On Sunday, October 19, policymakers and thought leaders from around the country convened in Lake Arrowhead, CA for a three-day symposium on the relationship between technology, data, transportation and land use planning, and the environment. The event, entitled “Smart Technologies, Smart Policies” is the 23rd annual installment in a series of relatively intimate, carefully curated symposia convened by the UCLA Lewis Center for Regional Policy Studies. This year, the panels dealt with such questions as: How can the public sector be responsive to the fast-pace and great uncertainty that technological change presents? How are city governments using more data and new technologies to plan and manage transportation systems in innovative ways, from smart parking meters to taxicab GPS to open transit data feeds? What new institutions, regulations, and systems are needed to deal with a changing technological world?
Assessing Technology’s Roles in Access and Mobility: From Where Have We Come, To Where Are We Headed? (Sunday, 1:30 PM)
…a means not just a thing.”
…embedded in social, political, and economic attitudes and systems.”
Opening: Martin Wachs, Distinguished Professor Emeritus of Urban Planning, UCLA
Martin Wachs, distinguished Professor Emeritus at UCLA and the Rand Corporation, opened the symposium and its first panel. He offered some historical context. In the early 21st Century, owners of private cars (early Ford models, for the most part) would hail passengers curb-side and offer rides for pay. This informal system, the first to bear the name ‘jitneys,’ was quickly banned in London and other cities. Why? Despite the fact that cars were seen as an environmental palliative to the pollution caused by horse-drawn transportation, they were also seen as inappropriate competition with public transit, and transit operators had political sway. Wachs reminded the audience that these early jitneys were not so different from what are considered innovative services today: the so-called peer-to-peer ridesharing services Lyft, Uber, Sidecar, and others. Our hesitations about these services today are entirely different, though, and they are a function of our time.
Dr. Wachs offered us an expansive frame, enabling that we might remove ourselves from, or at least realize, the social, political, and economic attitudes and systems that shape how we in 2013 view technology and in which technology is embedded.
Steven Popper, Senior Economist, RAND Corporation
Dr. Popper offered a broad overview of the interaction between technology and transportation, grounded in the National Cooperative Highway Research Program (NCHRP) study he recently completed entitled Expediting Future Technologies for Enhancing Transportation System Performance. He provided fundamental definitions and frameworks for thinking about technology. Although we often focus our attention on the “thingyness” of technology, i.e. what is made by humans out of steel and glass, technology also refers to activities, actions undertaken by people to alter the material world. We should also pay attention to the disruptive nature of technology, and to the social nature of technology, in that technology is a cooperation between humans. Dr. Popper also presented some fundamental claims about the nature of public agencies. Public agencies have difficulty interacting with technology because of some systemic factors and common characteristics. For example: agencies lack the necessary technical expertise to assess technology. Popper gave the example that many senior decision makers at transportation agencies have civil engineering or transportation planning backgrounds, but there are few people with electrical engineering or computer science backgrounds. Other examples include the politicized nature of government agencies, lack of funding, and stringent contracting procedures. Popper concluded by noting that there is no consistent single source of obstacles to technology use for transportation agencies. Agencies need to engage in introspection to understand their own posture towards technology and understand how it shapes their capacity to achieve their mission and goals.
Bruce Schaller, Deputy Commissioner for Traffic and Planning, New York City Department of Transportation
Mr. Schaller began by agreeing with Martin Wachs: technology is about information and communication, he said. He also agreed with Dr. Popper that technology is a means to an end. Where Mr. Popper identified the general need for agencies to consider technology in terms of how it can advance their missions and goals, Mr. Schaller provided an example of an agency that had done just that. He could clearly articulate the purposes and ends of technology in his agency. First, technology helps NYCDOT allocate space and time efficiently, which advances the fundamental goals NYCDOT has for systems operation. Taxi GPS data, smart parking meters, connected signal systems all enable the efficient allocation of space and time and help to track and record the use of street space and time. Second, technology helps NYCDOT planners communicate upcoming changes to the public in a continuous way, which Mr. Schaller called “seamless.” Mobile and web-based applications for NYC’s bike share, on-street parking, and travel advisories constitute this communication. Third, at NYCDOT, visual communication and traffic analysis software (Synchro) work together to serve public outreach: to display information and analyses in a way that the public can understand and immediately validate.
Mr. Schaller began his talk by stating that “technology is a goodie,” which is to say technological changes are politically free, whereas changes to the allocation of scarce street space have winners and losers. He concluded his talk, though, by offering some of the challenges that technology poses to NYCDOT. These include: ensuring that web and mobile applications such as NYCDOT’s bike-share app reflect the “ground truth”, and other issues related to privacy and network effects.
Joaquin Lopez, Virtual Business Manager, Cisco Internet Business Solutions
Mr. Lopez provided an overview of the internet. How did the internet evolve, and how will it change in the coming years? Two of the ideas presented by Joaquin recurred throughout the conference in attendee’s questions and comments. First, the Internet of Everything. Joaquin defined it as the digitization of the world and the connection of people, processes, data, and things. Joaquin showed a slide of a cow with a device on it to illustrate the wide range of data and processes that are being connected to the internet. (The Internet of Everything is Cisco’s twist on the more common phrase, “the Internet of Things.”) Any device or item in social or economic life could be connected. Joaquin noted that only 1% of things that could be connected are connected, so there is a lot of potential for growth here. The second idea which resonated throughout the conference was the statement, “I can’t imagine life without…” Joaquin showed data from BITKOM, a consortium of companies based in Europe, that indicated that 97% of 14-29 year olds said that they could not imagine life without a mobile phone. 84% could not imagine life without the internet. By contrast, 64% could not imagine life without a car. This glimpse into the role of various technologies in modern life became a common touchpoint throughout the conference.
Mr. Lopez spoke a bit about his desire for Cisco to collaborate more with public agencies and other stakeholders. First, he noted that Cisco controls how networks work, due to its market dominance: about 80% of the traffic on the net runs over Cisco technology. He called for transportation agencies, other government agencies, and other stakeholders to join the conversation about network features.
Attendee Questions and Comments
What about waste streams? Joaquin noted that the original build-up of the internet was very environmentally costly. Someone in the audience questioned the additions to the waste stream from adopting and mandating new technology. When agencies or societies mandate or mainstream new technology, old technology is thrown away.
Someone else asked about the costs of technology, for example the costs to human life and limb of texting while driving. The question echoed Martin Wach’s introduction to the panel: he had wondered if contemporary society is perhaps too eager and optimistic about technologies like smart phones and the internet, where early 20th century society was too hesitant about the emerging technology of the car. Will we realize some of the social costs of today’s technologies and begin to regulate them? Joaquin offered that the technology exists currently to make it impossible to text and drive, but we don’t use it.
There were some discussion about the education of transportation planners and engineers. Bruce suggested that the old typecast of old-school operations staffers vs. new school technologically proficient IT people is becoming obsolete. Joaquin offered that there is a huge opportunity to educate planners in the intersection between IT and Planning, or “public sector IT.” He said that planners should be familiar with basic IT skills and principles, like some python coding.
While Dr. Popper’s talk unpacked the difficulties that transportation agencies have with technology, both Mr. Schaller and Mr. Lopez provided examples of collaboration that could overcome those difficulties. Many of the weaknesses that transportation agencies have when it comes to relating to technology (e.g. lack of technical expertise) disappear when the agency no longer acts as a closed entity.
The panel also presented a contrast. On the one hand, there is rapid change and innovation in the technological sector, both in terms of technical discoveries, dissemination, standards, performance, processes, institutions, and so many other changes. On the other hand, the features and processes of transportation agencies are slow to change. NYCDOT provides a hopeful example of an agency that has incorporated technological innovation and data-driven processes into its operations. It remains to be seen whether more agencies will follow the NYCDOT model, and how quickly the bulk of agencies will incorporate new uses and forms of information technology.
Changing Behaviors, Changing Cities (Sunday, 3:45 PM)
“the relationship between the real world and the Yelp world”
“let’s look at behavior as distinct from attitude”
“a rising tide [of high-speed internet connections in a central city] lifts all ships [with regard to connectivity in households of color]”
Andrew Mondschein, Assistant Professor, Department of Urban and Environmental Planning, University of Virginia
First, Dr. Mondschein gave an overview of how information generally, and information technology more specifically, might affect travel choices. Fundamentally, “You can’t choose to go somewhere unless you know where it is and how to get there.” The cognitive maps that people have always formed in their heads and used to navigate the city are being replaced or augmented by an explosion of information technology services. These include web and phone applications for real-time travel information (NextBus), mapping and navigation (ex: Google Maps), social media (Twitter, Facebook), GPS-based services (FourSquare), and search services (Yelp). Professor Mondschein shared his research into Yelp reviews in Phoenix, AZ. Yelp is a social reviewing site that allows users to rate establishments (restaurants, parks, dentists office, and any number of other types of places) on a five-star scale and write reviews about them.
Dr. Mondschein showed a number of maps that juxtapose the real world with the “Yelp world.” Yelp reviews mentioning parking and transit cluster in certain neighborhoods of Phoenix. Yelp star rankings, he showed, tend to be higher in higher income neighborhoods. The volume of Yelp activity (e.g. the number of rating and reviews regardless of score) is also higher in higher income neighborhoods, and there are low-income neighborhoods with just as many restaurants and business-tax receipts that have relatively little Yelp activity.
Does this mean government should take out Yelp ads, perhaps for businesses in the neighborhoods where Yelp activity is disproportionately low? Mondschein is not sure that’s the answer, but he offers it as the kind of question government should be asking. There’s a need, he says, for government to go beyond opening a Facebook and a Twitter account in its engagement with social media and the digitized city.
Evelyn Blumenberg, Professor and Chair of Urban Planning, UCLA
Dr. Blumenberg illustrated the difference between perception and reality when it comes to teens and travel. Although popular media outlets have been publishing stories that advance the narrative that today’s teens will not purchase or drive cars as much as previous generations, the data do not bear out this narrative. Professor Blumenberg and her team examined National Household Transportation Survey data from 1990, 2001, and 2009. Contrary to what is insinuated by popular media, she finds: 1) very modest declines in licensing among young people, 2) a decline in commute travel by modes other than single-occupancy vehicle, and 3) small difference in travel patterns between young and old. Further, scrutiny of these data in combination with measures of web use shows that personal miles traveled is positively correlated with web use. Thus, the commonly heard hypothesis that cell phone or internet usage is a substitute for travel is not borne out in the data. These technologies complement travel, they do not substitute for it. More work will need to be done to understand how these technologies might affect travel. These effects could be in terms of route choice, spontaneous trip planning, a shift in trip types. or other kinds of shifts.
Karen Mossberger, Professor and Director, School of Public Affairs, Arizona State University
Dr. Mossberger presented a reminder that not all people are connected to the internet or reaping the advantages of communication technology. Her research concerns how high-speed broadband adoption rates vary among cities and neighborhoods within cities in the U.S. She analyzed data from the 2007 and 2009 Current Population Survey for the 50 largest metropolitan statistical areas (MSAs). She shared maps and tables of the cities with the highest rates of broadband adoption (Portland, San Francisco, and San Jose are the top three). Notably, Los Angeles, CA is in the bottom ten of all 50 cities for broadband adoption. Dr. Mossberger also finds that broadband adoption rates vary quite a bit within cities, with cost being the most important barrier nationally.
Much is at stake, Dr. Mossberger says. With the rise of e-government for all kinds of service delivery, from transit routing applications to enrollment in social services, lack of connectivity can mean lack of services.
Attendee Questions and Comments
Attendees had some wide-ranging and far-reaching questions in response to this panel. Hasan Ikhrata, the Executive Director of the Southern California Association of Governments, asked if it is time to shift our long-term policy visions in response to changing technologies. Professor Blumenberg responded that more data is needed to provide a full understanding of the dynamics at work with regards to technology and transportation. Another attendee asked about the role of pricing in pushing technological innovation. I asked about collective action, which is a fundamental role of government and also something that is made possible by social applications like Yelp. What is the difference between the kind of collective action undertaken by government and the kinds of collective actions enabled by Yelp? Are there ways in which government and web technology can work together to enable unique forms of collective action?
We still don’t really know how technologies like Yelp and the internet are changing travel behavior and urban form. We do know that the digital city and the real city are very different, and that the degree to which people access and participate in the digital city varies by socioeconomic factors. Although research like Blumenberg’s and Mondschein’s has been focused on travel, both researchers call for an investigation into social and economic implications they observe which are greater in scope.
The New Services on the Block: Technology-Enabled Ridesharing — Opportunities and Challenges (Sunday, 8:00 PM)
“Cabs fell asleep at the switch and created this gap.”
“This is regulation as whack-a-mole.”
“This is regulation lite.”
“Risk management isn’t just about preventing harm, it’s about enabling benefit by enabling people to take, for example, the risk of getting in a stranger’s car”
Intro: Juan Matute, Associate Director, Lewis Center for Regional Policy Studies, UCLA
First, Juan instructed everybody in the audience to fist-bump the person sitting next to them. Fist-bumping is the signature gesture of approval for people riding a Lyft. Lyft, along with its peers Uber, Sidecar, and others, represents an emerging transportation service commonly known as “Ridesharing” or “Peer-to-peer car sharing.” Using screen shots and photos (all produced by his smart phone), Juan took us on a visual tour to show us how these services work. The Lyft application shows you a map with the location of the drivers in your area. You can view a photo and ratings of each of the drivers. You choose one and request a ride, letting them know your location with your phone’s GPS. When the car comes, you recognize it by the pink mustache attached to the front grill, which is Lyft’s trademark. You take your ride, Lyft “suggests” a “donation,” and you pay by credit card, optionally adding tip.
David King, Assistant Professor of Architecture, Planning, and Preservation, Columbia University
Regulation is a central issue when it comes to ridesharing. Professor King asks the question, “Who are we regulating for?” He presented some charts showing that over the last several decades, there has been no relationship between population change and taxi service change in American cities. This suggests a failure of regulators to keep pace with change and provide consumers with service. Also, the fact that taxis are regulated by municipalities creates a confusing situation
Dr. King also presented his research on informal ridesharing in New York City. He introduced the informal jitneys that operate throughout the City, many catering primarily to a specific immigrant or ethnic group. Jitneys in Brooklyn and Queens carry quite robust daily ridership, equivalent to the 20th largest bus service in the United States. Dr. King studied one attempt by New York City to formalize jitney services in order to fill a service gap created by bus service cuts. He and his team observed bus stops where both traditional public transit and the jitney service stopped. They found that people’s primary reason for taking the jitney was to save time. Even people with an unlimited transit pass would take the jitney if it was the first vehicle to come to the stop.
Dr. King also fleshed out the category of ridesharing by exploring various concepts and origins of ridesharing. Ridesharing is often conceptualized as an amenity. Amenity ridesharing takes the form of shuttle services offered by employers, large apartment buildings, hotels, and other entities. Ridership tends to be wealthy. He explored the concept of asymmetrical travel demand, wherein travelers take public transit or walk or bike to a store like Costco, buy heavy and bulky goods, and then take a taxi or rideshare home.
What is the role of planning and policy in dealing with these services? Dr. King highlighted the need for planners to manage curb space. Pricing alone does not solve the problems of goods movement, informal jitneys, and public transit all competing for scarce temporary loading space.
Guy Fraker, President, Get2Kno
Fraker comes from the broader perspective of the “shared economy,” which includes a wide variety of sharing services, such as AirBnB (shared housing), ridesharing, and other emerging services (peer-to-peer shared bikes, others). As a participant in the shared economy, Guy Fraker called for a sense of proportion when thinking about risk and trade-offs involving regulating shared ride services. He opened with the question, “If a 747 crashed every 4 days, how often would you fly?” The rate of death in vehicles on the nation’s roads is equivalent to this, but we willingly take this risk every time we get in a car. Ridesharing and autonomous vehicles pose some risk, but Fraker calls for us to compare it to the existing risks we accept. Fraker’s company seeks to enable the shared economy by providing an outside assessment of a person’s trustworthiness. This assessment is based on linguistic analysis of public statements via social media and other online sources. The linguistic method was developed by the national intelligence agencies when they were forced to collaborate after the establishment of the Department of Homeland Security. Since trust is a fundamental component of the shared economy, Get2Kno would reward people who are trustworthy with discounts and other benefits related to shared economy services.
Carol Brown, Chief of Staff, California Public Utilities Commission (CPUC) President
The CPUC is one of the first governmental bodies to regulate ridesharing services, which they refer to as Transportation Network Companies (TNCs). The regulations are relatively brief and can be summarized in a single power point slide, which Ms. Brown displayed as she talked. Among other things, TNCs must carry commercial liability insurance, conduct criminal background checks and driving record checks, ensure no unlawful discrimination for ratings, and keep data on rejected requests and origin and destination of trips. Amidst calls from the taxi industry to ban these services, these regulations address some of the concerns about the safety of TNCs and their perception as professional driver services.
Many of the regulations that apply to taxis are difficult to apply to TNCs: taxi fleets must constantly become more green, taxis must provide disability access and allow service animals. TNCs also require smart phones and credit card, and this limits access and is potentially a social justice issue. All of these issues and critiques of TNCs exist, but on the other hand there are thousands of underground jitney services that will never come under the purview of the CPUC. So the commission saw an opportunity to have a positive impact on TNCs by crafting some regulation. This approach assumes that insurance companies and investors will be the ones to look into many of the risks and issues that TNCs pose. Carol marvelled at the marketing savvy these companies have. It’s amazing that they are known as “ridesharing” services, she said. They are not really ridesharing: most of the drivers are professional drivers, and the service is professionally delivered to customers. Moreover, there are cases where TNC customers have used a TNC to pick up a package.
Attendee Questions and Comments
Many of the questions were directed towards Carol; the attendees are keenly interested in how the public sector can or should react to these emerging services. Several questions explored additional regulatory elements. Did the CPUC consider regulating [X]? Or engaging TNCs in [X]? Someone asked about requiring TNCs to provide Access-like paratransit. (No, the CPUC did not explore this). Another person asked if the CPUC built in any kind of data-tracking to evaluate these companies’ claims about pollution reduction, congestion reduction, and others. They did not. For the most part, the answer was no to most of these, and Carol reiterated that the approach the CPUC is taking should be referred to as “regulation-lite.” Someone asked Dr. King about the prospects for jitneys as a transit-access mode. He was not optimistic about this, reiterating that jitney competes with transit on the basis of travel time.
While the benefits of TNCs and jitneys are clearly indicated by the demand for these services, the role of government is less clear. Taxi regulations provide evidence of a disappointing lack of innovation and timeliness by regulators, resulting in lack of service to consumers. The CPUC’s approach, by contrast, represents an attempt to keep pace with rapid change and enable benefit to consumers, rather than regulating it away. Many questions remain.
Building Better Planning Tools in the Era of Big Data (Monday, 8:30 AM)
“Data is the glue that holds all the scenarios together”
“We’re tripping over new sources of big data that we didn’t realize existed, because transportation planning isn’t lucrative for data providers.”
Gordon Garry, Director of Research and Analysis, Sacramento Area Council of Governments
Mr. Garry defined and described scenario planning and open source planning tools. These can be understood as new approaches that stand in contrast to old methods of performance-based and proprietary planning. Scenario planning can be characterized by its exploration of a set of scenarios rather than a single prediction. It is also characterized by the participation of a variety of stakeholders for long-term agreement. Open source planning tools are an emerging solution to the fact that planning problems have become more complex, multidisciplinary, and that they involve a broader range of stakeholders outside of government, while at the same time dealing with the fact that there is insufficient market demand for affordable proprietary software to address these problems. Mr. Garry gave some examples of the kinds of more complex modeling and scenario planning that are occurring, with more fine-grained data to support them. These included tour-based and activity-based travel demand models, as well as integrated models of economic behavior, land use, and travel.
Ron Milam, Principal, Fehr & Peers
Mr. Milam subtitled his presentation, “a first penguin perspective,” and explained that Fehr & Peers has been one of the first firms to try to use these data for transportation planning. No one wants to be the first penguin to jump into the water, because the first penguin bears the most scars from the ice and is the first to find out if there are predators in the water.
Mr. Milam first provided some background. First, he noted that it is a process to extract knowledge and wisdom from raw data. Second, more data and information sources exist now. He displayed a list of potential data sources that currently exist, and said that about half of them did not exist a few years ago. He elaborated upon a few examples, including cell phone data (providers: AirSage, TomTom), satellite imagery (which can be used to sample parking lot utilization), and routes and other data from mobile applications used for travel. He then demonstrated some of the ways in which Fehr and Peers has made use of new sources of data for transportation planning, and some of the advantages and drawbacks. Speed and route data from cell phones enabled Mr. Milam’s firm to rapidly and cost-effectively understand congestion. Cell phone origin-destination data allowed for validation of a travel demand model. Social media data (a query for tweets within a certain distance of a transit stop’ produced an ‘attitude assessment.’ Advanced software combined with finely grained travel data allow for a visualization of the impacts of a roadway infrastructure modification that is on the cutting-edge in terms of its ability to show the nuances of upstream, downstream, and ripple effects.
Matt Barth, Professor of Engineering, University of California, Riverside
Professor Barth described ongoing work at UCR involving in-vehicle sensors that obtain and record vehicle activity (velocity, grade, fuel consumption – either estimated or actual). Based on these parameters, these devices
Using this framework, Dr. Barth’s team is able to do environmental research using real-time information from vehicles. Dr. Barth shared the kinds of information his team works with, including real-time emissions graphs, cumulative emissions by trip, and comparisons between modeled and measured emissions. He identified the possibility for this kind of real-time information to be accumulated in order to estimate energy consumption and emission for roadway links. It is possible to make a real-time estimation of greenhouse gas emissions based on traffic activity and some evidence-based assumptions about the fleet composition, vehicle activity, and factors related to energy and emissions. Dr. Barth shared a map of real-time estimated GHGs in Los Angeles County.
Dr. Barth also shared some work he has done for the USDOT on “connected vehicle applications,” or ways to take advantage of the fact that it is now possible for vehicles to communicate to each other and to the transportation infrastructure. One concept that he shared from that work is the low-emission zone concept, wherein infrastructure would define and communicate such zones and vehicles would comply by going into low-emissions mode.
Finally, Dr. Barth also presented the variety of mobile sensors that now exist and can be attached to cars to detect a variety of different kinds of pollutants, among other types of information.
Based on the viability and current developments in mobile air quality sensing, on-board vehicle tracking and vehicle-based communications, Dr. Barth enunciated several possibilities: smog checks could be replaced by wireless, continuous monitoring of vehicle health; real-time estimates of fuel economy could replace EPA’s current fuel economy estimates, and the planning and provision of air quality would rely more upon real-time measurements and less upon models.
Attendee Questions and Comments
One question concerned probability, risk, and confidence intervals. Does big data set us up to better understand probabilistic dynamics? Mr. Milam commented that big data does give us much easier access than ever before to a good understanding of travel time reliability. A member of the consulting firm Iteris noted that cell phone data providers are not always forthcoming about the shortcomings of their data sets, and that the data can be very sparse for certain links of roadway during certain time periods. Mr. Milam agreed that vendors don’t tell you the drawbacks of their data sets, and offered that in order to get a trip table from cell phone data, his firm performed an estimation and then validated it using license plate surveys. A representative of one of the cell phone data vendors, AirSage, noted that they do get questions about sample size quite regularly, and that although their data set may have drawbacks, it is still much more extensive than a traditional travel survey.
The technical processes of transportation planning are poised to change on a wave of software innovation, big data, and dissatisfaction with the limits of traditional planning approaches. Early forays into the use of big data for transportation planning offer hopeful indications that new techniques can address new and more nuanced questions, model more nuanced behaviors, and that big data can provide much-needed model validation and ex post project evaluation. Likewise, a much more fine-grained understanding of pollution, emissions, and vehicle health stands to be enabled by emerging sensing and tracking technology in vehicles. Transportation planning and scenario planning represent a small market for big data providers, presenting a potential obstacle to further incorporation of these data into planning processes. Is there a role for government to collaborate with big data providers to produce robust, usable statistics?
Goods Movement in the Information Age (Monday 10:30 am)
“We probably paid for [New York City’s smart parking meters] because our parking tickets in New York City could fund lots of technological innovation.”
“Don’t graduate any more planners trying to build sustainable communities without understanding supply chain.”
Anne Landstrom, Principal Advisor, Commercial Group, Moffatt & Nichol
Ms. Landstrom provided a whirlwind, behind-the-scenes tour of what she called “the new e-commerce retail environment.” The meaning of this term that emerged over the course of her talk: the ways in which retailers are changing their operations 1) to accommodate and encourage more online shopping and door-to-door delivery 2) to adjust in the face of changing international and globalized contexts, emerging technology, as well as increased urbanization.
Ms. Landstrom defined two distinct but related logistics terms, fulfillment and distribution, and discussed each at length. Fulfillment is the process of getting an order to a customer. Amazon fulfillment is perhaps the most important case study today, the archetype, due to their market dominance and the new model they have pioneered. Amazon owns very large “fulfililment centers” that are close to urban areas (by necessity, because urban areas hold so many of the customers and so much of the consuming). These centers hold shelves upon shelves of all the various products Amazon sells. Workers (in combination with machinery and a variety of automated processes) assemble orders for shipment to customers. Ms. Landstrom noted two key facts. Holiday fulfillment at Amazon is insane and results in Amazon hiring tens of thousands of temporary workers who work very tough hours with very strict rules, like very little leeway to take breaks or get sick. Second, Amazon’s fulfillment costs are currently running high, at 10.2% of the cost of goods, where the market average is 8%. These high costs can be explained by the fact that Amazon’s fulfillment centers are near urban areas where land is more expensive, and that Amazon’s goal is market dominance over profitability.
Next, Ms. Landstrom discussed distribution, which is the process of getting product to stores. Walmart is the archetypal example of an enormous and complex distribution process. Like fulfillment, distribution is intensely seasonal, with roughly 20% of retail sales occurring in November and December. Distribution can involve at least three distinct processes. One, product moves from regional warehouses to stores. Two, import redistribution occurs near ports (presumably to regions)? Three, something termed ‘pre-distribution allocation’ occurs in ‘cross dock facilities.’ Cross dock facilities are long structures in ports where containers are arranged along the two long ends and goods are moved between containers. These cross dock facilities are an intermediate step between sea containers and containers going to domestic rail or truck trailers. Distribution warehouses (enormous) serve the purpose of storing product and allowing it to be sorted and rearranged. Ms. Landstrom noted that there are some distribution warehouses termed “dark warehouses” that are all run by robots moving goods around inside, while other warehouses involve a collaboration between workers and robots.
Fran Inman, Senior Vice President, Majestic Realty Co.
Ms. Inman’s company, Majestic Realty, builds large buildings, including stadiums, business parks, and — germane to this panel — logistics facilities. She is active in public policy as a state transportation commissioner serving on the California Transportation Commission. She defined smart logistics planning as “developing and operating facilities that are in high demand,” and noted that cross-dock facilities are an example. She also noted that of the 14 buildings she has built that are greater than 1 Million square feet in size, only one has been in California. This, she insinuated, indicates that regulation is California is too strict. She said that SB375, the state law that encourages infill development and coordinated planning of transportation and land use, ignores commercial development, and that that concerned her. This hinted at the discussion around the “sprawl of commercial land” that Dr. Wachs would later provoke in the Q and A.
Alison Bird, Environmental Manager, FedEx Express
Ms. Bird provided a thorough view of FedEx, a key link in the logistics chain. Among other roles, FedEx and UPS are involved in fulfillment, the growing business of getting orders to customers, more and more of whom are shopping online.
As a framing introduction, Ms. Bird noted that “FedEx is everybody’s Scope 3 emissions,” alluding to the ‘Scopes’ used in formal greenhouse gas inventories and management. As Ms. Bird narrates it, because FedEx customers wanted to understand their carbon footprint, FedEx started tracking and understanding their fuel consumption and efficiency. Immediately upon setting up a performance tracking system, FedEx began to learn “how much [we] didn’t know about [our] company.” A number of changes came from this, including the purchase of new fuel efficient planes, a total attitude shift about having a nonhomogeneous truck fleet, and electrification of vehicles. Ms. Bird said that it was somewhat difficult for the company to adjust to the new data-driven environment with regard to performance. The culmination of many efficiency-seeking changes over the past several years has been a decoupling of revenues from carbon, quite a significant achievement according to Ms. Bird.
Ms. Bird concluded by echoing the need for public sector leadership on matters related to logistics, and pointed out that even with all the technology being developed and used today, we still have airplanes using radar and bridges collapsing.
Attendee Questions and Comments
Many hands were seen to be raised during this question and answer session, and audience interest in this topic appeared to be high. One attendee asked, What is the most important thing that policymakers need to know about logistics? Panel members responded that goods movement is an afterthought in transportation planning. MAP-21 is the first transportation bill to address goods movement, but it’s a short bill and it didn’t provide any funding to address goods movement issues.
There was a discussion about road space and curb space, and how goods movement competes with other uses and users. Alison Yoh described the Pierpass program at the ports of Los Angeles and Long Beach, which uses differentiated pricing to encourage trucks to travel at off-peak hours. Ms. Bird from FedEx noted that the planning for curb space and pricing of curb space has not considered delivery and loading very thoroughly. She also noted that at FedEx, drivers are penalized much more for missing a delivery time than getting a parking ticket, and that the company is basically entirely insensitive to parking tickets.
Finally, Martin Wachs provoked a discussion surrounding the question, “What is the relationship between smart growth and commercial sprawl?” He noted that many urban infill projects are enabled by the existence of vacant parcels and right-of-way commercial transportation or commercial facilities. (Example: the High Line in New York City). He asks, how can we change the current dynamic, wherein commercial activities sprawl outward? What regulations might come down for commercial development? Centralized goods movement is more efficient, but there is no political will for it currently. The panelists did quite a bit of nodding at all this. Fran said that planning schools shouldn’t graduate any more planners trying to build sustainable communities without understanding supply chain.
Communications technology (the internet, ubiquity of online retail) has changed the way people buy goods. The process of moving goods is highly complex and interacts quite a bit with urban policy, in ways planners often overlook.
Using Technology to Enhance and Diversify Transit Services (Monday 1:30 PM)
“Can we integrate all these new modes and services with the old?”
“Next-day adjustments are possible in response to complaints.”
“Real-time transit information makes people feel more safe because they can come out 1-2 minutes before their bus arrives.”
Roger Teal, President, DemandTrans Solutions
Dr. Teal discussed the past, present, and future of transit data and the applications that employ it. First, he gave a sense of where things stand now. There are many new opportunities to deliver transit information to users (and “potential users,” Dr. Teal is careful to note). These include real-time routing applications (e.g. Google Transit directions) and real-time service information (e.g. NextBus). Key factors creating these opportunities are the ubiquity of the internet, mobile devices, and GPS, along with an ecosystem of application developers and development platforms such as Android and iOS. The other key linchpin in bringing this information to consumers is the General Transit Feed Specification (GTFS), which enables transit agencies to publish their fixed-route and real-time data for consumption by developers and applications. Dr. Teal gave a short, recent history of the GTFS. In the early 2000s, transit agencies were developing their own online trip planners. At the same time, Google developed Google Transit, which came to dominate over agency-developed applications. Google also published the GTFS specification around this time. Agencies soon realized it was easier to publish their data in the GTFS format and let outside parties develop routing applications, and this became and remains the dominant practice. A few years ago, Google added specifications for real-time transit data to the GTFS.
Dr. Teal provided a brief discussion of the implications of these transit applications. Limited evidence exists, and it indicates that while these applications probably do not result in significant increases in ridership, they do have an impact on user perception of quality of service. Mobile devices and transit information applications may create an opportunity for flexible transit services. As opposed to fixed-route transit services, these flexible transit services take requested pick-up origins and destinations as inputs and in response, determine the optimal sequence of pick-ups and drop-offs as well as the route. Historically, paratransit has constituted the bulk of flexible transit, with relatively simple booking and routing systems. Dallas and Denver are exploring smart phone booking capability and real-time integration with fixed-route transit schedules, though these efforts are still nascent.
To conclude, Dr. Teal noted that although commercial revenue for these applications is limited, agencies should remember that they do own their data, and that it does have value. Agencies should partner with developers to create the most compelling applications. It may make sense for agencies to keep some data proprietary and work with selected firms to develop applications while at the same time publishing much of their data to the GTFS for open use.
Susan Shaheen, Adjunct Professor, Civil and Environmental Engineering, UC Berkeley
Dr. Shaheen’s talk concerned all the “sharing” transportation services – ridesharing, bikesharing, and carsharing – and how these interact with public transit. First, she discussed the broader concept of the “sharing economy.” She cited the range of things that can be shared, such as housing (AirBnB), bikes (Citibike) chores (TaskRabbit) and digital video (Netflix). General principles of the sharing economy: (1) sharing trumps possession, or in other words, access trumps ownership, (2) whether sharing makes financial sense depends upon cost and convenience, and (3)
Focusing in on shared transportation services, she presented a typology of the various services that are now offered. With regards to cars alone there are at least three models for sharing. Zipcar is the prototypical “round trip” carshare service. Peer-to-peer carsharing, or “ridesharing” is typified by Lyft, Uber, and Sidecar. There is also one-way carsharing, typified by Enterprise Carshare or Hertz on Demand. Finally, there is also a fractional ownership model for car sharing, which has yet to see wide adoption due to insurance-related barriers. Shaheen shared data showing that carsharing memberships and vehicle fleets have been growing exponentially since 2002. With regard to bikes, again multiple models for sharing exist. Public bikesharing services are the most well-known; these are the city-operated systems such as Montreal’s Bixi, New York City’s Citibike, Paris’s Velib, and many others. Close community bikesharing, on the other hand, refers to bikes that are only available to members of, for example, a corporation or University. There are at least 15 such systems throughout the US, with the shared bikes at Google, Apple, and Facebook being well-known examples. Finally, Dr. Shaheen noted that peer-to-peer models for bikesharing are emerging. Bitlock and Spinster are two companies trying to enable peer-to-peer bikesharing.
To conclude, Dr. Shaheen addressed the relationship between these services and public transit. Surveys of bike share users indicate that bikesharing is both a complement and competitor for public transit. 95% of users perceive bikeshare as enhancing transit, but there is also some evidence that people use bikeshare to take trips that they previous took by public transit. Dr. Shaheen concluded by inquiring about a “shared mobility vision.” With the proliferation of transportation services, will there be an integrated user experience, both in mobile applications and on the street itself? There are a handful of people — like the company RideScout — pushing toward this vision, but it’s uncertain if it will be realized.
Conan Cheung, Deputy Executive Officer of Service Planning and Development, LA County Metro
Mr. Cheung discussed transit service changes, with an emphasis on technical innovations and real-time data. He began by providing an overview of what is meant by transit service changes or adjustments. These are changes to the routing and frequency of transit lines, and they are made in response to changes in the operating environment or changes in the spatial and temporal distribution of transit demand. Mr. Cheung contrasted “major overhauls,” conducted infrequently and impacting the entire transit system, with “minor adjustments,” conducted frequently at the level of individual lines. He gave an example of a major overhaul he had overseen in San Diego, showing maps of system frequencies and some images of the public involvement process.
Mr. Cheung then moved to a discussion of recent service changes at Metro, which provided a window into two major ways in which technology impacts transit service adjustments. First, communications technology has changed the outreach process and the ability of transit riders to engage with agencies. Metro makes use of social media, mobile applications, Nextrip, and its website to reach riders as well as Metro operators in new ways. Real-time communications from these sources can motivate rapid, incremental adjustments to specific services. Second, increased amounts of automatically collected transit data enable agencies to better understand trip making patterns and better optimize across multiple transit lines and performance goals. When making service changes on the Silver Line, Metro staff were able to break down ridership by line to see what riders would be impacted. They can also infer origin-destination data from Universal Fare Collection media (e.g. Tap cards) and make service changes sensitive to this data. Mr. Cheung also noted that contemporary scheduling and dispatch systems make it possible for Metro to make schedule adjustments in rapid response to details like changes in school start and end times. He concluded by saying that when an agency engages in many small changes, these act as preventative maintenance and stave off the need for major overhauls.
Attendee Questions and Comments
Someone asked about safety issues with shared transportation services. Dr. Shaheen noted that she is working with Guy Fraker to commission a large study regarding the right level and form of insurance for the shared economy. Roger Teal noted again that real-time transit information makes people feel more safe because they do not have to spend long periods of time at stations or stops waiting for a vehicle. Another attendee asked about the possibility that car sharing services could increase VMT by increasing access to cars and the convenience of car use. Dr. Shaheen answered that the data do not support this. Car2Go “drops 400 cars at a pop” when it opens services in a new area, and researchers have not observed increased VMT following these roll-outs. Relatedly, she noted that access to on-street parking is a major constraint for car sharing. The City of San Francisco recently allocated 900 parking spaces for shared cars. The question of on-street parking returned Shaheen to the vision of integrating all the various mobility services and modes in the street and on the curb. NACTO could play a role in achieving this vision, but it’s not playing that role yet: none of the NACTO Design Guides yet published have addressed shared mobility services.
Technological developments are spurring changes in the ways that riders interact with transit services, as well as changes in the portfolio of mobility services, particularly shared mobility services, on offer today. Transit’s position in this portfolio may change as new shared mobility services complement and compete with transit. Government’s attempts to participate in these changes have been a mixed bag. Transit agencies have fumbled by attempting to create their own transit routing applications. On the other hand, LA Metro has found success in using technology to inform service changes. What role should government play? What role can government play well? These continue to be major questions. Some are contending that one answer is that government can do a better job of allocating scarce roadway and curb space to accommodate a new portfolio of mobility services.
The Road to Connected and Automated Cars: Rocky or Smooth? (Monday 8:00 PM)
“Never use the term driverless cars”
“Not within the lifetime of anybody in this room”
[The car as] “driver whisperer”
“Our very modest vision is to transform the way society moves”
“The vehicle industry doesn’t move the way Silicon Valley moves” (Steven Shladover)
Moderator Andre Boutros began by acknowledging the hype: the level of interest in this subject, the extent to which journalists and other commentators have speculated about ‘driverless cars,’ and the high level of interest by symposium attendees.
Steven Shladover, Research Engineer, UC Berkeley
Dr. Shladover proceeded to largely ignore and ultimately refute the dominant view that mass adoption of automated vehicles is just over the horizon and will lead to sweeping changes in transportation systems. His introduction set a tone of skepticism about this dominant view that went unchallenged for the remainder of the session. He reminded the audience that in 1939, General Motors presented a “Futurama” exhibit with a mock-up of an automated car and the promises that it would exist within a few decades. At regular intervals since then, the automated car has resurfaced and resurfaced as an object for aspiration. The implication: people have long loved to dream about the automated car, and these dreams have not necessarily had any connection to what is technically feasible nor what the true social impacts of such a technology would or could be.
Dr. Shladover then proceeded with a detailed overview of automated vehicle technology. He gave a description of the various levels of automation as they are officially classified. He showed videos and photos of cars, trucks, and transit vehicles employing partial automation at each of the levels that are currently feasible. Finally, he discussed the prospects for higher levels of automation, and enumerated the many barriers to mass adoption of cars employing full automation.
First, Dr. Shladover noted the difference between automation (the capacity to operate without human involvement) and autonomy (the quality of operating independently from other people or entities). Cooperation is the opposite of autonomy, and Dr. Shladover stressed throughout his presentation that many of the benefits of automated cars only come with cooperation. He noted that Google’s goal is to build cars that can operate with high automation and high autonomy, so that each individual car operates without a driver and independent of other cars and systems.
Cooperation refers to vehicle-to-vehicle and vehicle-to-infrastructure communication and the kinds of actions and systems that are made possible by such communication. Dr. Shladover provided dramatic illustration of the power of cooperation. He showed videos of automated vehicles driving down a highway. Without cooperation, when the lead vehicle brakes, a shockwave propagates backwards down the highway as each vehicle behind it sees only the brake lights of the one immediately in front of it. With cooperation, the fact of the lead vehicle braking is immediately communicated to successive vehicles who can then slow down in unison. Dr. Shladover noted that there is an imminent decision pending wherein NHTSA may require cooperation: automated vehicles may be required to transmit cooperative collision warnings.
Dr. Shladover presented the levels of automation as they have been variously defined by the National Highway Traffic Safety Administration (NHTSA) and the Society of Automotive Engineers International. There are 4 or 5 levels of automation, depending on which of these two systems one references. Level 1 automation is exemplified by adaptive cruise control, also known as lane keeping assistance, and is on the market today. Level 3, to cite a mid-level example, includes a “traffic jam pilot” that would drive the car at slow speeds on congested highway conditions. It allows the driver to do other things (read, surf the web, text) but always remain available to take the wheel. Finally, at the highest automation levels, there is no driver needed at all. For each level of automation, he provided details on the timeline of market implementation, possible benefits and drawbacks, and possible uses for the technology. In particular, he noted that automated transit vehicles may be more feasible and imminent than private vehicles at similar levels of automation.
In the final analysis, he concluded that full automation is “not within the lifetime of anybody in this room.” Extensive testing will be needed to demonstrate that fully automated vehicles are significantly safer than today’s safety baseline. Such testing will take many millions of miles and hours to demonstrate that the vehicles can deal with the variety of real-world safety situations and that the computer system does not fail. For now, according to Dr. Shladover, policymakers should focus on providing the technology for cooperation as well as on getting the earliest public benefits from automation by focusing on transit and trucking applications.
Richard Pelletier, Lead Design Strategist, BMW DesignWorks USA
Mr. Pelletier offered a view of automated cars from private industry. As someone who designs cars, he looks into the future to anticipate technological developments, write patents, and think about where the market is going. These activities have given him cause to interact with and consider automated vehicles.
First, Mr. Pelletier explained the relationship between BMW DesignWorks and BMW as a whole. DesignWorks is a separate entity that does work not only for BMW but also for other clients. Mr. Pelletier cited as examples: John Deere. The Bay Area Rapid Transit Authority (BART), for whom his group created new designs for the interior of train cars. About half of his work is for BMW, and about half is for outside clients.
Mr. Pelletier focused on the concept of human-robot interactions, and cited Donald Norman’s work on user experience as an influence. Although much of the conversation about automated vehicles is about removing the driver, Mr. Pelletier and BMW are very much concerned with providing a compelling driver experience. As such, Mr. Pelletier talked about the possibility of calibrating systems like steering, the brake and gas pedals, and other kinds of actuators so that the driver’s interaction with them is enjoyable and responsive. One version of this might be called the “car as driver whisperer,” wherein many systems are automated, but the car compels the driver to engage in a way that gives the driver the experience of power and independence and control.
Mr. Pelletier did admit that he could not discuss much of his work related to automated cars, presumably because of the imperative of corporate secrecy in a competitive environment. What he could present was a patent he filed for in Europe in partnership with BMW. This patent exemplified what human-robot interactions might look like when vehicles are very technologically advanced and have the capability to drive themselves. Rather than remove the driver entirely, the patent describes a list of “maneuvers” that can be issued by the driver. These are things like “change lanes,” or “exit the highway,” to cite some examples. Mr. Pelletier explained that BMW drivers are a specific kind of person, who tends to be a control freak. These drivers need to be involved even if the car has the capacity to exclude them. In discussing his European patent application, Mr. Pelletier relayed that a possible role of government in the development of automated cars would be to define “global technical protocols” for vehicle-to-vehicle communications such as the broadcasting of turn signals (in the near term) and the broadcasting of other kinds of maneuvers in the long term.
Jesse Glazer, ITS and Operations Engineer, Federal Highway Adminstration
Mr. Glazer presented a talk originally prepared by James Pol of the ITS Joint Program Office at the USDOT. It concerned the federal government’s involvement in Intelligent Transportation Systems (ITS).
First, Mr. Glazer situated the ITS Joint Program Office (JPO) within the larger hierarchy of the federal government. The JPO coordinates with the “modal” federal administrations whose work concerns transit, highways, air travel, and other modes of travel. The JPO is a part of the Research and Innovative Technology Administration, which in turn is part of USDOT.
Mr. Glazer began by noting some broad trends in the realm of ITS. One has been the increased involvement of the private sector and the public at large. Previously, intelligent transportation systems had been planned, owned, and operated by the public sector. Now, they are much more multilateral. Concurrently, deployment of ITS is more mainstream than ever, and part of the reason for this is that the private sector can play roles that were inconceivable only recently.
Next, Mr. Glazer honed in on how the federal government is engaging with connected and automated vehicles. There is a Connected Vehicle Safety Pilot ongoing in Ann Arbor, Michigan wherein over 2,800 vehicles are equipped with a variety of vehicle-to-vehicle and vehicle-to-infrastructure sensors. The vehicles are driving on 73 miles of “instrumented roadway,” which is to say roadway that can communicate with the vehicles. NHTSA will examine one year’s worth of data from these light-duty vehicles and will make a decision in 2013 to regulate V2V and V2I technology or to take nonregulatory action, like issuing safety ratings. NHTSA has plans to make an analogous decision regarding heavy vehicles in 2014, and FHWA plans to release deployment guidelines in 2015.
Finally, Mr. Glazer shared some details about the next ITS Strategic Plan, which will cover the years 2015-2019. This Plan must be responsive to the following contemporary conditions: the increased feasibility of various levels of automation in cars, the goals of the most recent transportation bill (MAP-21), and an increased awareness of the relationship between transportation and the economy. Mr. Glazer shared six draft focus areas for the plan. one of these was “interoperability,” which refer to the importance of cooperation articulated by Dr. Shladover earlier in the session.
Attendee Questions and Comments
Several questions concerned the role of government. How can infrastructure be smarter? One panelist offered that where infrastructure currently communicates through light in the most abstract sense (e.g. paint as a visual marker, bulbs in signals), in the future infrastructure could broadcast other kinds of signals. Another panelist noted that the California Department of Motor Vehicles is currently developing regulations regarding the testing and operation of automated and autonomous vehicles. Someone commented that this role should really go to NHTSA, but NHTSA has been slow to act.
One attendee responded to Dr. Shladover’s claim that the low-hanging fruit for automation is in trucks and transit. What about the fact that the consequences of a crash are more severe for these modes? Dr. Shladover noted that there is currently a truck platooning study being conducted at Berkeley PATH, funded by USDOT; this should address concerns about crashes.
Does automation make driving more attractive? asked one attendee. Dr. Shladover noted that the technology is somewhat mode-neutral in the sense that it can also be used on transit vehicles.
The question and answer session became another opportunity for panelists to underscore the gap between the way that automated vehicles are discussed in the popular media and the real emerging issues they present. One panelist noted that driverless cars have become a distraction from the technology that currently exists, like assisted lane keeping and blind side warning systems. It has become more difficult to engage government agencies in a conversation about what their role should be in facilitating or regulating this technology because the conversation keeps slipping to “driverless cars.”
While it’s true that Google stimulated innovation by devoting energy and attention to the autonomous vehicle, it’s also the case that the technology is decades away from the mass market, and that the hyperbole around the driverless car distracts from much more imminent opportunities for government to regulate or otherwise facilitate public benefit. Ensuring that automated vehicles can communicate with each other and with infrastructure is a key role for government. Humans will continue to be in the driver’s seat, albeit with roles that may change in response to increased automation, for decades to come.
Technology-Enabled Policies: Moving to Markets and Pricing (Tuesday 8:30 AM)
“If you’re going to have a data-driven process then you need data” (Alex)
Spoken by a planner about IT people: “I like acronyms, they like acronyms”
“We [deliberately] make our vehicles disconnected to address privacy issues… just because a technology exists doesn’t mean you should use it.” (Ryan)
Intro: Susan Binder
Jack Opiola, President, D’Artagnan Consulting
Although Mr. Opiola’s presentation was part of a session on “technology-enabled policies,” his discussion of road usage charges (RUC), often called VMT fees, actually contained a number of points about funding realities and projected revenues that were quite irrespective of technological innovation. Further, one of Mr. Opiola’s central suggestions was to adopt a “low tech” implementation of RUC that just involved checking vehicle odometers periodically.
He told a story about the decline of the gas tax and the prospect that road usage charges could step into the funding gap. Due to regulation, industry innovation, and market demand, vehicles are much more fuel efficient than they have been in the past, and the vehicle fleet stands to continue to grow more fuel efficient with increased market share for electric vehicles. As a result, since 1970, the gas tax, the major source of funding for transportation infrastructure, has remained flat while population and vehicle miles traveled have grown. Federal, state, and local government agencies have been examining an alternate revenue source for some time. Mr. Opiola worked on a team that examined this matter for SCAG. They found that VMT-based pricing would raise more revenue than alternate road usage charge alternatives, such as cordon pricing and various forms of toll lanes.
Mr. Opiola presented five common objections to RUC, and offered a data-based rebuttal to each one. (1) RUC is too complicated and expensive to operate: Mr. Opiola presented results from a financial viability study he worked on for Oregon State Department of Transportation that showed that administrative costs as a percent of revenue could be as low as 3.3%, and that these costs drop as more vehicles are added to the RUC system. (2) RUC is inequitable to rural drivers: Mr. Opiola cited data from the Government Accountability Office to argue that while rural drivers do drive further per trip, their yearly VMT is about the same as that for drivers from urban and suburban settings. (3) The technology to implement RUC invades driver privacy: Mr. Opiola showed results from a survey conducted by Minnesota Department of Transportation, which showed that respondents prefer low tech RUC implementations — such as periodic odometer readings — over high tech ones, like in-vehicle GPS readers. (4) There’s no business case for RUC: Mr. Opiola’s entire presentation had rebutted this. (5) No politician is brave enough to adopt RUC: Mr. Opiola introduced the audience to the Western Road Usage Charge Consortium, a group of state and local departments of transportation working to encourage the development of a regional and perhaps eventually interregional RUC system.
Ryan Morrison, President, True Mileage, Inc.
Mr. Morrison discussed another kind of road user charge, this one primarily the concern of private industry: the mileage-based insurance fee (also known as usage-based insurance or UBI). Some conceptual background: auto insurance premiums are determined by actuarial analyses of the risk of the insured getting into a crash. Many variables are considered in the determination of the premium: for example, driving record, whether the vehicle is used for personal matters or business, and where geographically the vehicle is housed . The idea behind UBI to add new variables to this, chief among them the number of miles that the vehicle is driven. The more miles a person drives, the more exposure he or she faces to the risk of being in a crash. Premiums that are set based on miles-driven create an incentive signalling drivers to minimize risk by driving less, and this reduces actual risk which reduces claims costs for the insurers (e.g. a win-win). Mr. Morrison explained that insurance companies compete for customers, and that each wants the low-risk customers.
UBI has faced barriers to implementation, many of them widely understood to be a matter of technological feasibility. Mr. Morrison described UBI as it exists today. Drivers can get discounts of up to 30% of the cost of their premium based on three factors (the combination of these factors depending on the state and the insurer): mileage, time of day, and hard braking. Mr. Morrison explained the latter two variables: crash risk goes up at night, and that hard braking is associated with higher crash risk. Members of the public often have difficulty understanding why they should be rewarded for not braking, because braking is perceived as a safety measure that is used in emergency situations. However, hard braking behavior is associated with other unsafe behaviors, such as short following distances and speeding. Mr. Morrison cited a Brookings Institute report that found that VMT drops about 8% when drivers use UBI.
The situation in the state of California is unique, due to state regulations that are very specific in listing and ranking the variables that can be included in the calculation of insurance premiums. California mandates that mileage be the second most consequential variable in such a calculation, and excludes time of day and hard braking altogether. Mr. Morrison said that California might be better served by opening the door to other variables. Mr. Morrison also briefly touched on the main implementation of UBI in the U.S. right now, Progressive’s Snapshot program. Progressive installs a device in your car for 6 months, and then gives you a personalized discount based on your miles driven and other behavioral variables. The limited 6 month window is a consequence of the fact that the technology is expensive. A shortcoming of this approach is that discounts end up being much smaller than they could be because there is only a 6 month observation, and no opportunity for the driver to continually reduce his or her risk with data feedback.
Mr. Morrison’s company, True Mileage, builds devices that insurers can use to implement UBI. One of the key features that Mr. Morrison highlighted was that the new generation of devices can pre-process data, and do not need to include GPS or cellular modems that can triangulate the vehicle’s location. This addresses a key privacy concern for many users. It also reduces the cost of implementing these systems. Mr. Morrison had props and he demonstrated how the True Mileage devices work. There is a small device that plugs into your car’s OBD-II port. Periodically, you can then load data from the device and send it to your insurer. Mr. Morrison showed how you can do this on your smartphone (by holding it next to the device, not even connecting it!) and then view the data summary and send it along. He also elicited a big ‘ooh’ from the audience when he showed technology they are currently developing to allow users to load data onto a ‘smart’ postcard which could then be mailed to the insurance company. You slot the postcard into the device and – bing! – the data is loaded.
Alex Demisch, Senior Analyst – SFPark, San Francisco Municipal Transportation Agency
Mr. Demisch gave an overview of the SFPark program, with emphasis on the technical challenges and the lessons for how governments interact with data and information technology. SFPark is a federally-funded pilot program to implement demand-responsive parking pricing. In a general sense, what this means is that meter rates are adjusted up and down so that there is always one space available on every block (roughly 85% occupancy). If occupancy rates are too high, prices go up. If very few people are parking, prices go down. The idea of demand-based pricing for parking has existed for some time – see Donald Shoup’s 2005 book The High Cost of Free Parking – but SFPark is the first major roll-out in a large city. Further, SFPark hinged on hot-off-the-press parking sensor technology.
First, Mr. Demisch introduced the basic components of SFPark. It was undertaken by the San Francisco Municipal Transportation Agency (SFMTA), which Mr. Demisch emphasized is an all-in-one transportation agency, with authority over all the surface modes of travel, and the additional specific advantage that San Francisco is both a city and a county. He displayed a map showing the neighborhoods where the SFPark pilot took place. SFPark also covered fourteen of twenty parking garages owned by SFMTA. Mr. Demisch listed the necessary infrastructure, both physical and technical, that had to be in place in order to launch SFPark. The city had to invest in new meters, which could display changable rates and be connected to an IT system. It also had to conduct a census of every parking space in the pilot areas. Parking sensors, magnetometers, were installed below each space. A publicly available data feed was produced, which mobile app developers could draw upon. The agency created a table determining the correspondence between occupancy ranges and price adjustments. Mr. Demisch then showed the results of putting all of those pieces together: demand responsive rates displaying considerable variation from block to block.
Mr. Demisch concluded by examining some of the lessons the agency learned. First, he explained that in order to deal with over 250,000 records per day — an ‘avalanche of data,’ the agency had to invest in top-shelf private sector data management tools, the same kind that financial firms use. Second, SFPark spurred a necessary culture change at the agency about the level of time and investment it takes to acquire, understand, clean, and structure data. Third, SFPark required that asset management and inventory become digital where they had previously been paper-based or non-existent. People who worked on the paper-based systems feared they would lose their jobs; this was a barrier the project had to overcome. Finally, there were plenty of technical challenges: the agency had to reconcile 10,000 inconsistencies in the meter data, meters lost battery life because of the volume of communications, and the sensors broke frequently. The agency had to learn to deal with these challenges. Mr. Demisch described the performance-based contract SFPark entered into with the sensor company, which had the benefit that whenever the sensors broke, SFMTA did not pay. Mr. Demisch closed by saying that all the lessons learned in SF were to be shared in an e-book and an article in Access.
Attendee Questions and Comments
Someone asked about the utility of pilots in trying out new market and pricing mechanisms. Since pilots had worked so well for demand-responsive parking pricing, might they not also work well for road user charges and usage-based insurance? Yes, said the panelists. Mr. Morrison said that pilots are essential in insurance. Mr. Opiola described a number of pilots that have existed to try out road user charges, and that United States Representative Earl Blumenauer is currently trying to fund more pilots. Other questions concerned road user charges. One person commented that perhaps revenue is not the problem; perhaps the real problem is unwise spending on capital and deferred asset maintenance. Mr. Opiola responded that state departments of transportation should consider reorganizing as service-providing utilities. He mentioned that Oregon Department of Transportation had done so. Finally, someone asked how to front the cost for road user charges. Mr. Opiola suggested that private sector firms might want to invest in systems for account management that could be used to implement RUC but would also benefit the company by enabling them to provide “value-added” services related to parking, usage-based-insurance, perhaps carsharing, and others.
Each of these talks concerned a new form of pricing made possible by technology. In the case of road usage charges and usage-based insurance, counterintuitively, cruder implementations are actually more appealing than technologically advanced options. Given that one of the motivations for the public sector’s very existence is to deal with externalities, one broad appeal of technological innovation for the public sector is to enable it to price externalities in a way it could not before. The key takeaway from these talks is that even when the technology exists to collect all the data to influence such pricing, it may be easier and more politically palatable to throw much of that data away and just keep the final number. The related lesson from the SFPark pilot is that governments will have to become more technically advanced as data managers and information providers in order to be adept consumers of new technology. This involves building expertise in-house, investing in IT capital, and careful contracting practices.
Lead, Follow, or Get Out of the Way: Public Sector Strategies for Getting the Best Out of Rapid Technological Change
“Technology moves fast, but policy and planning processes don’t.” (Andre)
“Let’s not be shy about having visions and moving policy without 100% data to support it. We should move sooner rather than later.” (Hasan)
“The way we do planning should change fundamentally. No more optimization… we need decisions that perform well under a variety of scenarios.” (Marty)
“Information can be threatening to decision makers.” (BT)
Intro: Brian Taylor, Director, Lewis Center for Regional Policy Studies
Dr. Taylor introduced the closing session by offering that at the symposium, we had glimpses into two very different worlds, operating in parallel. There is the wild, unpredictable, fast-paced world of technological innovation on the one hand, and the more slow-moving world of public sector formal systems on the other. The public sector moves slowly for good reasons: it tries to be democratic, for one. How can and should these two worlds interact with one another?
Andre Boutros, Executive Director, California Transportation Commission (CTC)
First, Mr. Boutros gave some context regarding what the CTC does. They set Regional Transportation Plan guidelines for California regions and act as a pass-through for federal and state monies. Thus, in his role as Executive Director of the CTC, Mr. Boutros is privy to the workings and priorities of a variety of public agencies (public works, transportation, and others) throughout the state.
Mr. Boutros was critical of public agencies for being slow to adopt new technology. Maybe he exaggerated for dramatic effect: Mr. Boutros claimed that Caltrans was still running Windows NT and that they just updated to Office 2000. He also said that 50% of information-technology systems in California public agencies were nonfunctional. Public sector cultures, he posited, are not supportive of change, especially not when it comes in leaps and bounds. This can be a barrier to certain policy innovations enabled by technology, such as pricing: congestion pricing needs to be a system approach to provide substantial time savings. Thus, regional, comprehensive approaches are important. Finally, Mr. Boutros questioned whether technological developments that provoke so much excitement will really deliver benefits. Are connected cars really going to solve our problems with respect to congestion? Won’t roadway infrastructure and deferred maintenance remain a problem even if all cars are connected?
Hasan Ikhrata, Executive Director, Southern California Association of Governments (SCAG)
Mr. Ikhrata offered three big questions posed by the symposium. One, How does technology influence transportation, land use, and urban form? Two, What role should government have in private sector innovation? And three, What are the legal, political, and privacy implications of all this: emerging technology, its impact on transportation and land use, and goverment’s interaction with it?
Next, Mr. Ikhrata drew some specific lessons from the symposium. He referenced Dr. Blumenberg and Mr. Lopez’s talks on Sunday. Given the changing nature of the way people use phones in particular, we should question the fact that the NHTS is still conducted with a land line. There is a public policy debate about what the future will look like. While Mr. Ikhrata acknowledged Dr. Blumenberg’s mixed findings with regards to whether or not millenials’ travel patterns are really different from their elders’, he also suggested that it was time to make policy changes even in the face of limited data. Mr. Ikhrata underlined a few more specific points for the audience. Goods movement has as much of an impact on urban form as passenger movement. Bad pricing kills good planning: Mr. Ikhrata cited SCAG’s focus on a VMT fee as an example of how they are trying to implement better pricing. Finally, saying that technology will be a solution to the problem of greenhouse gases (an argument Mr. Ikhrata has heard about electric vehicles) is not enough. Even if all vehicles are electric, there are still problems of congestion and safety.
Mr. Ikhrata concluded by emphasizing that policymakers must take initiative to be responsive to emerging conditions, rather than waiting for comprehensive data or understanding. We should move sooner rather than later, he concluded.
Dr. Taylor gave the attendees an invitation to “pontificate” on everything that had transpired at the symposium. The subsequent comments provide a sense of what resonated for attendees, what questions remain, and what points they thought may have been overlooked.
Norm King (Consultant) commented that even when data exists, policy often ignores it, and he cited policies relating to diesel fuel as an example. He also cautioned that innovations can produce more greenhouse gases, and more generally that technological advancement and policy change can have unintended consequences.
Wally Siembab (Research Director, South Bay Cities Council of Governments) was interested in what will become of the millenials. What kinds of workers and managers will they be? Maybe they’ll be great telecommuters. He also questioned the focus on young people. What about the boomers? There are immediate issues we face with respect to the boomers’ aging and the limited mobility that results, the kinds of land use patterns that support high levels of access for an aging population. We need to debate greater levels of population density, he argued.
Martin Wachs responded to Andre Boutros. Rather than appeal to regional planning and comprehensiveness, he argued, the way we do planning should change fundamentally. No more 20-30 year plans. Scenario planning (as described in the Monday session with Gordon Garry) is a helpful step in the right direction, but falls short of the kind of planning we need. We need to look at how alternatives perform under a variety of scenarios, and stop optimizing for a given future.
Stephen Finnegan (Manager, Government Affairs and Public Policy, Auto Club) said that we need to demonstrate the value of operations and maintenance to elected officials to counteract their capital bias.
Dean Taylor (Senior Program Manager, Southern California Edison) agreed with Dr. Wachs’ emphasis on uncertainty and connected this to a need for the public sector not to overlegislate or overregulate.
Richard Pelletier (BMW DesignWorks) offered a perspective from industry. Big companies make big bets, he said. When BMW formed the iBrand, their big bet and problem statement was, “Young people don’t want cars.” BMW recognized that they were too big to tackle this problem. They created a start-up company within the company that could operate like a smaller organization and tackle the problem. This model is common in the technological industry, and Mr. Pelletier suggested it could be fruitful for the public sector, as well. He mentioned that he had learned a lot from a United Nations study called “Fast Governments.”
Carl Morehouse (SCAG 1st Vice President, representing the City of Ventura) offered that he still had questions about how to take all of this information back to political environment a context of rural politics and red states.
Jodi Litvak (Director, Community Relations, Metro) offered that public agencies have a smaller tolerance for risk. The motivations to take risk are different and fewer when you are a public agency monopoly. This is something to take into consideration when importing models from the private sector.
Stephen Popper, responding to Martin Wachs, said that we might use models not as predictors but as tools to generate scenarios.
Jolene Hayes (Freight and Logistics Consultant, Parsons Brinckerhoff) echoed the call for more nimble planning. She said that we need faster sketch-planning tools and funding mechanisms to support them.
Outro: Brian Taylor
To close the symposium, Brian Taylor made some final comments. He reiterated that the public sector moves slowly for some admirable reasons, including the democratic, participatory process that gives voice to minority views. A recurring question, posed throughout the symposium, is how can government make decisions in (fast) contingent, flexible ways in the face of uncertainty? Dr. Taylor referenced BMW’s decision to create a small, independent subsidiary. He extracted a counternarrative about the private and public sectors: while BMW was a slow ship that was too hard to turn, SFPark exemplifies the public sector’s agility to implement technologically-driven, innovative solutions. Finally, Dr. Taylor noted that information can be threatening to decision-makers, and that data-driven processes take power away from elected officials. This, in addition to low agency tolerance for risk, poor information-technology capacity, capital bias, and other structural conditions of the public sector, can be a reason that agencies are slow to adopt data-driven processes.