Tell the CPUC: We don't need more driverless cars causing trouble on our streets

 

CPUC Voting Meeting
Thursday, August 10, 11 AM

In-person: 505 Van Ness

Listen via stream: https://www.adminmonitor.com/ca/cpuc

Call in for public comment: English and Spanish instructions on the meeting notice

 

Sample comment

(Note: It is always better to use your own words. If you find having a script helpful, you may wish to write your own, incorporating your own thoughts and experiences.)

Hi, my name is ____ and I live in ____ and I’m a member of Indivisible SF. I’m asking the CPUC to change course on items 2 and 3 on the consent agenda. We should not be rolling out more of these vehicles that create hazards, make emergency situations worse, and can’t be told to turn around when they cause a problem. The companies have a lot of work to do before they’re ready to expand. Thank you.

Background

There are two companies currently developing driverless cars (sometimes called “autonomous vehicles” or AVs) and testing them on the streets of San Francisco. Cruise (owned by GM) is the farther-along of the two, as it already has cars operating with no human driver at all and offering ride-for-hire service within a portion of the City. Waymo (owned by Google) is the other. You’ve probably seen their cars roaming around, and maybe seen customers getting into or out of a Cruise car.

Cruise wants to expand its driverless ride-for-hire operations to encompass the whole City—the CEO suggests that “there is the capacity to absorb several thousand per city”—while Waymo wants to begin sending out its own cars without drivers. The California Public Utilities Commission will decide whether or not these expansions move forward.

Those decisions were on last month’s CPUC agenda, but the CPUC punted the proposals at the last minute. They’re now on the agenda for this month’s meeting, which will be tomorrow, August 10.

It’s worth reading the proposed CPUC resolutions. Here’s the one for Cruise’s expansion; the one for Waymo’s deployment is largely the same. Both resolutions approve the companies’ requests; each one claims that the company has provided “a complete Passenger Safety Plan that reasonably addresses its proposed service.” Further down, the resolutions note the protests and responses submitted by San Francisco agencies, the Los Angeles Department of Transportation, and the California Transit Association, documenting unsafe and illegal behavior by existing driverless cars such as those deployed by Cruise.

The protest describes several specific incidents where Cruise AVs blocked SFMTA buses or light rail vehicles, impacting the flow of traffic. San Francisco expresses concerns about expansion of commercial service into peak hours of the day as stoppages and delays are likely to impact significantly more passengers both on the impacted transit line(s) and systemwide.

Further, San Francisco describes unplanned stops and unsafe maneuvers by Cruise AVs that have impacted emergency responders. These include incidents where a Cruise AV obstructed a fire department vehicle traveling to an emergency, ran over a fire hose, or improperly entered an emergency scene.

For the protest submitted by SF, the CPUC says simply that “Commission staff have determined that San Francisco’s arguments are not within the grounds for a proper protest, so will be treated as a response,” before going into the Responses section covering the feedback from SF, LADOT, and CTA. The resolutions also include a list of organizations that have voiced support.

The resolutions then include a detailed discussion section by CPUC, which we won’t attempt to summarize here—it covers a lot of territory and we encourage you to read it for yourself.

Even with all of the problems identified by SF, LA, and CTA, the CPUC still appears unbothered and about to move forward.

The importance of human accountability

A car driven by a human driver, by definition, has someone in the car who can control the car and respond to external direction (e.g., a cop directing traffic) or unforeseen circumstances (e.g., a closed-off emergency scene).

Mission Local reports that there have been dozens of incidents just this year—and, as the proposal to expand driverless car operations moves forward, more are expected—in which driverless cars have impinged on an active emergency scene. The Fire Department has been vocal about these problems:

… during one operation, [Chief Jeanine Nicholson’s] firefighters had to spend half an hour tending to a disoriented autonomous vehicle. “That’s just unacceptable,” she said. “I will reiterate; it is not our job to babysit their vehicles.”

If that car had had a human driver, a single firefighter could have told the driver via arm gestures to make a U-turn, or go down an adjoining road, or otherwise remove themselves from the scene. Or, a human driver might have recognized an emergency scene and turned around on their own initiative. Instead, this sort of “babysitting” is becoming a new normal at emergency scenes, such as the one back in February that had a cop scolding an encroaching Waymo like a misbehaving puppy. “No! You stay!

The companies’ official guidance to emergency responders is to call a phone number, summon a support team, and wait for them to arrive. Unfortunately, as noted by the Fire Department, “firefighters often don’t have access to a phone at emergency scenes.” A police commander quoted by Mission Local noted that there are 41 authorized AV companies in the state, which means 41 different protocols for dealing with one of those companies’ products intruding on a scene. This is not something our emergency responders can be reasonably expected to cope with.

Ultimately, someone needs to be responsible for the actions of the car. When a cop or firefighter tells a human driver to go around or go back the way they came, that human driver can respond immediately. Driverless cars will simply continue to mill around, if not intrude further if not constantly held at bay by cops, firefighters, volunteer marshals, or other personnel. There is nobody in the car who can change its course when needed, and nobody accountable for the car continuing to linger or intrude and making a bad situation worse.

What does safety mean?

In recent years, there’s been a growing awareness that how we define the safety of a vehicle has been incomplete.

The traditional definition has been in terms of hazards to the occupants: the driver and any passengers, including small children in car seats. Cars have gotten safer for their occupants over the decades thanks to features like seat belts, airbags, crumple zones, and other features that protect the people inside the car from injury when the car collides with something else (or is collided with).

While important, this ignores an entire other group of people: Everyone outside the car, to whom every car represents a potential hazard, if not threat. Whether due to mechanical failure, inattention, or outright malice, any car can severely injure or kill people outside of it (a problem that has gotten worse as manufacturers have made their products bigger and heavier, and thus more lethal). And even if the car never touches anyone or anything else, merely taking up space can create a hazard—consider the double-parked car that forces bicyclists into car lanes, or the driver who fails to pull over to allow an emergency vehicle to get to a scene.

All of this creates a tremendous responsibility upon every driver to use their car safely, to be aware of people, pets, and things around them, and to obey the law. When human drivers break the law, injure others, or damage property, we have mechanisms to hold them accountable, including civil claims for damage and tickets for moving violations. The buck stops with the driver; the driver is clearly and unambiguously responsible for the car’s actions, because those actions are really the driver’s actions.

We should look at driverless cars through this lens.

  • Cruise (owned by GM) advertises its driverless cars as Chevy Bolts plus the surveillance equipment, including cameras and microphones, that provide data to the autonomous driving system—and also the Police Department. Surveillance makes us less safe.

  • Driverless cars have a history, with increasing frequency as Cruise has ramped up fully-driverless operations, of driving into active emergency scenes (examples 1, 2, 3). Interfering with first responders working an emergency scene makes us less safe.

  • Driverless cars also have a history, summarized in the CPUC documents, of unsafe and illegal behavior such as double-parking (including in bike lanes). There’s no way to tell a stopped car to move, or an intruding car to stop or go elsewhere, and it’s not clear how traffic enforcement officers can ticket a car that breaks the law if there’s no driver to hand a ticket to. Cars creating unsafe situations, in violation of law with no apparent means of accountability for their behavior, obviously makes us less safe.

This is what the CPUC is on the verge of allowing these companies to roll out onto the streets of San Francisco.

What can you do?

Join tomorrow’s CPUC meeting and voice your opinion. It’s 11 AM at 505 Van Ness, or you can watch via web stream and comment via phone.

References

‘Blanket the city’: Cruise CEO says SF can take 'several thousand' driverless cars — a potential tenfold increase, Mission Local, 08/07/2023

SF cops, firefighters vent prior to big vote on driverless car expansion, Mission Local, 08/08/2023

Explore: See the 55 reports — so far — of robot cars interfering with SF fire dept., Mission Local, 08/09/2023

CPUC meeting agenda, 08/10/2023

Cruise Tier 2 advice letter and “passenger safety plan”, 12/16/2022

Cruise response to feedback, 02/01/2023

Waymo Tier 3 advice letter and “passenger safety plan”, 12/12/2022

Waymo response to feedback, 01/30/2023

Cruise video for first responders explaining how to deal with Cruise driverless cars, 11/19/2021

San Francisco Police Are Using Driverless Cars as Mobile Surveillance Cameras, Motherboard, 05/11/2022