Car-hackers of the driverless revolution

driverless car

“The safest way to travel” was how they sold the driverless revolution at first, and we loved it. “Car accidents are a thing of the past”. It was true, computers were much faster at making those important decisions about road safety. Cars automatically slammed their own brakes on if a pedestrian ran into the road. Something though that arrogant teenagers started to abuse, once they’d realised they be safe every time, sometimes causing traffic chaos.

But like all software, it can be overridden. Computers are fast decision-makers but are only following instructions. We called them “Cackers”, car-hackers, people who altered the in-built behaviours of their own vehicles. The manufacturers didn’t like it of course, and neither did the transport agency.

Cackers hacked their cars to break the automated speed limit, and a new sub-culture of boy-racers was born – something that we thought we’d never see again after the driverless revolution.

Then came the “ethical cackers” who’d hack your car for benefits like better fuel economy, a smoother ride, or a marginally shorter route – as long as you didn’t mind taking slightly sharper corners at speed. As this became more popular, car manufacturers started to build it in, allowing users to select their own “driving style” preference – and for the first time since the driverless revolution, people had individual control again. Well, not absolute control of course, but were asked what the car should do in certain situations.

These options was presented to the vehicle’s user (the term “driver” of course was a thing of the past) as a multiple choice of scenarios. Most importantly was “directive 112” which became infamous in the media as what divided people the most – what to do about arrogant teenagers walking out in front of the car. Should the car brake, potentially causing travel chaos?

Or not?

How would you continue this story? What other modifications could be made to driverless cars?

  • Maybe the cars of the future will be fitted with small tractor beams to safely move pedestrians and animals out of the way without having to stop.

    • Good idea but what about disabled or infirm people, does it want to take the chance of pushing them over? Or pushing them into the path of oncoming traffic?

      • It would be an automated system, so it would be safe like the cars themselves. It would only be a problem if old and disabled people tried to interfere with the system somehow. Such as by trying to use the tractor beam to pull you along on your skateboard (or hoverboard)

        • So a car would pick people up and put them down elsewhere? I think people would have a problem with this . Like it’s their civil right to stand where they want or something. But maybe that’s a good idea for a story right there

  • John H Reiher Jr

    Directive 112 would be gone after the first class action lawsuit by bereaved parents soaks the manufacturer for millions, if not billions of dollars in damages.

    The real solution is to work on the teenagers and do so subtly through their culture.

    First off, why are they jumping out in front of robocars? There’s always the chance that you might get hit if the sensors don’t pick you up in time, or you jump in front within the vehicle’s stopping distance.

    In fact, I doubt it was all that safe to jump in front of a car. Too many variables and too many teens misjudging the stopping distance of the vehicles, that they all get away with the stunt.

    Now to address the other issue raised in this story idea: Cackers. I like the term, and I can see someone cacking a whole lot of cars to create a disturbance or even a barrier to the police, so that criminals could escape.

    And you forget the car sharing movement. It’s very likely that most people don’t own cars and just subscribe to a service that makes a robocar available when they need it. In this scenario it’s highly unlikely that the car sharing service would allow any customization of a vehicle’s OS at all.

    • I like the idea of sharing cars. In the movie Minority Report that’s what seems to be happening, you just hop in one and off you go. But remember that people with enough money will always want things to be theirs. Some people will misuse the service of course, and drop litter. Some might make a mess, vomit in it while drunk – or worse.

      As for people walking in front of a car. It’s happened to me, some people are either just arrogant, or maybe just stupid. Maybe they’re just yobbos, or have an inflated sense of self. They knew I was going to stop, and if I didn’t, well maybe they wanted a big compensation claim?

  • Steven Lyle Jordan

    The inevitable solution would be fitting law enforcement with the tools to detect and stop any illegally-modified cars, on the spot, when detected driving unsafely or illegally.

    The first line of civilian defense would be surveillance cameras, both mounted along roads and at all intersections, and on drones in the sky. The same computers that provided vehicles with traffic, weather and safety information would track every vehicle, comparing its actions with that of other cars and with established safety protocols for that section of road, to determine illegal activity.

    If the car was determined to be driving illegally, the computers would send a signal designed to override the car’s activity, pull it over on the side of the road and shut it down pending the arrival of an officer of the law.

    Eventually, these computers would be modified to query any car performing illegally, to confirm a reasonable right to drive illegally (ie, an emergency rush to the hospital). This would only happen after numerous instances of (supposed and actual) emergencies that were exacerbated by computer-ordered vehicle immobilizations.

    A “vehicular cold war” between cackers and law enforcement would continue for over 20 years, until governments would force all vehicle electronics to be sealed from owner hacking, and include anti-hacking systems that would cause the electronics package to fry upon tampering, and replacement of that package would mandate notification to law enforcement and a significant fine and/or ban from owning a motor vehicle. Essentially, the cackers would make it so strict and troublesome to own a motor vehicle as to drive more people to alternative forms of transportation, such as (less-controlled) motorcycles, bicycles and public transportation.

  • Christmas Snow

    The car that is intercepted by a “rogue” (altered) vehicle disrupting its path may pick the rogue’s plate number and transmit it to the main server. It is expected that such a rogue vehicle may be reported several times and raise law enforcement’s suspicion. This may be the best detection method.

    The computer may be accessible for programming solely by physically opening a specialized “black box” to physically access some computer ports necessary for reprogramming. Such rude action may automatically and quietly activate an alarm system for the authorities. As a contemporary analogue, we can think of how terrorists improve their bombs by turning them tampering-proof and the best way (so far) to deal with it is to enclose it with armored casing and just blow it up.

    The teenagers and the criminals alike would not “jump out” in front of a robocar. They will roll a barrel, an old car or anything big enough to activate the sensors. Talking about natural habitats destruction, what if a bear or a moose cross the street on their way to a new habitat?

    • Well I assume the software would detect the moose and swerve to avoid it. Although what if it has to choose between hitting a moose or a pedestrian? I assume the programmers would have decided that the pedestrian is more important. But that assumption might not be true, how about a story about a programmer who makes an error, leading to many deaths?

  • It’s not the same, but car hacking is on the horizon http://www.wired.com/2014/07/car-hacker/?mbid=social_fb

  • I just saw this article about people choosing “ethics” settings in their automated cars http://www.wired.com/2014/08/heres-a-terrible-idea-robot-cars-with-adjustable-ethics-settings/?mbid=social_fb