The other day, while driving I almost pulled out in front of a Google Self-Driving-Car. My natural instinct to not physically insert myself in the path of a ton of steel moving at high speeds won out, but what I that hadn’t happened?

Almost certainly, the car would have slowed down (abruptly if necessary) to avoid a collision. I call this a first-order good. But possibly a second-order bad. Because, as soon as human drivers get a little more used to this, they’ll just assume the car will stop for them. People will start pulling out in front of SDC’s like crazy. Sometimes, being a little aggressive behind the wheel is safer. Show weakness, and someone will jump all over it, often dangerously so.

The long term solution, of course, is for all vehicles on the road to be autonomous, then the car-to-car communication systems (including existing visual signals, etc.) plus a bit of algorithmic magic can work out any situation.

But until we get to that point, it will be a very delicate programming act. The vehicles need to seem menacing-enough that Yahoos (no pun intended) won’t feel like they can gain from doing something stupid, while not attracting lawsuits of the should-have-been-able-to-prevent-collision variety. How do you program good judgement? Should self-driving cars honk? Or would that just end up being another badge of honor for the Rodney-Ramjets of the world?

This article has more.