2016-07-02 Sat

Clear Thinking

Christopher Greaves Home_DSCN4170.JPG

I really don’t see what the “moral dilemma” dilemma is all about.

Ever since the first road fatality (an ancient Briton getting run over by a Roman Chariot) we have had this problem of where to cast the blame.

With motor-cars all that changed was the certainty of death, rather than the possibility.

Someone still has to be responsible.

So if a driverless car collides with and kills a pedestrian, then the owner of the car is responsible, not matter whether they are in the car or not.

That should be enough to cause the owner to take out a significant amount of insurance before accepting delivery of the car.

Or just not accepting delivery of the car until the manufacturer’s have sorted out the angles.

We could use the same arguments when I travel in my vehicle-that-I-don’t-have-to-drive, also known as a local transit bus. The driver is responsible and/or the Toronto Transit Commission has to have insurance, but there’s no dilemma. It all gets settled BEFORE the driver gets behind the wheel, before the Toronto Transit Commission accepts delivery of the bus.

Christopher Greaves Home_DSCN4171.JPG

This line too left me puzzled.

I was taught to drive with “never swerve to avoid hitting an animal”, which (“animal”) I took to mean a frog, a rabbit, a wild cat or similar. Kangaroos were another matter (think “Moose” or “Cow”).

I was taught, too, that if the brakes failed I was to “slide” the car alongside a stone wall or similar. Better to reduce the panels to bare metal rather than collide head-on into a tree. (I was taught to drive before seat belts, let alone airbags).

In that sense I was meant to treat a car as expendable.

I fail to see what a self-sacrificing car” has to do with the people inside it.

Observations

Christopher Greaves Home_DSCN4172.JPG

I am not sure who is in charge here. There can be many reasons for running a Pilot Project, but surely the project proper is being considered because it might change driver behaviour?