Self-driving cars work about as well as central planning bureaucrats do

For all the hype about self-driving cars out there, this Wall Street Journal report, titled The Bumps Ahead for Autonomous Vehicles, suggests there are some problems:

Everyday scenarios that human drivers navigate with ease, such as bumpy roads or snowy weather, can be unintelligible to a car’s sensors. But there is no industry standard for what sensors or backup mechanisms driverless cars must have if primary sensors fail.

Curved objects, such as the bottom of a soda can, can skew data from radar sensors by falsely amplifying radio waves that bounce off the object. This makes the car think it is approaching a much larger object and could errantly trigger emergency mechanisms such as sudden braking or swerving.

Objects that are low to the ground, such as snakes or sharp debris, can easily fail to register with cameras or lidar sensors placed on top of the car. Missed objects can become roadkill or potentially damage the car.

Cameras aren’t as effective at capturing the environment during low visibility conditions, including after sundown. Lidar and radar are unaffected by darkness, however, because they collect information about the environment from electromagnetic wavelengths higher and lower than that of visible light.

Radar sensors work by matching an object’s outline to a catalog of shapes stored in the car’s operating system’s memory. Objects with irregular shapes may confuse radar, yield inaccurate readings, and possibly cause the car to needlessly brake or swerve.

Driving on uneven surfaces can compromise the calibration of lidar and cause excessive wear on the ball-bearings that stabilize the sensor atop the car. The more often a vehicle encounters these conditions, the more frequently the lidar sensor will need to be replaced.

When bugs collide with the lidar sensors or camera lenses while driving, they leave smudges that require cleaning to return to peak functionality.

Snow is a nuisance to cameras, as falling flakes reduce long-range visibility and snow that accumulates on or near lenses can render a camera useless. Plus, it’s hard to remove frost from a spherical lidar sensor with a flat ice scraper.

In other words, the radar, the camera and other sensors seem to be experiencing some unintended consequences.

Soda cans can confuse the sensor, bumpy roads can mess up navigation, and snow and frost can reduce effectiveness. Self-driving cars aren’t that good in the dark either, and they have trouble spotting things close to the ground. Bugs on the windshield also confuse the sensor.

I am having a lot of trouble thinking of anything else that could go wrong.

I believe self-driving cars and trucks will be ready and safe to go for everyone the same time that bureaucrats and politicians can control temperatures, sea levels, and storm activity forever, not to mention, the huge global economy, if we just hand them trillions of dollars.

It's been tried before after all, as when the central planners of the Soviet Union, with nothing but good intentions, sought to remake the entire economy based on a few all-seeing central planners' decisions. We know how well that turned out. Because the reality is - and this is from the Austrian school of free market economics - the essential reason socialism always fails, and fails every time it's tried, is that central planners can never know everything about the market forces that affect a broad economy, any more than they can know all about how a butterfly's wing beat affects all the activity of the universe.

We have been told that self-driving vehicles will be safer than humans driving, just as socialism is supposedly safer for humans (because it's for their own good and the government is always benevolent), but as the article shows, other humans have to program the vehicles which includes all the sensors and cameras for every scenario. It's no different from how central planners and bureaucrats do such a great job in programming for every eventuality from their top-down perspective. Have any of them ever done that without those famous 'unintended consequences' so famous in socialism? Don't think so. Now we are supposed to think that a private company can do the same thing - and supposedly the result will be flawless?

Image credit: Michael Shick, via Wikimedia Commons // CC BY-SA 4.0

For all the hype about self-driving cars out there, this Wall Street Journal report, titled The Bumps Ahead for Autonomous Vehicles, suggests there are some problems:

Everyday scenarios that human drivers navigate with ease, such as bumpy roads or snowy weather, can be unintelligible to a car’s sensors. But there is no industry standard for what sensors or backup mechanisms driverless cars must have if primary sensors fail.

Curved objects, such as the bottom of a soda can, can skew data from radar sensors by falsely amplifying radio waves that bounce off the object. This makes the car think it is approaching a much larger object and could errantly trigger emergency mechanisms such as sudden braking or swerving.

Objects that are low to the ground, such as snakes or sharp debris, can easily fail to register with cameras or lidar sensors placed on top of the car. Missed objects can become roadkill or potentially damage the car.

Cameras aren’t as effective at capturing the environment during low visibility conditions, including after sundown. Lidar and radar are unaffected by darkness, however, because they collect information about the environment from electromagnetic wavelengths higher and lower than that of visible light.

Radar sensors work by matching an object’s outline to a catalog of shapes stored in the car’s operating system’s memory. Objects with irregular shapes may confuse radar, yield inaccurate readings, and possibly cause the car to needlessly brake or swerve.

Driving on uneven surfaces can compromise the calibration of lidar and cause excessive wear on the ball-bearings that stabilize the sensor atop the car. The more often a vehicle encounters these conditions, the more frequently the lidar sensor will need to be replaced.

When bugs collide with the lidar sensors or camera lenses while driving, they leave smudges that require cleaning to return to peak functionality.

Snow is a nuisance to cameras, as falling flakes reduce long-range visibility and snow that accumulates on or near lenses can render a camera useless. Plus, it’s hard to remove frost from a spherical lidar sensor with a flat ice scraper.

In other words, the radar, the camera and other sensors seem to be experiencing some unintended consequences.

Soda cans can confuse the sensor, bumpy roads can mess up navigation, and snow and frost can reduce effectiveness. Self-driving cars aren’t that good in the dark either, and they have trouble spotting things close to the ground. Bugs on the windshield also confuse the sensor.

I am having a lot of trouble thinking of anything else that could go wrong.

I believe self-driving cars and trucks will be ready and safe to go for everyone the same time that bureaucrats and politicians can control temperatures, sea levels, and storm activity forever, not to mention, the huge global economy, if we just hand them trillions of dollars.

It's been tried before after all, as when the central planners of the Soviet Union, with nothing but good intentions, sought to remake the entire economy based on a few all-seeing central planners' decisions. We know how well that turned out. Because the reality is - and this is from the Austrian school of free market economics - the essential reason socialism always fails, and fails every time it's tried, is that central planners can never know everything about the market forces that affect a broad economy, any more than they can know all about how a butterfly's wing beat affects all the activity of the universe.

We have been told that self-driving vehicles will be safer than humans driving, just as socialism is supposedly safer for humans (because it's for their own good and the government is always benevolent), but as the article shows, other humans have to program the vehicles which includes all the sensors and cameras for every scenario. It's no different from how central planners and bureaucrats do such a great job in programming for every eventuality from their top-down perspective. Have any of them ever done that without those famous 'unintended consequences' so famous in socialism? Don't think so. Now we are supposed to think that a private company can do the same thing - and supposedly the result will be flawless?

Image credit: Michael Shick, via Wikimedia Commons // CC BY-SA 4.0