Tuesday, April 18, 2017

Can Self-Driving Cars Ever Really Be Safe?

Published on



Analysts estimate that by 2030, self-driving cars and trucks (autonomous vehicles) could account for as much as 60 percent of US auto sales. That’s great! But autonomous vehicles are basically computers on wheels, and computers crash all the time. Besides that, computers get hacked every day. So you gotta ask, “Can self-driving cars ever really be safe?”

The Short Answer

No. Self-driving cars can never really be safe. They will be safer! So much safer that it’s worth a few minutes to understand why.

Humans Are Very Dangerous

First and foremost, according to the National Highway Traffic Safety Administration (NHTSA), 90 percent of all traffic accidents can be blamed on human error. Next, according to the AAA Foundation for Traffic Safety, nearly 80 percent of drivers expressed significant anger, aggression, or road rage behind the wheel at least once in the past year. Alcohol-impaired driving fatalities accounted for 29% of the total vehicle traffic fatalities in 2015. And finally, of the roughly 35,000 annual traffic fatalities, approximately 10 percent of them (3,477 lives in 2015) are caused by distracted driving.
Remove human error from driving, and you will not only save a significant number of lives, you will also dramatically reduce the number of serious injuries associated with traffic accidents – there were over 4.4 million in the United States during 2015.

Data Begins to Make a Case

In May 2016, a 40-year-old man named Joshua Brown died behind the wheel of a Tesla cruising in Autopilot mode on a Florida divided highway. He was the first.
Rage against the machine quickly followed, along with some valid questions about whether Tesla had pushed this nascent technology too fast and too far. Everyone expected the accident to be the fault of a software glitch or a technology failure, but it was not.
The NHTSA investigation found that “a safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted.” In other words, the car didn’t cause the crash. But there was more to the story. The NHTSA’s report concluded, “The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.” In reality, while Mr. Brown’s death was both tragic and unprecedented, the investigation highlighted a simple truth: semi-autonomous vehicles crash significantly less often than vehicles piloted by humans.

What Do You Mean by “Safe”?

The same NHTSA report mentioned 99 percent of US automakers had agreed to include Automatic Emergency Braking (AEB) systems in all new cars by 2025 with the goal of preventing 28,000 crashes and 12,000 injuries. The AEB program is limited to rear-end crashes, but there are a host of other semi-autonomous features in the works, and by the numbers, all of them will make us safer.
That said, this is very new technology, and regulators will need to define what they mean by “safe.” Must our autonomous vehicles drive flawlessly, or do they just need to be better at it than we are? The RAND Corp think tank says, “A fleet of 100 cars would have to drive 275 million miles without failure to meet the safety standards of today’s vehicles in terms of deaths. At the time of the fatal May 2016 crash, Tesla car owners had logged 130 million miles in Autopilot mode.”

The Transition to Fully Autonomous Vehicles

In April 2016, Ford, Alphabet, Lyft, Volvo Cars, and Waymo established the Self-Driving Coalition for Safer Streets to “work with lawmakers, regulators, and the public to realize the safety and societal benefits of self-driving vehicles.” They have their work cut out for them.

Self-Driving Cars Need to Be Trained

In January 2017, Elon Musk tweeted that a software update featuring Shadow mode was being pushed to all Teslas with HW2 Autopilot capabilities. This enabled the car’s autonomous driving AI to “shadow” its human drivers and compare decisions that it (the AI) would make to the decisions that were being made by the human driver. Think of it as self-driving AI in training. The auto industry and several tech giants are working as fast as they can to make autonomous vehicles mainstream. To speed the process, they may need to share some data. Will they? My guess is, absolutely.

Hacks and Crashes

In September 2016, Chinese researchers discovered some “security vulnerabilities” in the Tesla Model S and remotely hacked into the car. This was notable because it was the first time anyone had remotely hacked into a Tesla. We have a thesis here at The Palmer Group, “Anything that can be hacked, will be hacked.” Is this going to be an issue? Yes, but it’s also going to be an arms race. I’m betting on the good guys, but to be fair, hacking across every digital touchpoint is a never-ending battle. We will do our best to combat the bad guys.
As for computer crashes, yes, it is possible for the computer that runs your self-driving car to crash, but it will happen so infrequently that, by the numbers, you will be significantly safer in an autonomous vehicle than if you were driving yourself.

Fear and Assessment of Risk

Some people are afraid to fly. When you point out that flying is the safest form of travel by several orders of magnitude, the response is always some version of, “But when a plane crashes everyone dies.” Human beings are not very good at assessing risk. If you don’t have a gas pedal, a brake pedal, or a steering wheel, and your car crashes, you will feel helpless and out of control. And you may die. But, by the numbers, tens of thousands of people will not die or be injured because semi-autonomous driving and ultimately fully autonomous driving will be much safer than pure human driving. Some will counter that it’s cold comfort if you’re the one who is killed or injured, no matter how rare it is. I agree. But, by the numbers, if you were going to make a policy decision for our society at large, you have to agree that saving tens of thousands of lives and millions of injuries is a worthy endeavor.
(BTW: Please do not bring up the absurd “Why Self-Driving Cars Must Be Programmed to Kill” scenario where “One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?” If you had situational awareness and time to consider all of the outcomes posited by this nonsense hypothetical, you’d have time to step on the brake. If you didn’t have time to consider all of the potential actions and outcomes, the AEB would have engaged to prevent the car from hiting what was in front of it – the people you would have killed while you were thinking about what to do.)

A Prediction

I’m pretty sure that before 2030, if you are under the age of 25 or over the age of 70, you are going to need a special permit to manually drive a car. I’m also pretty sure that you will not be allowed to manually drive on certain streets and highway lanes because you will pose too great of a threat to the caravans of autonomous vehicles on those roads.
With any luck, the fear-mongers and bureaucrats will get out of the way, and we will all be much safer sooner.

Related Articles

About Shelly Palmer

Named one of LinkedIn’s Top 10 Voices in TechnologyShelly Palmer is CEO of The Palmer Group, a strategic advisory, technology solutions and business development practice focused at the nexus of media and marketing with a special emphasis on machine learning and data-driven decision-making. He is Fox 5 New York's on-air tech and digital media expert, writes a weekly column for AdAge, and is a regular commentator on CNBC and CNN. Follow @shellypalmer or visit shellypalmer.com or subscribe to our daily email http://ow.ly/WsHcb



Web hosting

Wednesday, April 12, 2017

What tourist areas should be avoided in Southeast Asia?


I am doing the usual route of Thailand, Cambodia and Vietnam and want to avoid places crowded with tourists where it is no doubt beautiful but everyone is just boozing.

11 Answers
Christian Bergland
I've got two, and they should be avoided for very different reasons.
#1 - Pattaya, Thailand
Pattaya is a filthy hellhole of neon and sex.  I'm no prude, but there really aren't any redeeming qualities to this place.  The beach is dirty, and quite rundown by Thai standards.  The food is horrible, with Thai cuisine taking a back seat to garbage fryups and rubbery seafood that is anything but local. 
Pattaya has long had a reputation as a center of sin, which in this context basically means sex.  The streets are crowded with bars advertising their services, which run the gamut in terms of the sex trade.  Many of the girls advertising these services are in their teens, and they often look young enough that they should really be in school rather than indirectly or directly engaged in the sex trade.
Hey, maybe this is your cup of tea, but I'd recommend avoiding it.  There are plenty of places in Thailand with nicer beaches, better food, and more cultural activities to engage in.
#2 - Vang Vieng, Laos
Okay, so the issues here are significantly different than those in Pattaya.  Vang Vieng is admittedly very beautiful:
I'm sure it's entirely possible to go here and have a wonderful time exploring the outdoors.  So why do I recommend avoiding it?  Because of this:
Tubing down the river is the thing to do in Vang Vieng, and Western tourists backpacking through Southeast Asia flock there in droves.  Now, tubing could be a really pleasant activity.  It could be a nice, quiet day along the river.  Instead, the riverside is packed with bars blasting top 40 hits, destroying whatever sense of serenity might have been created by the area's natural splendor. 
People generally start tubing relatively early in the day, and by late afternoon have made their way back into town, having had quite a bit to drink by that point.  Everything in town is dedicated to continuing the activities from earlier in the day, only perhaps taking them a step further:
Note the prices here: 100,000 kip is about $12 USD.  Tourists have been known to lose weeks to Vang Vieng, trapped in a cycle of debauchery of their own making.  Drink during the day, drugs at night, oversleep the next morning and miss the only bus out of town, rinse and repeat.  Even the restaurants get in on the act, with every establishment having TVs set up playing bootleg DVDs of hit American shows such as The Simpsons, Family Guy, and Friends
With most restaurants having the same menu of bad Western food and bland Lao food, you essentially choose where to eat based on which show you're comfortable watching.  Now, I'm not someone who thinks that every moment of travel must be focused on maximizing cultural experiences, but Vang Vieng really encapsulates everything about the backpacker ethos that's just totally tone deaf and clueless.  You come halfway around the world to get drunk and watch Friends while going tubing every day for two weeks?  And you spend all your time talking to other people who are doing the same?  Couldn't you have done all of this at home?
I'll note that plenty of people make Vang Vieng a stop on their backpacking circuit and absolutely love the place.  For me, I loved my time in Laos.  Luang Prabang is one of the coolest towns I've ever visited.  Vientiane has a really cool Graham Greene kind of vibe to it.  Savannakhet has a rundown, off the beaten track charm.  But Vang Vieng?  Vang Vieng is there to keep the worst kind of tourism from infiltrating the rest of Laos.  Proceed at your own risk.

সামাজিক বৈষম্যের বেদীতে বলি এক চিত্রনায়িকা পরীমনি?

  হঠাৎ করেই চলচিত্রের নায়িকা পরীমনিকে আটক করা হয়েছে। সুনির্দিষ্ট তথ্যের ভিত্তিতে র‌্যাব অভিযান চালিয়েছে তার বাড়িতে। অভিযানের আয়োজন দেখে...