Search This Blog

Wednesday, August 2, 2017

What are the Scary Ripple Effects of Self-Driving Vehicles?

As reported by ReadWrite: By 2040, we think fifth-level, autonomy-enabled mobility will be available as a service for the majority of the transportation needs of urban consumers. In other words, 70% of the urban population wouldn’t need to own cars because they would be available on-demand through their favorite app.

Although several power players like General Motors and Ford are promising fully autonomous cars sooner, there are many technological and large-scale regulatory and consumer sentiment issues with autonomous cars that would need to be addressed before they could fulfill our transportation needs.

But imagine the world in 2040, when most of the population doesn’t need cars. Everything from shopping to commuting to long distance road trips will be addressed by fully autonomous vehicles that you can summon with the push of a button. 

Rethinking today’s products and services in the 2040 world

Once you accept that basic premise, it is surprising to see how so many of our current products and services do not fit well with the world of the fully autonomous car. When we shift from complete ownership to an on-demand leasing model, automotive OEMs and their supply chains (Tier 1 and Tier 2) will be the first ones affected. Several of them are thinking of strategies to grapple with this eventual reality, but this is just the tip of the iceberg. A significant number of multibillion dollar companies operate around consumers owning at least one to two cars per family. All these products and services will either become obsolete or have to be fundamentally rethought.

15 minutes for an insurance product we don’t need?

Let us start with the way we buy cars — at dealerships. Once consumers stop buying cars, there is no need for dealership networks. They are merely a distribution channel that will be replaced by an app that we use to hail our cars.

Then, there’s insurance; we get insurance as soon as we buy a car. Because we will no longer own cars, there is no need to spend 15 minutes to save on car insurance. Accidents because of systemic failures like poor connectivity or algorithmic edge cases are inevitable, but the massive reduction in the frequency of accidents coupled with insurance being bundled into the mobility service will result in an enormous reduction in revenue for the insurance industry.

Car loans, which finance our car ownership, will also be out. In the 2040 world, mobility service providers will end up owning most of the cars. Even if you assume one or two ride hailing apps will dominate the market, there are economies of scale in owning and operating large autonomous car networks. As a result, we believe the ultimate owner of the cars will be large businesses, not individual car owners who would lease their cars on ride sharing platforms. These large fleet owners would have a low cost of capital with which to finance their vehicle purchases. Through both a massive reduction in the number of cars and the APR they can charge for each car loan, auto loan providers will see a huge downsizing of their market.


First stop on the road to oblivion: gas stations

Once we buy our cars, we spend a lot of money on it. Let us think about these products and their relevance in the 2040 world.

The first and most obvious car-related expense is gas. In a world where we have autonomous car fleets operating, we don’t need as many gas stations. A few large gas stations far away from high-density areas and perhaps operated, once again, by large fleet owners should suffice. These cars would essentially act like public transportation vehicles, which are refueled at the end of each day at a central location. A fewer number of miles traveled because we will be using ride sharing services, with more fuel efficient cars will certainly be a cause for concern for oil companies.

Parking is another common expense with a poor consumer experience. The omnipotence of autonomous ride-sharing cars will lead us to have fewer parking lots away from high-density areas where mobility service providers can park their automotive fleets when demand is low. Close to half of our urban areas are dedicated to parking, and therefore, a massive reduction in parking lots will be a boon to our cities. 

Every big technological change has massive unforeseeable consequences

At conception, the Internet was a platform only for email. But its unintended consequences ended up creating companies like Amazon, Google, and Facebook, which have changed the way we buy and consume media. In the same vein, we believe autonomous mobility will fundamentally change the way we live in our citie1s. So much of our urban real estate is tied to car ownership model of today: parking lots, gas stations, dealerships.

No one can foresee the new products or services that could emerge from such surplus real estate becoming available or the effects it will have on our housing markets. Will people live much farther from cities because commuting in an autonomous car could be productive, or would urban housing become cheaper because we don’t need parking lots anymore?

And as for my personal favorite unintended consequence, short distance flights — would we rather take a flight from San Francisco to Los Angeles or would we prefer to get there in swankier-than-business-class autonomous vehicles that come with plush beds and Netflix? I am extremely confident that there are many more positive, albeit unintended, consequences of autonomous cars that will emerge en-route to 2040.


So why is this scary?

The 2040 world of autonomous mobility is scary because so many of today’s products and services would have to radically evolve to stay relevant. And it is not just the automotive OEMs that are in trouble — auto insurance companies, car loan providers, oil and gas companies, car dealerships, parking lot owners, and auto parts suppliers and stores are all on the chopping block. Just this foreseeable disruption alone is worth $2 trillion in terms of products and services we consume today. If these companies are affected, it will set off a chain reaction of problems for suppliers, which will trigger panic.

This $2 trillion will be reshuffled and distributed to consumers, new companies and incumbents. And that is scary. Some of the best incumbent players in all the industries highlighted above are already thinking ahead to prepare and adapt to new reality. But many others will likely not survive this disruption. That’s scary, too.

Preparing for the 2040 future now by partnering with startups and augmenting your organization with change makers that can imagine the future well before it has arrived is essential if incumbent companies want to survive in this changing landscape. But for the startup founders and venture capitalists involved in the industry, the evolution of autonomous cars is an enormous and exciting opportunity that has the potential to create multiple $1 billion technology companies. Uber and Lyft are just the beginning.


Tuesday, August 1, 2017

China: Keeping Track of Technology for Self-Driving Vehicles

As reported by the Shanghai Daily: The next generation of cars, already on the drawing boards, require considerable time, research and testing before they become commonplace on the roads.
To place itself at the vanguard of “smart cars,” Shanghai created a test ground known as the National Intelligent Connected Vehicle Shanghai Pilot Zone — a 100 square-kilometer site in the Jiading District that is the first of its kind in China.
The name is a bit of a mouthful, but it simply means a demonstration site where the latest innovations in smart car technology can be tested under real-life conditions.
It’s all part of China’s ambitious plan to become a world leader of a new generation of cars that operate on digital intelligence systems, including self-driving. The target to complete the plan is the end of 2025. It’s also part of the nation’s program to upgrade its prime industries to compete in a changing world.
The pilot zone in Shanghai, opened in June last year, has been a testing site for more than 10 major car makers and auto-parts companies, including SAIC, General Motors, Volvo, Ford, BMW, NIO and Delphi. The testing covers a range of technologies, including self-driving capability, sensor performance, lane adherence and high-definition map positioning.
“Companies are actively participating in the development of the pilot zone,” said Rong Wenwei, general manager of Shanghai International Automobile City, which oversees the zone. “What we are building is a platform for companies to accelerate development of the industry.”
Ford Motor Co, for one, is testing its driver-assisted technologies, including left turn assist and traffic light optimal speed advisory. The US company has said it hopes both features will figure in its next-generation vehicles.
“The Shanghai International Automobile City provides a setting where we are able to develop, test and refine future connected vehicle technologies,” said Trevor Worthington, vice president of product development at Ford Asia Pacific.
Left turn assist is a feature that uses information exchanged between vehicles to alert drivers of oncoming traffic when making a left turn.
Traffic light optimal speed advisory technology connects a vehicle to road infrastructure, informing a driver of the best speed to reduce the waiting time at stoplights by monitoring data from roadway devices. Ford said that the advisory could help drivers avoid red lights, thus reducing travel time by up to 20 percent.
“Imagine your daily commute with less waiting time for red lights,” said Thomas Lukaszewicz, manager of automated driving for Europe and China at Ford. “Vehicle-to-infrastructure technology under development holds great promise to make commuting smoother and less time consuming.”
Delphi, the global auto parts supplier, has demonstrated its self-driving technologies in the zone. The company applied nine radar units and cameras in the autonomous car.
The radar is used to capture information about the surrounding environment of a vehicle. The camera is used to capture the road conditions in front of the vehicle.
“Chinese consumers are willing to embrace new solutions especially Internet and tech-based autonomous driving, ” said David Paja, president of electronics and safety at Delphi.
Delphi said it has achieved rapid progress on advanced driver assistance technology in China in the past five years.
“We are actively in discussion with several original equipment manufacturers in local market,” said Frank Wang, president of electronics and safety business for Asia Pacific at Delphi.
In addition to the vehicle testing, the pilot zone is a useful tool in raising public awareness about what’s coming next in the driving experience. An area of the zone has been set aside for public education. It offers a 70-minute tour on what it will be like to drive intelligent, connected cars. Visitors can also take rides in self-driving cars, including the Tesla Model X, Volvo XC90 and Volvo S90.
“If I get the chance to visit the pilot zone, I want to take a ride in a self-driving car,” said Wang Xing, a student from Tongji University. “I am curious about autonomous driving technology and want to see how it works.”
The pilot zone has itself a series of goals for the next three years. By 2020, the zone plans to host more than 1,000 intelligent connected vehicles.
The zone also aims to shoulder at least five national major projects and develop more than 10 industry standards related to intelligent and connected vehicles. It is on track to recruit more than 100 top professionals and serve as an incubator to 30 innovative startups in the three-year period.
As a hub of domestic research and development, it also plans to strengthen work relationships among companies, universities and institutions.
Shanghai International Automobile City said it has signed agreements with more than 20 companies to conduct 100 research projects. It also plans to hold more than 50 professional seminars on intelligent and connected vehicles this year, focusing on testing technologies, standards and big data.

Monday, July 31, 2017

Self-Driving Car Demo is the First to Cross the US-Canada Border

As reported by Engadget: As a rule, self-driving car tests tend to be limited to the country where they started. But that's not how people drive -- what happens when your autonomous vehicle crosses the border? Continental and Magna plan to find out. They're planning to pilot two driverless vehicles all the way from southeastern Michigan to Sarnia, Ontario, making this the first cross-border test of its kind. The machines won't be in complete control for the entire route, but they'll use a combination of cameras, LiDAR and radar to take over when they can, including two key border crossings (the Detroit-Windsor Tunnel and the Blue Water Bridge).

This isn't the first autonomous driving-related agreement involving Michigan and Ontario, but it's an important one: it'll explore rules and regulations in addition to the usual self-driving data collection.

As you might guess, tests like this will be vital to making autonomy a practical reality. Driverless vehicles need to know how to adapt to changing road rules, such as different signage and units of measurement. While this isn't the greatest challenge, it has to be overcome if you're ever going to embark on cross-border shopping trips without touching your steering wheel.


Tesla’s Model 3 and Apple’s iPhone Have a Few Things in Common

As reported by The Verge: The question of whether, and to what extent, cars are like phones has been gently bubbling along over the past few years as we’ve watched the nexus of innovation shifting from the technology we carry in our pocket to that which carries us along the roads. It’s obvious now that cars will experience transformative change like phones did before them, but how many parallels between the two are really there?
If you want to see a company doing its utmost to reduce the complexities of a car down to a familiar phone-like interface, you need look no further than Tesla and its new Model 3. This is the most affordable electric car in Tesla’s stable and it has the most aggressively stripped-down interior — from any manufacturer. There’s a 15-inch touchscreen in the middle of the dash and a couple of buttons on the steering wheel and that’s it. Given how Apple’s iPhone was the phone that made this “one touchscreen to rule them all” interface paradigm familiar in the first place, I thought it’d be fitting to look at the similarities between the iPhone and this new Model 3, as a proxy for answering how similar cars and phones have become.

BUTTONS BE DAMNED


Nokia 9500 Communicator
Nokia 9500 Communicator

Before the iPhone, phones had as strong an affinity to physical keyboards as laptops still do. The first Android prototypes basically looked like BlackBerrys, and the most advanced smartphones from Nokia (like the 9500 Communicator above) were awkward attempts at marrying the familiar with the new. That stage of evolution is where we find ourselves with car interfaces today: embracing new technology and touch interaction, but only partially. Audi’s latest A8 luxury sedan is a good example of the trepidatious transition away from the traditional button interfaces. Like Nokia before it, Audi is obviously struggling to abandon buttons entirely.
Tesla’s Model 3 is as clean a departure from buttons as the original iPhone was. One touchscreen, all your information and interactions on it. You’ll be adjusting everything, right down to the wing mirrors, via that display, though Tesla retains a couple of basic physical controls on the steering wheel just as Apple did with the iPhone’s home button. In essence, the Model 3 turns the car’s entire human interface into software. It’s alien to us as a car interior, just as it was once alien as a phone interface — how do you speed-dial anyone without buttons — but Tesla is betting that we’ll adapt to it over time just as we did with phones.

MAKING TECHNOLOGY MORE AFFORDABLE

It may seem perverse to allege that the iPhone, which has always been presented and perceived as a luxe phone purchase, has been a democratizing device. But if you think of it as lowering the price of an Apple computer from the MacBook’s four figures down to three, then it has indeed widened access to the latest technology. Still an expensive purchase for many, but much less so than previously.
That’s the position of the Tesla Model 3 today: it’s not the cheapest or most practical car you can purchase, but it brings Tesla’s advanced technologies like Autopilot down to their lowest price. Another similarity: both the iPhone and the Model 3 started rolling out slowly and with very limited initial quantities. That might seem coincidental, but it may also be read as evidence of how aggressive each company has been in pushing its technology to the masses.

OVER-THE AIR SOFTWARE UPDATES

This isn’t solely a Model 3 feature, but Tesla has pioneered over-the-air (OTA) updates to its cars much in the same way that Apple made OTAs a feature of iPhone ownership. Phone and car software both used to be static, unchanging things, but with faster innovation, fast updates are required. What is novel about the Model 3 is that it streamlines the software even further by limiting itself to the one screen. Outside of fantastical concepts, this is the closest that a car’s interface has gotten to the single-screen software environment we know from PCs and their mobile counterparts. Imagine how much easier iterating on the Model 3’s user interface will be: software designers will only have to code for one screen instead of the usual multiplicity of screens and physical controls inside cars.

Simplification isn’t easy, and Tesla is setting itself a non-trivial challenge in trying to create software equivalents for all the various buttons and dials scattered across a typical car’s interior. But in standardizing around this one display and a consistent hardware platform, the company can refine and improve its offering as fast as any mobile operating system can. This is the truest application of smartphone software development to cars that we’ve yet seen.

CHARISMATIC SALESMAN CEO

Elon Musk at the Tesla Model 3 launch
Elon Musk at the Tesla Model 3 launch
 Photo by Lauren Goode / The Verge

Tesla CEO Elon Musk has a sometimes-goofy presentation style that’s a million miles from Steve Jobs’ polished sales pitch, but it’s undeniable that both have been massively influential ambassadors for their brands. Musk has made Tesla cool, he’s made it a talking point in general conversation. Even with its so far limited sales, Tesla has grown to be a byword for electric vehicles as a whole, much in the same way as the iPhone has been for the smartphone category. BMW, Nissan, and many others are also making EVs, but it’s only with Tesla that you can say “I’m getting a Tesla” and need to explain nothing more beyond that.
The Model 3, and Tesla as a company, finds itself in a very decisive, precarious moment. The company needs to have the faith of its customers as it works to fulfill orders (and overcome any unforeseen stumbles that may arise) and that’s where a charismatic leader can be very helpful. Before Tesla is able to deliver actual cars to people, all it can sell them is a vision, and the 325,000 initial preorders for the Model 3 have shown that Musk is as capable of doing that as Apple’s Jobs was.

POTENTIAL TO CHANGE THE WORLD

Silicon Valley businesses can often seem smug and self-aggrandizing, however they do have a record of producing things that have been culturally and socially transformative on a global scale. Take your pick from the iPhone, Google search, Facebook, or the original silicon chips that gave the area its nickname. It’s no overstatement, then, to say that the Model 3 “could be Tesla’s iPhone moment,” as Recode’s Johana Bhuiyan argues. It could be the new mass-market product that overhauls the entire category it’s entering and resets expectations.
At first, exactly as with the iPhone, the Model 3 is only resetting the interface paradigm by dispatching the buttons in favor of a streamlined touch UI. What we see today is the foundation for what Musk and his team at Tesla want to achieve: the future they envision is one where you wouldn’t worry about being distracted from driving because you wouldn’t have to drive. And when you do choose to put your hands on the wheel, voice controls and automated settings would keep the need for visual distractions to a minimum. All those are things that Tesla would look to develop over the longer history of the Model 3, much as Apple’s most transformative changes — the App Store and the iSight camera — came in the years after the initial iPhone launch.
It’s certainly too early to know if Tesla will succeed, but if it does, it will be because of the Model 3. Like the iPhone before it, this car breaks with most of the conventions of its category and opts for a distinctly technological approach to a product that has until now been mostly defined by its mechanical qualities.

Friday, July 28, 2017

Forget the Sci-Fi and Embrace the Engineering: Making AI Work for You

From Mark Burnett @ BearingPoint: Artificial Intelligence (AI) is an increasingly essential component in many products and services. If its not in your products and services, it may well be in your competitors. There are lots of kinds of AI and even more ways of applying it to business and technical problems.

This paper on Artificial Intelligence gives a practical assessment of the state of development of AI and Machine Learning along with examples of its use and practical suggestions for what you need to consider if you want to use AI to enhance your business, products or services.

Advances in computer power, elastic cloud and the ability to quickly deploy thousands of compute instances running neural nets and other kinds of machine learning on big data cost effectively offers huge potential for automation, prediction, and generation of insights from patterns in the data that humans fail to see.

This is a paper for those wanting to find a way to make a difference now and as such, it encourages visionaries and solution designers to forget the sci-fi Utopian view of AI as a general human level intelligence for now and start by embracing the engineering problems of matching the various kinds of AI to the business problems and jobs-to-be-done they are suited for.

This is a call to tool-up, exploit the cloud, understand the different AI frameworks and platforms, and bring in the knowledge and expertise to build the right kinds of AI/ML/cognitive computing to solve business problems in practical future-proof ways that create competitive advantage from the outset.



Thursday, July 27, 2017

Low-Quality Lidar Will Keep Self-Driving Cars in the Slow Lane

As reported by MIT Technology Review: The race to build mass-market autonomous cars is creating big demand for laser sensors that help vehicles map their surroundings. But cheaper versions of the hardware currently used in experimental self-driving vehicles may not deliver the quality of data required for driving at highway speeds.
Most driverless cars make use of lidar sensors, which bounce laser beams off nearby objects to create 3-D maps of their surroundings. Lidar can provide better-quality data than radar and is superior to optical cameras because it is unaffected by variations in ambient light. You’ve probably seen the best-known example of a lidar sensor, produced by market leader Velodyne. It looks like a spinning coffee can perched atop cars developed by the likes of Waymo and Uber.
But not all lidar sensors are created equal. Velodyne, for example, has a range of offerings. Its high-end model is an $80,000 behemoth called HDL-64E—this is the one that looks a lot like a coffee can. It spits 64 laser beams, one atop the other. Each beam is separated by an angle of 0.4° (smaller angles between beams equal higher resolution), with a range of 120 meters. At the other end the firm sells the smaller Puck for $8,000. This sensor uses 16 beams of light, each separated by 2.0°, and has a range of 100 meters.
To see what those numbers mean, look at the video below. It shows raw data from the HDL-64E at the top, and the Puck at the bottom. The expensive sensor’s 64 horizontal lines render the scene in detail, while the image produced by its cheaper sibling makes it harder to spot objects until they’re much closer to the car. While both sensors nominally have a similar range, the lower resolution of the Puck makes it less useful for obstacles until they are much closer to the vehicle.

At 70 miles per hour, spotting an object at, say, 60 meters out provides two seconds to react. But when traveling at that speed, it can take 100 meters to slow to a stop. A useful range of somewhere closer to 200 meters is a better target to shoot for to make autonomous cars truly safe.
That’s where cost comes in. Even an $8,000 sensor would be a huge problem for any automaker looking to build a self-driving car that a normal person could afford. Because of this, many sensor makers are readying new kinds of solid-state lidar devices, which use an array of tiny antennas to steer a laser beam electronically instead of mechanically. These devices promise to be easier to manufacture at scale and cheaper than their mechanical brethren. That would make them a palatable option for car companies, many of which are looking to build autonomous cars for the mass market as soon as 2021.
But some of these new solid-state devices may currently lack the fidelity required for self-driving cars to operate safely and reliably at highway speeds.
The French auto parts maker Valeo, for example, claims to have built what it says is the world’s first laser scanner for cars that’s ready for high-volume production, the SCALA. It features four lines of data with an angular resolution of 0.8°Automotive News previously reported that Valeo will provide the lidar sensor used in the new Audi A8, though at the time of writing Audi declined to confirm this and Valeo didn’t respond to a request for details. The new A8 is the first production car to feature lidar and can drive itself—but only in heavy traffic at speeds less than 37 miles per hour.
In June, Graeme Smith, chief executive of the Oxford University autonomous driving spinoff Oxbotica, told MIT Technology Review that he thinks a trade-off between data quality and affordability in the lidar sector might affect the rate at which high-speed autonomous vehicles take to the roads. “Low-speed applications may be more affordable more quickly than higher-speed ones,” he explained. “If you want a laser that’s operating over 250 meters, you need a finely calibrated laser. If you’re working in a lower-speed environment and can get by with 15 meters’ range, then you can afford [to use] a much lower-cost sensor.”
Austin Russell, the CEO of lidar startup Luminar, says his company actively chose not to use solid-state hardware in its sensors, because it believes that while mechanically steering a beam is more expensive, it currently provides more finely detailed images that are critical for safe driving. “It doesn't matter how much machine-learning magic you throw at a couple of points [on an object], you can’t know what it is,” he says. “If you only see a target out at 30 meters or so, at freeway speeds that’s a fraction of a second.”
The standard of solid-state devices available for use in vehicles is likely to improve over time, of course. LeddarTech, for instance, is a Canadian firm based in Quebec that specializes in solid-state devices and is producing reference designs that auto parts makers will then use as a model to produce hardware at scale. The firm’s Luc Langlois says that one of its designs, estimated to cost a car company around $75 to produce, will feature either eight or 16 lines and be available in December 2018. A higher-resolution version, with 64 lines and estimated to cost around $100, will follow about a year later.
For its part, Velodyne has promised to build a solid-state lidar device, which John Eggert, director of automotive sales and marketing, says will use 32 laser lines and boast a range of 200 meters—though he won’t elaborate on the resolution provided by the hardware. And Israeli startup Innoviz Technologies claims to be making a $100 unit with a range of 200 meters and an angular resolution of 0.1°. Both firms have promised to put those sensors into production sometime in 2018, though the scale of production and availability remain unknown. Quanergy, a Silicon Valley startup, is building its own $250 solid-state device due to go into production later this year, but at the time of this writing did not respond to multiple requests for detailed specifications.
Oxbotica’s Smith thinks that automakers might just have to wait it out for a cheap sensor that offers the resolution required for high-speed driving. “It will be like camera sensors,” he says. “When we first had camera phones, they were kind of basic cameras. And then we got to a certain point where nobody really cared anymore because there was a finite limit to the human eye.” Makers of autonomous cars might find that lidar sensor performance levels out, too—eventually.

Monday, July 24, 2017

This Image Is Why Self-Driving Cars Come Loaded with Many Types of Sensors

As reported by MIT Technology Review: Autonomous cars often proudly claim to be fitted with a long list of sensors—cameras, ultrasound, radar, LIDAR, you name it. But if you’ve ever wondered why so many sensors are required, look no further than this picture.

You’re looking at what’s known in the autonomous-car industry as an “edge case”—a situation where a vehicle might have behaved unpredictably because its software processed an unusual scenario differently from the way a human would. In this example, image-recognition software applied to data from a regular camera has been fooled into thinking that images of cyclists on the back of a van are genuine human cyclists.

This particular blind spot was identified by researchers at Cognata, a firm that builds software simulators—essentially, highly detailed and programmable computer games—in which automakers can test autonomous-driving algorithms. That allows them to throw these kinds of edge cases at vehicles until they can work out how to deal with them, without risking an accident.

Most autonomous cars overcome issues like the baffling image by using different types of sensing. “LIDAR cannot sense glass, radar senses mainly metal, and the camera can be fooled by images,” explains Danny Atsmon, the CEO of Cognata. “Each of the sensors used in autonomous driving comes to solve another part of the sensing challenge.” By gradually figuring out which data can be used to correctly deal with particular edge cases—either in simulation or in real life—the cars can learn to deal with more complex situations.


Tesla was criticized for its decision to use only radar, camera, and ultrasound sensors to provide data for its Autopilot system after one of its vehicles failed to discern a truck trailer from a bright sky and ran into it, killing the driver of the Tesla. Critics argue that LIDAR is an essential element in the sensor mix—it works well in low light and glare, unlike a camera, and provides more detailed data than radar or ultrasound. But as Atsmon points out, even LIDAR isn’t without its flaws: it can’t tell the difference between a red and green traffic signal, for example.

The safest bet, then, is for automakers to use an array of sensors, in order to build redundancy into their systems. Cyclists, at least, will thank them for it.