Search This Blog

Wednesday, August 16, 2017

Tesla's Upcoming Electric Semi Truck Will Be Able to Drive Itself

Autopilot, big rig style.
As reported by The Verge: Tesla CEO Elon Musk has been teasing an electric semi truck for a while now, ahead of an official unveiling this fall. But a report in Reuters adds a new, if somewhat unsurprising, wrinkle to the mix: the Tesla big rig is probably going to have self-driving capabilities.
Reuters has seen emails between Tesla and the Nevada DMV where the two sides discussed “potential road tests” of the truck’s self-driving capabilities. The information also apparently describes Tesla’s desire to create long-haul electric semis that can drive themselves in “platoons,” potentially following behind a lead truck piloted by a human driver.
The idea that Tesla is working on incorporating self-driving technology into its upcoming semi truck falls in line with how aggressive the company has been at building the same tech into its consumer cars. Tesla offers semi-autonomous features on all of its current models in the form of Autopilot, which costs an additional $5,000 at the time of purchase. It also offers a $3,000 “full self-driving” option, which the company says will be activated once the software is ready. (Tesla claims that all of its cars are already equipped with the hardware necessary for full autonomy.)
What is surprising is that Tesla appears ready to test this technology. The state of Nevada is a likely partner, as it’s one of the few in the country that actually gives out licenses for autonomous vehicle testing. It was also the first state to allow self-driving big rigs to test in 2015 when Daimler acquired two AV licenses for its own Freightliner Inspiration Truck. Volvo is working on adding autonomous capabilities to its own trucking fleet, too.
Autonomy is also a common theme for the Silicon Valley companies that have dipped into the trucking world. In 2016, Uber acquired self-driving truck company Otto, which was led by a former high-profile employee of Google’s own self-driving project. (That employee is now currently at the center of a legal battle between Google’s parent company and Uber.) And Waymo, the company that blossomed out of Google’s self-driving car project, is working on its own self-driving truck program.
Tesla is planning an official reveal of the semi truck in September, so that’s when we’ll likely learn just how far the company wants to push this new part of its self-driving ambitions.

Monday, August 7, 2017

You Can Confuse Self-Driving Cars by Altering Street Signs

As reported by Engadget: While car makers and regulators are mostly worried about the possibility of self-driving car hacks, University of Washington researchers are concerned about a more practical threat: defacing street signs. They've learned that it's relatively easy to throw off an autonomous vehicle's image recognition system by strategically using stickers to alter street signs. If attackers know how a car classifies the objects it sees (such as target photos of signs), they can generate stickers that can trick the car into believing a sign really means something else. For instance, the "love/hate" graphics above made a computer vision algorithm believe a stop sign was really a speed limit notice.

It's easy to see the potential problems. You could make these stickers using a printer at home, so anyone from dedicated attackers to pranksters could try this. It might lead to a crash the moment someone alters the sign, but it could also produce long-term chaos -- picture your city closing a road until maintenance crews can scrape the stickers off a sign.

There are ways to fight this. The research team suggests using contextual information to verify that a sign is accurate. Why would you have a stop sign on the highway, or a high speed limit on a back road? We'd add that local governments could also install signs that use an anti-stick material, or put them out of reach. Whatever happens, something will have to change if passengers are going to trust self-driving cars' sign-reading abilities.

Of course another alternative would be to teach the AI systems to recognize sarcasm when they see it.

Thursday, August 3, 2017

Lockheed to Build $350M Satellite Factory Near Denver

As reported by the Gazette: Aerospace giant Lockheed Martin Corp. said Wednesday it plans to spend $350 million building a production facility for satellites at its Waterton Canyon campus in the Denver area that is designed to boost production capacity, reduce costs and accommodate a growing workforce.
Construction is scheduled to start by the end of the month and be completed in 2020 on a 266,000 square-foot building named Gateway Center that will be one of the largest production facilities on the campus. The facility will be large enough for the company to build five of its flagship A2100 satellites at the same time.
The new project is also designed to reassert the company in a sector that has become a lot more competitive, so-called microsatellites, which are often just a few meters wide.
U.S. Air Force officials talked in April at the Space Symposium in Colorado Springs about smaller, Swiss-Army-Knife-style microsatellites with multiple capabilities that can be quickly rocketed to the heavens to replace bigger satellites damaged in war.
Space "is a big place, but we plan to be a leader in it," said Rick Ambrose, executive vice president for Lockheed Martin's space systems business. "We're trying to develop a capability to handle any direction the market moves."
Plans for the new facility call for manufacturing, assembling and testing centers that are large enough to keep multiple production lines running at the same time, allowing the facility to produce large satellites as it is churning out smaller ones.
Lockheed executives are hoping that that will allow them to compete in the small-satellite market at the same time that they keep legacy production lines going, all under one roof.
By putting assembly and testing under the same roof, employees can simply roll a satellite down the hall versus moving it between buildings for testing. By taking the steps out of packing the satellite, moving it and unpacking it, the company will cut testing time from about two days to under an hour.
The building also will include a thermal vacuum chamber, meant to simulate the harsh environment of space, that will be used to test fully assembled satellites before they are attached to rockets. The facility is to include a stock of 3-D printers to produce the satellites' simpler components.
The construction project will create 1,500 jobs over three years. Lockheed Martin currently has about 8,600 employees in Colorado, including 4,000 at the Waterton Canyon complex. The company has added more than 750 jobs to its Colorado workforce since 2014 and has 350 job openings in Denver. In April, the Colorado Economic Development Commission approved $12.6 million in tax credits for the company in return for bringing 550 new jobs to Jefferson County over eight years.
The project will bolster Colorado's reputation as a leader in the space industry. Colorado has the nation's second-largest aerospace economy, employing 188,280 people and more than 400 companies, with a $3.4 billion annual payroll, according to the Colorado Space Coalition. Gov. John Hickenlooper said at the April Space Symposium that the state has ambitions to take over the top space economy ranking.
Besides Lockheed, seven other top aerospace contractors have significant operations in the state, including Boeing and Raytheon. Colorado is also home to the Air Force Space Command in Colorado Springs and the University of Colorado's Laboratory for Atmospheric and Space Physics in Boulder.
Lockheed has played a role in space operations since the 1950s, when its rockets were used to launch some of the first American satellites into orbit. Satellites are still a critical part of the company's business, and the project is designed to reassert the company in a sector that has become a lot more competitive.
Lockheed's business is being challenged by a cadre of well-heeled newcomers that have pledged to populate the lowest rung of Earth's orbit with microsatellites.
The global telecommunications investment firm SoftBank is spending $1.7 billion to combine Luxembourg-based Intelsat with Richard Branson-backed OneWeb, creating a satellite behemoth that seems singularly focused on smaller, cheaper models. Elon Musk's SpaceX is working to launch 4,425 small satellites into low Earth orbit by 2024.
These competitors contend that their microsatellites can provide a better signal in more places.
Then there is the Air Force's interest.
Air Force Space Command has been eyeing small satellites for years and more recently has taken serious steps to integrate them into future war plans.
While less capable than the school-bus sized spacecraft the military now has in orbit, the diminutive birds are cheaper to build and faster to launch than their complicated cousins.
With nations including Russia, China, Iran and North Korea developing or possessing anti-satellite capabilities, the small satellites could be used as stopgap measures to fill in for larger spacecraft damaged or destroyed in a war that reaches orbit, Space Command boss Gen. Jay Raymond has said.
It remains to be seen how Lockheed will fare. It specializes in taking on the kind of big projects that few companies have the scale to complete. The satellites Lockheed makes circle the planet much farther outside Earth's atmosphere, giving them a broader view of planet's surface. That means they have to be much more powerful, entailing longer production times and more expensive components.

Wednesday, August 2, 2017

What are the Scary Ripple Effects of Self-Driving Vehicles?

As reported by ReadWrite: By 2040, we think fifth-level, autonomy-enabled mobility will be available as a service for the majority of the transportation needs of urban consumers. In other words, 70% of the urban population wouldn’t need to own cars because they would be available on-demand through their favorite app.

Although several power players like General Motors and Ford are promising fully autonomous cars sooner, there are many technological and large-scale regulatory and consumer sentiment issues with autonomous cars that would need to be addressed before they could fulfill our transportation needs.

But imagine the world in 2040, when most of the population doesn’t need cars. Everything from shopping to commuting to long distance road trips will be addressed by fully autonomous vehicles that you can summon with the push of a button. 

Rethinking today’s products and services in the 2040 world

Once you accept that basic premise, it is surprising to see how so many of our current products and services do not fit well with the world of the fully autonomous car. When we shift from complete ownership to an on-demand leasing model, automotive OEMs and their supply chains (Tier 1 and Tier 2) will be the first ones affected. Several of them are thinking of strategies to grapple with this eventual reality, but this is just the tip of the iceberg. A significant number of multibillion dollar companies operate around consumers owning at least one to two cars per family. All these products and services will either become obsolete or have to be fundamentally rethought.

15 minutes for an insurance product we don’t need?

Let us start with the way we buy cars — at dealerships. Once consumers stop buying cars, there is no need for dealership networks. They are merely a distribution channel that will be replaced by an app that we use to hail our cars.

Then, there’s insurance; we get insurance as soon as we buy a car. Because we will no longer own cars, there is no need to spend 15 minutes to save on car insurance. Accidents because of systemic failures like poor connectivity or algorithmic edge cases are inevitable, but the massive reduction in the frequency of accidents coupled with insurance being bundled into the mobility service will result in an enormous reduction in revenue for the insurance industry.

Car loans, which finance our car ownership, will also be out. In the 2040 world, mobility service providers will end up owning most of the cars. Even if you assume one or two ride hailing apps will dominate the market, there are economies of scale in owning and operating large autonomous car networks. As a result, we believe the ultimate owner of the cars will be large businesses, not individual car owners who would lease their cars on ride sharing platforms. These large fleet owners would have a low cost of capital with which to finance their vehicle purchases. Through both a massive reduction in the number of cars and the APR they can charge for each car loan, auto loan providers will see a huge downsizing of their market.

First stop on the road to oblivion: gas stations

Once we buy our cars, we spend a lot of money on it. Let us think about these products and their relevance in the 2040 world.

The first and most obvious car-related expense is gas. In a world where we have autonomous car fleets operating, we don’t need as many gas stations. A few large gas stations far away from high-density areas and perhaps operated, once again, by large fleet owners should suffice. These cars would essentially act like public transportation vehicles, which are refueled at the end of each day at a central location. A fewer number of miles traveled because we will be using ride sharing services, with more fuel efficient cars will certainly be a cause for concern for oil companies.

Parking is another common expense with a poor consumer experience. The omnipotence of autonomous ride-sharing cars will lead us to have fewer parking lots away from high-density areas where mobility service providers can park their automotive fleets when demand is low. Close to half of our urban areas are dedicated to parking, and therefore, a massive reduction in parking lots will be a boon to our cities. 

Every big technological change has massive unforeseeable consequences

At conception, the Internet was a platform only for email. But its unintended consequences ended up creating companies like Amazon, Google, and Facebook, which have changed the way we buy and consume media. In the same vein, we believe autonomous mobility will fundamentally change the way we live in our citie1s. So much of our urban real estate is tied to car ownership model of today: parking lots, gas stations, dealerships.

No one can foresee the new products or services that could emerge from such surplus real estate becoming available or the effects it will have on our housing markets. Will people live much farther from cities because commuting in an autonomous car could be productive, or would urban housing become cheaper because we don’t need parking lots anymore?

And as for my personal favorite unintended consequence, short distance flights — would we rather take a flight from San Francisco to Los Angeles or would we prefer to get there in swankier-than-business-class autonomous vehicles that come with plush beds and Netflix? I am extremely confident that there are many more positive, albeit unintended, consequences of autonomous cars that will emerge en-route to 2040.

So why is this scary?

The 2040 world of autonomous mobility is scary because so many of today’s products and services would have to radically evolve to stay relevant. And it is not just the automotive OEMs that are in trouble — auto insurance companies, car loan providers, oil and gas companies, car dealerships, parking lot owners, and auto parts suppliers and stores are all on the chopping block. Just this foreseeable disruption alone is worth $2 trillion in terms of products and services we consume today. If these companies are affected, it will set off a chain reaction of problems for suppliers, which will trigger panic.

This $2 trillion will be reshuffled and distributed to consumers, new companies and incumbents. And that is scary. Some of the best incumbent players in all the industries highlighted above are already thinking ahead to prepare and adapt to new reality. But many others will likely not survive this disruption. That’s scary, too.

Preparing for the 2040 future now by partnering with startups and augmenting your organization with change makers that can imagine the future well before it has arrived is essential if incumbent companies want to survive in this changing landscape. But for the startup founders and venture capitalists involved in the industry, the evolution of autonomous cars is an enormous and exciting opportunity that has the potential to create multiple $1 billion technology companies. Uber and Lyft are just the beginning.

Tuesday, August 1, 2017

China: Keeping Track of Technology for Self-Driving Vehicles

As reported by the Shanghai Daily: The next generation of cars, already on the drawing boards, require considerable time, research and testing before they become commonplace on the roads.
To place itself at the vanguard of “smart cars,” Shanghai created a test ground known as the National Intelligent Connected Vehicle Shanghai Pilot Zone — a 100 square-kilometer site in the Jiading District that is the first of its kind in China.
The name is a bit of a mouthful, but it simply means a demonstration site where the latest innovations in smart car technology can be tested under real-life conditions.
It’s all part of China’s ambitious plan to become a world leader of a new generation of cars that operate on digital intelligence systems, including self-driving. The target to complete the plan is the end of 2025. It’s also part of the nation’s program to upgrade its prime industries to compete in a changing world.
The pilot zone in Shanghai, opened in June last year, has been a testing site for more than 10 major car makers and auto-parts companies, including SAIC, General Motors, Volvo, Ford, BMW, NIO and Delphi. The testing covers a range of technologies, including self-driving capability, sensor performance, lane adherence and high-definition map positioning.
“Companies are actively participating in the development of the pilot zone,” said Rong Wenwei, general manager of Shanghai International Automobile City, which oversees the zone. “What we are building is a platform for companies to accelerate development of the industry.”
Ford Motor Co, for one, is testing its driver-assisted technologies, including left turn assist and traffic light optimal speed advisory. The US company has said it hopes both features will figure in its next-generation vehicles.
“The Shanghai International Automobile City provides a setting where we are able to develop, test and refine future connected vehicle technologies,” said Trevor Worthington, vice president of product development at Ford Asia Pacific.
Left turn assist is a feature that uses information exchanged between vehicles to alert drivers of oncoming traffic when making a left turn.
Traffic light optimal speed advisory technology connects a vehicle to road infrastructure, informing a driver of the best speed to reduce the waiting time at stoplights by monitoring data from roadway devices. Ford said that the advisory could help drivers avoid red lights, thus reducing travel time by up to 20 percent.
“Imagine your daily commute with less waiting time for red lights,” said Thomas Lukaszewicz, manager of automated driving for Europe and China at Ford. “Vehicle-to-infrastructure technology under development holds great promise to make commuting smoother and less time consuming.”
Delphi, the global auto parts supplier, has demonstrated its self-driving technologies in the zone. The company applied nine radar units and cameras in the autonomous car.
The radar is used to capture information about the surrounding environment of a vehicle. The camera is used to capture the road conditions in front of the vehicle.
“Chinese consumers are willing to embrace new solutions especially Internet and tech-based autonomous driving, ” said David Paja, president of electronics and safety at Delphi.
Delphi said it has achieved rapid progress on advanced driver assistance technology in China in the past five years.
“We are actively in discussion with several original equipment manufacturers in local market,” said Frank Wang, president of electronics and safety business for Asia Pacific at Delphi.
In addition to the vehicle testing, the pilot zone is a useful tool in raising public awareness about what’s coming next in the driving experience. An area of the zone has been set aside for public education. It offers a 70-minute tour on what it will be like to drive intelligent, connected cars. Visitors can also take rides in self-driving cars, including the Tesla Model X, Volvo XC90 and Volvo S90.
“If I get the chance to visit the pilot zone, I want to take a ride in a self-driving car,” said Wang Xing, a student from Tongji University. “I am curious about autonomous driving technology and want to see how it works.”
The pilot zone has itself a series of goals for the next three years. By 2020, the zone plans to host more than 1,000 intelligent connected vehicles.
The zone also aims to shoulder at least five national major projects and develop more than 10 industry standards related to intelligent and connected vehicles. It is on track to recruit more than 100 top professionals and serve as an incubator to 30 innovative startups in the three-year period.
As a hub of domestic research and development, it also plans to strengthen work relationships among companies, universities and institutions.
Shanghai International Automobile City said it has signed agreements with more than 20 companies to conduct 100 research projects. It also plans to hold more than 50 professional seminars on intelligent and connected vehicles this year, focusing on testing technologies, standards and big data.

Monday, July 31, 2017

Self-Driving Car Demo is the First to Cross the US-Canada Border

As reported by Engadget: As a rule, self-driving car tests tend to be limited to the country where they started. But that's not how people drive -- what happens when your autonomous vehicle crosses the border? Continental and Magna plan to find out. They're planning to pilot two driverless vehicles all the way from southeastern Michigan to Sarnia, Ontario, making this the first cross-border test of its kind. The machines won't be in complete control for the entire route, but they'll use a combination of cameras, LiDAR and radar to take over when they can, including two key border crossings (the Detroit-Windsor Tunnel and the Blue Water Bridge).

This isn't the first autonomous driving-related agreement involving Michigan and Ontario, but it's an important one: it'll explore rules and regulations in addition to the usual self-driving data collection.

As you might guess, tests like this will be vital to making autonomy a practical reality. Driverless vehicles need to know how to adapt to changing road rules, such as different signage and units of measurement. While this isn't the greatest challenge, it has to be overcome if you're ever going to embark on cross-border shopping trips without touching your steering wheel.

Tesla’s Model 3 and Apple’s iPhone Have a Few Things in Common

As reported by The Verge: The question of whether, and to what extent, cars are like phones has been gently bubbling along over the past few years as we’ve watched the nexus of innovation shifting from the technology we carry in our pocket to that which carries us along the roads. It’s obvious now that cars will experience transformative change like phones did before them, but how many parallels between the two are really there?
If you want to see a company doing its utmost to reduce the complexities of a car down to a familiar phone-like interface, you need look no further than Tesla and its new Model 3. This is the most affordable electric car in Tesla’s stable and it has the most aggressively stripped-down interior — from any manufacturer. There’s a 15-inch touchscreen in the middle of the dash and a couple of buttons on the steering wheel and that’s it. Given how Apple’s iPhone was the phone that made this “one touchscreen to rule them all” interface paradigm familiar in the first place, I thought it’d be fitting to look at the similarities between the iPhone and this new Model 3, as a proxy for answering how similar cars and phones have become.


Nokia 9500 Communicator
Nokia 9500 Communicator

Before the iPhone, phones had as strong an affinity to physical keyboards as laptops still do. The first Android prototypes basically looked like BlackBerrys, and the most advanced smartphones from Nokia (like the 9500 Communicator above) were awkward attempts at marrying the familiar with the new. That stage of evolution is where we find ourselves with car interfaces today: embracing new technology and touch interaction, but only partially. Audi’s latest A8 luxury sedan is a good example of the trepidatious transition away from the traditional button interfaces. Like Nokia before it, Audi is obviously struggling to abandon buttons entirely.
Tesla’s Model 3 is as clean a departure from buttons as the original iPhone was. One touchscreen, all your information and interactions on it. You’ll be adjusting everything, right down to the wing mirrors, via that display, though Tesla retains a couple of basic physical controls on the steering wheel just as Apple did with the iPhone’s home button. In essence, the Model 3 turns the car’s entire human interface into software. It’s alien to us as a car interior, just as it was once alien as a phone interface — how do you speed-dial anyone without buttons — but Tesla is betting that we’ll adapt to it over time just as we did with phones.


It may seem perverse to allege that the iPhone, which has always been presented and perceived as a luxe phone purchase, has been a democratizing device. But if you think of it as lowering the price of an Apple computer from the MacBook’s four figures down to three, then it has indeed widened access to the latest technology. Still an expensive purchase for many, but much less so than previously.
That’s the position of the Tesla Model 3 today: it’s not the cheapest or most practical car you can purchase, but it brings Tesla’s advanced technologies like Autopilot down to their lowest price. Another similarity: both the iPhone and the Model 3 started rolling out slowly and with very limited initial quantities. That might seem coincidental, but it may also be read as evidence of how aggressive each company has been in pushing its technology to the masses.


This isn’t solely a Model 3 feature, but Tesla has pioneered over-the-air (OTA) updates to its cars much in the same way that Apple made OTAs a feature of iPhone ownership. Phone and car software both used to be static, unchanging things, but with faster innovation, fast updates are required. What is novel about the Model 3 is that it streamlines the software even further by limiting itself to the one screen. Outside of fantastical concepts, this is the closest that a car’s interface has gotten to the single-screen software environment we know from PCs and their mobile counterparts. Imagine how much easier iterating on the Model 3’s user interface will be: software designers will only have to code for one screen instead of the usual multiplicity of screens and physical controls inside cars.

Simplification isn’t easy, and Tesla is setting itself a non-trivial challenge in trying to create software equivalents for all the various buttons and dials scattered across a typical car’s interior. But in standardizing around this one display and a consistent hardware platform, the company can refine and improve its offering as fast as any mobile operating system can. This is the truest application of smartphone software development to cars that we’ve yet seen.


Elon Musk at the Tesla Model 3 launch
Elon Musk at the Tesla Model 3 launch
 Photo by Lauren Goode / The Verge

Tesla CEO Elon Musk has a sometimes-goofy presentation style that’s a million miles from Steve Jobs’ polished sales pitch, but it’s undeniable that both have been massively influential ambassadors for their brands. Musk has made Tesla cool, he’s made it a talking point in general conversation. Even with its so far limited sales, Tesla has grown to be a byword for electric vehicles as a whole, much in the same way as the iPhone has been for the smartphone category. BMW, Nissan, and many others are also making EVs, but it’s only with Tesla that you can say “I’m getting a Tesla” and need to explain nothing more beyond that.
The Model 3, and Tesla as a company, finds itself in a very decisive, precarious moment. The company needs to have the faith of its customers as it works to fulfill orders (and overcome any unforeseen stumbles that may arise) and that’s where a charismatic leader can be very helpful. Before Tesla is able to deliver actual cars to people, all it can sell them is a vision, and the 325,000 initial preorders for the Model 3 have shown that Musk is as capable of doing that as Apple’s Jobs was.


Silicon Valley businesses can often seem smug and self-aggrandizing, however they do have a record of producing things that have been culturally and socially transformative on a global scale. Take your pick from the iPhone, Google search, Facebook, or the original silicon chips that gave the area its nickname. It’s no overstatement, then, to say that the Model 3 “could be Tesla’s iPhone moment,” as Recode’s Johana Bhuiyan argues. It could be the new mass-market product that overhauls the entire category it’s entering and resets expectations.
At first, exactly as with the iPhone, the Model 3 is only resetting the interface paradigm by dispatching the buttons in favor of a streamlined touch UI. What we see today is the foundation for what Musk and his team at Tesla want to achieve: the future they envision is one where you wouldn’t worry about being distracted from driving because you wouldn’t have to drive. And when you do choose to put your hands on the wheel, voice controls and automated settings would keep the need for visual distractions to a minimum. All those are things that Tesla would look to develop over the longer history of the Model 3, much as Apple’s most transformative changes — the App Store and the iSight camera — came in the years after the initial iPhone launch.
It’s certainly too early to know if Tesla will succeed, but if it does, it will be because of the Model 3. Like the iPhone before it, this car breaks with most of the conventions of its category and opts for a distinctly technological approach to a product that has until now been mostly defined by its mechanical qualities.

Friday, July 28, 2017

Forget the Sci-Fi and Embrace the Engineering: Making AI Work for You

From Mark Burnett @ BearingPoint: Artificial Intelligence (AI) is an increasingly essential component in many products and services. If its not in your products and services, it may well be in your competitors. There are lots of kinds of AI and even more ways of applying it to business and technical problems.

This paper on Artificial Intelligence gives a practical assessment of the state of development of AI and Machine Learning along with examples of its use and practical suggestions for what you need to consider if you want to use AI to enhance your business, products or services.

Advances in computer power, elastic cloud and the ability to quickly deploy thousands of compute instances running neural nets and other kinds of machine learning on big data cost effectively offers huge potential for automation, prediction, and generation of insights from patterns in the data that humans fail to see.

This is a paper for those wanting to find a way to make a difference now and as such, it encourages visionaries and solution designers to forget the sci-fi Utopian view of AI as a general human level intelligence for now and start by embracing the engineering problems of matching the various kinds of AI to the business problems and jobs-to-be-done they are suited for.

This is a call to tool-up, exploit the cloud, understand the different AI frameworks and platforms, and bring in the knowledge and expertise to build the right kinds of AI/ML/cognitive computing to solve business problems in practical future-proof ways that create competitive advantage from the outset.

Thursday, July 27, 2017

Low-Quality Lidar Will Keep Self-Driving Cars in the Slow Lane

As reported by MIT Technology Review: The race to build mass-market autonomous cars is creating big demand for laser sensors that help vehicles map their surroundings. But cheaper versions of the hardware currently used in experimental self-driving vehicles may not deliver the quality of data required for driving at highway speeds.
Most driverless cars make use of lidar sensors, which bounce laser beams off nearby objects to create 3-D maps of their surroundings. Lidar can provide better-quality data than radar and is superior to optical cameras because it is unaffected by variations in ambient light. You’ve probably seen the best-known example of a lidar sensor, produced by market leader Velodyne. It looks like a spinning coffee can perched atop cars developed by the likes of Waymo and Uber.
But not all lidar sensors are created equal. Velodyne, for example, has a range of offerings. Its high-end model is an $80,000 behemoth called HDL-64E—this is the one that looks a lot like a coffee can. It spits 64 laser beams, one atop the other. Each beam is separated by an angle of 0.4° (smaller angles between beams equal higher resolution), with a range of 120 meters. At the other end the firm sells the smaller Puck for $8,000. This sensor uses 16 beams of light, each separated by 2.0°, and has a range of 100 meters.
To see what those numbers mean, look at the video below. It shows raw data from the HDL-64E at the top, and the Puck at the bottom. The expensive sensor’s 64 horizontal lines render the scene in detail, while the image produced by its cheaper sibling makes it harder to spot objects until they’re much closer to the car. While both sensors nominally have a similar range, the lower resolution of the Puck makes it less useful for obstacles until they are much closer to the vehicle.

At 70 miles per hour, spotting an object at, say, 60 meters out provides two seconds to react. But when traveling at that speed, it can take 100 meters to slow to a stop. A useful range of somewhere closer to 200 meters is a better target to shoot for to make autonomous cars truly safe.
That’s where cost comes in. Even an $8,000 sensor would be a huge problem for any automaker looking to build a self-driving car that a normal person could afford. Because of this, many sensor makers are readying new kinds of solid-state lidar devices, which use an array of tiny antennas to steer a laser beam electronically instead of mechanically. These devices promise to be easier to manufacture at scale and cheaper than their mechanical brethren. That would make them a palatable option for car companies, many of which are looking to build autonomous cars for the mass market as soon as 2021.
But some of these new solid-state devices may currently lack the fidelity required for self-driving cars to operate safely and reliably at highway speeds.
The French auto parts maker Valeo, for example, claims to have built what it says is the world’s first laser scanner for cars that’s ready for high-volume production, the SCALA. It features four lines of data with an angular resolution of 0.8°Automotive News previously reported that Valeo will provide the lidar sensor used in the new Audi A8, though at the time of writing Audi declined to confirm this and Valeo didn’t respond to a request for details. The new A8 is the first production car to feature lidar and can drive itself—but only in heavy traffic at speeds less than 37 miles per hour.
In June, Graeme Smith, chief executive of the Oxford University autonomous driving spinoff Oxbotica, told MIT Technology Review that he thinks a trade-off between data quality and affordability in the lidar sector might affect the rate at which high-speed autonomous vehicles take to the roads. “Low-speed applications may be more affordable more quickly than higher-speed ones,” he explained. “If you want a laser that’s operating over 250 meters, you need a finely calibrated laser. If you’re working in a lower-speed environment and can get by with 15 meters’ range, then you can afford [to use] a much lower-cost sensor.”
Austin Russell, the CEO of lidar startup Luminar, says his company actively chose not to use solid-state hardware in its sensors, because it believes that while mechanically steering a beam is more expensive, it currently provides more finely detailed images that are critical for safe driving. “It doesn't matter how much machine-learning magic you throw at a couple of points [on an object], you can’t know what it is,” he says. “If you only see a target out at 30 meters or so, at freeway speeds that’s a fraction of a second.”
The standard of solid-state devices available for use in vehicles is likely to improve over time, of course. LeddarTech, for instance, is a Canadian firm based in Quebec that specializes in solid-state devices and is producing reference designs that auto parts makers will then use as a model to produce hardware at scale. The firm’s Luc Langlois says that one of its designs, estimated to cost a car company around $75 to produce, will feature either eight or 16 lines and be available in December 2018. A higher-resolution version, with 64 lines and estimated to cost around $100, will follow about a year later.
For its part, Velodyne has promised to build a solid-state lidar device, which John Eggert, director of automotive sales and marketing, says will use 32 laser lines and boast a range of 200 meters—though he won’t elaborate on the resolution provided by the hardware. And Israeli startup Innoviz Technologies claims to be making a $100 unit with a range of 200 meters and an angular resolution of 0.1°. Both firms have promised to put those sensors into production sometime in 2018, though the scale of production and availability remain unknown. Quanergy, a Silicon Valley startup, is building its own $250 solid-state device due to go into production later this year, but at the time of this writing did not respond to multiple requests for detailed specifications.
Oxbotica’s Smith thinks that automakers might just have to wait it out for a cheap sensor that offers the resolution required for high-speed driving. “It will be like camera sensors,” he says. “When we first had camera phones, they were kind of basic cameras. And then we got to a certain point where nobody really cared anymore because there was a finite limit to the human eye.” Makers of autonomous cars might find that lidar sensor performance levels out, too—eventually.