Search This Blog

Friday, March 20, 2015

Startup Promises Business Insights from Satellite Images Using 'Deep Learning' AI

Orbital Insight is using deep learning to find financially useful information in aerial imagery.  For instance, Orbital insight's software can identify objects such as crude oil containers as shown above.
As reported by MIT Technology Review: The next time you drive to Home Depot you may help a Wall Street firm decide whether it should invest in the company. A startup called Orbital Insight is using commercially available satellite imagery and machine learning to analyze the parking lots of 60 different retail chains to assess their performance.

Founder James Crawford expects images from above to provide all sorts of business intelligence. “We’re just starting to discover what can be done with this kind of large-scale data,” says the alum of both NASA and the Google project that digitized over 20 million books.

The shadows in such images can indicate the fullness of a container.
Interest in satellite imaging is growing, and the cost is coming down. Google snatched up the satellite-image-processing company Skybox Imaging last August, and today Google Ventures and other investors, including Sequoia and Bloomberg Beta, announced they had sunk $8.7 million into Crawford’s company.
Orbital Insight is using a promising new technique known as deep learning to find economic trends through satellite-image analysis. Deep learning uses a hierarchy of artificial “neurons” to learn to recognize patterns in data (see “Deep Learning”).

To predict retail sales based on retailers’ parking lots, humans at Orbital Insights use Google Street View images to pinpoint the exact location of the stores’ entrances. Satellite imagery is acquired from a number of commercial suppliers, some of it refreshed daily. Software then monitors the density of cars and the frequency with which they enter the lots.

Crawford’s company can also use shadows in a city to gather information on rates of construction, especially in secretive places like China. Satellite images could also predict oil yields before they’re officially reported because it’s possible to see how much crude oil is in a container from the height of its lid. Scanning the extent and effects of deforestation would be useful to both investors and environmental groups.

Over time, Orbital Insight’s software can identify trends and make predictions. “Then it’s not an image anymore—it’s some sort of measurement,” Crawford says.

That still leaves open the question of an unwanted eye in the sky. Not everyone likes the idea of being monitored as they run errands, and businesses may reject the idea of being watched from space. Crawford says satellites are already collecting this information—intelligence agencies have been using it for decades—and Orbital Insights is just making sense of the data.

“A satellite can cover every square inch of the earth every two weeks. You can’t stop that,” he says. “We don’t drive what imagery the satellite takes.”



Farmers of the Future Will Utilize Drones, Robots and GPS

As reported by Physics.org: Today's agriculture has transformed into a high-tech enterprise that most 20th-century farmers might barely recognize.

After all, it was only around 100 years ago that farming in the US transitioned from animal power to combustion engines. Over the past 20 years the (GPS), and other new tools have moved farming even further into a technological wonderland.

Beyond the now de rigeur air conditioning and stereo system, a modern large tractor's enclosed cabin includes computer displays indicating machine performance, position and operating characteristics of attached machinery like seed planters.

And as amazing as today's technologies are, they're just the beginning. Self-driving machinery and flying robots able to automatically survey and treat crops will become commonplace on farms that practice what's come to be called precision .

The ultimate purpose of all this high-tech gadgetry is optimization, from both an economic and an environmental standpoint. We only want to apply the optimal amount of any input (water, fertilizer, pesticide, fuel, labor) when and where it's needed to efficiently produce high crop yields.

Global positioning gives hyperlocal info
GPS provides accurate location information at any point on or near the earth's surface by calculating your distance from at least three orbiting satellites at once. So farming machines with GPS receivers are able to recognize their position within a farm field and adjust operation to maximize productivity or efficiency at that location.

Take the example of soil fertility. The farmer uses a GPS receiver to locate preselected field positions to collect soil samples. Then a lab analyzes the samples, and creates a fertility map in a geographic information system. That's essentially a computer database program adept at dealing with geographic data and mapping. Using the map, a farmer can then prescribe the amount of fertilizer for each field location that was sampled. Variable-rate technology (VRT) fertilizer applicators dispense just exactly the amount required across the field. This process is an example of what's come to be known as precision agriculture.

Info, analysis, tools
Precision agriculture requires three things to be successful. It needs site-specific information, which the soil-fertility map satisfies. It requires the ability to understand and make decisions based on that site-specific information. Decision-making is often aided by computer models that mathematically and statistically analyze relationships between variables like and the yield of the crop.

Finally, the farmer must have the physical tools to apply the management decisions. In the example, the GPS-enabled VRT fertilizer applicator serves this purpose by automatically adjusting its rate as appropriate for each field position. Other examples of precision agriculture involve varying the rate of planting seeds in the field according to soil type and using sensors to identify the presence of weeds, diseases, or insects so that pesticides can be applied only where needed.

Site-specific information goes far beyond maps of soil conditions and yield to include even satellite pictures that can indicate crop health across the field. Such remotely sensed images are also commonly collected from aircraft. Now unmanned aerial vehicles (UAVs, or drones) can collect highly detailed images of crop and field characteristics. These images, whether analyzed visually or by computer, show differences in the amount of reflected light that can then be related to plant health or soil type, for example. Clear crop-health differences in images – diseased areas appear much darker in this case – have been used to delineate the presence of cotton root rot, a devastating and persistent soilborne fungal disease. Once disease extent is identified in a field, future treatments can be applied only where the disease exists. Advantages of UAVs include relatively low cost per flight and high image detail, but the legal framework for their use in agriculture remains under development.

Let's automate
Automatic guidance, whereby a GPS-based system steers the tractor in a much more precise pattern than the driver is capable of is a tremendous success story. Safety concerns currently limit completely driverless capability to smaller machines. Fully autonomous or robotic field machines have begun to be employed in small-scale high profit-margin agriculture such as wine grapes, nursery plants and some fruits and vegetables.
Autonomous machines can replace people performing tedious tasks, such as hand-harvesting vegetables. They use sensor technologies, including that can detect things like location and size of stalks and leaves to inform their mechanical processes. Japan is a trend leader in this area. Typically, agriculture is performed on smaller fields and plots there, and the country is an innovator in robotics. But autonomous machines are becoming more evident in the US, particularly in California where much of the country's specialty crops are grown.

The development of flying robots gives rise to the possibility that most field-crop scouting currently done by humans could be replaced by UAVs with machine vision and hand-like grippers. Many scouting tasks, such as for insect pests, require someone to walk to distant locations in a field, grasp plant leaves on representative plants and turn them over to see the presence or absence of insects. Researchers are developing technologies to enable such flying robots to do this without human involvement.

Breeding + sensors + robots
High-throughput plant phenotyping (HTPP) is an up-and-coming precision agriculture technology at the intersection of genetics, sensors and robotics. It is used to develop new varieties or "lines" of a crop to improve characteristics such as nutritive content and drought and pest tolerance. HTPP employs multiple sensors to measure important physical characteristics of plants, such as height; leaf number, size, shape, angle, color, wilting; stalk thickness; number of fruiting positions. These are examples of phenotypic traits, the physical expression of what a plant's genes code for. Scientists can compare these measurements to already-known genetic markers for a particular plant variety.

The sensor combinations can very quickly measure phenotypic traits on thousands of plants on a regular basis, enabling breeders and geneticists to decide which varieties to include or exclude in further testing, tremendously speeding up further research to improve crops.
Agricultural production has come so far in even the past couple decades that it's hard to imagine what it will look like in a few more. But the pace of high-tech innovations in agriculture is only increasing. Don't be surprised if, 10 years from now, you drive down a rural highway and see a very small helicopter flying over a field, stopping to descend into the crop, use robotic grippers to manipulate leaves, cameras and machine vision to look for insects, and then rise back above the crop canopy and head toward its next scouting location. All with nary a human being in sight.

U.S. Regulators Give Amazon Go-Ahead for Drone Tests

As reported by Reuters: Amazon.com Inc has won approval from U.S. federal regulators to test a delivery drone outdoors, as the e-commerce company pursues its goal of sending packages to customers by air, even as it faces public concern about safety and privacy.

The Federal Aviation Administration said on Thursday it issued an experimental airworthiness certificate to an Amazon business unit and its prototype drone, allowing test flights over private, rural land in Washington state.

The FAA also granted Amazon an exemption from other flight restrictions so the experimental drone can conduct those flights.

The approval is a win for Seattle-based Amazon, the largest e-commerce company in the United States, and advances plans by the company and others to deliver packages using small, self-piloted aircraft.

There are limitations, however. The experimental certificate applies to a particular drone and Amazon must obtain a new certification if it modifies the aircraft or flies a different version, making it difficult to adapt the model quickly in the field. Amazon's petition for permission indicated it was testing several iterations of a drone at an indoor facility in Seattle.

Amazon must keep flights below 400 feet (120 meters) and keep the drone in sight, according to the FAA.

The company had asked for permission to fly at altitudes up to 500 feet (150 meters.)

The drone operators must have private pilot licenses and current medical certification. Amazon must supply monthly data to the regulators.

The company did not respond to requests for comment. Amazon public policy chief Paul Misener is set to testify at a congressional hearing on drones next Tuesday.

As part of Amazon Chief Executive Jeff Bezos' plan to deliver packages under a program dubbed "Prime Air," the company is developing drones that fly at speeds of 50 miles per hour (80 kph), operate autonomously and sense and avoid objects. Amazon also is working with NASA on an air-traffic management system for drones.

Amazon sought permission from the FAA to test drones in outdoor areas near Seattle, where one of its research and development labs is developing the technology. The company has conducted test flights outside the United States, in countries with looser restrictions.

In February, the FAA proposed long-awaited rules to try to set U.S. guidelines for drones, addressing growing interest from both individual and corporations in using unmanned aerial vehicles. The draft rules still must undergo public comment and revision before becoming final, which is expected to take at least a year.

Additionally, Amazon announced one-hour delivery services called 'Prime Now' in Baltimore and Miami.

The service will be available in select zip codes to Amazon Prime subscribers, who pay $99 a year for unlimited free two-day delivery on more than 20 million items. The one-hour service, available through the Prime Now mobile app, costs $7.99, while two-hour delivery will be free.

Amazon Prime's success has blown away the company's projections and "petrified" local and national retailers, said Howard Davidowitz, chairman of Davidowitz & Associates, a national retail consulting and investment banking firm headquartered in New York City.

"If you're a retailer and you're not scared of Amazon ... you should be," he said. "They are the change agent. They are leading the change in retail."

Davidowitz expects the Prime Now program to catch on rapidly in Baltimore the way it has in New York.

The service is made possible by the state-of-the-art fulfillment technology in Amazon's new 1 million-square-foot distribution center in Southeast Baltimore, at the site of the former General Motors plant on Broening Highway and a short drive from much of the city.

That facility will open in the next couple of weeks, said Amazon spokeswoman Kelly Cheeseman.

Thursday, March 19, 2015

Tesla's Musk Touts Self-Driving Car, Promises to end 'Range Anxiety'

As reported by the LA Times: New Tesla vehicles will soon be able to steer themselves, park themselves and brake in an emergency, Tesla Motors Chief Executive Elon Musk said.

Such vehicles, already being tested, have driven from San Francisco to Seattle with virtually no driver input, Musk said.

And current Tesla Model S sedans will now be able to tell you exactly how much juice you have in the battery, and exactly what to do about it.

During an invitation-only telephone news conference, the Silicon Valley-based billionaire touted software updates for his company's all-electric Model S that will dramatically reduce the electric vehicle condition known as range anxiety -- the fear that the car will run out of power before it reaches its destination.

Musk said new updates, which will download wirelessly to Model S cars already on the road some time in the next 10 days or so, will scan the locations of all Tesla charging stations and tell drivers exactly how far it is to the best one, and then recommend the best route for getting there.

The new features "are going to make a key difference to people driving the car and their perception of it as they are driving the car," Musk said. "It makes it impossible to run out of range unintentionally. The car will always take care of you."

Musk also promised another set of software updates that will make it possible for the car to drive itself on highways and major roads -- "parking lot to parking lot," he said.

During test drives along a route from the Bay Area to the Northwest, he said, "We are able to travel almost all the way without the driver touching any controls at all."

Perfecting those features will require "a lot of validation testing," Musk cautioned. But these capabilities could be a reality "in three months or so."

The car will also be its own valet, Musk said, though not in public parking lots.

"On private property you will be able to press the 'summon' button and your car will be able to find you," he said. "You can press it again and the car will put itself to bed in the garage, and close the garage door."

Though Musk's motor vehicles are by far the most expensive electric cars on the road -- the lowest-priced Model S goes for over $70,000, while many cost more than $110,000 -- they already offer the greatest range.

Currently, a top-end Model S sedan can get as much as 295 miles out of a single charge, the company has said. Even the entry-level Model S can go 265 miles before recharging.

No other electric vehicle offers even half that. While many EVs now on the road can go 80 to 100 miles between charges, only the Toyota RAV 4 EV cracks the century mark -- and only at an estimated 103 miles.

And, unlike other electric cars, the Tesla comes with a substantial charging infrastructure where most drivers can, for free, refresh their battery life in a short time.

Refueling the battery on a household 110-volt plug could take more than 24 hours. But a Tesla "supercharger," at stations the company has installed across North America, can replenish 80% of the battery's juice in 30 to 40 minutes.

The 12-year-old company currently has only the Model S sedan available through its unique no-dealership sales arrangement.

The company said at the time of its fourth-quarter earnings reports in February that it produced 35,000 Model S vehicles in 2014.

Tesla's long-delayed midsized crossover SUV, the Model X, is expected to begin delivery late this year. The company has said it already has more than 20,000 orders for the highly anticipated falcon-wing X.

Musk said all the dramatic new features currently being applied to the Model S will be available on the Model X as well.

NVIDIA to Install Computers in Cars to Teach Them How to Drive

As reported by ITWorld: As thousands of dashcam videos on YouTube vividly demonstrate, drivers see the craziest things. Be it an angry bear, a low-flying aircraft or even a guy riding a shopping cart on the freeway, the videos make for entertaining viewing but also illustrate a problem facing developers of self-driving cars: how can you program a computer to make sense of all this?

On Tuesday, chip maker Nvidia introduced a $10,000 computer that it says will allow cars to learn the right and wrong reactions to different situations, essentially figuring out what to do from experience rather than a rigid set of pre-defined situations.

“Driving is not about detecting, driving is a learned behavior,” said Jen Hsun Huang, CEO of Nvidia, during a presentation at the company’s GTC 2015 conference in San Jose.

The Drive PX is based on two of the company’s Tegra X1 processors and will crunch video from up to 12 cameras. Over time it should learn, for example, to slow down for dogs but not slam on the brakes for a piece of newspaper blowing across the road.

Today’s commercial autonomous systems are largely related to detecting when cars stray from their lanes or preventing collisions. Several fully self-driving cars have been developed as part of research projects, but they rely on highly detailed maps and are generally restricted to operating in controlled environments.

A DARPA project already proved the learning technology on a lower level, said Huang. A small autonomous robot was fed with 225,000 images of a backyard. When it started out, the robot ran straight into an obstacle, but after analyzing the images, it managed to successfully scoot around the yard without hitting objects, figuring out for itself how to get around.

The Drive PX is intended to be used by auto makers in research and development projects and is unlikely to mean self-driving cars are coming anytime soon. But if it works as promoted, it could help advance their arrival.

One proponent of autonomous driving, Tesla Motors CEO Elon Musk, said the most difficult part of realizing the technology was at speeds between 10- and 50 miles per hour.

“It’s fairly easy to deal with things that are sub five or 10 miles per hour, you just make sure it hits nothing” said Musk, who was speaking alongside Huang at the event. “From 10 to 50 miles per hour in complex suburban environments, that’s when you can get a lot of unexpected things happening. Once you’re above 50 miles per hour, it gets easier again.”

An additional element of Drive PX will ensure that actions learned in one car are shared with others.

Nvidia didn’t say which auto makers would be using the platform, which will be available from May, but did say that it’s already receiving inquiries from car companies about the technology.

Wednesday, March 18, 2015

New Technology May Double Radio Frequency Data Capacity

As reported by Columbia Engineering: A team of Columbia Engineering researchers has invented a technology—full-duplex radio integrated circuits (ICs)—that can be implemented in nanoscale CMOS to enable simultaneous transmission and reception at the same frequency in a wireless radio. Up to now, this has been thought to be impossible: transmitters and receivers either work at different times or at the same time but at different frequencies. The Columbia team, led by Electrical Engineering Associate Professor Harish Krishnaswamy, is the first to demonstrate an IC that can accomplish this. The researchers presented their work at the International Solid-State Circuits Conference (ISSCC) in San Francisco on February 25.

“This is a game-changer,” says Krishnaswamy, director of the Columbia high-Speed and Mm-wave IC (CoSMIC) Lab. “By leveraging our new technology, networks can effectively double the frequency spectrum resources available for devices like smartphones and tablets.”

In the era of Big Data, the current frequency spectrum crisis is one of the biggest challenges researchers are grappling with and it is clear that today's wireless networks will not be able to support tomorrow's data deluge. Today's standards, such as 4G/LTE, already support 40 different frequency bands, and there is no space left at radio frequencies for future expansion. At the same time, the grand challenge of the next-generation 5G network is to increase the data capacity by 1,000 times.

So the ability to have a transmitter and receiver re-use the same frequency has the potential to immediately double the data capacity of today's networks. Krishnaswamy notes that other research groups and startup companies have demonstrated the theoretical feasibility of simultaneous transmission and reception at the same frequency, but no one has yet been able to build tiny nanoscale ICs with this capability.

“Our work is the first to demonstrate an IC that can receive and transmit simultaneously,” he says. “Doing this in an IC is critical if we are to have widespread impact and bring this functionality to handheld devices such as cellular handsets, mobile devices such as tablets for WiFi, and in cellular and WiFi base stations to support full duplex communications.”

The biggest challenge the team faced with full duplex was canceling the transmitter's echo. Imagine that you are trying to listen to someone whisper from far away while at the same time someone else is yelling while standing next to you. If you can cancel the echo of the person yelling, you can hear the other person whispering.

“If everyone could do this, everyone could talk and listen at the same time, and conversations would take half the amount of time and resources as they take right now,” explains Jin Zhou, Krishnaswamy’s PhD student and the paper’s lead author. “Transmitter echo or ‘self-interference’ cancellation has been a fundamental challenge, especially when performed in a tiny nanoscale IC, and we have found a way to solve that challenge.”

Krishnaswamy and Zhou plan next to test a number of full-duplex nodes to understand what the gains are at the network level. “We are working closely with Electrical Engineering Associate Professor Gil Zussman and his PhD student Jelena Marasevic, who are network theory experts here at Columbia Engineering,” Krishnaswamy adds. “It will be very exciting if we are indeed able to deliver the promised performance gains.”


This work was funded by the DARPA RF-FPGA program.

Could Human-Driven Cars Become Illegal?

As reported by Huffington PostSelf-driving cars might be a novelty today. But in the not-too-distant future, they could become common.

Eventually, autonomous cars might prove to be so much safer than human drivers that you won't even be allowed to take the wheel anymore, Tesla co-founder and CEO Elon Musk said on Tuesday.

"People may outlaw driving cars because it's too dangerous," Musk told NVidia CEO Jen-Hsun Huang at the company's GPU Technology Conference in San Jose, California, according to CNBC. "You can't have a person driving a two-ton death machine."

Musk later clarified on Twitter that he doesn't support outlawing human-driven cars -- only that he could envision it happening in the future. 

In any case, it would be a while before human drivers are completely replaced. Musk said there are 2 billion cars on the road, and automakers can make 100 million vehicles per year. That means it would take at least 20 years to replace every car with an autonomous one.

While Musk has in the past called artificial intelligence "our biggest existential threat," and compared it to "summoning a demon," he said on Tuesday that autonomous cars won't be that demon.

"That's sort of like a narrow form of AI," Musk said, according to The Verge. "It would be like an elevator. They used to have elevator operators, and then we developed some simple circuitry to have elevators just automatically come to the floor that you're at ... the car is going to be just like that."