Search This Blog

Tuesday, November 26, 2013

Tracing the World's Most Complicated Roadways With GPS Data

As reported by The Atlantic Cities: Who knew GPS could be so beautiful?  The intricate highway interchange is always easier to appreciate from above. Take away the congestion, the last-minute mergers, the tail-pipe exhaust, the conflicting road signs and the vertigo, and a perfect cloverleaf really starts to look like a marvel of engineering.


Perhaps you've seen photos like these that capture the most complex Interstate overpasses as interlocking ribbons of asphalt. The above image, though, presents some of this same information in a quieter, more beautiful way, reducing interchanges – in this case, the intersection of I-70 and the I-465 beltway around Indianapolis – to their simplest geometry.
That picture comes from a layer of GPS traces on OpenStreetMap, where it's now possible to visualize the open-source mapping project's vast, ever-updating GPS database. The traces come from individual contributors, often driving their own cars, creating their own data streams via something as simple as an app on their smartphones. Such data can correct imprecise maps or validate earlier edits. But GPS data also produces a compass byproduct: Using it, we can verify the direction of a one-way street captured from a moving car, or unravel the elaborate logic of a four-way stacked overpass in a way that's not possible from a satellite photo.
Forget the old two-toned picture of road traffic: red for tail lights and white for head lights. This map, courtesy of MapBox and the OpenStreetMap Foundation, paints moving GPS traces with the full color wheel at right. Eric Fischer, who worked on the project, explained the method this way by email:
The resulting map of the world portrays every traced road by both location and direction. The highway interchanges, though, pop out as some of the most compelling parts of our infrastructure when viewed this way. With the help of Fischer, we pulled out some of our favorites below.
Consider this a more zen appreciation of highway infrastructure than what you'll undoubtedly experience on the roads this week heading to and from Thanksgiving.
The famous Spaghetti Junction outside of Birmingham
in the United Kingdom

A four level interchange in Los Angeles.

The confluence of I-90, I-190 and I294 outside of O'Hare International Airport
in Chicago.

New Jersey from I-95 and I-495

Arc de Triomphe in Paris

A particularly heavily traveled cloverleaf in Moscow.





Monday, November 25, 2013

Faulty Traffic Sensors Dull How 'Smart' Freeways Are

As reported by Wireless Week: California's highways aren't as smart as they used to be.

Buried under thousands of miles of pavement are 27,000 traffic sensors that are supposed to help troubleshoot both daily commutes and long-term maintenance needs on some of the nation's most heavily used and congested roadways. And about 9,000 of them do not work.

The sensors are a key part of the "intelligent transportation" system designed, for example, to detect the congestion that quickly builds before crews can get out and clear an accident.

A speedy response matters: Every minute a lane is blocked during rush hour means about four extra minutes of traffic. Fewer sensors can mean slower response times, so the fact that 34 percent are offline — up from 26 percent in 2009 — creates an extra headache in California's already-sickly traffic situation.

"(It) is not an acceptable number, really," said California's top transportation official, Brian Kelly.
With limited space and money for new lanes, Kelly said, maximizing flow on existing freeways is critical. To do so, planners rely on a network of cameras, above-road detectors, message boards and the in-road sensors called "loops" because of their shape.

Some loops were cut during construction, others yanked out by copper wire thieves. Many have succumbed to old age.

The resulting blind spots show up as strings of gray amid the green, yellow or red on the large map that freeway managers overseeing Los Angeles and Ventura counties monitor for signs of trouble. Even worse off than LA, according to Caltrans, are inland areas such as the San Joaquin Valley and San Bernardino and Riverside counties.

The outages are significant enough that the sensors alone cannot produce real-time traffic maps that are useful to the public. Especially when compared to the many private traffic mapping services that drivers rely on to get around.

So, to post online traffic maps that are ready for public consumption, California and other states are paying the private sector.

Caltrans gives away data from its working loop sensors to Google and other companies; Caltrans also pays Google for a traffic map that incorporates its own data as well as information the tech giant gets from vehicles and cellphones whose owners have agreed to share location data.

California's tab is not large — Caltrans estimates it at $25,000 per year for its public-facing Quickmap — but other states are giving away sensor data and buying back reliable maps as well. Michigan's transportation department said it pays Inrix Inc. about $400,000 annually for data to populate its Mi Drive map.

An Inrix spokesman said the company has contracts with 25 state transportation departments.

Loops are a simple technology that can last decades when properly installed. A bundle of wires under the pavement detects the size, speed and number of vehicles that pass over it, transmitting the information to a roadside box. That data records traffic in real time, but also helps planners who want to know how many of what kinds of vehicles use a road so they can project when it will start to deteriorate (more big trucks means more potholes, sooner).

Drivers may be familiar with loops at surface street intersections, where a circular cut in a turn lane means a loop will detect an idling car and tell the light to change. Replacement materials cost only a few hundred dollars — but installing a loop on a freeway can cost thousands because to embed the wire crews must close two lanes, likely off hours when labor is more expensive.

In the Fresno and San Francisco Bay areas alone, Caltrans plans to spend $35 million to fix loops sensors — as well as freeway lights, cameras, ramp meters and other electrical systems — that are down due to metal scavengers or other problems.

The state that pioneered the use of loop sensors starting in the 1970s is not alone in its struggle to keep them producing reliable data.

In Utah, transportation officials estimated about 20 percent of loops do not work.

"Does it impair our ability to make informed decisions? Certainly," said Blaine Leonard, manager of the state's intelligent transportation systems program.

Information from loops informs the estimated travel times posted on freeway message boards.

"If the data is bad and therefore the travel times are bad, at some point in time the public goes, 'Well, they don't know what they're doing,'" Leonard said.

About 75 percent of loops In the Austin, Texas area are not working due to large-scale freeway resurfacing, according to the state department of transportation. Michigan's transportation planners abandoned loops because they found too many failed during winter's freeze-thaw cycle; they've moved to above-road sensors that use microwaves to detect traffic.

Saturday, November 23, 2013

Who Knew GPS Could Look So Beautiful?

As reported by the News Scientist: A crop-dusting aircraft's graceful, looping route over Russian farmland is tracked by the pilot's GPS, resulting in a beautiful map you won't see anywhere else.

This aerial concoction is one of many by custom map-maker MapBox, which has developed a way to overlay the world's largest trove of open-sourced GPS data – submitted over nine years to the free wiki Open Street Map – on top of aerial imagery to create beautiful, traveler-friendly maps.

Mapbox's GPS routes are color-coded by the course of travel, with each direction given its own hue, to help future users verify one-way streets, roads not displayed on traditional maps or, in this case, display one aircraft's vivid rainbow path across the sky.

Friday, November 22, 2013

How WiFi Could Revolutionize the Cellular Industry

As reported by the Washington PostIt's easy to forget that WiFi has actually gotten faster over time. In 2003, your garden variety WiFi network managed theoretical speeds of 54 Mbps. Fast forward a decade, and we're now browsing over WiFi, in some cases, at 1 Gbps or more.

Those advances aren't just creating faster Internet experiences. They're also giving rise to a new crop of cellular services. These alternatives to the traditional wireless carrier take advantage of the spread of cheap and plentiful WiFi to deliver low-cost voice, SMS and data in ways that should make the giants in the industry deeply jealous. If the budget-minded upstarts get their way, they could wind up overturning the entire way that cellular service is bought and sold. Here's how.
The country is dominated by four national wireless carriers that operate their own networks. These companies charge relatively high prices. Some of the cost is justified; in addition to providing your mobile service, the companies have to invest in upgrading towers, buying the airwaves over which your calls travel, and other infrastructure costs.
But the small cellular companies now moving aggressively to shake up this system pay no such costs. Collectively, these businesses are called MVNOs — mobile virtual network operators. By signing deals with the larger businesses, MVNOs get to use those companies' infrastructure without actually having to build it all themselves. In some cases, MVNOs also cut costs by foregoing customer service teams. That can add up to savings that are passed on to consumers.
The idea isn't all that new; in fact, MVNOs are really popular overseas. The United States itself is home to dozens of cellular operators that piggyback off of AT&T, Sprint, T-Mobile and Verizon. But the business model that helped sustain MVNOs through the 1990s and 2000s is changing.
Consider Republic Wireless, a Raleigh-based business that announced this month it would sell Motorola's new flagship phone, the Moto X. Republic enjoys all the traditional advantages of an MVNO — low capital expenditures on infrastructure and spectrum — but it's taken the additional step of cutting out 3G and 4G data use whenever it can. Technically, Republic operates on Sprint's network, but it's more appropriate to think of Sprint as a backup for when a call or message can't be completed over WiFi.
Yes, you read that right: WiFi. Republic's business depends on shunting all of your communications — data, voice, everything — onto the free stuff you get in your office or in coffee shops. What makes this beautiful is that whenever a Republic customer chooses to place a call over WiFi, that saves Republic money. As a result, Republic can offer a $5-a-month plan for unlimited talk, text and data. For another $5 a month, customers get access to Sprint's cellular network (minus 3G). Higher-tier plans provide 3G and 4G Internet on Sprint, though it's almost a joke to call them "higher-tier" when the most expensive plan tops out at just $40 a month. The tiered plan supersedes an old, $19-a-month all-you-can-eat plan.
"The crazy plans at $5 and $10 have never been tried," said CEO David Morken. "That's because we focus on unlicensed spectrum as the primary, and licensed spectrum as the secondary."
That's the opposite of the way traditional wireless companies work. Most national providers place a premium on "licensed spectrum," or spectrum that only they have the rights to. The problem is that while valuable spectrum can help increase call quality, buying the rights is expensive. T-Mobile, for example, is reportedly eyeing a $3 billion spectrum deal with Verizon.
Republic pays none of those costs. What's more, because its parent company is the same one that handles calls made over Google Voice, Vonage and a host of other VoIP services, it's gotten incredibly experienced at not dropping your WiFi calls.
It almost sounds too good. And your mileage will certainly vary, depending on where you are and the strength of your connection. But the business model alone is extraordinary, because it threatens one of the main ways that national wireless companies make their money: selling network access.
Other MVNOs are catching on, too. Toronto-based Ting, which charges you separately for minutes, text and data as you use them (rather than bundling it into one opaque monthly rate), reports seeing data consumption drop by between 50 percent and 75 percent as a result of WiFi offloading.
"Our users switch on WiFi at home and at work on their smartphones so much more than the average user," said Elliot Noss, Ting's CEO.
There's some evidence that the large carriers are relying more heavily on WiFi to manage loads, as well — they're just not talking about it much. The growing demand for WiFi all around is one argument for allocating more spectrum for unlicensed usage ahead of a major spectrum auction in 2014. A recent New America Foundation studyreports that WiFi offloading saves the wireless industry $20 billion a year, which amounts to 29 percent of its total annual revenues.
That poses a couple of big problems for us all, actually. In a future where MVNOs and large carriers alike push more of their traffic onto WiFi, the incentives to build new mobile infrastructure begin to erode. Why should a carrier invest in expensive network upgrades if it can provide the same experience by dumping traffic onto a customer's home or office network?
Not only does that create potential pitfalls over the long term, but it also transfers more business to providers of fixed, wireline broadband providers like cable companies, giving them a great deal more bargaining power in the process.
Asked whether he was concerned about potentially kneecapping one incumbent only to replace it with another, Morken laughed.
"One dragon at a time," he said.

Thursday, November 21, 2013

Wireless Carriers May Be Avoiding Technology to Disable Stolen Smartphones to Prevent Phone Sales from Dipping

As reported by ABC NewsAbout 1.6 million smartphones were stolen in 2012, Consumer Reports estimates.

George Gascón, the district attorney of San Francisco, wants to decrease that number by working with manufacturers to install kill switches that would render smartphones inoperable if reported as stolen. Gascón biggest opponents aren't the phone manufacturers, but the cellular providers.

Gascón said he reached out to Samsung this summer to implement the kill switches. "They engaged a third-party developer willing to develop it, and said they would roll it out with the Galaxy 5 phones," he told ABC News. "But the carriers said to Samsung, 'Absolutely not.' We were perplexed, so we started to look into it."
Gascón said he is suspicious of the wireless carriers' motives for rejecting the kill switch. "There were email conversations between Samsung and the kill-switch developer, saying that the carriers were concerned about losing business," he said. "I became outraged."
Samsung declined to comment on specific details involving Gascón, but issued the following statement: "We are working with the leaders of the Secure Our Smartphones (S.O.S.) Initiative to incorporate the perspective of law enforcement agencies. We will continue to work with them and our wireless carrier partners towards our common goal of stopping smartphone theft."
It might not be immediately apparent how a kill switch would decrease the number of smartphones stolen. Gascón said it might take some time to trickle down, but that once smartphone thieves see that they can't do anything with a stolen smartphone, their motivation to steal more phones will disappear.
He estimates that any effects could be two to three years down the road, depending on how often people replace their devices or update their operating system.
Both Verizon and AT&T declined to speak about the issue and deferred to CTIA-The Wireless Association for further comment. Jamie Hastings, vice president of external and state affairs for CTIA, did not directly address the decision regarding kill switches, but said all carriers are working on a multi-pronged approach to lower the number of phone thefts in the country.
"One of the components of the efforts was to create an integrated database designed to prevent stolen phones from being reactivated," Hastings said in a statement. "To assist users, we offer a list of apps to download that will remotely erase, track and/or lock the stolen devices."
Kevin Mahaffey, the chief technology officer of Lookout Mobile Security, said it's also important not to rush into any manufacturing decision that could have a big impact. "There are different risks associated with different technologies in order to solve a problem," he said. "There's no silver bullet or pixie dust to make it work."
While a kill switch might deter thieves, it could increase the risk of a cyberattack that could affect millions of phones at a time. "You have to appreciate the carrier perspective as well," Mahaffey said. "If your phone stops working, who do you expect to replace it?"
Like many issues, it all comes down to better understanding and communication between law enforcement, cell carriers and phone manufacturers.
"No one party has the whole picture," he said. "Each has their own insight, and we need to get all of these parties to work together."

EU Parliament Approves €7 Billion to Complete GNSS/GPS Projects

As reported by PC WorldThe European Parliament on Wednesday approved €7 billion (US$9.5 billion) in funding to further develop and complete Europe’s satellite navigation programs, including the Galileo and EGNOS projects.


The funding will cover the projects from 2014 to 2020 and will be spent on completion of the satellite navigation infrastructure as well as the development of fundamental components such as Galileo-enabled chipsets or receivers in smartphones.
“The overall economic impact of Galileo and EGNOS is estimated to be around €90 billion over the next 20 years,” said Industry Commissioner Antonio Tajani. “In addition to the opening up of new business opportunities, everyday users will be able to enjoy increasingly accurate satellite navigation services with every new satellite launch.”
Galileo, the fully E.U.-owned autonomous satellite navigation system under civil control, will provide first services from the end of 2014 and when fully operational (before 2020) will provide a freely accessible service for positioning, navigation and timing, using the dual-frequency Galileo Signal in Space.
Positioning and timing signals provided by satellite navigation systems are used in many critical areas including electronic trading, mobile phone networks, power grid synchronization, air traffic management and, of course, in-car navigation.
EGNOS (the European Satellite Based Augmentation System) has been fully operational since 2011. It works to increase the accuracy of GPS positioning, making it suitable for safety-critical applications such as aircraft navigation. EGNOS improves the positioning accuracy of GPS to within three meters. In comparison, people using a GPS receiver without EGNOS can only be sure of their position to within 17 meters.
Responsibilities for the completion and operation of the satellite navigation programs will be divided. The European Commission will remain responsible for the progress of the programs and their overall supervision. The Prague-based European GNSS Agency will gradually take charge of EGNOS and Galileo’s operational management and the deployment of Galileo as well as the design and development of next generation systems will be entrusted to the European Space Agency.
The Council of the E.U. is expected to approve the regulation at a ministerial meeting next month. It will then enter into force on Jan. 1.

Wednesday, November 20, 2013

Proposed FUEGO Satellite Could Locate Wild-Fires in Real-Time

The proposed Berkeley FUEGO satellite would continuously
scan the US for wildfires, so they could be potentially controlled
before they become overwhelming.  This  technology would
also potentially mitigate the currently $2.5B budget in the US set
aside annually to help control wildfires; as well as the liability for
the damage the fires invariably cause in property damage and loss
of life; affecting both civilians and firefighters.
As reported by the UC Berkeley News CenterAs firefighters emerge from another record wildfire season in the Western United States, University of California, Berkeley, scientists say it’s time to give them a 21st century tool: a fire-spotting satellite.

Such a satellite could view the Western states almost continuously, snapping pictures of the ground every few seconds in search of hot spots that could be newly ignited wildfires. Firefighting resources could then be directed to these spots in hopes of preventing the fires from growing out of control and threatening lives and property.
The UC Berkeley scientists have designed such a satellite using state-of-the-art sensors, written analysis software to minimize false alarms, and even given it a name – the Fire Urgency Estimator in Geosynchronous Orbit (FUEGO). They’re hopeful it can be built for several hundred million dollars, either by government or private entities.
“If we had information on the location of fires when they were smaller, then we could take appropriate actions quicker and more easily, including preparing for evacuation,” said fire expert Scott Stephens, a UC Berkeley associate professor of environmental science, policy and management. “Wildfires would be smaller in scale if you could detect them before they got too big, like less than an acre.”
Stephens, physicist Carl Pennypacker, remote sensing expert Maggi Kelly and their colleagues describe the satellite in an article published online Oct. 17 by the journal Remote Sensing.
“With a satellite like this, we will have a good chance of seeing something from orbit before it becomes an Oakland fire,” said Pennypacker, a research associate at UC Berkeley’s Space Sciences Laboratory and scientist at Lawrence Berkeley National Laboratory, referring to the devastating 1991 fire that destroyed more than 3,000 homes in Berkeley and Oakland. “It could pay for itself in one firefighting season.”
With global warming, Stephens said, wildfires are expected to become more frequent and more extensive. This year alone, California’s firefighting arm, CAL FIRE, has responded to over 6,000 wildfires, 1,600 more than average, according to tweets by the department’s information officer Daniel Berlant. Wildfire-prone areas stretching from Spain to Russia could also benefit from their own dedicated satellites.
Updating an outmoded system
Fire detection today is much like it was 200 years ago, Stephens said, relying primarily on spotters in fire towers or on the ground and on reports from members of the public. This information is augmented by aerial reconnaissance and lightning detectors that steer firefighters to ground strikes, which are one of the most common wildfire sparks.
Infrared images of the area around Yosemite National Park on
Aug. 17, 2013, before and 10 minutes after ignition of the Rim Fire.
The images, taken by the GOES weather satellite, show that fire
hotspots can be detected from space.  GOES is a powerful, all-
purpose satellite, and was not exclusively designed for fire
detections, unlike the proposed FUEGO geosynchronous satellite
which could scan areas every few minutes.
“Even today, most fires are detected, in some way or another, by people,” he said. “Even the Rim Fire near Yosemite National Park this past summer was detected by someone who saw a smoke column.”
But satellite technology, remote sensing and computing have advanced to the stage where it’s now possible to orbit a geostationary satellite that can reliably distinguish small, but spreading, wildfires with few false alarms. Pennypacker estimates that the satellite, which could be built and operated by the federal government, like the Geostationary Operational Environmental Satellite (GOES); as a partnership between government and the private sector, like the Landsat satellite program; or by a private company alone, would cost several hundred million dollars – a fraction of the nation’s $2.5 billion yearly firefighting budget.
The idea of a fire detection satellite has been floated before, but until recently, detectors have been prohibitively expensive, and the difficulty of discriminating a small burning area from other bright hotspots, such as sunlight glinting off a mirror or windshield, made the likelihood of false alarms high. Today, computers are faster, detectors cheaper and more sensitive, and analysis software far more advanced, making false alarms much less likely, according to researchers.
“Simply put, we believe we have shown that this kind of rapid, sensitive fire detection of areas bigger than 10 feet on a side is probably feasible from space, and we have evidence that the false alarm rate will not be crazy,” said Pennypacker, who has designed sensitive satellite-borne detectors for 40 years. “Our work requires further testing, which we are eager to do.”
The approach is similar to what Pennypacker and colleague Saul Perlmutter used 20 years ago to search for exploding stars to study the expansion of the universe. In that case, they created an automated system to compare consecutive images of the night sky to look for new points of light that could be supernovas. Perlmutter, UC Berkeley professor of physics, shared the 2010 Nobel Prize in Physics for this work, which proved that the expansion of the universe is accelerating.
How it works
“In concept, this is a simple system: a telephoto camera, an infrared filter and a recording device. We are just looking for something bright compared to the surroundings or changing over time,” Kelly said. “Then, we do these rapid calculations to determine if one image is different from the next.”
Images taken in two different infrared wavelengths reveal different
details of a smokey fire, demonstrating that a fire-spotting satellite
could see ignition sites obscured by smoke.  These images are of a
2003 fire in the San Bernadino National Forest near Los Angeles taken
by the ASTER satellite.
Pennypacker and graduate student Marek K. Jakubowski developed a computer analysis technique, or algorithm, to detect these differences in space and time and to distinguish them from bright lights that might look like fires. This involves several billion calculations per second on images taken every few seconds, covering the entire West every few minutes. The new paper reports on tests of this algorithm using existing imagery from real fires, but the team hopes to get funding to test the system on a fire that is starting, such as a prescribed burn.
“The point is, satellites like Landsat and GOES provide great information after a fire starts; they can focus and monitor a fire by looking at smoke plumes, fire spread, hot spots at the edges, etc.,” Kelly said. “FUEGO is designed for early detection of smaller fires. Right now, we lose a lot of time because fires are already big by the time we see them.”
The FUEGO design, for which UC Berkeley has filed a patent, was developed with funds from the Office of the Vice Chancellor for Research.