Search This Blog

Sunday, November 10, 2013

All About Beamforming, the Faster Wi-Fi You Didn't Know You Needed

As reported by PC WorldBeamforming is one of those concepts that seem so simple that you wonder why no one thought of it before. Instead of broadcasting a signal to a wide area, hoping to reach your target, why not concentrate the signal and aim it directly at the target?


Sometimes the simplest concepts are the most difficult to execute, especially at retail price points. Fortunately, beamforming is finally becoming a common feature in 802.11ac Wi-Fi routers (at least at the high end). Here’s how it works.
First, a bit of background: Beamforming was actually an optional feature of the older 802.11n standard, but the IEEE (the international body that establishes these standards) didn’t spell out how exactly it was to be implemented. The router you bought might have used one technique, but if the Wi-Fi adapter in your laptop used a different implementation, beamforming wouldn’t work.
Some vendors developed pre-paired 802.11n kits (with Netgear’s WNHDB3004 Wireless Home Theater Kit being one of the best examples), but these tended to be expensive, and they never had much of an impact on the market.
The IEEE didn’t make the same mistake with the 802.11ac standard that’s in today’s high-end devices. Companies building 802.11ac products don’t have to implement beamforming, but if they do, they must do so in a prescribed fashion. This ensures that every company’s products will work together. If one device (such as the router) supports beamforming, but the other (such as the Wi-Fi adapter in your router) doesn’t, they’ll still work together. They just won’t take advantage of the technology.
Beamforming can help improve wireless bandwidth utilization, and it can increase a wireless network’s range. This, in turn, can improve video streaming, voice quality, and other bandwidth- and latency-sensitive transmissions.
Beamforming is made possible by transmitters and receivers that use MIMO (multiple-input, multiple-output) technology: Data is sent and received using multiple antennas to increase throughput and range. MIMO was first introduced with the 802.11n standard, and it remains an important feature of the 802.11ac standard.

How beamforming works

Wireless routers (or access points) and wireless adapters that don’t support beamforming broadcast data pretty much equally in all directions. For a mental picture, think of a lamp without a shade as the wireless router: The bulb (transmitter) radiates light (data) in all directions.
Devices that support beamforming focus their signals toward each client, concentrating the data transmission so that more data reaches the targeted device instead of radiating out into the atmosphere. Think of putting a shade on the lamp (the wireless router) to reduce the amount of light (data) radiating in all directions. Now poke holes in the shade, so that concentrated beams of light travel to defined locations (your Wi-Fi clients) in the room.
If the Wi-Fi client also supports beamforming, the router and client can exchange information about their respective locations in order to determine the optimal signal path. Any device that beamforms its signals is called a beamformer, and any device that receives beamformed signals is called a beamformee.

Netgear's Beamforming+

As mentioned earlier, beamforming support is an optional element of the 802.11ac standard, and any vendor offering it must support a specific technique. But the vendor can also offer other types of beamforming in addition to that standard technique.
Netgear’s Beamforming+ is a superset of the beamforming technique defined in the 802.11ac standard, so it’s interoperable with any other 802.11ac device that also supports beamforming. But Beamforming+ does not require the client device to support beamforming, so you could see range and throughput improvements by pairing one of Netgear’s routers (specifically, Netgear’s model R6300, R6200, and R6250) with any 5GHz Wi-Fi device (Netgear’s R7000 Nighthawk router also supports beamforming on its 2.4GHz network).
Netgear is not the only router manufacturer to support beamforming, of course. It’s becoming a common feature on all of the higher-end Wi-Fi routers and access points. If you’re in the market and want a router that supports beamforming, check the router’s specs on the box or at the vendor’s website. Here are three other routers you might consider: the Linksys EA6900, the D-Link DIR-868L, and the Trendnet TEW-812DRU.

Saturday, November 9, 2013

New Foundation Formed to Pursue eLoran as Backup for GPS

As reported by Inside GNSS: A new nonprofit has been launched to push for repurposing the United States’ old C-Loran infrastructure to support a new, privately funded Enhanced Loran (eLoran) service as a backup to GPS.

The Resilient Navigation and Timing Foundation (RNT Foundation), headquartered in Alexandria, Virginia, was formed to support the creation of an Enhanced Loran or eLoran service, possibly through a public private partnership to be funded by the system’s users. ELoran, they said, is needed to provide a backup to GPS, a key element of the nation’s critical infrastructure that is increasingly at risk.

“The Department of Homeland Security has determined that the GPS timing signal is essential for the operation of 11 of the nation’s 16 critical infrastructure sectors; all sectors use GPS information in some form. The number of jamming and spoofing incidents in the US continues to grow each year and the threat to our national, homeland and economic security increases,” the RNT Foundation wrote in a white paper released Wednesday (November 6, 2013).

The need for some sort of secondary system was underscored by a Government Accountability Office report, also released Wednesday, that said federal agencies have failed to do enough to backup the GPS system and mitigate disruptions.

“Few people realize how navigation and timing services are essential to nearly every facet of our lives,” said RNT President and Executive Director Dana Goward in a statement. “Every nation needs multiple, and complementary services to help ensure navigation and timing signals are available whenever and wherever needed.”

Goward recently retired from the U.S. Coast Guard where he was director for Marine Transportation Systems. He served more than 40 years in the Coast Guard, both in uniform and as a civilian, and led delegations to several international organizations as the nation’s maritime navigation authority.

Joining Goward at the foundation as vice president is Martin Faga, a former CEO of MITRE Corporation and a past assistant secretary of the Air Force. “Our society has become dependent on navigation and timing systems, requiring a robust backup to our primary Global Navigation Space Systems, like GPS and others,” Faga said in the organization’s announcement.

Loran, shorthand for LOng RAnge Navigation, is a network of fixed, terrestrial radio beacons that broadcasts signals centered on 100kHz. This is a far lower frequency than that used by the GPS system, meaning Loran signals will not be interfered with by a jammer targeting GPS signals. Loran signals are also far more powerful, 1.3 million times stronger on average, said Goward — which means they can be used indoors, underground, and under water. Proponents insist that eLoran can be a fully independent source of positioning and timing signals and a backup to GPS if it were disrupted.

ELoran, which is in use in several nations around the world, has proven itself to be a reliable backup, proponents say.

“The British system has shown that the technology is more than mature. The South Koreans — the (request for proposals) to build their system closes on the 13th of this month — they do not see it as developmental; they see it as current market technology. As do apparently the Saudis and the Indians who are also looking to contract for systems,” said Goward.

At one point, the United States was also going to establish an eLoran network and spent some $160 million to upgrade the existing C-Loran beacons, before changing direction. Shortly after it came into office, the Obama administration decided to terminate the U.S. program, possibly as a cost-saving measure. It ceased operation early in 2010.

Resurrecting plans for eLoran in the United States would give the country a separate system with 8-10 meter navigation accuracy and time accuracy of better than 50 nanoseconds, said the foundation’s white paper. The system could be built in the continental U.S. in three years for about $40 million with usable signals potentially available within the first year of operation. The cost to run the systems would be about $16 million a year, the organization estimated.

Funding Ideas: Fees on Receivers, Telephones, Electricity
To finance the system the foundation is proposing a public-private partnership (P3) be formed to take over the existing Loran sites and convert then to eLoran service. Revenue could come from carrying high-priority, critical text messages for a fee or contracting with companies or government agencies to use eLoran to detect GPS interference. The P3 could also charge for a high-accuracy timing service that provides precision users with timing signals as accurate as 30 nanoseconds.

Among the other ideas are a $1 tax on each standalone or integrated eLoran and satellite navigation receiver sold — which could generate $20 million annually in the United States and “fund operation of the entire system” according to the white paper. A temporary eight-cent fee on every monthly U.S. cell phone and electric bill could, in one year, provide enough funding to endow the P3 in perpetuity.

“All of those are just possible funding ideas. We think there are many ways that the public-private partnership could be funded with no direct cost to the taxpayer,” Goward told Inside GNSS.

Whether or not the foundation would be part of the public-private partnership is to be determined.

“The important thing is to get these things done. Whether we are part of the partnership or not, or we just advocate for the partnership – as long as the end result is that the U.S. gets the system and the resiliency that we need to protect our critical infrastructure and ensure our citizens are safe we’ll consider it a job well done,” Goward said.

One company that would like to be part of the public-private partnership is UrsaNav, Inc., of Chesapeake, Va.

UrsaNav would “absolutely” be interested in participating in a future eLoran system supported by the private sector, president and co-founder Charles Schue told Inside GNSS. “I think there is room in the pubic private partnership for various interested companies.”

The firm is already among the supporters of the foundation and has provided some financial backing, said Schue.

The opportunity could be lost if there is no action soon, he added.

“They are pretty far along (in tearing down the existing sites), said Schue, “and they are moving pretty fast. They have taken down a lot of antennas already — the transmitting antennas.”

The government plans to take down more antennas in December and still more in March 2104, he said.

“They are also removing equipment — technology. So you have, from my perspective, $160 million of taxpayer money spent to put new technology at the sites and that technology seems to be headed out the door to the scrap pile,” said Schue. “That seems to be a real waste.”

If a possibility exists that the sites might be put to use for eLoran, then perhaps work to dismantle and divest the sites should be delayed, suggested Schue. “It would seem to me that during the period that they are thinking about it, they would want to preserve stuff instead of getting rid of stuff that you may have to go out and buy again.”

Getting a decision on the sites, however, may not be easy.

“It is not a technical problem,” said Schue. “It is probably not even a money problem because the money is so small. It is just the political will for the agencies that are charged with solving GPS vulnerability — that’s (the departments of Homeland Security, Transportation and Defense) — to just make a decision to do something. That’s all that’s required is the decision. The technology is proven and the money is budget dust.”

Critical Infrastructure Not Prepared for GPS Disruption

As reported by GCNAlthough position, navigation and timing services from the Global Positioning System are widely used in the nation’s critical infrastructure, government and industry are not prepared to address the risks of GPS disruptions, according to a recent study.
The Government Accountability Office said GPS has become “an invisible utility” underpinning many applications critical to the nation’s security and economy. A number of executive directives have mandated programs to detect and mitigate accidental or malicious interference, but, “sectors’ increasing dependency on GPS leaves them potentially vulnerable to disruptions,” GAO concludes.
The Transportation and Homeland Security departments have primary responsibility for ensuring the security of systems relying on GPS, but a lack of resources and cooperation has limited progress in identifying backup technologies in the last eight years.
In its report, GPS Disruptions: Efforts to Assess Risks to Critical Infrastructure and Coordinate Agency Actions Should Be Enhanced, GAO recommends that DHS produce a more reliable assessment of the risks of GPS disruption together with metrics for the effectiveness of risk mitigation and that the two departments establish a formal agreement laying out roles and responsibilities.
GPS is a satellite-based system providing precise timing signals that can also be used to determine position and for navigation. Timing functions are used widely in critical infrastructure, and transportation industries, particularly aviation and maritime, use GPS for navigation. Because it relies on radio signals, GPS is susceptible to natural interference from weather on Earth and in space, to accidental interference from other devices and to intentional blocking or spoofing
Disruptions to service have not been common. The U.S. Coast Guard, which fields reports of GPS problems, received 44 such reports in 2012. But reporting is not mandatory, and GAO noted that USCG’s role is not widely known, so incidents could be underreported.
GAO examined the use of GPS in four critical infrastructure sectors — communications, energy, financial services and transportation — as well as DHS and DOT efforts at risk management. The communications and transportation industries are most reliant on the service, although the financial services and energy sectors use its timing features to a lesser extent.
DHS has produced a National Risk Estimate for GPS, released in 2012 for official use only. GAO criticized the report, saying it is incomplete and has limited usefulness because it does not meet the department’s own guidance for risk management. DHS defended the study, saying that its scope was limited, that it fulfills its intended purposes and that it “sufficiently characterized the risk environment.”
GPS is the backbone for NextGen, the Federal Aviation Administration’s next-generation air traffic control system, and because of its use for navigation DOT is the lead civilian agency for GPS reliability efforts. The department was charged in a 2004 national security directive with developing backup capabilities for government and industry, with the assistance of DHS. An implementation plan for a national position, navigation and timing architecture has been released, and potential backup alternatives for FAA NextGen are being researched. Current navigational alternatives to GPS do not support NextGen capabilities, and FAA expects to make a decision by 2016 on a backup system.
USCG is doing research to test alternative non-space-based sources of timing, NIST is researching the possibility of using the nation’s fiber networks as an alternative, and DHS has commissioned a study of ways to detect and mitigate sources of disruptions. The Defense Advanced Research Projects Agency also is working on alternative navigation tools
But GAO found little progress had been made in identifying adequate backup technologies, due to a lack of staffing and budget and to a lack of cooperation between the two lead departments. Roles in the effort have not been clearly defined, and DOT sees its job as addressing only the needs of the transportation sector, leaving the rest to DHS. But DHS says that the terms of the directive put DOT in the lead position.
DHS said that it will establish a formal, written agreement with DOT “that will clearly delineate roles and responsibilities” in developing GPS backup capabilities. But it noted that “the ability to fully implement agreed-upon shared tasks will be contingent on the availability of personnel and financial resources.”

Friday, November 8, 2013

Inertial Sensors Boost Smartphone GPS Performance

As reported by MIT Technology Review: If you've ever used a smartphone to navigate, you’ll know that one of the biggest problems is running out of juice. GPS sensors are a significant battery drain and so any journey of significant length requires some kind of external power source. Added to that is the difficulty in even getting a GPS signal in city center locations where towering office blocks, bridges and tunnels regularly conspire to block the signal.

So a trick that reduces power consumption while increasing the device’s positioning accuracy would surely be of use.

Today, Cheng Bo at the Illinois Institute of Technology in Chicago and a few pals say they've developed just such a program, called SmartLoc, and have tested it extensively while travelling throughout the windy city.

They say that in the city, GPS has a positioning accuracy of about 40 meters. By comparison, their SmartLoc system pinpoints its location to within 20 meters, 90 per cent of the time.

So how have these guys achieved this improvement? The trick that Bo and pals use is to exploit the smartphone’s inertial sensors to determine its position whenever the GPS is off line.

The way this works is straightforward. Imagine a smartphone fixed to the windscreen of a car driving around town. Given a GPS reading to start off with, the smartphone knows where it is on its built-in or online map. It then uses the inertial sensor to measure its acceleration, indicating a move forwards or a turn to the left or right and so on.

By itself, this kind of data is not very useful because it’s hard to tell how far the vehicle has traveled and whether the acceleration was the result of the car speeding up or going over a humpback bridge, for example.

To get around this, the smartphone examines the section of road on the map looking for road layouts and features that might influence the sensors; things like bends in the road, traffic lights, humpback bridges and so on. Each of these has a specific inertial signature that the phone can spot. In this way, it can match the inertial signals to the road features at that point.

The key here is that each road feature has a unique signature. Bo and co have discovered a wide range of inertial signatures, such as the deceleration, waiting and acceleration associated with a set of traffic lights, the forces associated with turnings (and how these differ from the forces generated by changing lanes, for example) and even the change in the force of gravity when going over a bridge.

Note the conspicuously missing GPS positions
in the area of the Eisenhower tunnel on I-70 in
Colorado.
Having gathered this data, the SmartLoc program looks for these signatures while the car is on the move. These guys have tested it using a Galaxy S3 smartphone on the city streets in Chicago and say it works well. They point out that in the city center, the GPS signal can disappear for distances of up to a kilometer, which would leave a conventional navigation system entirely confused.

However, SmartLoc simply fills in the gaps using its inertial signature database and a map of the area. “Our extensive evaluations shows that SmartLoc improves the localization accuracy to less than 20m for more than 90% roads in Chicago downtown, compared with ≥ 50% with raw GPS data,” they say.

That certainly looks handy. And this kind of performance could also help save battery power by allowing a smartphone to periodically switch off the GPS sensor and run only using the inertial sensor.

What Bo and co don’t do is explain their plans for their new system. One obvious idea would be to release it as an app–it clearly already works on the Android platform. Another idea would be to sell the technology to an existing mapping company. Perhaps they’re planning both. Whatever the goal, it seems worth keeping an eye on.

Share of Proprietary RF in Stolen Vehicle Tracking Systems to Drop to 10% by 2018

As reported by Business Wire: While the total number of stolen vehicle tracking solutions in use globally will increase from 35.5 million in 2013 to 143 million in 2018, the share of proprietary RF based systems will decrease from 24% in 2013 to 10% in 2018.

“Stolen vehicle tracking, recovery, slow down or immobilization has been offered by vendors such as GM-OnStar for more than a decade as a factory installed solution. It has also seen success as an aftermarket solution for high-end vehicles with LoJack's RF-based approach positioned as a premium offer providing deep indoor coverage allowing recovery rates of more than 90%. As embedded factory-installed telematics technology and advanced indoor location become more widespread, aftermarket solutions will gradually become less popular, and in its wake, RF will lose overall market share. However, in regions like the US and South Africa RF will continue to be used respectively as a premium technology for high-end vehicles and as a reliable and robust technology to address rampant vehicle theft,” says VP and practice director Dominique Bonte.

This disruptive trend is prompting established RF SVT players to shift their strategies. Market leader LoJack has entered into a partnership with TomTom aimed at jointly selling premium SVT and fleet management services through LoJack’s dealer channels. At the same time LoJack is transitioning to pre-install business and recurring revenue models leveraging its privileged law enforcement relationships as well as its premium brand image as sustainable competitive advantages. DigiCore has started combining the unique characteristics of respectively RF and cellular/GNSS technologies providing continuous, deep-indoor location tracking.

Thursday, November 7, 2013

Google Maps Adds Waze Traffic Data to the Desktop, Brings Back Pegman

As reported by EngadgetThe dramatic overhaul of Google Maps on the desktop saw the world lose a good friend. That friend: Pegman. 

The tiny yellow avatar that you could drop almost anywhere to get instant access to Street View disappeared. Instead, the ground level perspective was accessed by first clicking on a point on the map, then selecting Street View from the pop over in the upper left-hand corner. Clearly, that's much less convenient. Thankfully, Pegman is making a grand return with the latest update to Google Maps on the desktop. 

Additionally, that Waze acquisition is continuing to pay dividends. The company's traffic data is finally coming to the desktop site, after being added to the Android and iOS mobile apps back in August. You'll be able to see areas of congestion and even spot incidents like accidents that bring your commute to standstill. Slowly but surely the new Google Maps is reaching feature parity with it's predecessor, thanks to constant updates like these.
Mapping isn't just about navigation, however. Google also sees it as a tool for exploration and education. That's why it's pushing a new feature called Earth Tours, that brings 3D bird's-eye imagery of particular locals to WebGL-enabled browsers. Now you can fly around Boston or the Alps, just like you would in Google Earth, but without the need to install another piece of software.  A video below demonstrates many of these new features:



The Internet Killed Distance. Mobile Computing Brought It Back.

Here’s why location matters again in e-commerce.
As reported by MIT Technology ReviewFor retailing, the key change produced by the Internet is that shopping online spared consumers the economic costs (in time, grief, and gas money) of visiting a store and locating a product. This has been called the “death of distance.” 

When even isolated individuals can buy anything from a global marketplace, physical location does not confer any commercial advantage, and online merchants might be expected to win every battle.

But an emerging body of economic research shows that there is no independent “online world.” Physical context matters to e-commerce. It shapes our choices and tastes, and it strongly determines what we buy online. With the rise of mobile computing, these local effects matter even more.

Given how easy it is to find and buy books, electronics, and other items online, why do people continue to buy in stores at all? The reason is that online buying generates what economists call disutility: inspecting digital products is difficult, shipping can be slow or expensive, and returning products can be challenging.

Research shows that people weigh these disadvantages against the benefits of buying online. Along with colleagues Chris Forman and Anindya Ghose, I examined what happened to Amazon’s book sales at 1,497 U.S. locations when a Walmart or Barnes & Noble opened nearby. We found that customers who lived near the newly opened stores bought many fewer best-sellers from Amazon.


This means that for mainstream products, local retail options—the offline world—had large economic effects on online business. The physical environment shapes online behavior in other powerful ways. Neighbors tend to like the same music, books, and cars. Social networks are also local. Most e-mail a person receives comes from the same city, often the same building. So even though we speak of the Internet as a “place” where users “visit” websites, this metaphor falls flat when we consider actual behavior. All online behavior has an offline context.

Mobile computing strengthens the links between online and offline life. Before, online activity happened in a specific place, sitting at a desk. Now smartphones mean that wherever consumers happen to be, they can gather information online, compare prices, or buy something. Brick-and-mortar stores worry that customers might be browsing products in their aisles, but buying online (see “It’s All E-Commerce Now”).

Yet the offline environment is actually more important when consumers connect through a mobile device. With colleagues including Sang Pil Han of the City University of Hong Kong, we studied 260 users of a South Korean microblogging service similar to Twitter. What we found was that behavior on the small mobile screen was different from behavior on the PC. Searching became harder to do, meaning that people clicked on the top links more often. The local environment was also more important. Ads for stores in close proximity to a user’s home were more likely to be viewed. For every mile closer a store was, smartphone users were 23 percent more likely to click on an ad. When they were on a PC, they were only 12 percent more likely to click close-by stores.

Thus, the mobile Internet is less “Internet-like” than Web browsing on a PC: search costs are higher and distance matters more. We do not yet know how the growth of the mobile Internet will affect the balance between online and offline retailers. But it appears certain that real-world stores will do better if they can leverage the information available online, and that online retailers will need to understand their customers’ offline environment in order to succeed.