Search This Blog

Friday, December 20, 2013

California Bill Would Require 'Kill Switch' For Smartphones

As reported by USA TodayTwo California officials have announced plans to introduce legislation requiring smartphones to have a "kill switch" that would render stolen or lost devices inoperable.

State Sen. Mark Leno and San Francisco District Attorney George Gascon announced Thursday that the bill they believe will be the first of its kind in the United States will be formally introduced in January at the start of the 2014 legislative session.
Leno, a San Francisco Democrat, joins Gascon, New York Attorney General Eric Schneiderman and other law enforcement officials nationwide who have been demanding that manufacturers create kill switches to combat surging smartphone theft across the country.
"One of the top catalysts for street crime in many California cities is smartphone theft, and these crimes are becoming increasingly violent," Leno said. "We cannot continue to ignore our ability to utilize existing technology to stop cellphone thieves in their tracks. It is time to act on this serious public safety threat to our communities."
Almost 1 in 3 U.S. robberies involve phone theft, according to the Federal Communications Commission. Lost and stolen mobile devices — mostly smartphones — cost consumers more than $30 billion last year, according to a study cited by Schneiderman in June.
In San Francisco alone, more than 50 percent of all robberies involve the theft of a mobile device, and in Los Angeles mobile phone thefts are up almost 12 percent in the last year, the San Francisco DA's office said.
Samsung Electronics, the world's largest mobile phone manufacturer, earlier this year proposed installing a kill switch in its devices. But the company told Gascon's office the nation's biggest carriers rejected the idea.
But the CTIA-The Wireless Association, a trade group for wireless providers, says a permanent kill switch has serious risks, including potential vulnerability to hackers who could disable mobile devices and lock out not only individuals' phones but also phones used by entities such as the Department of Defense, Homeland Security and law enforcement agencies.
The CTIA has been working with the FCC, law enforcement agencies and elected officials on a national stolen phone database that debuted last month.
Gascon and Schneiderman have given manufacturers a June 2014 deadline to come up with solutions to curb the theft of stolen smartphones.
"I appreciate the efforts that many of the manufacturers are making, but the deadline we agreed upon is rapidly approaching and most do not have a technological solution in place," Gascon said. "Californians continue to be victimized at an alarming rate, and this legislation will compel the industry to make the safety of their customers a priority."

Placing an Economic Value on GPS/GNSS Spectrum

"With other industries fighting for the finite raw material of
spectrum, the GPS industry must continue to generate and
update its economic valuation work or risk being marginalized
in policy debates." Bartlett Cleland
As reported by Inside GNSS:  The nation’s leading GPS experts are struggling to quantify how the world’s premier navigation and timing system affects the U.S. economy, an effort critical to building a political firewall around GPS spectrum in the face of ballooning demand for broadband capacity.

“It is an impossible question,” said James Schlesinger, chairman of the National Space-Based Positioning, Navigation, and Timing (PNT) Advisory Board. “It’s methodologically harder than evaluating the Gross Domestic Product. What we are being asked to do is calculate something that is incalculable.”

The research challenge was highlighted in comments by Jules McNeff, vice-president at Overlook Systems Technologies, in the current issue of Inside GNSS: "In the case of GNSS services, these [benefits] are more difficult to quantify because they are both direct and indirect, include second- and third-order benefits, and are not simply revenue streams from direct subscription services. They are spread across all economic sectors and represent revenues from increased efficiencies in logistics and related economies of operation, improved safety, reductions in personnel costs and exposure to hazards, and others."

Alternative data from a similar market report.
The PNT advisory board, which agreed to look at economic impacts last year at the request of the National Executive Committee for Space-Based PNT (ExCom), spent part of its December meeting weighing reports from three researchers it had commissioned to find and assess data on GPS benefits and spectrum valuation that others had already gathered. One of the most recent studies discussed was Issue 3 of the GNSS Market Report, a study released by the European GNSS Agency (GSA) a few months ago.

GSA found that the annual shipment of GNSS devices worldwide jumped from 125 million in 2006 to roughly 850 million in 2012, said Nam Pham of ADP Analytics. By 2022 that is expected to grow to 2.5 billion units a year, bringing the total number of GNSS devices in circulation to around 7 billion — enough for every person on the planet to have one. The total global revenue for GNSS devices was forecast to grow to $143.9 billion by 2022 from $59.4 billion in 2012.

According to the GSA study, the GNSS market is dominated by location-based services (LBS), said Pham, a category that includes GNSS-enabled consumer products such as smart phones, tablets and digital cameras. The GSA found that though such devices make up the vast majority of the GNSS devices, they likely will be responsible for only 47 percent of the revenue from 2012 to 2022 — a scant lead over the 46.3 percent generated by rail and road applications. Surveying and mapping is expected to produce 4.1 percent of revenue for the period followed by agriculture and aviation/marine uses at 1.4 and 1.3 percent respectively,

Of the global activity in 2012, North America generated more than 70 percent of the aviation revenue and nearly 60 percent of that from agriculture. Thirty percent of LBS revenue was from North America, while surveying and road applications came in at around 25 percent each, rail at roughly 15 percent, and maritime applications at around 10 percent.

Direct revenue is only one of the economic factors for any given industry, said Pham. Companies buy raw materials and hire workers who in turn use their wages to buy homes and other goods. Pham estimated that GNSS manufacturers contributed $32 billion in output and $6.8 billion in earnings to the U.S. economy, generating more than 105,000 total jobs in the process. He based his figures on the 2011 U.S. Census Bureau Annual Survey of Manufacturers.

Work with the Data You Have, Not the Data You Want
The European study, however, appeared incomplete to at least one member of the board — an assessment with which Pham agreed.

“There is something missing there,” said board member James Geringer, a former governor of Wyoming. “I think of all the other downstream applications . . . any of the aerial imagery that uses GPS. It is a meaningless service without the GPS or other coordinate references,” Geringer said. The geospatial information system (GIS) firm ESRI, where he serves as director of policy and public sector strategies does $1 billion in annual business by itself, he said, adding, “You’ve got a great big hole in there somewhere, it seems to me.”

Geringer’s point illustrated the shortcomings in the available data at the heart of a wide-ranging discussion within the board, which was weighing how to approach the question and at what point it should present numbers — even preliminary numbers — to the ExCom.

It is like trying to measure the value of energy, Schlesinger said.

Even so, the European study appears to be one of the best available in that it is recent — it was released in October of this year — and broad in scope.
Most of the economic studies on GPS tend to be very narrowly cast and 5 to 15 years old — so old they fail to capture the benefits of new applications and rapid shifts in technology, said spectrum expert Bartlett Cleland,

Cleland presented the result of a literature review on the market research. Among those new applications, he said, are GPS-enabled clothing, including shoes that help you find your destination, and GPS-based sports analytics that teams can use to assess and improve their play against a specific opponent.

Moore’s law that computing power doubles and costs are halved every 18 months is “roughly applicable” to the GPS industry, Cleland said, and is a realistic guide for how often you need to do new studies on an industry.

“About every 18 months you have to reshuffle the deck,” he said.

Getting Equipped for the Spectrum Gold Rush
The need for better data on PNT’s economic impact was made clear during the fight over LightSquared’s proposal to repurpose spectrum near the band used by GPS, thereby creating what tests eventually showed would be debilitating interference to GPS receivers. The economic data presented by the wireless broadband industry at the time seemed to give the telecom companies the upper hand in the policy debate, said Geringer.

GPS signal strength is below the normal noise thresholds of
the spectrum.  Additional 'noise' in the form of competing
communication signals make the GPS signal more difficult
'dig out' of the spectrum.
Many studies support the argument that broadband needs more spectrum, said Cleland. The number of studies is partly due to the industry being well organized.

Interest in the subject is global, he said, and some broadband companies have the cash to support economic-benefit research for their industry. For example, he said,  a report from the International Telecommunications Union and the United Nation’s Educational, Scientific and Cultural Organization details the global economic effects of broadband with GDP numbers, penetration to growth rates, and analysis of ancillary benefits such as eradication of poverty or increased educational opportunities. Related reports are also available as well from the Telecommunications Industry Association and even individual companies such as CISCO.

LightSquared's potential interference ended up with the FCC
denying them a license, and a lawsuit against several of
the industry players from Harbinger Capital.
“With other industries fighting for the finite raw material of spectrum,” Cleland said, “the GPS industry must continue to generate and update its economic valuation work or risk being marginalized in policy debates.”

That fight is only going to become more intense as companies scramble to capture consumers who want “everything, everywhere,” said John Kneuer, a former National Telecommunications and Information Administration chief and a spectrum policy expert. Wireless carriers are trying to provide content inside the home, cable companies like Comcast and DISH are devising ways to deliver content outside of the home, and applications providers like Google and Amazon are maneuvering to become content conduits.

The broadband industry is converging on a ‘singularity” where all the companies are competing with each other to control the same customers  — and it all rides on the availability of spectrum, said Kneuer.

The Advisory Board will continue its work into next year, said the board’s vice chairman, Brad Parkinson, while cautioning against the expectation that they could come up with a single number that policy makers could use for an ‘elevator speech.’

The sum of the economic effects of GNSS throughout the world “is certainly many tens of billions of dollars per year, but the analysis to date tends to be based on cost rather than value,” Parkinson said. ‘There is a good reason for that. It is really hard to come up with the total value. Hence, we tend to undervalue the GPS contribution.”

Although Cleland said that he didn’t see why “GPS doesn’t sit right alongside a lot of these technology issues in importance,” he said it was unlikely that the pressure on GPS spectrum will ease.

“The broadband debate and the need for spectrum will never end,” said Cleland. “It will never end.”

Thursday, December 19, 2013

Apple Patent Shows Off “Layered” Maps

As reported by GigaOM: Are we looking at iPhone’s map of the future? Today, the U.S. Patent and Trade Office (USPTO) published an application from Apple that details the workings of an “Interactive Map” with many layers, according to Apple Insider.

The map itself is a program that allows users to view different “layers” of content, pulled from the Internet, over the map. In the report, Apple gives the example of a weather layer that could tell users when a storm is approaching. Other sample layers include Shopping, Commuting, and even Tourism.

The map is also designed to take into account these layers when the user searches in the area, to provide further context for searching. For example, someone in the Shopping layer who types in “skateboard” is more likely to get a skate shop than a skate park. The layers would also have some geospatial features — users in the “Tourism” layer could get access to information for all the landmarks in the immediate area.

The patent also discusses the maps’ interactivity. The application describes the ability to tap on specific features or landmarks to get information, like the population of a city or the year a landmark was built. Touching two points on the map would create a route between them, like a more sophisticated pin drop. Pins themselves would also receive an overhaul, with more robust features, instead of being phased out entirely.

Apple has been in pursuit of a better Maps program since the Cupertino company had a fall-out with Google in 2012. But the implementation of a proprietary Maps system has been a thorn in the giant’s paw, so to speak — providing the most headaches for users with its bad directions and glitchy behavior.

While necessary updates like transit directions are likely in the works, it’s promising to see that Apple is actually looking further ahead to bring a more enriched experience to the app. It might be a long while before the company has a Maps app worth using, but this patent could be a promising first step.

GPS-Aided Crowdshipping: TwedEx and Deliv

As reported by Inside GNSSCrowdsourcing has gained some attention in the GNSS community as a means for detecting and mitigating RF interference and signal spoofing.

In the past year, researchers and commercial ventures have developed ways to take advantage of the GNSS-derived knowledge of real-time location of participants to create ad hoc networks of couriers.
Earlier this year, Palo Alto, California–based Deliv announced plans to begin “crowd-shipping” operations using part-time couriers to deliver goods based on their proximity to participating retail outlets and their customers. The company’s cofounder and CEO, Daphne Carmeli, was a founding team member of WebMd and is the former head of Netscape e-commerce.
Last Thursday (December 12, 2013) Deliv announced alliances with three of America's largest mall operators, Simon Property Group, Macerich, and Westfield, to provide the same-day service. Together with a partnership established earlier this year with General Growth Properties (GGP), the second-largest U.S. mall operator, the companies manage more than 660 malls including thousands of retailers across the country.
The operators have begun a rollout of Deliv-powered services across nine malls including GGP's Stonestown Galleria (San Francisco), Eastridge (San Jose), Glendale Galleria (Los Angeles), and Oakbrook Center (Chicago); Macerich's Santa Monica Place (Santa Monica) and Westside Pavilion (Los Angeles); Simon's Stanford Shopping Center (Palo Alto); and Westfield's Century City (Los Angeles) and Valley Fair (Santa Clara).

Backed by $7.85 million in venture capital and partnership with leading mall operators in the country, Deliv draws on a crew of crowdsourced workers — typically part-timers, such as college students and retirees — equipped with GPS-enabled smartphones to provide same-day delivery in their local area.
With typical delivery charges between $5 and $15 — which may be further reduced by retailer subsidies — the business will compete with traditional delivery operations by FedEx, UPS, and Amazon, but without the need for a bricks-and-mortar warehouse infrastructure of the latter.
Deliv’s technology can be integrated directly into retailers’ e-commerce systems enabling customers to select a same-day delivery option during online transactions. The Deliv application programming interface (API) makes extensive use of GPS-derived lat/lon coordinates for routing, estimation of delivery time, and real-time tracking available to customers on online maps.
Customers can order products online for delivery to local stores, which in effect serve as distribution centers for the operation. Growth prospects for the venture are attractive. Of today’s $70 billion delivery market, same-day delivery is estimated to be only about $500 million.
Mobile Delivery En Route
Researchers associated with Microsoft Corporation and Google are trying to extend the crowdshipping capability even further, by predicting the future locations of individuals within a window of time that could allow them to become part of a delivery network.
In a paper presented at the 2012 AAAI Conference on Artificial Intelligence, “Far Out: Predicting Long-Term Human Mobility,” Adam Sadilek, a Google data scientist and former visiting researcher at Microsoft, and John Krumm, a principal researcher at Microsoft Research, introduced the concept of “TwedEx,” using geotagged Twitter messages to linkages in a potential delivery network.
The authors proposed “an efficient nonparametric method that extracts significant and robust patterns in location data, learns their associations with contextual features (such as day of week), and subsequently leverages this information to predict the most likely location at any given time in the future.”
Using more than 32,000 days worth of data of GPS-tagged “tweets” from 703 Twitter-using participants in a Seattle area field trial indicated that their model predicts the correct location with high accuracy, even years into the future.
In a follow-up paper, “Crowdphysics: Planned and Opportunistic Crowdsourcing for Physical Tasks,” Sadilek and Krumm, joined by another Microsoft researcher, Eric Horvitz, examined whether their technique could support delivery of packages to a moving target — what they referred to as “local opportunistic routing under uncertainty.”
Presented last July at the 7th International AAAI Conference on Weblogs and Social Media, the paper extended the evaluation to New York and invited readers to consider how such a service, currently unavailable, would operate:
“Imagine that you wish to send a gift to a friend. You write only the friends unique identifier on the package and drop it into the crowd substrate. The identifier can be a telephone number, email address, or Twitter handle. Workers then route the package among themselves until one of them meets the target person. Thus, the only constraint in this system is that the package has to be delivered to the recipient as fast as possible. Aside from that, the delivery can occur anywhere, anytime.”

Wednesday, December 18, 2013

How Micro-Location, Geofencing and Indoor Location Are Driving The Retail Revolution

As reported by GIS User: With the hype surrounding the launch of iBeacon in the Apple retail stores, proximity sensors, also called proximity detection devices or micro-location, increasingly feels like a revolution.

Since Apple launched iBeacon with the release of iOS 7 earlier this year, and Google finally upgrading Android to include Bluetooth Low Energy (BLE), the entire retail world is now excited by the opportunity of deploying BLE beacons to develop new retail services and create additional revenue sources.

As a consequence, the market is now flooded by tons of startups with BLE beacon prototypes, compatible or not with Apple’s iBeacon specifications - nobody knows, actually - with each making the promise that deploying beacons will solve the main problem brick and mortars are facing (in their competition with e-commerce sites), creating a direct link with the end user who will tend to spend more money when they are in the retail store and not when they come back to the competitors web sites!

Capitalizing on the huge interest generated by these announcements, we now see many self-declared “domain experts” releasing reports identifying the growing number of startups announcing these next-generation beacons, that look much better - on paper - than those of established Indoor location experts but are limited in features and untested in the field.

A situation that makes it very hard for venue owners, retailers, shopping mall operators, etc., to get an overall picture of this mega trend, and decipher what is true from what is not.

Evolution or revolution?
Indoor location is definitively a key feature with immense value. Massive adoption is on the way and the market finally reached maturity, which is typical of mainstream adoption: established business models, clear competitive landscape, and consolidations …

Customers are not early adopters anymore and decisions are now driven by added value considerations. Improved end user experience (which is a critical issue for most venue owners in particular in their battle against e-tailers) and identified tangible ROIs are key drivers for the Indoor location market.

The maturity of the indoor location market is already impressive, simply by looking at requests coming every day from retail stores, shopping mall owners, department store manager, large airports, train stations, convention and exhibition centers, smart buildings…

In other words, the technology works. Which is why it’s crucial for technology providers to set the right expectations, especially with the recent rise of point solutions that are very limited, both in functionalities and evolution. The key added value of an indoor location service is not in the Beacon itself but in the software intelligence that only Indoor location experts can bring to third parties, with easy to use deployment tools and an end-to-end platform.

For example, to make sure a user is inside an area or a shop in order to credit her with loyalty points, simple micro-location is simply not good enough. It lacks the additional intelligence to aggregate all the location data available (GPS, 3/4G, WiFi…) and provide the right location information in real-time. The main risk with unreliable systems is to alert users in a wrong area, flood them with irrelevant offers or discounts and at the end drive the interaction with the user completely inefficient.

Aside from the magic, the success of Indoor location can be attributed to 3 key levels of service:

Indoor Location, like an indoor GPS, helps users find their way, with step by step navigation, offers the ability to discover the surroundings, can optimize their visit, locate their friends and colleagues, insure their security, and provide behavioral analytics to merchants on visitors’ paths.

Micro-location allows travelers, visitors and shoppers to interact with a specific item, a product on a shelf, an art piece…The presence of the consumer is only identified when they are close to a BLE 4.0 beacon and lost when she’s far from it.

Geofencing adds to the 2 previous approaches by bringing smart and reliable interaction. It acts as a virtual and invisible fence to send specific information when users enter or leave a specific area or venue.

Geomarketing or loyalty-oriented interactions with context awareness are among the most common use cases.

Many new use cases are being developed for more personal use and these applications include in-storewalk-in detection, a partner or customer location during a show, guidance until the right gate with path time in an airport….

Another huge value these technologies bring to retail is to allow venue owners like retailers or shopping mall operators to track, in real-time, consumer behaviors to improve customer service and boost sales.

While knowing that someone is entering into a shop is interesting in itself, it still remains a very poor data compared to what a supplier of a holistic Indoor Location solution, like Pole Star, can offer today with a unique set of technologies and services, covering a large spectrum of use cases to lead the visitor in a personalized way.

In another example, in large buildings, the paths consumers use provide valuable insight into their behavior and interplay at different point of presences, and help the venue owner to optimize its venue and its overall sales and marketing strategy. Within a department store or a supermarket knowing the paths of shoppers throughout the venue will definitely help drive a more personalized user experience and improved customer relationships, as well as mobile context-aware advertising.

Which is why it doesn’t make sense to consider micro-location as a standalone solution solving the complex requirements and use cases of Indoor location. Although micro-location is an important piece of the entire Indoor location value chain, the only solution to maximize the value for a venue owner is to be able to follow consumer behavior from their initial shopping intent, far before the final interaction at the point of presence.

The success in realizing the full value of Indoor Location, comes first by attracting people in the building, or the shop then assist and guide them according to what they intend to do. Similar to what the Google search engine does online, but in the physical world, Indoor Location is the only technology able to suggest user interactions, based on their geolocation in a particular area or close to a specific object or shelves.

This is no revolution. The technology exists and has been already deployed and used in many indoor locations (shopping malls, airports, museums…).

The new ability brought by Apple with iBeacon is to wake-up the application when a BLE beacon is close from the user iPhone in order to engage with customers even if the app was not initially opened. However the big topic now is how to motivate end-user to download the venue or brand generic application?  Innovation has to happen now in the service to consumers, with killer applications that satisfy both consumers and venue owners, and in how location data can be used to analyze users’ indoor behaviors i.e. Big Data.

A global and holistic view of Indoor Location – not just micro-location - is essential to maximize the value of what it can bring to consumers, retailers and the society as a whole, as well as guarantee the satisfaction of venue owners.

Galileo Achieves First Airborne Tracking

Aircraft position data as obtained by Galileo-only receiver
during a Netherlands test flight.
As reported by GPSWorld:  The European Space Agency’s Galileo satellites have achieved their first aerial fix of longitude, latitude and altitude, enabling the inflight tracking of a test aircraft. ESA’s four Galileo satellites in orbit have supported months of positioning tests on the ground across Europe since the first fix in March.

Now the first aerial tracking using Galileo has taken place, marking the first time that Europe has been able to determine the position of an aircraft using only its own independent navigation system. The milestone took place on a Fairchild Metro-II above Gilze-Rijen Air Force Base in the Netherlands at 12:38 GMT on November 12. It was part of an aerial campaign overseen jointly by ESA and the National Aerospace Laboratory of the Netherlands, NLR, with the support of Eurocontrol, the European Organisation for the Safety of Air Navigation, and LVNL, the Dutch Air Navigation Service Provider.

A pair of Galileo test receivers was used aboard the aircraft, the same kind employed for Galileo testing in the field and in labs across Europe. They were connected to an aeronautical-certified triple-frequency Galileo-ready antenna mounted on top of the aircraft.

Tests were scheduled during periods when all four Galileo satellites were visible in the sky – four being the minimum needed for positioning fixes. The receivers fixed the plane’s position and, as well as determining key variables such as the position, velocity and timing accuracy; time to first fix; signal-to-noise ratio; range error; and range–rate error.
Testing covered both Galileo’s publicly available Open Service and the more precise, encrypted Public Regulated Service, whose availability is limited to governmental entities.

Fairchild Metro-II aircraft used for Galileo airborne testing.
Flights covered all major phases: take off, straight and level flight with constant speed, orbit, straight and level flight with alternating speeds, turns with a maximum bank angle of 60º, pull-ups and push-overs, approaches and landings.

They also allowed positioning to be carried out during a wide variety of conditions, such as vibrations, speeds up to 456 km/h, accelerations up to 2 g horizontal and 0.5–1.5 g vertical, and rapid jerks. The maximum altitude reached during the flights were 3000 meters.

NLR’s Fairchild Metro-II has previously performed initial European GPS testing in the 1980s, and the first tests of the European Geostationary Navigation Overlay Service, EGNOS, which sharpens GPS accuracy and monitors its reliability over Europe for high-accuracy or even safety-of-life uses.
The definition and development of Galileo’s in-orbit validation phase were carried out by ESA and co-funded by ESA and the EU.

The Full Operational Capability phase is managed and fully funded by the European Commission. The Commission and ESA have signed a delegation agreement by which ESA acts as design and procurement agent on behalf of the Commission.

Tuesday, December 17, 2013

Hell on Wheels: The State With America's Worst Drivers Revealed

As reported by the Car Connection: When Shakespeare wrote about Italians and Spaniards, he tended to make them a little crazy. Too much sun makes the blood boil, went the thinking at the time.

Four-hundred years down the road, we know that most Renaissance medical theories were mostly inaccurate, but could a new study about driving in America give some support to the idea that warm weather people are of similar ilk?

Maybe, maybe not.  

The study was conducted by CarInsuranceComparison.com and compiled five types of law-enforcement data:
  • Fatality rate (per 100 million vehicle miles traveled)
  • Failure to obey citations issued (traffic signals and seat belts)
  • Drunk driving citations
  • Number of tickets issued
  • Careless driving citations
The data itself came from a range of sources, including the National Highway Traffic Administration and Mothers Against Drunk Driving. CarInsuranceComparison ranked each state in each category according to how well (or poorly) it performed, then added the rankings to create a total score. For example, a state that ranked 32nd in careless driving citations and 14th in drunk driving citations would have a combined score of 46 for those two areas. (For a peek at the actual tabulations, check this Google doc.)

When all the dust settled, Louisiana had earned the dubious distinction of having the worst drivers in America. It might interest you to know that despite Louisiana's liberal alcohol policies, DUIs aren't the cause of its low rank. In fact, Louisiana "only" scored 38th (out of 51, including the District of Columbia) in rates of drunk driving arrests, but really hit bottom in failure-to-obey citations, tickets issued, and incidents of careless driving.

Whatever's wrong with Louisiana, it's catching, because the Bayou State's neighbors didn't fare too well, either. In fact, seven of the ten worst-ranked states are situated in the South. (Assuming we want to associate Texas with the South, which is a matter of some debate.) And so, without further ado, the states with the worst drivers are:
1. Louisiana
2. South Carolina
3. Mississippi
4. Texas
5. Alabama
6. Florida
7. Missouri
8. North Carolina
9. Montana
10. North Dakota

At the other end of the list -- the good end -- we find states that are generally farther to the north. The best of the bunch is Vermont, which boasts a higher-than-average number of drunk-driving incidents, but on every other metric, it's in the top ten. The states with the best drivers are:
51. Vermont
50. Utah
49. New Hampshire
48. Minnesota
47. Oregon
46. Maine
45. Connecticut
44. District of Columbia
43. Iowa
42. Massachusetts
41. Alaska

RATIONALE & CAVEATS
Can we draw any broad conclusions about America based on this study? As much as we love to be equal-opportunity finger-pointers, it's hard to ignore the fact that the worst-driving states fall largely in one southern cluster, while the better-driving states lie mostly to the north.

But surely the disparity isn't related to weather -- otherwise, why would Montana and North Dakota make the top-ten worst-driver list? The overlap doesn't seem to be alcohol-related, either -- at least not among the most accident-prone segment of drivers -- nor does it seem linked to licensing laws for teens. Frankly, the problem of bad driving seems most closely aligned with poverty rates, though even that isn't a perfect match.

Then again, maybe the clusters are a coincidence, or perhaps the study is inaccurate -- something we've seen before and will likely see again. Methodology nerds could surely poke a few holes in the survey's backend, and our own experiences tell us that some states may deserve slightly poorer rankings than they received this round.