Search This Blog

Monday, February 3, 2014

Agency Completes HOS Field Study, Concludes Current Provisions Yield Safer, Better Rested Drivers

As reported by CCJ Digital: In its long-awaited field study on the current hours-of-service rule, the Federal Motor Carrier Safety Administration has concluded that the two 1 a.m. to 5 a.m. period requirements of the 34-hour restart provisions of the rule cause truck operators to drive safer and be better rested.

The results of the study were released Jan. 30. It was conducted by the Sleep Performance Research Center in Washington State University in Spokane, Wash., and Pulsar Informatics in Philadelphia, Pa.

Researchers studied 106 drivers, ages 24-69, who were studied in two duty cycles and during the 34-hour restarts on each side.

Drivers who did not include two 1 a.m. to 5 a.m. periods reported “greater sleepiness, especially towards the end of their duty periods,” than when following the provisions of the current HOS rule, the study says. They also deviated from their lane more often and “exhibited more lapses of attention, especially at night,” the study says.

Also, drivers who didn’t include the two nighttime periods in their restart got most of their sleep during the day, the study says, and spent more time driving and more time driving at night, and most of their on-duty time came at night.

“This new study confirms the science we used to make the hours-of-service rule more effective at preventing crashes that involve sleepy or drowsy truck drivers,” said FMCSA Administrator Anne Ferro. “For the small percentage of truckers that average up to 70 hours of work a week, two nights of rest is better for their safety and the safety of everyone on the road.”

The study presents different results than research and surveys done by both the American Trucking Associations’ American Transportation Research Institute and the Owner-Operator Independent Drivers Association. Both groups released studies in the fall concluding that, in addition to carriers and drivers losing productivity and revenue due to the rules, drivers were less rested on the whole under the current rule than before. Click here to see CCJ coverage of ATRI’s study, and click here to see coverage of OOIDA’s.

Another study done by the University of Tennessee, findings of which were released this month, found similar conclusions: Drivers are generally more fatigued under the current provisions than under the prior hours rule.

The completion of FMCSA’s study has been a point of contention between FMCSA and members of the House responsible for oversight of the agency. The report was supposed to be completed, per the current MAP-21 highway funding law, by Sept. 30, 2013.

Rep. Richard Hanna (R-N.Y.) chastised the agency more than once for missing the deadline. He and other representatives also took a stern tone in November with Ferro in a House hearing held to determine the impacts of the hours-of-service rules over the fact the study had not yet been completed.

Hanna also is the sponsor of a bill in the House that would, if passed, overturn the hours-of-service rule until Congress could further study FMCSA’s methodology in creating it. The Senate has also introduced a version of the bill.

Of 106 drivers studied in FMCSA’s study, 44 were local drivers, 26 regional and 36 over-the-road, who drove 414,937 miles during the study, while researchers gathered a combined 1,260 days of data.

The drivers wore wrist activity monitors to record sleep and wake periods and used smartphones to take psychomotor vigilance tests, log fatigue and sleepiness and to log sleep, wake and caffeine intake.

Also, trucks in the study were equipped with data acquisition systems that recorded distance traveled, speed, acceleration, lateral lane position, steering wheel angle, headway distance, fuel use and other parameters.

All trucks in the study were equipped with electronic logging devices (ELDs), and information from those was downloaded from carriers.

The tests were conducted between January 2013 and July 2013.

Click here to see the results of the study. 

Friday, January 31, 2014

Telematics For Tennis Rackets Can Provide For Virtual Coaching

As reported by EuroSportMany sporting gadgets come and go with barely a flicker of attention, but there is now a tool that could transform tennis forever.  

An exaggeration? Only time will tell, but the number of top companies involved surely gives an indication as to its genuine potential.
Sony have unveiled a tennis sensor - a little gadget that is attachable to the base of a tennis racket - and Babolat have released their 'Play Pure Drive' effort with one already embedded into the grip.
To put the device in its simplest terms, it is like having a virtual 'tennis coach' to assess your every shot, sensing where the ball strikes the racket and the quality of the contact.
It counts forehands and backhands, serves and smashes and provides stats in the form of tennis data that can be analysed, stored and compared.
The sensor can gather data such as ball speed, accuracy, angle, etc and will pair the info with devices such as Bluetooth, phones, computers and USB connections.
More than simply a coaching aid, the sensor would allow even the top players to quickly and effectively assess their own shots and learn from specific errors during a match.
Would this go right the way to the top elite level? That all depends on how it is received within the tennis world, but the potential is there for it to improve broadcasting tools in addition to personal analysis.
Babolat's latest venture into the field of personal sporting analytics has been put through the International Tennis Federation's official approval process and could well impact the professional game if it is viewed as beneficial to everyone involved.
Babolat's Play Pure Drive (Babolat)
Put simply, if the ITF approve the sensors then they could be used in Grand Slams. Given that the technology already exists on the market, the top players would provide companies the exposure and publicity they desire.
Gael Moureaux, tennis racquets products manager at Babolat, has said: "We integrated sensors inside the handle of the racquet, but it does not change the specification.
"And these sensors will analyse your tennis game, so your swing - your motion - and all this information will be collected by the racquet.
"During the development process of the racquet, we did a lot of lab tests with a lot of players around the world to make sure the data is accurate and to have the right data for the player."
What does this mean for your average amateur tennis lover? The Babolat Play Pure Drive is already out on the market, while Sony's Smart Tennis Sensor will be priced at around £106 when it moves from Japan, where it is currently available.
According to Sony who announced the sensor’s availability in Japan, the sensor will be compatible with around six Yonex EZone and VCore tennis rackets, but additional racket compatibility will be available before long.
Sensor-connected racquets are already with us and who is to say that this will not end up becoming the accepted next phase of the tennis equipment revolution.
We've come quite a way from wooden racquets with tiny heads. This crazy-looking new gadget could yet transform the sport as we know it.

Android App Warns When You’re Being Tracked

As reported by Technology Review: A new app notifies people when an Android smartphone app is tracking their location, something not previously possible without modifying the operating system on a device, a practice known as “rooting.”

The new technology comes amid new revelations that the National Security Agency seeks to gather personal data from smartphone apps (see “How App Developers Leave the Door Open to NSA Surveillance”). But it may also help ordinary people better grasp the extent to which apps collect and share their personal information. Even games and dictionary apps routinely track location, as collected from a phone’s GPS or global positioning system sensors.

Existing Android interfaces do include a tiny icon showing when location information is being accessed, but few people notice or understand what it means, according to a field study done as part of a new research project led by Janne Lindqvist, an assistant professor at Rutgers University. Lindqvist’s group created an app that puts a prominent banner across the top of the app saying, for example, “Your location is accessed by Dictionary.” The app is being readied for Google Play, the Android app store, within two months.

Lindqvist says Android phone users who used a prototype of his app were shocked to discover how frequently they were being tracked. “People were really surprised that some apps were accessing their location, or how often some apps were accessing their location,” he says.

According to one Pew Research survey, almost 20 percent of smartphone owners surveyed have tried to disconnect location information from their apps, and 70 percent wanted to know more about the location data collected by their smartphone.

The goal of the project, Lindqvist says, is to goad Google and app companies into providing more prominent disclosures, collecting less personal information, and allowing users to select which data they will allow the app to see. A research paper describing the app and the user study can be found here. It was recently accepted for an upcoming computer security conference.

In many cases, location information is used by advertisers to provide targeted ads. But information gained by apps often gets passed around widely to advertising companies (see “Mobile-Ad Firms Seek New Ways to Track You” and “Get Ready for Ads That Follow You from One Device to the Next”).

Google, which maintains the Android platform, has engineered it to block an app from gaining information about other apps. So Lindqvist’s team used an indirect method using a function within Android’s location application programming interface (API) that signals when any app requests location information. “People have previously done this with platform-level changes—meaning you would need to ‘root’ the phone,” says Lindqvist. “But nobody has used an app to do this.”

Google has flip-flopped on how much control it gives users over the information apps can access. In Android version 4.3, available since July of last year, users gained the ability to individually disable and enable apps’ “permissions” one by one, but then Google reversed course in December 2013, removing the feature in an update numbered 4.4.2, according to this finding from the Electronic Frontier Foundation.

The new app and study from Lindqvist’s team could help push Google back toward giving users more control. “Because we know how ubiquitous NSA surveillance is, this is one tool to make people aware,” he says.

The work adds to similar investigative work about Apple’s mobile operating system, iOS. Last year different academic researchers found that Apple wasn’t doing a good job stopping apps from harvesting the unique ID numbers of a device (see “Study Shows Many Apps Defy Apple’s Privacy Advice”). Those researchers released their own app, called ProtectMyPrivacy, that detects what data other apps on an iPhone try to access, notifies the owner, and makes a recommendation about what to do. However, that app requires users to first “jailbreak” or modify Apple’s operating system. Still, unlike Android, Apple allows users to individually control which categories of information an app can access.

“Telling people more about their privacy prominently and in an easy-to-understand manner, especially the location, is important,” says Yuvraj Agarwal, who led that research at the University of California, San Diego, and has since moved on to Carnegie Mellon University. Ultimately, though, Agarwal believes users must be able to take action on an app’s specific permissions. “If my choice is to delete Angry Birds or not, that’s not really a choice,” he says.

A 96-Antenna System Tests the Next Generation of Wireless

As reported by MIT Technology Review: Even as the world’s carriers build out the latest wireless infrastructure, known as 4G LTE, a new apparatus bristling with 96 antennas taking shape at a Rice University lab in Texas could help define the next generation of wireless technology.

The Rice rig, known as Argos, represents the largest such array yet built and will serve as a test bed for a concept known as “Massive MIMO.”

MIMO, or “multiple-input, multiple-output,” is a wireless networking technique aimed at transferring data more efficiently by having several antennas work together to exploit a natural phenomenon that occurs when signals are reflected en route to a receiver. The phenomenon, known as multipath, can cause interference, but MIMO alters the timing of data transmissions in order to increase throughput using the reflected signals.

MIMO is already used for 4G LTE and in the latest version of Wi-Fi, called 802.11ac; but it typically involves only a handful of transmitting and receiving antennas. Massive MIMO extends this approach by using scores or even hundreds of antennas. It increases capacity further by effectively focusing signals on individual users, allowing numerous signals to be sent over the same frequency at once. Indeed, an earlier version of Argos, with 64 antennas, demonstrated that network capacity could be boosted by more than a factor of 10.

“If you have more antennas, you can serve more users,” says Lin Zhong, associate professor of computer science at Rice and the project’s co-leader. And the architecture allows it to easily scale to hundreds or even thousands of antennas, he says.

Massive MIMO requires more processing power because base stations direct radio signals more narrowly to the phones intended to receive them. This, in turn, requires extra computation to pull off. The point of the Argos test bed is to see how much benefit can be obtained in the real world. Processors distributed throughout the setup allow it to test different network configurations, including how it would work alongside other emerging classes of base stations, known as small cells, serving small areas.

“Massive MIMO is an intellectually interesting project,” says Jeff Reed, director of the wireless research center at Virginia Tech. “You want to know: how scalable is MIMO? How many antennas can you benefit from? These projects are attempting to address that.”

An alternative, or perhaps complementary, approach to an eventual 5G standard would use extremely high frequencies, around 28 gigahertz. Wavelengths at this frequency are around two orders of magnitude smaller than the frequencies that carry cellular communications today, allowing more antennas to be packed into the same space, such as within a smartphone. But since 28 gigahertz signals are easily blocked by buildings, and even foliage and rain, they’ve long been seen as unusable except in special line-of-sight applications.

But Samsung and New York University have collaborated to solve this, also by using multi-antenna arrays. They send the same signal over 64 antennas, dividing it up to speed up throughput, and dynamically changing which antennas are used and the direction the signal is sent to get around environmental blockages (see “What 5G Will Be: Crazy Fast Wireless Tested in New York City”).

Meantime, some experiments have been geared toward pushing existing 4G LTE technology further. The technology can, in theory, deliver 75 megabits per second, though it is lower in real-world situations. But some research suggests it can go faster by stitching together streams of data from several wireless channels (see “LTE Advanced Is Poised to Turbocharge Smartphone Data”).

Emerging research done on Argos and in other wireless labs will help to define a new 5G phone standard. Whatever the specifics, it’s likely to include more sharing of spectrum, more small transmitters, new protocols, and new network designs. “To introduce an entirely new wireless technology is a huge task,” Marzetta says.


In Motorola Purchase, Lenovo Gains Big Footprint in Smartphones

As reported by the NY Times: Apple and Samsung Electronics dominate the smartphone business, controlling about half of the sales and most of the profits. An ever-changing roster of also-rans has struggled to close the gap.

Now one of those challengers, Lenovo, has broken free of the pack and pushed itself into a clear No. 3 position with an agreement to buy Motorola Mobility from Google.

Assuming the deal is completed, do Samsung and Apple need to start looking over their shoulders? Not yet, it seems.

With a combined share of 6.4 percent of smartphone sales in the fourth quarter of 2013, Lenovo and Motorola would still be only a distant third to Samsung, with 29 percent, and Apple, with 17 percent, according to Counterpoint Technology Market Research. Motorola, with barely more than 1 percent of the smartphone market share, is a shadow of the company whose Razr handsets were the must-have devices of the pre-smartphone era.

No other mobile phone maker has managed to climb back after falling from the heights. And Google piled up millions of dollars in losses from Motorola during a brief ownership that began in 2012.

But Lenovo executives say the Motorola brand remains valuable. And analysts agree that the deal includes other assets that could give Lenovo a better chance of eventually challenging the top two than other second-tier smartphone makers.

“Lenovo now has extra scale in smartphones and a seat near the top table,” said Neil Mawston, an analyst at Strategy Analytics, a research firm.

Some analysts said the deal was reminiscent of Lenovo’s 2005 acquisition of IBM’s PC business, which turned a parochial Chinese electronics company into a global technology power. After several other acquisitions, Lenovo is the biggest PC maker in the world, and it moved this month to expand its computer-making operations further by snapping up IBM’s low-end server business.

Investors are worried about the cost of the two most recent deals: more than $5 billion. Lenovo shares fell 8.2 percent in Hong Kong on Thursday. But analysts say that if the company can successfully integrate Motorola, it could gain considerable advantages.

For one, there would be geographical benefits, which Lenovo executives pointed out in separate conference calls with reporters on Wednesday in the United States and on Thursday in Asia.

Although the company has been pushing to expand its smartphone business internationally, more than 90 percent of its sales are still in China. Lenovo has entered a few other developing markets, but it has not begun selling phones in the United States or Western Europe.

Lenovo executives said they would retain both brand names, and in some cases, the two brands might be sold alongside each other.

“We are not restricting Lenovo to China or Motorola to the U.S.,” said Wai Ming Wong, the Lenovo chief financial officer. “They are two different brands with different sets of propositions for the customers. The key for us is to sell more devices to the market.”

Motorola sells its flagship phone, the Moto X, for $399 in the United States without a mobile service contract. That is more than double the price of some Lenovo smartphones in China.

Although Motorola is not a big player globally, it has distribution relationships with more than 50 mobile carriers, Mr. Wong said.

Analysts said it could take several years to rebuild the brand in Europe, where it has mostly disappeared, but Motorola would give Lenovo a more immediate entree into the United States.

One of the biggest barriers to entering the American smartphone market for Lenovo has been its lack of a strong patent portfolio. This would have made it vulnerable to so-called patent trolls — entities that buy patents to collect royalties from technology manufacturers. Under the agreement, Lenovo said it would receive more than 2,000 patents and license others from Google, affording it a measure of protection.

“Motorola understands the smartphone market very well, especially in the mature markets,” Mr. Wong said.

Because the $2.9 billion price of the Motorola sale includes $750 million in Lenovo stock, the deal also gives Lenovo what it described as a “strategic relationship” with Google, developer of the Android mobile operating system, which is widely used by Lenovo, Samsung and other phone makers.

By aligning itself more closely with Google, Lenovo can keep pace with Samsung, which this week announced a 10-year deal to share patents with Google. That agreement, along with the planned sale of Motorola to Lenovo, eased concerns over possible strains in the Google-Samsung alliance.

“With these agreements, Google gets away from unnecessary friction with the manufacturers,” said Tom Kang, an analyst at Counterpoint.

Close ties to Google are important for Lenovo and Samsung because they build official versions of the Android operating system into their phones, integrating them with Google’s other online services. Some other phone makers use so-called forked versions of Android, which are stripped of some of their Google functions.

“Lenovo has the expertise and track record to scale Motorola into a major player within the Android ecosystem,” Larry Page, the chief executive of Google, said in a statement.

When Google acquired Motorola, the move prompted speculation that it was trying to free itself from its dependence on Samsung phones and create a more integrated system combining hardware and software, along the lines of Apple. Samsung began developing more of its own software, including a new operating system called Tizen.

This week, Lenovo announced a restructuring that created a separate unit to develop “ecosystem and cloud services.”

But analysts said they expected Lenovo to continue to use Android in its mobile phones, rather than develop its own mobile operating system.

“For Lenovo, it’s the cheapest and best-performing solution,” Mr. Kang said.

By relying on Android, Lenovo — like Samsung — falls short of the level of hardware and software integration that Apple provides. But the addition of Motorola would give it a range of hardware that few others could match.

“This puts Lenovo in position to have leading offerings in smartphones, tablets and PCs — a vital trifecta that no other global manufacturer has — besides Apple,” said Frank E. Gillett, an analyst at Forrester Research.

Thursday, January 30, 2014

EU Police Want ‘Remote Kill Switch’ On Every Car

As reported by ReutersThe EU is considering mandatorily equipping of all cars sold in the union with devices, which would allow police to remotely disable engines, according to leaked documents.  

If the plan goes as planned, European law enforcers will be able to stop fugitives, suspected criminals and even speeding drivers with a simple radio command from a control room.

The technology is part of a six-year development plan by the ‘European Network of Law Enforcement Technologies’, or Enlets, a working group for police cooperation across the EU, reports the Telegraph.
"Cars on the run can be dangerous for citizens," the newspaper cites a document leaked by state power watchdog Statewatch.
"Criminal offenders will take risks to escape after a crime. In most cases the police are unable to chase the criminal due to a lack of efficient means to stop the vehicle safely," it says.
Remote control of car electronics is far from novel. A modern car is equipped with a network of microcomputers, which monitors and controls everything from ignition and flow of fuel to radio station being played. And increasingly cars can communicate wirelessly, a technology called telematics.
Loan firms and car dealerships has been using the benefits of electronically-controlled cars for years. A vehicle sold in subprime market can be equipped with a black box, which reminds the client of overdue payments with honking horns and flashing lights and would disable the engine completely a few days later, unless the money is paid. And a GPS receiver would tell the dealership the exact location where the car can be collected.
Remote tracking and control is also used as anti-theft measure. Services like General Motors’ Stolen Vehicle Slowdown can force a stolen car to drop speed and stop on a remote command from the service provider.

Giving police the ability to do the same to any car in the EU does not thrill some rights advocates cautious of giving the government more authority.
"We need to know if there is any evidence that this is a widespread problem. Let's have some evidence that this is a problem, and then let's have some guidelines on how this would be used," Statewatch told the Telegraph.
Apart from that, there is a concern of possible hacker attacks, which could use the remote kill switch for nefarious ends. In March 2010 Texas police arrested a former car dealership employee, who used its car tracking and repossession system to disable some 100 vehicles in Austin in revenge for being laid-off.
Researchers from the University of California, San Diego and Washington University tested how much harm hacking can do to a car’s electronic controller. The study conducted in 2010 showed that a criminal can relatively easy interfere with safety-critical systems like brakes.
The security of connected cars has not become hacker-prof since. At the 2014 Consumer Electronics Show this month, technology firm Harman warned that hacking problems for modern cars very serious because the infrastructure of their electronic components was not designed with networking in mind, so they are not ready for the level of exposure to cyber-attacks that internet connectivity brings.

Repurposing Technology: GPS Used To Help Predict Hazards

As reported by GCNWhile necessity may be the mother of invention, tight budgets are the mother of innovative repurposing of technologies. 
Scientists at NASA's Jet Propulsion Laboratory, the National Oceanic and Atmospheric Administration and the Scripps Institution of Oceanography have teamed up to take advantage of a network of hundreds of GPS stations in Southern California, originally installed to measure tectonic movements and to monitor and predict hazardous events such as earthquakes and flash floods.
While the scientific-level GPS stations bounce signals off satellites to record very small changes in location, those signals can also be used to measure water vapor in the atmosphere.
"A GPS receiver fundamentally is measuring the amount of time it takes signals to travel from the GPS satellites to the receiving antenna on the ground," explained Angelyn W. Moore, a scientist on JPL's Geodynamics and Space Geodesy Group. "That travel time is modified by the amount of water vapor in the atmosphere. The upshot is that whenever we measure a geodetic-quality GPS station's position, we are also measuring the delay due to water vapor. That delay can be related to precipitable water vapor with a surface pressure and temperature measurement."
When this moisture data is combined with data from barometers and thermometers, it can give forecasters greater accuracy in predicting rainfall and flash floods. According to Moore, there are approximately 40 stations providing water-vapor estimates every half hour. "We are evaluating hardware to provide water vapor [data] at 5 minutes or less at a test site, and plan to install that at approximately 25 sites," she added.
While forecasters are currently accessing the system's water-vapor estimates via a Web interface, said Moore, "We are pursuing integration into their standard forecaster displays."
According to Moore, the network in Southern California consists of 475 GPS stations, about 175 of which are operating in real time and 17 of which have been equipped with accelerometers. While GPS measurements can be used to measure large movements during an earthquake, accelerometers can measure smaller movements. More importantly, accelerometers can measure primary waves, or P-waves, which can be help seismologists predict the arrival of the secondary waves, which signal the phase of violent shaking during an earthquake. Since P-waves move through the earth faster than S-waves, data from P-waves could be used to provide an early-warning system.
While Moore doesn't know of imminent plans to expand the system, she says doing so would offer important data. "Certainly the spatial extent can be extended," she said. "Existing real-time GPS stations tend to be located in California more than in Arizona, so north is the most obvious opportunistic direction at present. That would enable use of the GPS water vapor [data] for other weather conditions such as atmospheric rivers." Atmospheric rivers are relatively narrow regions in the atmosphere in which relatively large amounts of water vapor are transported horizontally. According to NOAA, "While ARs come in many shapes and sizes, those that contain the largest amounts of water vapor, the strongest winds, and stall over watersheds vulnerable to flooding, can create extreme rainfall and floods."
In fact, NOAA already has several hundred GPS-equipped weather stations scattered throughout the nation. Although the density of that network is not sufficient for the kinds of system being developed by the JPL/NOAA/Scripps team, they could be incorporated into an expanding network.