Search This Blog

Friday, November 7, 2014

How New Transportation Technologies Will Change Everything

As reported by Government Technology: The transportation systems around which the modern world has been built are on the verge of a significant transformation. Intelligent transportation systems (ITS) are making driving and traffic management better and safer for everyone. 

Transportation typifies the FutureStructure framework. (FutureStructure is a sister publication of Governmet Technology.) Soft infrastructure — the realm of concepts, policies and legislation — is rapidly evolving to accommodate the demand for global investment in hard transportation infrastructure. Technology is bridging the two as vehicles and the infrastructure on which they operate become increasingly connected.

Traffic and population growth create demand for more transportation infrastructure, but many jurisdictions don’t have sufficient money or space to build more roads and rail.

Despite cities lacking funds population growth will continue — the World Health Organization expects 7 out of 10 people on the planet will live in cities by mid-century. Coupled with climate change concerns, cities leaders must start rethinking the very nature of existing transportation systems.

New transportation technologies are emerging to meet these challenges, including connected and autonomous vehicles, alternative fuels, keyless fleet management and traffic analytics, as well as local zoning and planning policies that support transit-oriented development. New technology for on-road communications will dramatically change how vehicles operate and provide information and capabilities for better, real-time traffic management — if the necessary network infrastructure is in place.

ITS is poised to transform transportation into a connected, dynamic component of the city-as-a-system. Perhaps more importantly, the greater ease in moving about will have a positive impact on quality of life and commerce for residents, visitors and local businesses.

The Promise of ITS

“Cities are struggling with transportation today and will struggle even more in the future,” said Bill Ford, Jr., executive chairman of the Ford Motor Company, while addressing the ITS World Congress in Detroit in September 2014. “We need to redefine what mobility is for the coming century.”

According to the U.S. Department of Transportation, ITS improves transportation safety and mobility by integrating advanced, wireless communications technologies into transportation infrastructure and vehicles. The purpose of ITS is to process and share information that can prevent vehicle collisions, keep traffic moving and reduce environmental impacts.

Coordinating traffic signals, giving signal priority to transit lanes, electronic information signs and variable speed limit signs are all part of the burgeoning ITS industry. Also part of ITS is the ability to automatically distribute real-time traffic data to websites, social media feeds, mobile apps, and local TV and radio stations.

“Instead of a bunch of independent systems on the local, national or even global level, ITS creates a transportation network that works like the Internet, where everything is connected, but also open for standards-based communication, which reduces costs and creates value for everyone involved in managing traffic,” said David Pickeral, who leads the Industry Smarter Solutions Team for Transportation at IBM.

Autonomous and Connected Vehicles

Perhaps the most anticipated element of ITS is the connected vehicle. The imminent arrival of connected vehicles is one reason for new visions of transportation within a metro area.

Connected technology focuses on wireless communication: vehicle-to-vehicle (V2V), vehicle-to-pedestrian (V2P) and vehicle-to-infrastructure (V2I), collectively referred to as V2X. Intended primarily to improve safety, V2V technology allows cars to continually communicate to the vehicles around them so each are aware of the others’ speed, heading and direction. 

Connected vehicles also help in recognizing and alerting drivers to dangerous situations. By adding communication points in hazardous road areas and intersections, V2I technology extends crash-reduction capabilities by allowing automatic control of signal timing, speed management, and operation of transit and commercial vehicles.

“The connected vehicle technologies are ready,” said Suzanne Murtha, senior program manager for intelligent transportation initiatives at Atkins Global. “Now it’s a matter of governments capturing and sharing data about real-time, on-the-street traffic conditions so drivers can make better choices.”

A different but related technology is that of autonomous vehicles, perhaps the most famous example of which is the Google self-driving car. Autonomous cars use a combination of LIDAR (similar to sonar but with laser light), GPS, optical cameras and big-time processing power to analyze millions of possible roadway scenarios and then take the appropriate action. The ultimate goal for autonomous vehicle technology is to make the vehicle so intelligent that no driver input is needed. However, truly autonomous vehicles, wherein the driver can give up complete control to the car, remain on the distant horizon. According to Ford, it is incremental technological advancement that will one day lead to driverless cars.  

“By the time we get to full autonomy, the last step won’t seem like such a big deal,” he said. “Even as we put in a lot of these features the driver still has to be vigilant and in control.”
Malcolm Dougherty, director of the California Department of Transportation, agreed. At the ITS World Congress he said that while he believed “the development of autonomous vehicle technology is going to accelerate … for the time being the motorist will always be responsible for the vehicle.”

If you buy a new car today, you’re getting a preview of how driving will change as we move into the era of autonomous, connected vehicles. Features that help you park the car in a tight spot, automatically adjust cruise control speeds and sound an alert when the car drifts out of its lane are examples of technology now offered by automakers. Several states have already passed laws that allow autonomous vehicles to operate on public roads.

In September, California approved three permits for Volkswagen, Mercedes and Google to start autonomous vehicle testing under Senate Bill 1298, which requires the state to adopt formal autonomous vehicle testing rules by 2015.

“When SB 1298 was working its way through, everyone thought that the technology was quite a number of years away — and we were all very surprised as we met with the car manufacturers and industry, about how far along the technology really is,” California Department of Motor Vehicles CIO Bernard Soriano told FutureStructure’s sister publication Techwire. “Getting a chance to see the technology up close and being able to experience it is mind-boggling. It’s exciting to be working on this because we’re on the cusp of societal change. I’m not one to use hyperbole, but this one is a game-changer. It will change the way we function as a society, for the better.”

California State Sen. Alex Padilla (D-Pacoima), who introduced SB 1298 two years ago, shared in the excitement, saying in a statement that “this technology takes a bold step forward. Driverless vehicles will revolutionize transportation, reduce traffic accidents and save lives. Establishing safety standards for these vehicles is an essential step in that process.”
Terry D. Bennett, senior industry program manager, civil engineering and planning at Autodesk, said while the autonomous vehicle concept is compelling, focusing on V2I and V2V makes more practical sense in cities.

“I think [autonomous cars] more than anything create a lot of space for people to think differently,” he said. “But with Detroit and other cities looking at dedicated roads for vehicle-to-vehicle or vehicle-to-infrastructure communication, you’re starting to see the point that having infrastructure that’s intelligent, has sensors and can communicate, is a much better long term approach than trying to automate a single car.”

Indeed, the U. S. Department of Transportation estimates that V2V technology may eliminate or reduce the impact of up to 80 percent of crashes involving unimpaired drivers. In a Governing Institute survey, 62 percent of local officials agreed that autonomous and connected vehicles will mean fewer crashes. Fifty-one percent also foresee improved mobility and reduced congestion as more intelligent vehicles take to the road.

Electric Vehicles

Oregon is gaining both environmental and economic development benefits from its infrastructure and program investments to support electric vehicles (EVs). The most visible of these investments is the West Coast Electric Highway, which includes charging stations along Interstate 5 in Oregon, Washington and eventually California. Based on positive public response, Oregon is installing EV charging stations along other key highways and encouraging private businesses to install stations as well.

Travel Oregon, the state’s tourism office, runs a targeted EV tourism program, “Oregon Electric Byways,” with suggested itineraries and a partnership with Enterprise Rent-a-Car for EV rental.
“It’s hard to separate the infrastructure from economic development because the infrastructure starts the conversation about EVs, especially outside of major cities,” said Ashley Horvat, Oregon’s chief electric vehicle officer and the first person in the public sector to hold this role. “By placing charging stations around the state, we went into communities that had never seen EVs, which really increased adoption and created a positive perception for Oregon within the EV industry.”

In September, California Gov. Jerry Brown signed Senate Bill 1275, which sets a goal for the state to put 1 million zero-emission vehicles on the road by 2023. The bill also authorizes the state to provide financial incentives for consumers to purchase such vehicles, part of the governor’s effort to make electric cars affordable for lower-income workers.

“I’m excited that California is charging ahead with plans to have electric vehicles in every zip code across the state,” the bill’s author, California State Sen. Kevin De León (D-Los Angeles) said in a statement. “We’re going to lead the way in the fight against climate change by putting a million EVs on the roads, which means making them affordable to all drivers, not just the wealthy.”

Driving Data

Intelligent infrastructure generates data that helps civic leadership make better decisions. For local transportation managers, connected vehicles and connected infrastructure will be tools for traffic data collection and analytics.

Better traffic flow is achievable in part with better systems for collecting and analyzing real-time traffic data. In this arena, transportation managers can learn from the technologies and practices deployed by private companies, especially those with large fleets.

For instance, some keen-eyed observers know that the familiar brown UPS trucks rarely make a left turn. The reason is that for decades UPS has worked to optimize routes. The UPS On-Road Integrated Optimization Navigation (ORION) software, which provides analytics for routing the company’s delivery trucks, is the latest in route optimization. The system combines daily data on package delivery commitments and historical route tracking to identify the optimal path (out of hundreds of thousands of possibilities) for each UPS driver to follow that day. UPS expects the ORION system to significantly reduce fuel consumption and miles driven in its trucks. Public transportation departments will benefit from using similar analytics tools said Tom Madrecki, strategic communications manager at UPS.

“It’s really about diving into the data and, based on where people need to go, determining how to make the transportation system the best it can be, then investing in the needed technology to realize those improvements,” Madrecki said.  
           

ITS and the Path to Smart Cities

No matter how promising the new technology, local transportation officials are caught in a classic funding bind — it’s impossible to reduce costs without making investments, but funding for infrastructure investments is scarce or nonexistent. In a recent Governing Institute survey, 78 percent of respondents indicated lack of funding was the key barrier to developing ITS, well ahead of the 45 percent who cited an aging infrastructure as the key barrier.

“We need to create a framework for private entrepreneurship to lead the way,” argued Florida Department of Transportation Secretary Ananth Prasad during a panel at the ITS World Congress. “Legislation at the state and federal level needs to be loosened up.”

In many cases, public-private partnerships will have a larger role in financing new transportation projects. “Many state and local governments don’t have the staff and other resources to implement projects on this large scale,” said Nicholas Fluehr, a managing director at Wells Fargo. “Although municipal bond financing is still a viable option, partnering with the private sector can be a good option from both a cost and efficiency standpoint.”

As traffic volumes continue to grow in the coming decades, the public sector will need to consider every possible opportunity to better manage all transportation systems and infrastructure.

“For state and local governments, the question is which investments will allow them to more effectively and efficiently utilize the existing transportation infrastructure,” said Murtha. “You can spend billions on new roads and light rail or you can make a much smaller investment in the communications technology that will allow more vehicles to operate intelligently on current streets and highways.”

One of the recurring themes of the ITS World Congress was that we’re on the cusp of an extraordinary revolution in transportation, one that may save government billions of dollars by facilitating far better utilization of existing transportation infrastructure.

“Investing in last century’s infrastructure is cheaper in the short run but more costly in the long run,” said Verizon Chairman and CEO Lowell C. McAdam in a keynote address at the ITS World Congress.

That’s why the smart cities of the future will be those that embrace and integrate intelligent transportation systems. While driverless cars may be a long way off, vehicle connectivity is not.
“A smart, connected infrastructure will improve the quality of all our lives,” McAdam said. “Job No.1 in achieving this potential is bringing connectivity to every car.”

Too Tired to Practice? Ask a GPS Device

As reported by ABC NewsNebraska's Tommy Armstrong Jr. was running play after play during a preseason practice and was beginning to wear down in the heat.

He could have asked for a break, but he didn't have to. An assistant strength coach who was keeping electronic tabs on Armstrong could tell by looking at his laptop that the quarterback was fatigued. Armstrong was ordered to the sideline.
"Dial it down," he was told.
Armstrong had just entered the "red zone" — and not the kind that extends from the end zone to the 20-yard line. This "red zone" meant Armstrong — who was wearing a tracking device relaying biomechanical data to the staffer's laptop in real time — was overexerting himself and at greater risk for injury.
It's one of the features of technology being used by about 30 college football teams and 15 NFL teams to monitor the movements and physical output of players during conditioning, practices and games.
The Australia-based company Catapult developed the system about eight years ago. Rugby and soccer teams were among the first to use it. Football teams in the United States began signing on with Catapult three years ago, and several hockey and basketball teams have followed.
"You build a portfolio of data on each player so over a period of time you can tell when they're wearing down, do they need an extra rest, do they need a day off, all those things," Tennessee coach Butch Jones said. "The most important thing is what you do throughout the week to get them ready to perform at their peak, at their optimal level, come game day."
At Nebraska, the top 50 football players slip a monitor weighing about 3 ounces into a pouch in the back of the tight-fit shirts they wear under their shoulder pads. Head strength coach James Dobson said it's too expensive to track all of the Huskers' 130 players. As it is, Nebraska will pay Catapult more than $363,000 over three years to rent equipment.
Each monitor includes a GPS device and other sensors that measure hundreds of variables per second, many of them hard to pronounce.
Some of the basic metrics: how far and fast did the player travel during a practice or game, his rate of acceleration, how many times he went right vs. left and whether he moved faster when he went one way or the other. The monitor is so sensitive that it can detect even a slight change in a player's gait, which can be a sign of fatigue or injury.
Data collected is put into an algorithm developed by Catapult, and the result is a number called "player load." The load is a number that varies depending on a player's position, but the average in college football would be about 350, said Catapult sports performance manager Ben Peterson. The higher a player's number goes, the greater his exertion.
A baseline is established for each player, and his readings can be monitored in real time.
"On certain days you have to be in certain zones," said Armstrong, the Nebraska quarterback. "If you go over that, they tell you, 'Hey, yesterday you were in the red, so make sure you're not today.' If you are in the red zone, you take a few series off."
Under NCAA rules, Catapult data cannot be looked at in real time during games because it could provide a competitive advantage if one team is using the system and the other is not.
Peterson said college teams using the system have reported an average of a 27-percent decrease in soft-tissue injuries.
When an athlete does get hurt, sports medicine personnel can use Catapult data to manage his recovery. For instance, if an injured wide receiver were able to reach only 70 percent of his maximum acceleration or speed, it would show he has a ways to go before he's ready to play in a game. The data also could be used to establish points of emphasis in a hurt athlete's rehabilitation protocol.
Alabama coach Nick Saban said he looks at player load readings to see which players are working as hard as they can and, conversely, to identify ones who aren't. Saban said players who know they're going to play on Saturdays tend to give maximum effort all the time, but that's not necessarily the case for those who aren't as likely to play.
Saban said it's telling to track defensive backs.
"When they're covering a good receiver, their numbers are higher," Saban said. "When they're covering a guy who's not as fast, they're not as good."
Tennessee safety Brian Randolph said the technology helps coaches put players in the best position for success.
"They don't want to overwork us. It shows that they care," Randolph said. "They definitely tell you when you've had a lot of reps or when you have a lot of mileage on your legs from the day before, so they tell you to get in the cold tub and get extra recovery."

Thursday, November 6, 2014

A Brain-Inspired Chip Takes to the Sky

As reported by MIT Technology Review: There isn’t much space between your ears, but what’s in there can do many things that a computer of the same size never could. Your brain is also vastly more energy efficient at interpreting the world visually or understanding speech than any computer system.

That’s why academic and corporate labs have been experimenting with “neuromorphic” chips modeled on features seen in brains. These chips have networks of “neurons” that communicate in spikes of electricity (see “Thinking in Silicon”). They can be significantly more energy-efficient than conventional chips, and some can even automatically reprogram themselves to learn new skills.

Now a neuromorphic chip has been untethered from the lab bench, and tested in a tiny drone aircraft that weighs less than 100 grams.

In the experiment, the prototype chip, with 576 silicon neurons, took in data from the aircraft’s optical, ultrasound, and infrared sensors as it flew between three different rooms.

The first time the drone was flown into each room, the unique pattern of incoming sensor data from the walls, furniture, and other objects caused a pattern of electrical activity in the neurons that the chip had never experienced before. 

That triggered it to report that it was in a new space, and also caused the ways its neurons connected to one another to change, in a crude mimic of learning in a real brain. Those changes meant that next time the craft entered the same room, it recognized it and signaled as such.

The chip involved is far from ready for practical deployment, but the test offers empirical support for the ideas that have motivated research into neuromorphic chips, says Narayan Srinivasa, who leads HRL’s Center for Neural and Emergent Systems. “This shows it is possible to do learning literally on the fly, while under very strict size, weight, and power constraints,” he says.

The drone, custom built for the test by drone-maker company Aerovironment, based in Monrovia, California, is six inches square, 1.5 inches high, and weighs only 93 grams, including the battery. HRL’s chip made up just 18 grams of the craft’s weight, and used only 50 milliwatts of power. That wouldn’t be nearly enough for a conventional computer to run software that could learn to recognize rooms, says Srinivasa.

The flight test was a challenge set by the Pentagon research agency DARPA as part of a project under which it has funded HRL, IBM, and others to work on neuromorphic chips. One motivation is the hope that neuromorphic chips might make it possible for military drones to make sense of video and sensor data for themselves, instead of always having to beam it down to earth for analysis by computers or humans.

Prototypes made under DARPA’s program—like HRL’s—have delivered promising results, but much work remains before such technology can perform useful work, says Vishal Saxena, an assistant professor working on neuromorphic chips at Boise State University. “The biggest challenge is identifying what the applications will be and developing robust algorithms,” he says.

Researchers also face a chicken-and-egg scenario, with chips being developed without much idea of what algorithms they will run and algorithms being written without a firm idea of what chip designs will become established. At the same time, neuroscientists are still discovering new things about how networks of real brain cells work on information. “There’s a lot of work to be done collectively between circuit and algorithm experts and the neuroscience community,” says Saxena.

Still, HRL’s owners, GM and Boeing, are already considering how they might commercialize the technology, says Srinivasa. One option could be to use neuromorphic chips to build a degree of intelligence into the sensors increasingly found in cars, planes, and other systems.

Wednesday, November 5, 2014

Treating Ebola: The Bluetooth Method

As reported by National Geographic: Before they set a toe into the concrete-walled isolation room, the doctors and nurses become fortresses unto themselves: face shields, of course, but respirators, too, plus three layers of gloves on each hand, duct-taped to their sleeves. Nurses watch over a webcam to keep them on protocol, and Bluetooth stethoscopes relay heart data directly to a remote location—no ear canal exposure required.
Call it the no-touch approach to medicine. And it's the little-heralded reason that a hospital in Nebraska, of all places, has emerged as a leader in the stateside fight against Ebola. Already, it's brought two Ebola patients to recovery and prevented transmission to health care providers. 

The Centers for Disease Control and Prevention has held up the hospital as a model for others.

All around the world, of course, the health care workers who've been treating the terrifying disease avoid skin-to-skin contact with patients and use a battery of protective equipment, like gloves and air-filtering PAPR suits. But Nebraska Medicine, near downtown Omaha, has taken protection to a whole new frontier—and into the slightly eerie field of hands-free medicine. If successful, the approach could have implications for medical practice, even beyond Ebola, especially as the burgeoning field of telehealth takes off. (The U.S. telehealth market could grow more than 50 percent annually through 2018, Forbes reports.)

The challenge is to harness technology's protective power without jettisoning the bedside manner, a key to healing, Nebraska health care practitioners acknowledge. They're navigating the trade-offs with computer screens that display "almost life-size" images, said Nebraska Medicine lead nurse Kathleen Boulter. And although providers remain hidden beneath layers of latex and paper, their patients have surprised them with an ability to recognize them by their eyes. "A lot of emotion is expressed by our eyes," she said.

And so far, the hospital has a 100 percent success rate on Ebola. Its first Ebola patient, 51-year-old missionary Dr. Rick Sacra, stayed for nearly three weeks before his release. On Oct. 22, the hospital discharged its second patient, NBC freelance cameraman Ashoka Mukpo, 33, after a roughly two-week stay, said Boulter.


So how does Nebraska Medicine work? It starts with a secured entrance. To limit traffic in and out of the isolation room—and the risk of spreading disease—it uses the Vidyo videoconferencing platform. The isolation room houses a webcam-equipped computer connected to the front desk, the biocontainment unit's conference rooms and providers' offices outside the unit. And inside the isolation room, providers can request a second opinion or order supplies without ever leaving. "If something's going on, we know right away," Boulter said.

Traditional stethoscopes also pose a huge contamination risk, medical professionals say, because they require practitioners to lodge earpieces into their ear canals. Tech, of course, has found a way around this. The 3M Littmann Electronic Stethoscope looks much like a regular stethoscope, but its Bluetooth capabilities allow Nebraska Medicine providers to take their ears out of the equation. Instead, a sensor goes onto the patient's chest. A USB dongle, connected to the computer in the isolation room, establishes a Bluetooth connection with a remote computer. Providers outside can listen to a patient's heart and lung sounds in real time. They can even tell health care workers inside the isolation room to reposition the sensor.
Another stethoscope used by Nebraska Medicine is the Thinklabs One Digital Stethoscope. Its high sound quality allows health care workers to wear earpieces over their surgical caps, eliminating ear-canal exposure. They slip them on just before entering the isolation room and plug them into a hockey puck-sized sensor—equipped with a volume-control module—that picks up sounds from the patient's chest. Providers chuck the earpieces into the hazardous waste bin when they doff their protective gear.

Meanwhile, devices that monitor pulse and other vital signs upload measurements to the patient's electronic health record. And a wireless-capable X-ray allows nurses to send images directly to radiologists, skipping the step of transporting bulky film cassettes to the medical imaging department for processing.

Behind the no-touch push is Nebraska Medicine's information technology department, which is "robust across all units, not just biocontainment," Boulter said. "Even on regular floors, nurses have laptops on them" and rely on the same wireless X-rays. And the Center for Medicare & Medicaid Innovation awarded the hospital a $10 million telehealth grant in July.

And while other hospitals have embraced telehealth too, practitioners hope the healing touch is here to stay. "There are times when something as simple as holding a patient or family member's hand conveys calmness, caring, reduces fear. ... I don't believe the effect of a human touch is something that can be replaced," Boulter said.

Even the hospital's telehealth guru, Kyle Hall, agrees: "[I]t's still about a human diagnosing the patient."

New Clock May Redefine Time As We Know It

Strontium atoms floating in the center of this photo are the heart of the world's most precise clock.  The clock
is so exact that it can detect tiny shifts in the flow of time itself.
As reported by NPR: "My own personal opinion is that time is a human construct," says Tom O'Brian. O'Brian has thought a lot about this over the years. He is America's official timekeeper at the National Institute of Standards and Technology in Boulder, Colorado.

To him, days, hours, minutes and seconds are a way for humanity to "put some order in this very fascinating and complex universe around us."

We bring that order using clocks, and O'Brian oversees America's master clock. It's one of the most accurate clocks on the planet: an atomic clock that uses oscillations in the element cesium to count out 0.0000000000000001 (1x10-16) second at a time. If the clock had been started 300 million years ago, before the age of dinosaurs began, it would still be keeping time — down to the second. But the crazy thing is, despite knowing the time better than almost anyone on Earth, O'Brian can't explain time.

"We can measure time much better than the weight of something or an electrical current," he says, "but what time really is, is a question that I can't answer for you."

Maybe its because we don't understand time, that we keep trying to measure it more accurately. But that desire to pin down the elusive ticking of the clock may soon be the undoing of time as we know it: The next generation of clocks will not tell time in a way that most people understand.

The New Clock
At the nearby University of Colorado Boulder is a clock even more precise than the one O'Brian watches over. The basement lab that holds it is pure chaos: Wires hang from the ceilings and sprawl across lab tables. Binder clips keep the lines bunched together.

In fact, this knot of wires and lasers actually is the clock. It's spread out on a giant table, parts of it wrapped in what appears to be tinfoil. Tinfoil?

"That's research grade tinfoil," says Travis Nicholson, a graduate student here at the JILA, a joint institute between NIST and CU-Boulder. Nicholson and his fellow graduate students run the clock day to day. Most of their time is spent fixing misbehaving lasers and dealing with the rats' nest of wires. ("I think half of them go nowhere," says graduate student Sara Campbell.)

At the heart of this new clock is the element strontium. Inside a small chamber, the strontium atoms are suspended in a lattice of crisscrossing laser beams. Researchers then give them a little ping, like ringing a bell. The strontium vibrates at an incredibly fast frequency. It's a natural atomic metronome ticking out teeny, teeny fractions of a second.

This new clock can keep perfect time for 5 billion years.

"It's about the whole, entire age of the earth," says Jun Ye, the scientist here at JILA who built this clock. "Our aim is that we'll have a clock that, during the entire age of the universe, would not have lost a second."

But this new clock has run into a big problem: This thing we call time doesn't tick at the same rate everywhere in the universe. Or even on our planet.

Time Undone
Right now, on the top of Mount Everest, time is passing just a little bit faster than it is in Death Valley. That's because speed at which time passes depends on the strength of gravity. Einstein himself discovered this dependence as part of his theory of relativity, and it is a very real effect.

The world's most precise atomic clock is a mess to look at. But it can tick for billions of years without losing a second.The relative nature of time isn't just something seen in the extreme. If you take a clock off the floor, and hang it on the wall, Ye says, "the time will speed up by about one part in 1016."

The world's most precise atomic clock is a mess to look at. But it can tick for billions of years without losing a second.

That is a sliver of a second. But this isn't some effect of gravity on the clock's machinery. 

Time itself is flowing more quickly on the wall than on the floor. These differences didn't really matter until now. But this new clock is so sensitive, little changes in height throw it way off. Lift it just a couple of centimeters, Ye says, "and you will start to see that difference."

This new clock can sense the pace of time speeding up as it moves inch by inch away from the earth's core.

That's a problem, because to actually use time, you need different clocks to agree on the time. Think about it: If I say, 'let's meet at 3:30,' we use our watches. But imagine a world in which your watch starts to tick faster, because you're working on the floor above me. Your 3:30 happens earlier than mine, and we miss our appointment.

This clock works like that. Tiny shifts in the earth's crust can throw it off, even when it's sitting still. Even if two of them are synchronized, their different rates of ticking mean they will soon be out of synch. They will never agree.

The world's current time is coordinated between atomic clocks all over the planet. But that can't happen with the new one.

"At this level, maintaining absolute time scale on earth is in fact turning into nightmare," Ye says. This clock they've built doesn't just look chaotic. It is turning our sense of time into chaos.
Ye suspects the only way we will be able to keep time in the future is to send these new clocks into space. Far from the earth's surface, the clocks would be better able to stay in synch, and perhaps our unified sense of time could be preserved.

But the NIST's chief timekeeper, Tom O'Brian, isn't worried about all this. As confusing as these clocks are, they're going to be really useful.

"Scientists can make these clocks into exquisite devices for sensing a whole bunch of different things," O'Brian says. Their extraordinary sensitivity to gravity might allow them to map the interior of the earth, or help scientists find water and other resources underground.

A network of clocks in space might be used to detect gravitational waves from black holes and exploding stars.

They could change our view of the universe.

They just may not be able to tell us the time.

Tuesday, November 4, 2014

GPS and Relativity

As reported by GPSWorld: An educational video by the Perimeter Institute of Theoretical Physics shows how GPS, a navigational tool that can pinpoint your location to within a few meters, incorporates a number of effects from Einstein’s theory of relativity.


Special relativity effects include the speed of the GPS satellites (14,000km/hr), which can cause the atomic clocks on-board the satellites to register time about 7 microseconds/day slower than on earth.  General relativistic effects include the reduced gravity environment that the satellites inhabit at about 20,000km above the earth's surface which causes the clocks to run about 45 microseconds a day faster than they do on earth.  The overall effect is a +38 microseconds/day increase in measured time which is then compensated for at the satellites.


Monday, November 3, 2014

The Plane Crash That Gave Americans GPS

As reported by The Atlantic: On the first day of September in 1983, the Soviet Union shot down a plane. Its military officers thought it was a spy plane, they said later. But it was not: It was a passenger jet, Korean Air Lines Flight 007, and the 269 people on the plane all died.

The flight had originated in New York; one of the passengers was a U.S. congressman. At first, the Soviet Union wouldn't even admit its military had shot the plane down, but the Reagan administration immediately started pushing to establish what had happened and stymie the operations of the Soviet Aeroflot airline. President Reagan also made a choice that, while reported at the time, was not the biggest news to come out of this event: He decided to speed up the timeline for civilian use of GPS.

The U.S. had already launched into orbit almost a dozen satellites that could help locate its military craft, on land, in the air, or on the sea. But the use of the system was restricted. (It was meant, for instance, to help powerful weapons hit their targets—it wasn't the sort of tool governments usually want to make publicly available.) Now, Reagan said, as soon as the next iteration of the GPS system was working, it would be available for free.

It took more than $10 billion and until over 10 years for the second version of the U.S.'s GPS system to come fully online. But in 1995, as promised, it was available to private companies for consumer applications. Sort of. The government had built in some protection for itself—"selective availability," which reserved access to the best, most precise signals for the U.S. military (and anyone it chose to share that power with).

It didn't take long, though, for commercial providers of GPS services to start complaining. Location-based services, after all, are only as good as their actual usefulness—and if you've got a customer lost in the woods, you want that customer to know as precisely as possible where they are so they can get un-lost. In 2000, not that long before he left office, President Clinton got rid of selective availability and freed the world from ever depending on paper maps or confusing directions from relatives again.

GPS has not, however, been a panacea for international conflicts over the positioning of large vehicles. Just a few years ago, in 2007, a group of British sailors were detained by the Iranian government, which said they had wandered into Iranian waters. The British GPS system showed the boats in Iraqi waters. But it didn't matter. According to the Iranian authorities, they had been in Iranian waters. The sailors were released eventually—but only after almost two weeks of discussion over where, exactly, they had been.