Search This Blog

Wednesday, November 5, 2014

Treating Ebola: The Bluetooth Method

As reported by National Geographic: Before they set a toe into the concrete-walled isolation room, the doctors and nurses become fortresses unto themselves: face shields, of course, but respirators, too, plus three layers of gloves on each hand, duct-taped to their sleeves. Nurses watch over a webcam to keep them on protocol, and Bluetooth stethoscopes relay heart data directly to a remote location—no ear canal exposure required.
Call it the no-touch approach to medicine. And it's the little-heralded reason that a hospital in Nebraska, of all places, has emerged as a leader in the stateside fight against Ebola. Already, it's brought two Ebola patients to recovery and prevented transmission to health care providers. 

The Centers for Disease Control and Prevention has held up the hospital as a model for others.

All around the world, of course, the health care workers who've been treating the terrifying disease avoid skin-to-skin contact with patients and use a battery of protective equipment, like gloves and air-filtering PAPR suits. But Nebraska Medicine, near downtown Omaha, has taken protection to a whole new frontier—and into the slightly eerie field of hands-free medicine. If successful, the approach could have implications for medical practice, even beyond Ebola, especially as the burgeoning field of telehealth takes off. (The U.S. telehealth market could grow more than 50 percent annually through 2018, Forbes reports.)

The challenge is to harness technology's protective power without jettisoning the bedside manner, a key to healing, Nebraska health care practitioners acknowledge. They're navigating the trade-offs with computer screens that display "almost life-size" images, said Nebraska Medicine lead nurse Kathleen Boulter. And although providers remain hidden beneath layers of latex and paper, their patients have surprised them with an ability to recognize them by their eyes. "A lot of emotion is expressed by our eyes," she said.

And so far, the hospital has a 100 percent success rate on Ebola. Its first Ebola patient, 51-year-old missionary Dr. Rick Sacra, stayed for nearly three weeks before his release. On Oct. 22, the hospital discharged its second patient, NBC freelance cameraman Ashoka Mukpo, 33, after a roughly two-week stay, said Boulter.


So how does Nebraska Medicine work? It starts with a secured entrance. To limit traffic in and out of the isolation room—and the risk of spreading disease—it uses the Vidyo videoconferencing platform. The isolation room houses a webcam-equipped computer connected to the front desk, the biocontainment unit's conference rooms and providers' offices outside the unit. And inside the isolation room, providers can request a second opinion or order supplies without ever leaving. "If something's going on, we know right away," Boulter said.

Traditional stethoscopes also pose a huge contamination risk, medical professionals say, because they require practitioners to lodge earpieces into their ear canals. Tech, of course, has found a way around this. The 3M Littmann Electronic Stethoscope looks much like a regular stethoscope, but its Bluetooth capabilities allow Nebraska Medicine providers to take their ears out of the equation. Instead, a sensor goes onto the patient's chest. A USB dongle, connected to the computer in the isolation room, establishes a Bluetooth connection with a remote computer. Providers outside can listen to a patient's heart and lung sounds in real time. They can even tell health care workers inside the isolation room to reposition the sensor.
Another stethoscope used by Nebraska Medicine is the Thinklabs One Digital Stethoscope. Its high sound quality allows health care workers to wear earpieces over their surgical caps, eliminating ear-canal exposure. They slip them on just before entering the isolation room and plug them into a hockey puck-sized sensor—equipped with a volume-control module—that picks up sounds from the patient's chest. Providers chuck the earpieces into the hazardous waste bin when they doff their protective gear.

Meanwhile, devices that monitor pulse and other vital signs upload measurements to the patient's electronic health record. And a wireless-capable X-ray allows nurses to send images directly to radiologists, skipping the step of transporting bulky film cassettes to the medical imaging department for processing.

Behind the no-touch push is Nebraska Medicine's information technology department, which is "robust across all units, not just biocontainment," Boulter said. "Even on regular floors, nurses have laptops on them" and rely on the same wireless X-rays. And the Center for Medicare & Medicaid Innovation awarded the hospital a $10 million telehealth grant in July.

And while other hospitals have embraced telehealth too, practitioners hope the healing touch is here to stay. "There are times when something as simple as holding a patient or family member's hand conveys calmness, caring, reduces fear. ... I don't believe the effect of a human touch is something that can be replaced," Boulter said.

Even the hospital's telehealth guru, Kyle Hall, agrees: "[I]t's still about a human diagnosing the patient."

New Clock May Redefine Time As We Know It

Strontium atoms floating in the center of this photo are the heart of the world's most precise clock.  The clock
is so exact that it can detect tiny shifts in the flow of time itself.
As reported by NPR: "My own personal opinion is that time is a human construct," says Tom O'Brian. O'Brian has thought a lot about this over the years. He is America's official timekeeper at the National Institute of Standards and Technology in Boulder, Colorado.

To him, days, hours, minutes and seconds are a way for humanity to "put some order in this very fascinating and complex universe around us."

We bring that order using clocks, and O'Brian oversees America's master clock. It's one of the most accurate clocks on the planet: an atomic clock that uses oscillations in the element cesium to count out 0.0000000000000001 (1x10-16) second at a time. If the clock had been started 300 million years ago, before the age of dinosaurs began, it would still be keeping time — down to the second. But the crazy thing is, despite knowing the time better than almost anyone on Earth, O'Brian can't explain time.

"We can measure time much better than the weight of something or an electrical current," he says, "but what time really is, is a question that I can't answer for you."

Maybe its because we don't understand time, that we keep trying to measure it more accurately. But that desire to pin down the elusive ticking of the clock may soon be the undoing of time as we know it: The next generation of clocks will not tell time in a way that most people understand.

The New Clock
At the nearby University of Colorado Boulder is a clock even more precise than the one O'Brian watches over. The basement lab that holds it is pure chaos: Wires hang from the ceilings and sprawl across lab tables. Binder clips keep the lines bunched together.

In fact, this knot of wires and lasers actually is the clock. It's spread out on a giant table, parts of it wrapped in what appears to be tinfoil. Tinfoil?

"That's research grade tinfoil," says Travis Nicholson, a graduate student here at the JILA, a joint institute between NIST and CU-Boulder. Nicholson and his fellow graduate students run the clock day to day. Most of their time is spent fixing misbehaving lasers and dealing with the rats' nest of wires. ("I think half of them go nowhere," says graduate student Sara Campbell.)

At the heart of this new clock is the element strontium. Inside a small chamber, the strontium atoms are suspended in a lattice of crisscrossing laser beams. Researchers then give them a little ping, like ringing a bell. The strontium vibrates at an incredibly fast frequency. It's a natural atomic metronome ticking out teeny, teeny fractions of a second.

This new clock can keep perfect time for 5 billion years.

"It's about the whole, entire age of the earth," says Jun Ye, the scientist here at JILA who built this clock. "Our aim is that we'll have a clock that, during the entire age of the universe, would not have lost a second."

But this new clock has run into a big problem: This thing we call time doesn't tick at the same rate everywhere in the universe. Or even on our planet.

Time Undone
Right now, on the top of Mount Everest, time is passing just a little bit faster than it is in Death Valley. That's because speed at which time passes depends on the strength of gravity. Einstein himself discovered this dependence as part of his theory of relativity, and it is a very real effect.

The world's most precise atomic clock is a mess to look at. But it can tick for billions of years without losing a second.The relative nature of time isn't just something seen in the extreme. If you take a clock off the floor, and hang it on the wall, Ye says, "the time will speed up by about one part in 1016."

The world's most precise atomic clock is a mess to look at. But it can tick for billions of years without losing a second.

That is a sliver of a second. But this isn't some effect of gravity on the clock's machinery. 

Time itself is flowing more quickly on the wall than on the floor. These differences didn't really matter until now. But this new clock is so sensitive, little changes in height throw it way off. Lift it just a couple of centimeters, Ye says, "and you will start to see that difference."

This new clock can sense the pace of time speeding up as it moves inch by inch away from the earth's core.

That's a problem, because to actually use time, you need different clocks to agree on the time. Think about it: If I say, 'let's meet at 3:30,' we use our watches. But imagine a world in which your watch starts to tick faster, because you're working on the floor above me. Your 3:30 happens earlier than mine, and we miss our appointment.

This clock works like that. Tiny shifts in the earth's crust can throw it off, even when it's sitting still. Even if two of them are synchronized, their different rates of ticking mean they will soon be out of synch. They will never agree.

The world's current time is coordinated between atomic clocks all over the planet. But that can't happen with the new one.

"At this level, maintaining absolute time scale on earth is in fact turning into nightmare," Ye says. This clock they've built doesn't just look chaotic. It is turning our sense of time into chaos.
Ye suspects the only way we will be able to keep time in the future is to send these new clocks into space. Far from the earth's surface, the clocks would be better able to stay in synch, and perhaps our unified sense of time could be preserved.

But the NIST's chief timekeeper, Tom O'Brian, isn't worried about all this. As confusing as these clocks are, they're going to be really useful.

"Scientists can make these clocks into exquisite devices for sensing a whole bunch of different things," O'Brian says. Their extraordinary sensitivity to gravity might allow them to map the interior of the earth, or help scientists find water and other resources underground.

A network of clocks in space might be used to detect gravitational waves from black holes and exploding stars.

They could change our view of the universe.

They just may not be able to tell us the time.

Tuesday, November 4, 2014

GPS and Relativity

As reported by GPSWorld: An educational video by the Perimeter Institute of Theoretical Physics shows how GPS, a navigational tool that can pinpoint your location to within a few meters, incorporates a number of effects from Einstein’s theory of relativity.


Special relativity effects include the speed of the GPS satellites (14,000km/hr), which can cause the atomic clocks on-board the satellites to register time about 7 microseconds/day slower than on earth.  General relativistic effects include the reduced gravity environment that the satellites inhabit at about 20,000km above the earth's surface which causes the clocks to run about 45 microseconds a day faster than they do on earth.  The overall effect is a +38 microseconds/day increase in measured time which is then compensated for at the satellites.


Monday, November 3, 2014

The Plane Crash That Gave Americans GPS

As reported by The Atlantic: On the first day of September in 1983, the Soviet Union shot down a plane. Its military officers thought it was a spy plane, they said later. But it was not: It was a passenger jet, Korean Air Lines Flight 007, and the 269 people on the plane all died.

The flight had originated in New York; one of the passengers was a U.S. congressman. At first, the Soviet Union wouldn't even admit its military had shot the plane down, but the Reagan administration immediately started pushing to establish what had happened and stymie the operations of the Soviet Aeroflot airline. President Reagan also made a choice that, while reported at the time, was not the biggest news to come out of this event: He decided to speed up the timeline for civilian use of GPS.

The U.S. had already launched into orbit almost a dozen satellites that could help locate its military craft, on land, in the air, or on the sea. But the use of the system was restricted. (It was meant, for instance, to help powerful weapons hit their targets—it wasn't the sort of tool governments usually want to make publicly available.) Now, Reagan said, as soon as the next iteration of the GPS system was working, it would be available for free.

It took more than $10 billion and until over 10 years for the second version of the U.S.'s GPS system to come fully online. But in 1995, as promised, it was available to private companies for consumer applications. Sort of. The government had built in some protection for itself—"selective availability," which reserved access to the best, most precise signals for the U.S. military (and anyone it chose to share that power with).

It didn't take long, though, for commercial providers of GPS services to start complaining. Location-based services, after all, are only as good as their actual usefulness—and if you've got a customer lost in the woods, you want that customer to know as precisely as possible where they are so they can get un-lost. In 2000, not that long before he left office, President Clinton got rid of selective availability and freed the world from ever depending on paper maps or confusing directions from relatives again.

GPS has not, however, been a panacea for international conflicts over the positioning of large vehicles. Just a few years ago, in 2007, a group of British sailors were detained by the Iranian government, which said they had wandered into Iranian waters. The British GPS system showed the boats in Iraqi waters. But it didn't matter. According to the Iranian authorities, they had been in Iranian waters. The sailors were released eventually—but only after almost two weeks of discussion over where, exactly, they had been.

SpaceShipTwo's Rocket Engine Did Not Cause Fatal Crash

As reported by Discovery: It wasn't SpaceShipTwo’s hybrid rocket motor -- which was flying on Friday with a new type of fuel -- that caused the fatal crash, the head of the accident investigation agency said late Sunday.

The ship’s fuel tanks and its engine were recovered intact, indicating there was no explosion.
“They showed no signs of burn-through, no signs of being breached,” Christopher Hart, acting chairman of the National Transportation and Safety Board, told reporters at the Mojave Air and Space Port in Mojave, Calif.

Instead, data and video relayed from the ship show its hallmark safety feature -- a fold-able tail section designed for easy re-entry into the atmosphere from space -- was deployed early.

“The engine burn was normal up until the extension of the feathers,” said Hart.

Normally, the feather system wouldn't be unlocked until the rocket-powered spaceship is moving about Mach 1.4, or 1.4 times faster than the speed of sound.

Instead, the co-pilot moved the lever from locked to unlock when the spaceship was traveling at about Mach 1, Hart said.

“I’m not stating that this is the cause of the mishap,” he added. “We have months and months of investigation to determine what the cause was.”

In addition to the possibility of pilot error, Hart said the NTSB is looking into a variety of other issues that may have caused or contributed to the accident, including training, spacecraft design and the safety culture at Virgin Galactic and Scaled Composites, which designed and manufactured the spaceship.

“There is much more that we don’t know and our investigation is far from over,” Hart said.

The accident claimed the life of Scaled Composites test pilot Mike Alsbury, who was serving as the spaceship co-pilot, Scaled’s website shows. Pilot Pete Siebold, who was able to parachute to the ground, survived with a serious shoulder injury.

SpaceShipTwo took off on Friday morning for what was expected to be its fourth powered test flight. It was released as planned from its carrier jet at an altitude of about 45,000 feet. Seconds later, the spaceship’s hybrid motor, which was using a new plastic propellant, powered up.

About nine seconds later, the ship’s feathering system was unlocked, said Hart. Two seconds after that, the ship’s tail section moved toward the deployed position.

"This was an uncommanded feather, which means the feather occurred without the feather lever being moved into the feather position," Hart told Discovery News.
“Shortly after the feathering occurred, the telemetry data terminated and the video data terminated,” he said.

Debris was scattered over a five-mile area north of the spaceport, indicating the spaceship broke apart in flight.

About 800 people already have paid or put down deposits to fly on SpaceShipTwo. Virgin Galactic hoped to begin passenger service next year. The company's second ship is about 65 percent complete.


Saturday, November 1, 2014

Virgin Galactic Spaceship Crash Caps Terrible Week for Commercial Spaceflight

As reported by Space.com: The burgeoning field of commercial spaceflight suffered two serious blows this week.

The bad news began on Tuesday (Oct. 28), when Orbital Sciences Corp.'s Antares rocket exploded just seconds after blasting off on an unmanned cargo mission to the International Space Station for NASA. Then, on Friday (Oct. 31), Virgin Galactic's SpaceShipTwo crashed during a test flight; one of the two pilots aboard was killed and the other injured, apparently seriously.

The causes of the two accidents are unclear at the moment, and so are the consequences. But the fallout could be huge for Orbital Sciences, Virgin Galactic and the entire private spaceflight industry, which has been building up some serious momentum over the past several years. [Photos: SpaceShipTwo's Test Flights]

Virginia-based Orbital Sciences holds a $1.9 billion contract with NASA to make eight robotic cargo runs to the space station using Antares and the company's Cygnus spacecraft. Orbital had completed two such missions without incident before Tuesday's rocket explosion.

Another company, California-based SpaceX, also signed a deal to ferry cargo to the space station for NASA. The agency is paying SpaceX $1.6 billion to fly 12 unmanned supply missions to the orbiting lab using the firm's Dragon capsule and Falcon 9 rocket. So far, SpaceX has flown four of these missions, and all have been successful.

NASA is also looking to the private sector to take astronauts to and from low-Earth orbit. Last month, the agency awarded SpaceX and Boeing multibillion-dollar contracts to continue developing their crewed vehicles — a manned version of Dragon in SpaceX's case and a capsule called the CST-100 for Boeing.

NASA officials hope at least one of these spaceships is up and running by 2017. The agency has been dependent on Russian Soyuz spacecraft to ferry American astronauts to and from the space station since 2011, when NASA's space shuttle fleet retired.
NASA officials expressed confidence in Orbital Sciences after Tuesday's launch mishap, citing the company's two successful supply missions to the space station. The agency also seemed to affirm its commitment to private cargo delivery.

"Launching rockets is an incredibly difficult undertaking, and we learn from each success and each setback," Bill Gerstenmaier, head of NASA's Human Exploration and Operations Directorate, said in a statement Tuesday. "Today's launch attempt will not deter us from our work to expand our already successful capability to launch cargo from American shores to the International Space Station."

Meanwhile, Virgin Galactic and Scaled Composites — the company that built the six-passenger, two-pilot SpaceShipTwo — are dealing with a tragedy that claimed a life.
Virgin Galactic founder Sir Richard Branson has previously expressed hope that commercial operations of SpaceShipTwo will begin sometime in 2015. Friday's crash, which occurred during the suborbital space plane's fourth rocket-powered flight and 55th overall test flight, will almost certainly push that timeline back.

But Virgin Galactic representatives vowed that they will continue their work to get SpaceShipTwo up and running. And the entire industry will bounce back as well, said Stuart Witt, CEO of Mojave Air and Space Port in California, which hosts SpaceShipTwo's test flights.

"It hasn't been an easy week. It's certainly been a challenge," Witt said during a post-crash news conference Friday. "But where I'm from, this is where you find out your true character."

Friday, October 31, 2014

Spooky Action at a Distance: How Entanglement Generating Satellites Will Make the Quantum Internet Global

Sending entangled photons to opposite sides of the planet will require a small fleet of orbiting satellites, say physicists.
As reported by MIT Technology Review: One of the challenges that physicists face in creating a quantum Internet is to distribute entangled photons around the planet. The idea is that a user in Tokyo could use this entanglement to send a perfectly secure message to somebody in Moscow or Johannesburg or New York.

The problem is that entangled photons are difficult to send over these distances because optical fibers absorb then. This process of absorption limits the distance that physicists can distribute entanglement to about 100 kilometers.

One solution is to place quantum repeaters along a fiber that pass on the entanglement without destroying it. Physicists are currently developing these kinds of devices and expect to have them operating in the next few years.

However, quantum repeaters will operate at temperatures close to absolute zero and require their own power and cooling infrastructure. That is all possible on land but is much harder to make work for transoceanic cables. Which is why physicists are looking for alternative ways to distribute entanglement over long distances.

Today, Kristine Boone at the University of Calgary in Canada and a few pals outline a plan to distribute entanglement around the planet from satellites orbiting a couple of hundred kilometers above the Earth. “Our proposed scheme relies on realistic advances in quantum memories and quantum non-demolition measurements and only requires a moderate number of satellites equipped with a tangled photon pair sources,” they say.

One feature of quantum technology is that it is rapidly changing as advances are made in laboratories all over the planet. But any technology aboard a satellite cannot be changed once it is launched. So a potential danger with a satellite-based network is that it would be unable to take advantage of important advances.

Boone and co get around this by keeping much of the most advanced technology on the ground. Their proposed satellites will be little more than vehicles for producing entangled photons, a process that is relatively well understood and straightforward to achieve.

Each satellite will generate a constant stream of entangled pairs. Each member of the pair will be sent to separate stations on the ground, where it will be stored in quantum memories. In this way, the satellites will entangle quantum memories across the globe.

The ground stations will consist of relatively small one-meter telescopes, aimed at the satellites as they pass overhead. These will collect photons and direct them towards quantum memories. It is the quantum memories that are likely to advance rapidly in the coming years.

Once the entanglement is stored on the ground, it can then be used as needed to send secure messages, or even sent locally across the quantum Internet using short optical fibers.
Simon and co perform various calculations to show that their proposal is well founded. “We have argued that quantum repeaters based on LEO satellite links are a viable approach to global quantum communication,” they say.

An interesting question is whether the system they propose would be better than the one we discussed last week in which entanglement is transported around the world in quantum memories on containerships. At first glance, that seems to have the potential to be cheaper given that the transport infrastructure is already in place and known to be cost-effective. By contrast, rocket launches, and the satellites they carry, are hugely expensive.

One thing is clear. Entanglement is set to become a valuable resource that is likely to be bought and sold, much like oil and gas today. Just how the incipient market for entanglement emerges will be interesting to watch.

Ref: arxiv.org/abs/1410.5384 : Entanglement Over Global Distances via Quantum Repeaters with Satellite Links