Search This Blog

Tuesday, June 18, 2013

Atomic Clocks to sync wirelessly for more accurate GPS satellite systems

NIST researchers transferred ultra-precise time signals over the air between a laboratory on NIST's campus in Boulder, Colo., and nearby Kohler Mesa. Signals were sent in both directions, reflected off a mirror on the mesa, and returned to the lab, a total distance of approximately 2 km indicated study co-author Nathan Newbury of NIST's Quantum Electronics and Photonics Division. "The actual link is a loop." The experiment used an infrared laser to generate ultra-short pulses at a very precise rate of 1 picosecond every 10 nanoseconds, where 10 ns corresponds to a set number of "ticks" of an optical atomic clock.  The two-way technique overcomes timing distortions on the signals from turbulence in the atmosphere, and shows how next-generation atomic clocks at different locations could be linked wirelessly to improve distribution of time and frequency information and other applications.

The stability of the transferred infrared signal matched that of NIST's best experimental atomic clock, which operates at optical frequencies. Infrared light is very close to the frequencies used by these clocks, and both are much higher than the microwave frequencies in conventional atomic clocks currently used as national time standards. Operating frequency is one of the most important factors in the precision of optical atomic clocks, which have the potential to provide a 100-fold improvement in the accuracy of future time standards. But the signals need to be distributed with minimal loss of precision and accuracy. 

The test was done across land, but eventually, the researchers hope, it should be possible to transfer the pulses via satellites. 

In the future, optical atomic clocks could be used for satellite-based experiments to create more precise GPS satellite navigation systems, which "could be improved in the sense that you could put better optical clocks in satellites and cross-link them optically," Newbury said. 

For GPS systems, an error of just one nanosecond, or a billionth of a second, would mean the location is about 12 inches (30 centimeters) off.

Monday, June 17, 2013

The 'Astronomical data' Behind UPS’ New Tool to Deliver Packages more efficiently

An excellent article from 'Wired' on the logistical challenges of UPS, and the use of heuristics to make their delivery methodology even more efficient.  

Here is a quick summary of some of the numbers that they deal with on a daily basis:


30—The maximum number of inches UPS specifies a driver should have to move to select the next package. This is accomplished through a meticulous system for loading packages into the truck in the order in which they’ll be delivered.

74—The number of pages in the manual for UPS drivers detailing the best practices for maximizing delivery efficiency.

200—The number of data points monitored on each delivery truck to anticipate maintenance issues and determine the most efficient ways to operate the vehicles.

55,000—The number of “package cars” (the brown trucks) in UPS’ U.S. fleet. If the figures involved in determining the most efficient route for one driver are astronomical in scale, imagine how those numbers look for the entire fleet.

16 million—The number of deliveries UPS makes daily.

$30 million—The cost to UPS per year if each driver drives just one more mile each day than necessary. By that same logic, the company saves $30 million if each driver finds a way to drive one mile less.

85 million—The number of miles Levis says UPS’ analytics tools are saving UPS drivers per year.

100 million—The reduction in the number of minutes UPS trucks spend idling thanks in part, the company says, to onboard sensors that helped figure out when in the delivery process to turn the truck on and off.

200 million—The number of addresses mapped by UPS drivers on the ground.

15 trillion trillion—The number of possible routes a driver with just 25 packages to deliver can choose from. As illustrated by the classic traveling salesman problem, the mathematical phenomenon that makes figuring out the best delivery routes so difficult is called a combinatorial explosion.

Saturday, June 15, 2013

Google's 'Loon'y Internet Balloons

 Using helium filled balloons launched from New Zealand, and floating about 12 miles (20 km) in the stratosphere; Google is testing Internet access via remote controlled balloons with flight computers.  The balloons will control their location by changing altitude to pick up different wind patterns.  The transmitter on each balloon would beam down the Internet to an area about 780 square miles (1,250 square kilometers) — twice the size of New York City; but will require a special transceiver on the ground.  Connection speeds will be equivalent to 3G.  Inter-balloon and ground-to-balloon communication will use ISM bands (specifically 2.4 and 5.8 GHz).  You can follow their progress on g+.

Friday, June 14, 2013

Parking sensors, smart street signs and their implications for vehicle tracking systems

Imagine a robotic street sign that can anticipate points of interest for you as you approach it in your vehicle - using a host of information such as the time of day, holidays, weather, important local events, and location relevant social media data - like the start (or end) of nearby concert.  


That's the concept of a company called 'Breakfast' and their product called 'Points': a smart, dynamically rotating set of digital signage.

In an ongoing effort to setup intelligent parking infrastructure, several cities have started installing smart parking sensors to indicate when parking areas are taken, and when they have become recently available. These devices are placed in the pavement in order to detect when a vehicle is parked in an individual space.

How are these and other related sensors going to impact drivers and vehicle tracking systems?

In the future with a more prominent foothold for the 'Internet of things', machine to machine communication will be a given - not only will the parking sensor know that you've parked there, but it will be able to identify the vehicle's make and model, and in some cases, the VIN number as well.  The sensor will be able to securely relay your associated credit card information to the electronic parking meter, and verify that you don't have any prior outstanding parking or speeding tickets - while your in-vehicle navigation system suggests the closest parking spot to your destination - and your vehicle tracking services let your company, and your customer (or vendor/associate) know that you've arrived. 

Vehicle tracking system providers will have the opportunity to take the lead on this technology revolution by providing a variety of communication system integration features (such as integrated, GPS, Zigbee, RFID, OBDII, and/or Bluetooth voice and data connections) acting as a communication hub, and retrofitting vehicles with tracking and communication technology that will allow their owners to have relatively easy access to these systems.  This will allow for smoother integration between communication and automation devices that the driver/passengers are wearing or carrying; and the vehicular environment.


'Smart hubs' will be able to rapidly increase the uptake of integration with systems like hand-held or wearable communication devices, automated road systems, and Internet parking; which cities will be actively (and in some cases aggressively) setting up to help manage services like parking and traffic control, but also to handle crowd-sourced information for things like real-time traffic reporting by the vehicles, vehicle health, reporting accidents, road or environmental hazards, stolen vehicles, while also providing access to social features such as how close friends and family are in case you wish to setup an impromptu get together; not to mention location relevant ads for possible meeting places.

Why is all of this important?  Because the underlying driver for this technology will be the ongoing pressure regarding energy use, increased efficiency, and decreasing impact on our environment, which is going to continue to build over time; and this kind of flexible technology will be able to help to reduce overall costs while providing new amenities and efficiency to individuals, businesses, and governments alike.

Thursday, June 13, 2013

U.S. Supreme Court turned down operational mandates set by the Port of Los Angeles in an attempt to impose regulation on interstate commerce

In an opinion authored by Justice Kagan, a unanimous Supreme Court rejected the Port’s contention. 

The legal arguments were regarding "concession agreements" that the Port made mandatory for drayage trucks performing short haul movement of cargo in and out of the port - under the auspices of the "Clean Truck Program" which was implemented in 2007.  The agreement required that the trucking company meet additional regulations for financial capacity, maintenance, and additional regulations for it's employment of drivers.  The agreements also require specific placards be placed on the vehicles with a phone number for reporting issues or concerns, and a plan for off-street parking.  An amended tariff made it a misdemeanor for terminal operators to grant access to any unregistered drayage truck.


Agreeing with ATA on these rules, the Court concluded that, whatever the Port’s asserted motivation, the concession agreements amounted to “classic regulatory authority” and thus fell within the scope of the FAAAA’s preemption provision. It observed that the concession agreements, while technically contracts between the Port and trucking companies, were not the “result merely of the parties’ voluntary commitments.”  Rather, the Port compelled trucking companies to enter into the contracts as a condition of access to the Port, by “wielding coercive power over private parties, backed by the threat of criminal punishment.”  By imposing the concession agreements through coercion rather than “ordinary bargaining,” Los Angeles was “performing its prototypical regulatory role.”.


A possible technical solution involving GPS to the conflict over the Blue Nile dam in Ethiopia

Ethiopia is building several hydro-electric dams throughout the country in an effort to provide reliable electrical power to it's citizens, and to help generate income for the country by selling power to some of it's neighboring countries; including Sudan and Kenya as well as Djibouti.

The most contentious though is the current construction for the Grand Ethiopian Renaissance Dam (GERD) on the Blue Nile - which once constructed will be the largest dam in Africa, at a cost of 4.8B euros ($6.4B USD) and generating an expected 6,000MW.  The Blue Nile and the White Nile meet near Khartoum in Sudan to form the Nile which subsequently flows into Egypt.  The Nile is considered to be the longest river in the world and impacts multiple neighboring countries - the Blue Nile originates in Ethiopia from lake Tana.  The Blue Nile provides the Nile with 85 percent of it's overall resources so it's clearly an important tributary.  Egypt's 'war of words' over GERD oscillate between righteous indignation and outright threats over the building of the dam; with some Egyptian minsters actually going so far as to suggest air strikes against the structure.  So far Ethiopia is standing it's ground and as shown in the picture above, has already diverted the Blue Nile in preparation for building the dam - which is estimated to be 21% complete.  

Egypt is claiming historical and colonial-era treaties grant them 70 percent of the water supply (55 billion cubic meters of water per year), and consider any significant loss of their water rights to be a matter of national security for it's more than 84M population.  The Nile provides Egypt with 90% of all of their natural water resources. In May of 2010 five members of the Nile Basin Initiative signed a cooperative framework agreement to seek more water from the River Nile - a move strongly opposed by Egypt and Sudan.

In spite of the controversy, I suspect there will be a diplomatic set of solutions and compromises between all interested parties - as Egypt is an influential trading partner with Ethiopia, and throughout Africa and the Middle East.  However, part of that overall diplomatic solution could be a technical one.

By using an intelligent 'Internet of things' strategy, buoys could be placed at strategic locations before and after the dam to collect real time readings on several items like the depth of the river, the speed of water flow, air and water temperature, pH, etc - coupled with specific GPS locations of the buoy where the data is being collected.  This data could then be provided to all interested parties for near real-time analysis to help show compliance with any compromise agreements, or existing agreements in place regarding the responsible management of the Blue Nile's natural resources.  The data and subsequent analysis can be used to show when use of the dam may be needed to help mitigate potential flooding, but can also be used to show real-time shortfalls in water supplies to downstream entities - while providing ongoing historic data about seasonal water supplies within the Nile basin.

Wednesday, June 12, 2013

Original GPS design team member shares a similar vision with Waze


Bradford Parkinson, former US Air Force Colonel, professor emeritus at Stanford University, and integral member of the team that invented GPS technology shared his vision of the future of GPS with CNN in a recent interview. 

In the interview, he indicates that self-driving cars with GPS will be one of the next major steps forward for the technology - using, among other things, cooperation between cars.

This is similar to the 'crowd sourcing' of highway, traffic, and road condition information that Waze is using to help drivers navigate from location to location.  It's no wonder that Google is snapping up the company for an estimated $1.3B, since they are already highly invested in the self-driving vehicle technology market.  Automated sharing of this kind of crowd sourced information from other drivers or automated vehicles should only further advance the technology - and the Google patent base.  The self driving automobile market already has a high barrier to entry - Google is making sure it stays that way for the foreseeable future.  

In September 2012, California legalized self driving vehicles, and are expected to have the first draft of regulations and legislation for self driving vehicles in 2015; with the first self driving vehicles expected to hit the highways in 2016.

While experts have indicated that the current self driving systems are somewhat impractical - since they involve system technology costing nearly $70,000, the CNN article reminds us that the first 'portable' GPS system was the 'Manpack', which weighed 40 pounds and cost over $400,000 - while today's GPS devices can be "smaller than a fingernail and cost $1.50".