Search This Blog

Wednesday, March 26, 2014

Using Adaptive Notch Filters To Reduce The Effects Of GPS Jamming


As reported by Inside GNSS: GNSS jammers are small portable devices able to broadcast powerful disruptive signals in the GNSS bands. A jammer can overpower the much weaker GNSS signals and disrupt GNSS-based services in a geographical area with a radius of several kilometers. Despite the fact that the use of such devices is illegal in most countries, jammers can be easily purchased on the Internet and their rapid diffusion is becoming a serious threat to satellite navigation.

Several studies have analyzed the characteristics of the signals emitted by GNSS jammers. From the analyses, it emerges that jamming signals are usually characterized by linear frequency modulations: the instantaneous frequency of the signal sweeps a range of several megahertz in a few microseconds, affecting the entire GNSS band targeted by the device.

The fast variations of their instantaneous frequency make the design of mitigation techniques particularly challenging. Mitigation algorithms must track fast frequency variations and filter out the jamming signals without introducing significant distortions on the useful GNSS components. The design problem becomes even more challenging if only limited computational resources are available.

We have analyzed the ability of an adaptive notch filter to track fast frequency variations and mitigate a jamming signal. In this article, we begin by briefly describing the structure of the selected adaptive notch filter along with the adaptive criterion used to adjust the frequency of the filter notch.

When the adaptation parameters are properly selected, the notch filter can track the jamming signals and significantly extend the ability of a GNSS receiver to operate in the presence of jamming. Moreover, the frequency of the filter notch is an estimate of the instantaneous frequency of the jamming signal. Such information can be used to determine specific features of the jamming signal, which, in turn, can be used for jammer location using a time difference of arrival (TDOA) approach.

The capabilities of the notch filter are experimentally analyzed through   a series of experiments performed in a large anechoic chamber. The experiments employ a hardware simulator to broadcast GPS and Galileo signals and a real jammer to disrupt GNSS operations. The GNSS and interfering signals were recorded using an RF signal analyzer and analyzed in post-processing. We processed the collected samples using the selected adaptive notch filter and a custom GNSS software receiver developed in-house.

The use of mitigation techniques, such as notch filtering, significantly improves the performance of GNSS receivers, even in the presence of strong and fast-varying jamming signals. The presence of a pilot tone in the Galileo E1 signal enables pure phase-locked loop (PLL) tracking and makes the processing of Galileo signals more robust to jamming.

Adaptive Notch Filter
Several interference mitigation techniques have been described in the technical literature and are generally based on the interference cancellation principle. These techniques attempt to estimate the interference signal, which is subsequently removed from the input samples. For example, transform domain excision techniques at first project the input signal onto a domain where the inference signal assumes a sparse representation. (See the articles by J. Young et alia and M. Paonni et alia, referenced in the Additional Resources section near the end of this article.) The interference signal is then estimated from the most powerful coefficients of the transformed domain representation. The interfering signal is removed in the transformed domain, and the original signal representation is restored.

When the interfering signal is narrow band, discrete Fourier transform (DFT)-based frequency excision algorithms, described in the article by J. Young and J. Lehnert, are particularly effective. Transform domain excision techniques are, however, computationally demanding, and other mitigation approaches have been explored. For example, notch filters are particularly effective for removing continuous wave interference (CWI). M. Paonni et alia, cited in Additional Resources, considered the use of a digital notch filter for removing CWI, the center frequency of which was estimated using the fast Fourier transform (FFT) algorithm. Despite the efficiency of the FFT algorithm, this approach can result in a significant computational burden and alternative solutions should be considered.

The article by M. Jones described a finite impulse response (FIR) notch filter for removing unwanted CW components and highlighted the limitations of this type of filter. Thus, we adopted an infinite impulse response (IIR) structure and experimentally demonstrated its suitability for interference removal. In particular we considered the adaptive notch filter described in the article by D. Borio et alia listed in Additional Resources and investigated its suitability for mitigating the impact of a jamming signal.

http://www.insidegnss.com/auto/popupimage/WPEqTab.jpg This technique has been selected for its reduced computational requirements and for its good performance in the presence of CWI. Note that the notch filter under consideration has been extensively tested in the presence of CWI; however, its performance in the presence of frequency-modulated signals has not been assessed. Also, note that removing a jamming signal poses several challenges that derive from the swept nature of this type of interference. (For details, see the paper by R. H. Mitch et alia.)


Jamming signals are usually frequency modulated with a fast-varying center frequency. The time-frequency evolution of the signal transmitted by an in-car GPS jammer is provided as an example in Figure 1. The instantaneous center frequency of the jamming signal sweeps a frequency range of more than 10 megahertz in less than 10 microseconds. The adaptation criterion selected for estimating the center frequency of the jamming signal has to be sufficiently fast to track these frequency variations.

The notch filter considered in this work is characterized by the following transfer function (illustrated on the opening page of this article) 

Equation 1 (for equations see inset photo, above right)
where kα is the pole contraction factor and z0[n] is the filter zero. kα controls the width of the notch introduced by the filter, whereas z0[n] determines the notch center frequency. Note that z0[n] is progressively adapted using a stochastic gradient approach described in the textbook by S. Haykin with the goal of minimizing the energy at the output of the filter. A thorough description of the adaptation algorithm can be found in the article by D. Borio et alia.

The notch filter is able to place a deep null in correspondence with the instantaneous frequency of narrow band interference and, if the zero adaptation parameters are properly chosen, to track the interference frequency variations. The energy of the filter output is minimized when the filter zero is placed in correspondence with the jammer instantaneous frequency 

Equation 2
where Φ(nTs) is the jammer instantaneous frequency and fs = 1/Ts is the sampling frequency.

This implies that z0[n] can be used to estimate the instantaneous frequency of the interfering signal. The magnitude of z0[n] also strongly depends on the amplitude of the interfering signal. Indeed, |z0[n]| approaches one as the amplitude of the jamming signal increases. Thus, |z0[n]| can be used to detect the presence of interference, and the notch filter activates only if |z0[n]| passes a predefined threshold, Tz. A value of Tz= 0.75 was empirically selected for the tests described in the following section.

http://www.insidegnss.com/auto/popupimage/WPFig1_2.jpgExperimental Setup and Testing
To test the capability of the adaptive notch filter to mitigate against a typical in-car jammer, we conducted several experiments in a large anechoic chamber at the Joint Research Centre (JRC) of the European Commission.


Figure 2 provides a view of the JRC anechoic chamber where the jamming tests were conducted. The anechoic chamber offers a completely controlled environment in which all sources of interference besides the jammer under test can be eliminated.

The experimental setup is similar to that employed to test the impact of LightSquared signals on GPS receivers (For details, see the article by P. Boulton et alia listed in Additional Resources). We used a simulator to provide a controlled GPS and Galileo constellation, with a static receiver operating under nominal open-sky conditions. The GNSS signals were broadcast from a right hand circular polarization (RHCP) antenna mounted on a movable sled on the ceiling of the chamber. A survey grade GNSS antenna was mounted inside the chamber, and the sled was positioned at a distance of approximately 10 meters from this antenna. The GNSS receiving antenna was connected via a splitter to a spectrum analyzer, an RF signal analyzer, and a commercial high sensitivity GPS receiver. Table 1 (see inset photo, above right) lists the RF signal analyzer parameters.

To provide the source of jamming signals a commercially available (though illegal) in-car jammer was connected to a programmable power supply. We removed the jammer’s antenna and connected the antenna port, via a programmable attenuator with up to 81 decibels of attenuation, to a calibrated standard gain horn antenna. This gain horn was positioned at approximately two meters from the GNSS receiving antenna.

The goal of this configuration was to permit variation of the total jammer power received at the antenna.
Unfortunately, the jammer itself is very poorly shielded; so, a significant amount of the interfering power seen by the receiver was found to come directly from the body of the jammer, rather than through the antenna.
To minimize this effect, we exercised great care to shield the jammer as much as possible from the GNSS antenna. We placed the jammer body in an aluminum box, which was subsequently surrounded by RF absorbent material. The jammer body and the receiving GNSS antenna were separated by approximately 15 meters, thereby ensuring approximately 60 decibels of free space path loss.

The experiment was controlled via a PXI controller, which generated synchronous triggers for the RF data collection and simulator signal generation, controlled the power supplied to the jammer, and updated the attenuation settings according to a desired profile. All events (trigger generation, jammer power on/off, attenuation setting) were time stamped using an on-board timing module. The commercial receiver was configured to log raw GPS measurements including carrier-to-noise (C/N0) values.

http://www.insidegnss.com/auto/popupimage/WPFig3_4_2.jpg The experimental procedure involved two trials, each lasting approximately 40 minutes. In the first trial, the simulator and data collection equipment were both enabled, but the jammer remained powered off. In the second trial, the same scenario was generated in the simulator, the data collection equipment was enabled and, after a period of three minutes, the jammer was powered on.

We initially set the attenuation to its maximum value of 81 decibels. We subsequently reduced this in two-decibel decrements to a minimum value of 45 decibels. We maintained each level for a period of 60 seconds. Finally, we again increased the attenuation in two-decibel increments to its maximum value. Figure 3 presents this attenuation profile.

We performed a calibration procedure whereby the total received jammer power at the output of the active GNSS receiving antenna was measured using a calibrated spectrum analyzer while the attenuation level was varied. Further, the total noise power was measured in the same 12-megahertz bandwidth with the jammer switched off. This permitted the computation of the received jammer-to-noise density power ratio (J/N0) as a function of the attenuator setting.

Figure 3 also shows the calibrated J/N0 at the output of the active GNSS antenna as a function of time. The analysis provided in the next section is conducted as a function of the J/N0.

Sample Results
This section provides sample results obtained using the adaptive notch filter described earlier. In particular, the loss in C/N0 experienced by the GPS and Galileo software receivers used for analysis is experimentally determined as a function of the J/N0.

The adaptive notch filter is used to reduce the C/N0 loss. Figure 4 shows the loss in C/N0 experienced in the presence of the jammer as a function of J/N0. The first curve arises from software receiver processing of the GPS signals, the second plot from software receiver processing of the Galileo signals, and the third from the commercial high sensitivity receiver that processed only the GPS signals.

Note the small difference between the GPS and Galileo results. This is to be expected due to the wideband nature of the jammer. In fact, for both GPS and Galileo processing the jammer is effectively averaged over many chirp periods, thereby giving it the appearance of a broadband (white) noise source. The one difference between the GPS and Galileo signals is that the tracking threshold of the Galileo signals is approximately six decibels lower than that for the GPS signals. This is due to the use of a pure PLL processing strategy using only the E1C (pilot) component of the Galileo signal.

The other interesting point to note from Figure 4 is that the commercial receiver exhibits better resilience against the jammer. This is most likely due to a narrower front-end bandwidth in the commercial receiver, although this cannot be confirmed because the receiver manufacturer does not provide this information.

From the time-frequency evolution of the jamming signal used for the experiment and shown in Figure 1, it emerges that the bandwidth of the jamming component is approximately 10 megahertz. If the commercial receiver had a smaller bandwidth, then it would effectively filter out some of the jammer power, thereby improving its performance with respect to the software receiver results.

Figure 4 provides an indication of the performance degradation caused by a jamming signal when no mitigation technique is employed. The notch filter is expected to improve the receiver performance. The improvement depends on the filter parameters and their ability to track the jammer’s rapid frequency variation.

Two configurations of the adaptive notch filter were tested: kα = 0.8 and kα = 0.9. The first case has a smaller contraction factor and, hence, a wider notch than the latter.

http://www.insidegnss.com/auto/popupimage/WPFig5_6.jpg The adaptive step size of the stochastic gradient algorithm was tuned for the jammer under consideration. (The adaptation of the filter zero must be fast to track the frequency variations of the jammer’s chirp signal.) In each case the magnitude of the zero of the notch filter was used as a detector for interference. We chose a threshold of 0.75 so that when the amplitude of the zero was greater than this threshold, the notch filter was enabled and the receiver processed this filtered data. Otherwise the receiver processed the raw data collected from the antenna.

Figure 5 and Figure 6 illustrate the results of the filtering for the two cases. In these plots, the upper portion shows the time evolution of the frequency content of the raw data, with the frequency estimate of the notch filter superimposed as a dashed red line. The lower plots show the time evolution of the frequency content of the filtered data. From these lower plots the wider notch appears to do a better job of removing the jammer signal. On the other hand, this will also result in a greater reduction of the useful signal power.

http://www.insidegnss.com/auto/popupimage/WPFig7_8.jpg The effect of the notch filter on the reception of GNSS signals in terms of the C/N0 degradation is illustrated in Figure 7 and Figure 8 for Galileo and GPS signals, respectively. Again, the difference between the impact on GPS and Galileo signals is slight, due to the wideband nature of the interferer. On the other hand, the benefit of the notch filter is clear in both figures. The sidebar, “Track the Jamming Signal,” (at the end of this article) provides access to data and tools with which readers can test different configurations of the notch filters themselves.

Interestingly, it appears that two limiting curves exist, one for the case of no filtering and one for the case where a notch filter is applied. The variation in the contraction factor (over the range considered) has little effect on the C/N0 effectively measured by the GPS and Galileo software receivers.

The separation between the two curves is approximately five decibels, i.e., the receiver that applies the notch filter experiences approximately five decibels less C/N0 loss than an unprotected receiver for the same J/N0. Of course, we must remember that this result applies for the data collection system considered in this test, which consists of a 14-bit analog-to-digital converter (ADC) with no automatic gain control (AGC). In commercially available receivers with a limited number of bits for signal quantization the non-linear losses due to the combination of these two front-end components will likely lead to additional losses.

Conclusion
We have proposed an IIR adaptive notch filter as an easy means to implement mitigation technique for chirp signals typical of the type of commercially available jammers that have become ever more present in recent years. A simple stochastic gradient adaptation algorithm was implemented, with an associated simple interference detection scheme. Our analysis showed that, for a receiver with sufficient dynamic range, the proposed technique leads to an improvement of approximately five decibels in terms of effective C/N0.
We tested the proposed scheme on data collected from a low-cost commercial jammer in a large anechoic chamber. We used a software receiver to process both GPS and Galileo signals. The broadband nature of the chirp signal means that its effect on GNSS signal processing is similar to an increase in the thermal noise floor. Hence, the impact is very similar on both GPS and Galileo receivers. On the other hand, the chirp signal is instantaneously narrowband, a feature that is exploited by the use of a notch filter with a highly dynamic response to variations in the frequency of the interferer.

Acknowledgment
This study is mainly based on the paper “GNSS Jammers: Effects and Countermeasures” presented by the authors at the Satellite Navigation Technologies and European Workshop on GNSS Signals and Signal Processing, (NAVITEC), December 2012.

Additional Resources
[1]
Borio, D., Camoriano, L., and Lo Presti, L., “Two-pole and Multi-pole Notch Filters: A Computationally Effective Solution for GNSS Interference Detection and Mitigation,” IEEE Systems Journal, Vol. 2, No. 1, pp. 38–47, March 2008
[2]
Boulton, P., Borsato, R., and Judge, K., “GPS Interference Testing, Lab, Live, and LightSquared,” Inside GNSS, pp. 32-45, July/August 2011
[3]
Haykin, S., Adaptive Filter Theory, 4th ed., Prentice Hall, September 2001
[4]
Jones, M., “The Civilian Battlefield, Protecting GNSS Receivers from Interference and Jamming,” Inside GNSS, pp. 40-49, March/April 2011
[5]
Mitch, R. H., Dougherty, R. C., Psiaki, M. L., Powell, S. P., O’Hanlon, B. W., Bhatti, J. A., and Humphreys, T. E., “Signal Characteristics of Civil GPS Jammers,” Proceedings of the 24th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS 2011), Portland, OR, pp. 1907–1919, September 2011
[6]
Paonni, M., Jang, J., Eissfeller, B., Wallner, S., Avila-Rodriguez, J. A., Samson, J., and Amarillo- Fernandez, F., “Wavelets and Notch Filtering, Innovative Techniques for Mitigating RF Interference,” Inside GNSS, pp. 54 – 62, January/February 2011
[7]
Young, J. and Lehnert, J., “Analysis of DFTbased Frequency Excision Algorithms for Direct Sequence Spread-Spectrum Communications,” IEEE Transactions on Communications, Vol. 46, No. 8, pp. 1076 –1087, August 1998 

Could Future Gasoline Engines Emit Less CO2 Than Electric Cars?

As reported by Green Car ReportsIs it possible to make a gasoline engine so efficient that it would emit less carbon dioxide per mile than is created by generating electricity to run an electric car over that same mile?


Small Japanese carmaker Mazda says yes.
In an interview published last week with the British magazine Autocar, Mazda claimed that its next generation of SkyActiv engines will be so fuel-efficient that they'll be cleaner to run than electriccars.

That's possible. But as always, the devil is in the details.

Specifically, total emissions of carbon dioxide (CO2) in each case depend on both the test cycles used to determine the cars' emissions and the cleanliness of the electric generating plants used to make the electricity.

In the U.S., the "wells-to-wheels" emissions from running a plug-in electric car 1 mile on even the dirtiest grids in the nation (North Dakota and West Virginia, which burn coal to produce more than 90 percent of their power) equate to those from the best non-hybrid gasoline cars: 35 miles per gallon or more.

The U.S. average for MPG equivalency is far higher, however, and it's roughly three times as high--near 100 mpg--for California, the state expected to buy as many plug-in cars as the next five states combined.
In Europe, however, 35 mpg is a perfectly realistic real-world fuel efficiency for small diesel cars (generally compacts and below). And their official ratings are often higher still.

European test cycles for measuring vehicle emissions (which translate directly to fuel efficiency) are gentler than the adjusted numbers used in the U.S. by the EPA to provide gas-mileage ratings.

On the generation side, some European countries use coal to produce a large proportion of their national electricity. (Some also buy their natural gas from Russia, a supplier that may appear more problematic today than in years past.)

So if Mazda can increase the fuel economy of its next-generation SkyActiv engines by 30 percent in real-world use, as it claims, it's possible that its engines might reach levels approaching 50 mpg or more--without adding pricey hybrid systems.

And those levels would likely be better than the wells-to-wheels carbon profile of an electric car running in a coal-heavy country--Poland, for example.

Mazda will raise its current compression ratio of 14:1 to as much as 18:1 and add elements of homogeneous charge-compression ignition (HCCI) to its new engines.

The HCCI concept uses compression itself to ignite the gas-air mixture--as in a diesel--rather than a spark plug, improving thermal efficiency by as much as 30 percent, though so far only under light loads.
Mazda's next round of SkyActiv engines won't emerge until near the end of the decade, "before 2020." Even its current-generation diesel models still haven't been launched in the U.S.

With rising proportions of renewable sources like wind and solar, and perhaps more natural gas, some European grids will then be cleaner than they are today--making the comparison tougher for Mazda.

But the company's assertion is at least plausible. We'll wait for actual vehicles fitted with the new and even more efficient engines to emerge, and see how they compare to the lastest grid numbers then.

SkyActiv badgeElectric-car advocates may be tempted to pooh-pooh any vehicle with any tailpipe emissions. Or they may point out that electric-car owners in the U.S. appear to have solar panels on their homes at a much higher rate than the country at large--meaning much of their recharging is done with virtually zero carbon emitted.

But every effort to reduce the carbon emissions per mile of the trillions of miles we drive globally every year is a step in the right direction.

Will Mazda lead the march along that path? We look forward to learning more about its next SkyActiv engines.

Boom In Sale Of Telematics Solutions Predicted

Vehicle tracking using OBDII port devices
is becoming more popular due to easy installation
and the ability to track personal vehicles and/or
rental/leased vehicles.
As reported by OutLaw.com:  Sales of telematics devices are expected to increase by more than 1000% over the next five years, according to a new study.

Technology market analysts ABI Research said that 117.8 million 'onboard diagnostics (OBD) aftermarket telematics solutions' are expected to be subscribed to in 2019, up from 9.5m this year.

It said the demand for solutions that plug in to OBD port built into vehicles is strongest in European and North American markets at the moment but that smartphone applications will provide increasing competition within the telematics technology market.
Smartphone HTML5 applications are being
used more commonly to provide detailed
information not only of a vehicle's location
but to gather and process sensor information.
"Beyond the 2019 forecast horizon, the window of opportunity for OBD-dongles will gradually close as open factory-installed OEM (original equipment manufacturer) telematics becomes more widespread," ABI Research said in a statement.
"OBD solutions will also face competition from aftermarket telematics solutions based on smartphones connecting directly to the vehicle OBD port via Bluetooth or Wi-Fi adapters. Even standalone smartphone applications are starting to be explored for applications such as UBI (usage-based insurance) and driver behavior monitoring of truck drivers leveraging the built-in GPS, accelerometer, and connectivity," it said.
'Telematics' is a term most commonly associated with the motor insurance industry; though in general it refers to any system that collects remote or mobile sensor data. Insurance companies are increasingly recording information via devices in cars that allows them to set insurance premiums that reflect the driving style of motorists. Using recorded data, companies are able to pinpoint specific driver or vehicle risks rather than using more generalized area statistics.
Earlier this year competition law expert Natasha Pearman of Pinsent Masons, the law firm behind Out-Law.com, warned insurance companies to be careful to ensure that arrangements governing how telematics data is gathered, managed and accessed do not fall foul of competition and privacy laws.

Tuesday, March 25, 2014

2014: GPS Modernization Stalls

As reported by Inside GNSS: With the optimism of college-bound seniors touring the Ivy League, GPS managers have been weighing options to dramatically change the GPS constellation. Now, after studying the costs, considering the benefits, and assessing the funding climate, officials have made the starkly fiscal decision to stick close to home and take a few extra years to finish.

Although the final decisions will not be made until sometime this spring, proposals for a distinctly different type of GPS constellation appear to be off the table, sources tell Inside GNSS. The plan now appears to forego any major shift in the design of the satellites such as those proposed in Lower Cost Solutions for Providing Global Positioning System (GPS) Capability, an Air Force report delivered to Congress last April. 

The established course of modernization will proceed largely unchanged, say sources, but it will take longer to build and launch the GPS III satellites and add the new signals. Full implementation of the new military M-code, for example, will be pushed back roughly four years at least, noted one source. Those waiting for the new civil signals will also have to be patient. 

The current course of action will likely raise the total cost of the modernized system although the higher costs will be spread out over time in a way that fits more appropriately into budgets constrained by sequestration and the overall post-war downsizing taking hold at the Pentagon. 

“We’ll pay a unit price that’s a little higher, but just like when you buy your car on payments, you pay more for the car but you have cash flow management,” Maj. Gen. Robert McMurry, director for Air Force space acquisition told reporters during a briefing on the military space budget. 

The fiscal year 2015 (FY15) budget McMurry was describing reflects the decisions so far. The FY15 request for money to procure GPS III satellites is just $292.397 million — down dramatically from the White House’s request for $477.598 million in FY14 and roughly half of the $531 million the White House projected it would need for the program just last year. It is less even than the $450.598 million allocated by Congress. 

The administration is also asking $212.571 million for GPS III development, somewhat less than the $221.276 million requested last year and only slightly smaller than the $215 million that was projected to be needed last year. The request is a bit more than the $201.276 million Congress appropriated for FY14. 

The request to procure Block IIF satellites is about the same as last year — $52.09 million — but the new ground control segment faces al reduction. Whether it is a big or small cut depends on how you look at it. The White House is asking for $299.76 million in FY15 for the Next Generation Operational Control System or OCX. That is down sharply from its $383.5 million request last year and the congressionally approved FY14 amount of $373.5 million. 

In the projections that accompany each budget, however, the White House last year only anticipated asking for $303.5 million for FY15. In fact, both the FY14 and FY15 budgets project falling GPS allocations for the next several years. This is despite the fact that the program has been experiencing delays and challenges — the sort of things that normally add to the cost. How these two trends will mesh in the end is unclear. 

Implications 
Where it had planned to buy two GPS IIIs this year, the Air Forces now plans to buy only the ninth in the series and put down money for long lead items on Space Vehicle 10 (SV10). It will then buy just one new satellite next year and three each year for the next three years, according to McMurry. 

The launches will be stretched out a bit as well. Five GPS missions will see their booster procurement moved past 2017, said McMurry. That has implications for Department of Defense (DoD) plans to introduce more competition into the launch procurements. 

“Those five will still be available for competition,” he said. “It’s to be determined how we’ll do that competition. It will be part of the phase two, and they’re working on that strategy.” 

The long life of the existing satellites is making it possible to find savings without undermining the quality of the GPS system, officials said. 

“(The) satellites are living longer than we predicted so we didn’t need to replenish those as fast as we originally planned. And it made no sense to spend that money if we didn’t need the satellites,” said Troy Meink, deputy under secretary of the Air Force for space. 

The Constellation 
Less clear is how many satellites the Air Force plans to have in the constellation over the long run. 
The FY15 budget “reprofiles Global Positioning System, GPS III, to meet constellation sustainment demands,” said Under Secretary of the Air Force Eric Fanning. 

“Sustainment,” however, does not necessarily mean supporting the constellation in its current configuration, which stands at 31 satellites plus spares. While having more satellites is advantageous to those on the ground, it is not strictly necessary according to Air Force mandates. 

“The requirement is 27 satellites to maintain 24,” McMurry told the audience at a March 7 breakfast on Capitol Hill sponsored by the Mitchell Institute. “I think our approach will be to absolutely assure that requirement, which meets that mandated performance, but in doing so I will expect that we’ll, in reality, maintain a slight surplus to that as we move on.” 

“It is clear that they are going to stay with the ‘enhanced 27,’ which is 27+3 for as long as they can,” said a source familiar with the issue. “And even if for some reason those (satellites) that are turned on now drop out, they’ve got as many as five that they can turn back on.” 

Those backup satellites just have “to last until the next launch,” said the source, suggesting it is realistic that the Air Force will be able to keep the number of satellites up even though the spacecraft have been operating far past their design lives. 

The question is whether the next launch, or launches, can happen fast enough. The existing satellites were launched in clusters and are therefore at risk of failing in groups as they age out. Will the Air Force have the satellites it needs and the lift capacity required to deal with a sudden, rapid loss of the older spacecraft? 
It appears that the answer may be “Yes.” 

Dual Launch After All 
Inside GNSS has learned that, even though the Pentagon has slashed the funding for dual launch, plans are in the works to enable the lighter payloads and dual-launch capability needed to rapidly refresh the constellation. 

The United Launch Alliance, a 50/50 joint venture between Lockheed and Boeing, is stepping in to finish the work. “ULA is funding the launch vehicle development work that will enable dual launches of GPS III and other potential satellites, with a planned first launch capability in 2017,” the company said in a written response to a question from Inside GNSS

And the Air Force will continue to fund the technology needed on the satellites to make dual launch possible. 

“We have maintained the development of dual-launch capability within the satellite line,” McMurry said. “The satellite itself will have the dual transponders and radio frequencies in place so that, if you launch two of them together, you could communicate with both of them independently.”

Slimmer Sats in a Pinch
The satellites might still be too heavy to launch two at a time unless some other changes are made. 
To address that, said a source, military managers are planning to build flexibility into the GPS III satellites that will enable the service to drop the Nuclear Detonation Detection System or NDS payload from the GPS III satellite if necessary. 

“They are going to ensure that they have the option to fly GPS III without the NDS,” said the source, adding that, without NDS, “should there be the requirement . . . to rapidly replace the constellation, they will actually be able to launch two.” 

The built-in flexibility also means the GPS program will not have to plan around any delays in the new NDS payload — which is on schedule but still needs a good deal of testing, the source said. 

Dual-launch capability and potentially lighter satellites are a variation on a far more ambitious proposal to develop a stripped-down version of the GPS III spacecraft that would not carry the NDS and provide fewer signals. These smaller satellites would likely have been used in combination with some of the larger, regular GPS spacecraft. 

The austere NavSats or “NibbleSats” were cheaper to build and could have been launched at least two at a time — perhaps even in clutches of three or four — dramatically reducing the cost of maintaining the constellation. But the proposal was set aside after hitting a number of stumbling blocks involving funding and contract management, sources told Inside GNSS

“I think the pressure to reduce the cost of the satellite is very much there,” said an expert. “Obviously anything the Air Force can do to drive the cost of the satellite down — I think they’re still alert to those opportunities. But I think the Congress has made it very difficult for them too create a ‘new start.’”

It is not only hard to get money for new starts (new programs), the rules governing the seemingly endless rounds of Congressional budget extensions mean the program could have become tangled in delays, said Stan Collender, national director of financial communications for Qorvis Communications and an expert on the federal budget. 

“Under a continuing resolution,” said Collender, “new starts are prohibited.” 

Moving to small satellites would also have been such a big change that it likely would have required the Air Force to recompete the GPS III contract. New procurements face a long process fraught with complexities and the uncertainties created whenever new contractors enter a program where legacy satellites have been in place so long that they are frequently referred to jokingly as being “old enough to vote” — that is, have reached 18 years of age. 

The prospect of reopening the contract is why “NavSats, small sats, little sats — that stuff is off the table,” said a source, who has been following the issue. 

Thrifty Innovation
That does not mean the GPS III program will proceed without a few enhancements. 

Col. Bill Cooley, director of the GPS Directorate, said the Air Force is “looking at a design turn” and is examining using better solar panels and traveling wave tube amplifiers or TWTAs (TWEEtas) which could reduce power requirements. 

Changing the batteries is also under consideration, according to sources. One expert pointed out that some of older battery technology is not even available anymore. Another source said that lithium-ion batteries were being considered. 

“The decisions are in the process of being made,” Cooley told Inside GNSS. That decision process may be playing a role in the delay of the first GPS III satellite, which is now not expected to be ready until FY16. 
The design of the first satellite is supposed to be a template of sorts for the rest, and Cooley made it clear to the Mitchell Institute audience that he wanted the design to be “repeatable.” 

Right now, however, interference problems within the satellite’s navigation payload have to be resolved. The problems in the payload, which is being build by Virginia-based Exelis, have already been cited as a reason for delays in the program. 

The GPS IIIs must generate and transmit eight signals — legacy military P-code on L1 and L2 frequencies and civil C/A-code on L1, as well as the new dual-frequency M-code, and civil L1C, L2C, and L5 signals. In order to keep the timing and everything synchronized, there is “one critical box” involved in their generation, said Cooley. 

“There’s a whole bunch of techniques you can use,” Cooley told Mitchell Institute attendees. You can use absorptive material, he said, redesign some of the boards or separate the signals by putting them in separate boxes. “All options are on the table.” 

Sources confirm that the Air Force seems to have workable solutions in hand. Although those solutions may need extensive testing, the potentially two-year delay could also give the program time to work in some of the innovations mentioned previously. Program managers may also just be working hard to get the most out of the new satellites. 

“The GPS IIIs have a design life of 15 years,” said Cooley. “That’s a real challenge — to get 15 years in that harsh environment. We hope that they will last much longer. We hope that we can get a satellite that can vote and drink and all those kinds of things.” 

What the changes, delays, and development problems mean for GPS III prime contractor Lockheed Martin is unclear. 

“Over the next few weeks we will review the budget in detail to understand the specific impacts to our business,” the company said in a written response to Inside GNSS. “We look forward to working with the administration and Congress over the coming months as budget discussions continue.” 

OCX and Civil Money
What the delay in the first GPS III satellite means to Raytheon is that the OCX prime will have more time to work on ground segment modernization that already slipped behind schedule, said McMurry, who attributed slippage in the initial GPS III delivery as the biggest reason for a new delay in the OCX program. 

OCX program managers may get more time still if all the money from the civil side of the GPS program does not come through. 

As noted earlier, the White House scaled back their FY15 defense budget request for OCX to $300 million This is just under the $303.5 million that was projected to be needed this year though the program has been experiencing difficulties. This is the request for the Defense Department. OCX also gets part of its money from the U.S. Department of Transportation (DoT), and a long string of underpayments from DoT has put GPS program managers in a position where they will be forced to reprogram OCX, adding some six months to the schedule and tens of millions to DoT’s bill, according to an expert with knowledge of the issue. 

The White House gave DoT the responsibility for funding those parts of the GPS program needed by civil users, and DoT handed the Federal Aviation Administration (FAA) the actual funding task. 

The FAA has largely failed, however, to persuade Congress to allocate the money for the civil funding. This has forced the agency, which has cost overruns on other programs, to short its payments to DoD for the last several years. 

The FAA is now trying to make up for those too-small payments but “it’s not pretty,” said the source, who spoke on condition of anonymity. 

The Administration’s FY15 budget request for civil funding actually jumped from the $20 million requested last year to $27 million — better, but still a far cry from the $40 to $50 million that was supposed to be allocated each year for five years. Even so, if FAA convinces Congress to approve the whole request it will be a dramatic improvement over the scant $6 million it got for FY14. 

Failure to win over lawmakers could have significant consequences. 

Up to now the Air Force has been able to manage around the budget shortfalls and keep things more or less on track. With sequestration and other cuts coming out of DoD space programs, GPS program managers are no longer in a position to finagle funding for civil capabilities. 

The source told Inside GNSS that, should the FAA fail to secure adequate monies, the OCX program to will have to be stretched out some six months at considerable expense — money that FAA will also be expected to make up. The total bill for what FAA owes plus the added cost of delay would top $100 million said the source — nearly four times the current budget request. 

DoD and FAA will have to work together pretty closely to manage during these austere times, said the source. “When you get cut to the bone you have to work pretty closely just to survive.” 

Sequestration Looms 
The budget crunch could get even worse if sequestration is applied without changes in 2016.
 
Military officials made it clear that they are assuming sequestration will not resume in full in 2016, as is now the law. If it should reemerge, said Fanning, “we would be unable to procure one of the three GPS III satellites planned in FY17.” 

Still unclear is how the launch rate or other aspects of the program would be affected, but it is clear, said Collender, that budget politics mean sequestration is likely to remain a factor. He estimated a 3-out-of-4 chance that the cuts will return in full force in the 2016 budget. 

“It will be a presidential year,” he said. “You’ll have Republicans that don’t want to be blamed for increasing spending; so, spending cuts might be in place. There’s probably a 75 percent chance that the 2016 sequester stays in place as is.” 

That could force still more changes to the GPS program and perhaps even reopen consideration of a GPS III redesign,experts hinted. Asked what options the Air Force was looking at for the long term, Cooley left the door open. 

“What options are we not looking at?” he said. 

Does Google Glass Distract Drivers? The Debate Continues

As reported by NPR: Shane Walker hops into his Toyota Prius hybrid and puts on his Google Glass. It's a lightweight glasses frame with a tiny computer built into the lens.

Google is at the forefront of a movement in wearable technology, gadgets we put on our bodies to connect us to the Internet, and perhaps nothing embodies that more than Glass. But the eyewear is raising eyebrows outside the high-tech industry. Before Glass even hits stores, lawmakers in several states want to ban it on the roads.

Walker, an independent developer living in San Francisco, turns on the GPS app and starts driving. Instead of talking out loud, like an app on a smartphone might, it shows him his route as a thin blue line and a triangle on the upper right corner of the lens.

"Google did a good job of making it nonintrusive, so it's not directly in your line of sight," he says.
But Walker's favorite feature is the camera. Say you're on a road trip. With a tap of the side, you can record the entire thing in decent resolution and then, with another tap, share it with your friends. Or you can wink and take a picture.

At a stop sign, Walker strokes the Glass frame with his right index finger. He's flipping through stored photos. The movement is so discreet — no bending his neck down like you would with a smartphone — and I have to ask him: "Is it something you would do if there was a police officer right in front of you?"
"I mean, it's debatable," he replies. "It is hands-free, so I do feel like in my legal right, it's OK for me to interact with stuff that doesn't require my hands, like winking, taking pictures."

Legislative Battle Over Public Safety
Ira Silverstein, a Democratic state senator from Illinois, disagrees. "Yeah, it's hands-free, but it can affect your vision," he says.

He's written a bill that says using Glass distracts drivers. "The first offense would be a misdemeanor. The second offense if, God forbid causes death, could be a felony."

Leading car insurance companies have not yet taken a position on Glass, but at least eight states have proposed legislation banning the use of Google Glass on the road. In West Virginia, Republican state Delegate Gary Howell says lawmakers need to act before Glass gets out of hand.
Glass is the ultimate multitasking machine. It streams incoming emails and scans the human eyelid for commands. But Howell says its high-tech creators aren't seeing a basic fact about the real world.

"Have they driven on mountain roads in West Virginia, where you've got one 15-mile-an-hour turn after another one, where you really need to be concentrating on what you're doing?" he says. "You could be wearing it, not looking at your driving but watching a video screen."

Google is responding to this roadblock by sending lobbyists around the country to dispel concerns. Spokesman Chris Dale says Glass can help drivers.

"It's actually not distracting, and it allows you — rather than looking down at your phone, you're looking up and you're engaging with the world around you," Dale says. "It was specifically designed to do that: to get you the technology you need, just when you need it, but then to get out of your way."

Not Texting, But Text
Back in Walker's car, Glass does something that a smartphone can't do. We turn a corner past a golden fire hydrant, and obscure facts suddenly start streaming in front of Walker's iris.

"When San Francisco burst into flames in the days following the disastrous 1906 earthquake, much of the city's network of fire hydrants failed," Walker reads. "Miraculously, this fire hydrant, nicknamed 'The Little Giant,' is said to have been the only functional ..." He goes on reading like this for about half a minute.

"Are you reading all of that from the upper right-hand corner of your eye?" I ask.

"Yeah," he says. "It's pretty cool. It's like text just floating in air."

Walker has a theory about why the text is not distracting him: "The layer is transparent, so your eye does a good job of seeing through it while also staring at it."

Earl Miller, a professor of neuroscience at MIT who specializes in multitasking, says this sounds like wishful thinking.

"You think you're monitoring the road at the same time, when actually what you're doing [is] you're relying on your brain's prediction that nothing was there before, half a second ago — that nothing is there now," he says. "But that's an illusion. It can often lead to disastrous results."

In other words, the brain fills in the gaps in what you see with memories of what you saw a half-second ago. Among scientists, that statement is not controversial. The politics of Google Glass — and where it's worn — clearly is.

Monday, March 24, 2014

LA Police Argue All Vehicles Are Under Investigation

As reported by Gizmodo: Do you drive a car in the greater Los Angeles Metropolitan area? According to the L.A. Police Department and L.A. Sheriff's Department, your car is part of a vast criminal investigation.

The agencies took a novel approach in the briefs they filed in EFF and the ACLU of Southern California's California Public Records Act lawsuit seeking a week's worth of Automatic License Plate Reader (ALPR) data. They have argued that "All [license plate] data is investigatory." The fact that it may never be associated with a specific crime doesn't matter.

This argument is completely counter to our criminal justice system, in which we assume law enforcement will not conduct an investigation unless there are some indicia of criminal activity. In fact, the Fourth Amendment was added to the U.S. Constitution exactly to prevent law enforcement from conducting mass, suspicion-less investigations under "general warrants" that targeted no specific person or place and never expired.

ALPR systems operate in just this way. The cameras are not triggered by any suspicion of criminal wrongdoing; instead, they automatically and indiscriminately photograph all license plates (and cars) that come into view. This happens without an officer targeting a specific vehicle and without any level of criminal suspicion. The ALPR system immediately extracts the key data from the image—the plate number and time, date and location where it was captured—and runs that data against various hotlists. At the instant the plate is photographed not even the computer system itself—let alone the officer in the squad car—knows whether the plate is linked to criminal activity.

Taken to an extreme, the agencies' arguments would allow law enforcement to conduct around-the-clock surveillance on every aspect of our lives and store those records indefinitely on the off-chance they may aid in solving a crime at some previously undetermined date in the future. If the court accepts their arguments, the agencies would then be able to hide all this data from the public.

However, as we argued in the Reply brief we filed in the case last Friday, the accumulation of information merely because it might be useful in some unspecified case in the future certainly is not an "investigation" within any reasonable meaning of the word.

LAPD and LASD Recognize Privacy Interest in License Plate Data

In another interesting turn in the case, both agencies fully acknowledged the privacy issues implicated by the collection of license plate data.  LAPD stated in its brief:
"[T]he privacy implications of disclosure [of license plate data] are substantial. Members of the public would be justifiably concerned about LAPD releasing information regarding the specific locations of their vehicles on specific dates and times. . . . LAPD is not only asserting vehicle owners' privacy interests. It is recognizing that those interests are grounded in federal and state law, particularly the California Constitution. Maintaining the confidentiality of ALPR data is critical . . . in relation to protecting individual citizens' privacy interests"
The sheriff's department recognized that ALPR data tracked "individuals' movement over time" and that, with only a license plate number, someone could learn "personal identifying information" about the vehicle owner (such as the owner's home address) by looking up the license plate number in a database with "reverse lookup capabilities such as LexisNexis and Westlaw."

The agencies use the fact that ALPR data collection impacts privacy to argue that—although they should still be allowed to collect this information and store it for years—they should not have to disclose any of it to the public. However, the fact that the technology can be so privacy invasive suggests that we need more information on where and how it is being collected, not less. This sales video from Vigilant Solutions shows just how much the government can learn about where you've been and how many times you've been there when Vigilant runs their analytics tools on historical ALPR data. We can only understand how LA police are really using their ALPR systems through access to the narrow slice of the data we've requested in this case.

We will be arguing these points and others at the hearing on our petition for writ of mandate in Los Angeles Superior Court, Stanley Mosk Courthouse, this coming Friday at 9:30 AM.

Apps By The Dashboard Light

As reported by MIT Technology Review: Starting next month, many car buyers will be getting a novel feature: Internet connections with speeds similar to those on the fastest smartphones—and even a few early dashboard-based apps, engineered to be as dumbed-down as possible.

Backseat passengers could get streaming movies and fast Wi-Fi connections to smart watches and tablets in (and near) the car. For drivers, high-resolution navigation maps would load quickly, and high-fidelity audio could stream from Internet radio services. But the first dashboard apps will be limited, spare versions of familiar ones like the Weather Channel, Pandora, and Priceline.

The first U.S. model with the fast wireless connection—known as 4G LTE, around 10 times faster than 3G connections—is expected to be the 2015 Audi A3, which goes on sale next month for a starting price of $29,900. Data plans will cost extra—an average of around $16 a month.

GM says it expects to sell 4G-equipped 2015 Chevrolets and other models starting in June. Many other carmakers, including Ford and Toyota, are following suit, both in the U.S. and worldwide, using partnerships with wireless carriers to deliver the connectivity.

By providing apps, carmakers see an opportunity for product differentiation and steady revenue streams. They also suggest that connectivity can lead to new safety features, and that using these onboard services will be safer than furtively glancing at phones.

But when drivers browse the GM AppShop, they shouldn’t expect what they get on an iPhone or a Galaxy phone. GM expects to provide just 10 apps initially, most of them mapping, news, and radio services.



That’s partly because the automaker’s screening process for apps is brutal, says Greg Ross, director of product strategy and infotainment for GM vehicles. “They go through rigorous safety and security standards,” he says. “And since it’s pulling data from the car, it’s locked down before it ever gets into the vehicle.”

As a result, the technology and interface need to be almost as simple as an analog radio knob, says Bruce Hopkins, cofounder of BT Software, based in San Diego. He is one of a very few developers whose apps will be available in GM cars.

Called Kaliki, BT Software’s app provides audio readings of stories—done by humans, not text-to-speech software—pulled from mainstream publications such as USA Today and TV Guide, as well as podcasts from radio and TV stations. (Its advantage over the radio? “Radio has been around for the last eight decades, and you still can’t pause it,” he says.)

Hopkins followed detailed rules from GM—no pinch-zoom controls or tiny icons allowed, for example—and spent two years developing the app, including time in a test facility in Detroit. “One of the terms GM talks a lot about is driver workload,” he says. “You cannot have anything that would require the driver to have several different things they have to think about. At the end of the day, they want something that works as simple as the regular radio.”

The apps know if you are driving. Drivers will never be able to open a “terms and conditions” screen—or play a game, assuming games ever come—unless the vehicle’s transmission is in “park.”

Despite the hurdles, 4,000 developers have registered with GM’s app store, because the payoff could be large for them: getting their apps included in a car could help them market versions that work on smartphones. And apps in cars command much more attention if they are among just a few that a driver can choose from while sitting behind the wheel for an hour or two every day.

In the longer term, apps will emerge that draw on data generated by the car, says GM’s Ross. 

This could be useful for maintenance or driving efficiency—or to generate data for insurance discounts. Apps tapping information from many cars could alert drivers to accidents; signals indicating hard braking or slipping wheels in other cars could warn of slick roads ahead. 

Sensors can ultimately help bring about semi-autonomous or fully autonomous cars (see “Data Show’s Google’s Robot Cars Are Smoother, Safer Drivers Than You or I”).

Henry Tirri, CTO of Nokia, says the potential for apps in cars is vast, given the amount of data vehicles produce. “The car is already probably the densest sensor hub that an individual owns right now,” he says. (See “After Microsoft Deal, What’s Left of Nokia Will Bet on Internet of Things.”)


In Audi’s case, the service will cost $100 for up to five gigabytes of data over six months, or $500 for 30 gigabytes over 30 months. GM has not announced pricing except to say that customers can get various plans combining service to their homes, phones, and cars. Both GM and Audi are using AT&T to provide service (see “GM and AT&T Blur Line Between Car and Smartphone”).