Search This Blog

Wednesday, March 25, 2015

New Car Tech Can Prevent You From Accidentally Speeding

As reported by Ford Social: Breaking the speed limit is not something we always do on purpose. All the same, it can be costly in terms of fines, and driving bans, as well as playing a significant role in many road accidents.

In the U.K. alone, in 2013, more than 15,000 drivers received fines of £100 or more for speeding.

Ford is now launching Intelligent Speed Limiter, a technology that could help prevent drivers from unintentionally exceeding speed limits.

The system monitors road signs with a camera mounted on the windscreen, and slows the vehicle as required. As the speed limit rises, the system allows the driver to accelerate up to the set speed – providing it does not exceed the new limit.

“Drivers are not always conscious of speeding and sometimes only becoming aware they were going too fast when they receive a fine in the mail or are pulled over by law enforcement,” said Stefan Kappes, active safety supervisor, Ford of Europe. “Intelligent Speed Limiter can remove one of the stresses of driving, helping ensure customers remain within the legal speed limit.”

Further new technologies available for the new S-MAX include the Pedestrian Detection system that will reduce the severity of some collisions involving vehicles and pedestrians, or help drivers avoid some impacts altogether.

The S-MAX is also equipped to help out at junctions where it is difficult to see.  At low speeds a camera fitted in the grill monitors the view from the front of the car, which is then displayed inside the car.

Tuesday, March 24, 2015

US Fixing Software Glitch with Boeing GPS Satellites

As reported by Reuters: The U.S. Air Force Sunday it is working to resolve a technical error that affected some Boeing Co Global Positioning System (GPS) satellites, although it did not hurt the accuracy of GPS signals received by users around the world.

Air Force Space Command said the glitch appeared to involve the ground-based software used to index, or sort, some messages transmitted by GPS IIF satellites built by Boeing, but officials were still investigating other possible causes.

Lockheed Martin Corp runs the GPS "ground control" segment, which enables Air Force officials to operate all GPS satellites, including the IIF satellites built by Boeing.

The Air Force said the issue came to light in recent days, but a close examination of archived data showed the problem had gone unnoticed since 2013. It gave no details of the extent of the problem, its impact on the overall system or how it had come to light.

It said the glitch appeared related to the ground software that builds and uploads messages transmitted by GPS satellites, resulting in an occasional message failing to meet U.S. technical specifications.

The Air Force said it had put in place a temporary solution and officials were working on a permanent fix.

Boeing, prime contractor for the GPS IIF satellites, had no immediate comment on the news, which comes days before the Air Force is due to launch the ninth GPS IIF satellite into space.

Lockheed officials also had no immediate comment.

Air Force Space Command spokesman Andy Roake said it was unclear which contractor was responsible for the problem.

GPS is a space-based worldwide navigation system that provides users with highly accurate data on position, timing and velocity 24 hours a day, in all weather conditions.

The system is used by the military for targeting precision munitions and steering drones. It also has a wide range of commercial applications, including verification of automated bank transactions, farming and tracking shipments of packages. Car navigation systems and mobile phones use GPS to determine their location.

Boeing is under contract to build 12 GPS IIF satellites. The first of the GPS IIF satellites was launched in May 2010.

Three GPS/GNSS Satellite Launches Coming Up

Galileo satellites being moved prior to mating with the Fregat stage of a Soyuz rocket in preparation for a March 27th 2015 launch.
As reported by Inside GNSS: Four GNSS satellites will be launched during the coming week: a GPS Block IIF, two full operational capability (FOC) Galileo spacecraft, and an Indian Regional Navigation Satellite System (IRNSS) satellite.

United Launch Alliance (ULA) will send the ninth GPS IIF into space on Wednesday (March 25, 2015) from Cape Canaveral, Florida; a Russian Soyuz rocket will lift the Galileo FOC 3 and 4 into orbit from Korou, French Guiana on Friday, March 27; and India’s Polar Satellite Launch Vehicle will carry the fourth IRNSS payload from Satish Dhawan Space Center on Saturday, March 28.

The Air Force Second Space Operations Squadron (2 SOPS) indicates that IIF-9 (identified by space vehicle and pseudorandom noise code, respectively, as SVN-71/PRN-26) will replace SVN-35 (currently being operated in Launch, Anomaly Resolution and Disposal Operations or LADO status) in the B plane slot 1F.

Meanwhile, SVN-38/PRN-08 will be taken out of the operational constellation prior to SVN-71 payload initialization and sent to LADO. PRN-08 will be assigned to SVN-49 in May and set to test, but is tentatively scheduled for assignment to IIF-10 to launch on June 16.  SVN-35, launched on August 30, 1993, has been in a residual status since March 2013 in an expanded node slot in the B plane, having served 21.5 years, 14.0 years beyond its designed service life.
The launch of India's fourth Navigation Satellite IRNSS-1D is scheduled for Saturday March 28th.

The US Air Force Will Train with Remote-Controlled F-16s

As reported by Engadget: To keep their skills sharp, US Air Force pilots routinely fly simulated sorties against domestic planes with similar flight capabilities to that of enemy planes. For years, this decoy duty has fallen to specially modified, unmanned F-4 Phantom IIs, however these Vietnam-era fighters can no longer keep up with America's modern warplanes. That's why the USAF recently took delivery of a new breed of autonomous target based on the venerable F-16 Fighting Falcon.

Boeing delivered the first of an expected 126 remote controlled QF-16 target drones to Florida's Tyndall Air Force Base last week. "It was a little different to see it without anyone in it, but it was a great flight all the way around," USAF Lt. Col. Ryan Inman said in a 2013 statement. "It's a replication of current, real world situations and aircraft platforms they can shoot as a target. Now we have a 9G capable, highly sustainable aerial target."

Another five QF-16s are currently being outfitted as part of the company's initial pre-production run and are expected to enter service by early October. They'll be employed by the 82nd Aerial Targets Squadron as stand-ins for the MiG-29 Fulcrum and Sukhoi Su-27 Flanker, a pair of fighter jets that our forces are likely to encounter should Russia's recent spate of saber-rattling and annexations lead to actual armed conflict.



Monday, March 23, 2015

US Navy Will Fire Fighter Jets Into the Air With Electromagnets

As reported by Engadget: For the last 60 years, the US Navy has launched fighters from carrier decks using steam catapults. While that made for some atmospheric Top Gun shots, the jerky motion adds wear-and-tear to aircraft and pilots alike. The military is now ready to test the next generation Electromagnetic Aircraft Launch System (EMALS) aboard the new USS Gerald R. Ford after successful land trials (see the video below). EMALS uses a prescribed dose of electromagnetic energy to smoothly launch a variety of aircraft at the precise speeds needed, reducing stress on airframes. It's more adaptable to different aircraft and launch conditions than current catapults, and is well-suited for lightweight drone systems like the X-47B now aboard US carriers.

Starting in June, the Navy will start catapulting "dead loads" into a river. Eventually, EMALS will launch F/A-18 Super Hornets, EA-18G Growlers, E2D Advanced Hawkeyes and other craft aboard Ford-class ships, which can pump out three times the voltage (13,800 volts) of older carriers. It will also catapult the controversial F-35 Joint Strike Fighter, which has already been tested with EMALS at the Lakehurst land proving ground. The Navy's F-35C Lightning II variant recently went through a two-week sea trial with 124 successful "cat shots" on a regular steam launcher and is scheduled to go into service by 2018.

Friday, March 20, 2015

Startup Promises Business Insights from Satellite Images Using 'Deep Learning' AI

Orbital Insight is using deep learning to find financially useful information in aerial imagery.  For instance, Orbital insight's software can identify objects such as crude oil containers as shown above.
As reported by MIT Technology Review: The next time you drive to Home Depot you may help a Wall Street firm decide whether it should invest in the company. A startup called Orbital Insight is using commercially available satellite imagery and machine learning to analyze the parking lots of 60 different retail chains to assess their performance.

Founder James Crawford expects images from above to provide all sorts of business intelligence. “We’re just starting to discover what can be done with this kind of large-scale data,” says the alum of both NASA and the Google project that digitized over 20 million books.

The shadows in such images can indicate the fullness of a container.
Interest in satellite imaging is growing, and the cost is coming down. Google snatched up the satellite-image-processing company Skybox Imaging last August, and today Google Ventures and other investors, including Sequoia and Bloomberg Beta, announced they had sunk $8.7 million into Crawford’s company.
Orbital Insight is using a promising new technique known as deep learning to find economic trends through satellite-image analysis. Deep learning uses a hierarchy of artificial “neurons” to learn to recognize patterns in data (see “Deep Learning”).

To predict retail sales based on retailers’ parking lots, humans at Orbital Insights use Google Street View images to pinpoint the exact location of the stores’ entrances. Satellite imagery is acquired from a number of commercial suppliers, some of it refreshed daily. Software then monitors the density of cars and the frequency with which they enter the lots.

Crawford’s company can also use shadows in a city to gather information on rates of construction, especially in secretive places like China. Satellite images could also predict oil yields before they’re officially reported because it’s possible to see how much crude oil is in a container from the height of its lid. Scanning the extent and effects of deforestation would be useful to both investors and environmental groups.

Over time, Orbital Insight’s software can identify trends and make predictions. “Then it’s not an image anymore—it’s some sort of measurement,” Crawford says.

That still leaves open the question of an unwanted eye in the sky. Not everyone likes the idea of being monitored as they run errands, and businesses may reject the idea of being watched from space. Crawford says satellites are already collecting this information—intelligence agencies have been using it for decades—and Orbital Insights is just making sense of the data.

“A satellite can cover every square inch of the earth every two weeks. You can’t stop that,” he says. “We don’t drive what imagery the satellite takes.”



Farmers of the Future Will Utilize Drones, Robots and GPS

As reported by Physics.org: Today's agriculture has transformed into a high-tech enterprise that most 20th-century farmers might barely recognize.

After all, it was only around 100 years ago that farming in the US transitioned from animal power to combustion engines. Over the past 20 years the (GPS), and other new tools have moved farming even further into a technological wonderland.

Beyond the now de rigeur air conditioning and stereo system, a modern large tractor's enclosed cabin includes computer displays indicating machine performance, position and operating characteristics of attached machinery like seed planters.

And as amazing as today's technologies are, they're just the beginning. Self-driving machinery and flying robots able to automatically survey and treat crops will become commonplace on farms that practice what's come to be called precision .

The ultimate purpose of all this high-tech gadgetry is optimization, from both an economic and an environmental standpoint. We only want to apply the optimal amount of any input (water, fertilizer, pesticide, fuel, labor) when and where it's needed to efficiently produce high crop yields.

Global positioning gives hyperlocal info
GPS provides accurate location information at any point on or near the earth's surface by calculating your distance from at least three orbiting satellites at once. So farming machines with GPS receivers are able to recognize their position within a farm field and adjust operation to maximize productivity or efficiency at that location.

Take the example of soil fertility. The farmer uses a GPS receiver to locate preselected field positions to collect soil samples. Then a lab analyzes the samples, and creates a fertility map in a geographic information system. That's essentially a computer database program adept at dealing with geographic data and mapping. Using the map, a farmer can then prescribe the amount of fertilizer for each field location that was sampled. Variable-rate technology (VRT) fertilizer applicators dispense just exactly the amount required across the field. This process is an example of what's come to be known as precision agriculture.

Info, analysis, tools
Precision agriculture requires three things to be successful. It needs site-specific information, which the soil-fertility map satisfies. It requires the ability to understand and make decisions based on that site-specific information. Decision-making is often aided by computer models that mathematically and statistically analyze relationships between variables like and the yield of the crop.

Finally, the farmer must have the physical tools to apply the management decisions. In the example, the GPS-enabled VRT fertilizer applicator serves this purpose by automatically adjusting its rate as appropriate for each field position. Other examples of precision agriculture involve varying the rate of planting seeds in the field according to soil type and using sensors to identify the presence of weeds, diseases, or insects so that pesticides can be applied only where needed.

Site-specific information goes far beyond maps of soil conditions and yield to include even satellite pictures that can indicate crop health across the field. Such remotely sensed images are also commonly collected from aircraft. Now unmanned aerial vehicles (UAVs, or drones) can collect highly detailed images of crop and field characteristics. These images, whether analyzed visually or by computer, show differences in the amount of reflected light that can then be related to plant health or soil type, for example. Clear crop-health differences in images – diseased areas appear much darker in this case – have been used to delineate the presence of cotton root rot, a devastating and persistent soilborne fungal disease. Once disease extent is identified in a field, future treatments can be applied only where the disease exists. Advantages of UAVs include relatively low cost per flight and high image detail, but the legal framework for their use in agriculture remains under development.

Let's automate
Automatic guidance, whereby a GPS-based system steers the tractor in a much more precise pattern than the driver is capable of is a tremendous success story. Safety concerns currently limit completely driverless capability to smaller machines. Fully autonomous or robotic field machines have begun to be employed in small-scale high profit-margin agriculture such as wine grapes, nursery plants and some fruits and vegetables.
Autonomous machines can replace people performing tedious tasks, such as hand-harvesting vegetables. They use sensor technologies, including that can detect things like location and size of stalks and leaves to inform their mechanical processes. Japan is a trend leader in this area. Typically, agriculture is performed on smaller fields and plots there, and the country is an innovator in robotics. But autonomous machines are becoming more evident in the US, particularly in California where much of the country's specialty crops are grown.

The development of flying robots gives rise to the possibility that most field-crop scouting currently done by humans could be replaced by UAVs with machine vision and hand-like grippers. Many scouting tasks, such as for insect pests, require someone to walk to distant locations in a field, grasp plant leaves on representative plants and turn them over to see the presence or absence of insects. Researchers are developing technologies to enable such flying robots to do this without human involvement.

Breeding + sensors + robots
High-throughput plant phenotyping (HTPP) is an up-and-coming precision agriculture technology at the intersection of genetics, sensors and robotics. It is used to develop new varieties or "lines" of a crop to improve characteristics such as nutritive content and drought and pest tolerance. HTPP employs multiple sensors to measure important physical characteristics of plants, such as height; leaf number, size, shape, angle, color, wilting; stalk thickness; number of fruiting positions. These are examples of phenotypic traits, the physical expression of what a plant's genes code for. Scientists can compare these measurements to already-known genetic markers for a particular plant variety.

The sensor combinations can very quickly measure phenotypic traits on thousands of plants on a regular basis, enabling breeders and geneticists to decide which varieties to include or exclude in further testing, tremendously speeding up further research to improve crops.
Agricultural production has come so far in even the past couple decades that it's hard to imagine what it will look like in a few more. But the pace of high-tech innovations in agriculture is only increasing. Don't be surprised if, 10 years from now, you drive down a rural highway and see a very small helicopter flying over a field, stopping to descend into the crop, use robotic grippers to manipulate leaves, cameras and machine vision to look for insects, and then rise back above the crop canopy and head toward its next scouting location. All with nary a human being in sight.