Search This Blog

Sunday, April 27, 2014

FTC Comes Out In Favor Of Tesla Direct Sales, Against Dealer-Backed Bans

As reported by Green Car Reports:  The ongoing fight between Tesla Motors and car dealers across the country has spilled from the headlines into the legal system, but so far, the outcome is far from certain. Will Tesla be allowed to sell its cars directly to consumers? Or is there some state interest in forcing Tesla into the dealer franchise model America's major carmakers use?

The Federal Trade Commission (FTC) weighed in through its "Competition Matters" blog, making it plain that the agency supports the Tesla direct sales approach, likening it to past technological advances in consumer-business relations.

"In this case and others, many state and local regulators have eliminated the direct purchasing option for consumers, by taking steps to protect existing middlemen from new competition. We believe this is bad policy for a number of reasons," wrote Andy Gavil, Debbie Feinstein, and Marty Gaynor in the FTC's "Who decides how consumers should shop?" posting to the Competition Matters blog.

The strong statement of policy is not a change to any law or regulation, but it does clearly indicate the FTC's stance on the matter. Gavil is the director of the FTC's Office of Policy Planning, Feinstein is director of the Bureau of Competition, and Gaynor is director of the Bureau of Economics.

The post continues, "Dealers contend that it is important for regulators to prevent abuses of local dealers. This rationale appears unsupported, however, with respect to blanket prohibitions of direct sales by manufacturers. And, in any event, it has no relevance to companies like Tesla. It has never had any independent dealers and reportedly does not want them."

Tesla CEO Elon Musk has explained why Tesla doesn't want conventional car dealers in the past. Though noting that it would be an easier path for Tesla, Musk thinks that conventional car dealers would have a conflict of interest in conveying the benefits of electric cars, since they would still rely on conventional (gasoline-burning) cars for the majority of their sales and profits.

Tesla's battle for direct sales is framed by existing franchise laws that prohibit anyone not licensed as a car dealer from selling vehicles to the public. Laws vary from state to state, but in all, 48 states have some version of the restriction.

The FTC appears to take issue not with those laws, but with how they're being used, and with the direct-sales bans being passed in several states.

"Regulators should differentiate between regulations that truly protect consumers and those that protect the regulated," the post continued.

Tesla now has more than 50 stores and galleries in the U.S., with six more due to open soon. Over 40 service centers are also currently in operation, with another 23 planned.                                         

Verizon, AT&T Will Face Bidding Limits In Incentive Auction

As reported by GigaOm: Last week the Federal Communications Commission laid out all of its proposed rules for next year’s controversial broadcast airwave incentive auction, save one. It didn't address the most contentious rule of them all: whether the countries’ two mega-carriers AT&T and Verizon will have free rein in the auction or face restrictions on how many airwaves they can buy.

The FCC is now taking a whack at the political piƱata, and AT&T and Verizon aren't going to be pleased with what comes out. On Thursday, FCC Chairman Tom Wheeler began circulating proposed rules for low-band spectrum auction — of which the incentive auction is most definitely one — that would limit Verizon and AT&T’s ability to bid on all licenses in markets where competition for frequencies is particularly  intense.

What that means is that in areas where there’s the most demand for mobile broadband airwaves, such as the big cities, the FCC will set aside up to 30 MHz of airwaves for carriers that don’t already own a lot of low-band spectrum. The rules aren’t exactly a surprise since Wheeler has been leaning in this direction for months, though they’re likely to get overshadowed by the FCC’s controversy du jour, net neutrality.

The reason low-band spectrum is valuable is because of its propagation — it can reach out long distances in rural areas and punch through walls in dense metro areas. Most of the low-band spectrum in use in the U.S. today is owned by, you guessed it, Verizon and AT&T, both of which have tapped 700 MHz for the backbones of their LTE networks.

Wheeler elaborated in the FCC’s blog:
“… two national carriers control the vast majority of that low-band spectrum.  This disparity makes it difficult for rural consumers to have access to the competition and choice that would be available if more wireless competitors also had access to low-band spectrum.  It also creates challenges for consumers in urban environments who sometimes have difficulty using their mobile phones at home or in their offices.
To address this problem, and to prevent one or two wireless providers from being able to run the table at the auction, I have proposed a market based reserve for the auction.”
The nitty gritty
The way the auction would work under the FCC’s proposal is that in any given market, all carriers would bid freely for these 600 MHz airwaves. But after bidding hits a particular trigger point indicating high demand for those licenses, the FCC would basically split the auction in two, creating a reserve chunk of airwaves up to 30 MHz that only smaller carriers like Sprint, T-Mobile and regional operators could bid on. The unreserved portion would remain open to all bidders.

Verizon and AT&T wouldn’t necessarily face restrictions in every market. It all depends on the extent of their low-band holdings in any given region. There are even a few geographical cases where regional carriers like U.S. Cellular hold enough 700 MHz spectrum that they would be excluded from the reserve camp, FCC officials said.


FCC Commissioners (L to R): Commissioner Ajit Pai, Commissioner Mignon Clyburn, Chairman Tom 
Wheeler, Commissioner Jessica Rosenworcel and Commissioner Michael O’Rielly (Source: FCC)

The rules certainly aren't final. In May they go before the full commission, which will decide on specific mechanisms such as which auction stage reserve bidding would be triggered and what percentage of licenses in any given market could be reserved. It could also change up the rules entirely, easing restrictions on AT&T and Verizon, or toss them out entirely. Those carriers are putting a lot of political pressure on the FCC and Congress for an entirely open auction, and AT&T even threatened to sit the whole auction out.

AT&T may just be bluffing, but the threat has to give the FCC some pause. A major bidder sitting out the auction wouldn’t just mean less revenue for the government, it could cause the entire auction to fail. The way this complex auction is structured (I spell out all the details here), the broadcasters currently using the UHF band would agree to part with their TV channels, but only if their selling prices are met. The fewer bidders there are to buy those repurposed airwaves, the less likely the auction will meet those prices.

We’re still a year away from the first bids being placed, and it’s becoming increasingly clear there’s no way the FCC is going to be able to make happy all the various broadcasters, carriers, politicians and public interest groups involved. It’s just a question of whether it can make enough of them happy to actually pull the auction off.

Musk’s SpaceX to Sue Over Lockheed-Boeing Launch Monopoly


As reported by Bloomberg Businessweek: Elon Musk’s space company will sue the U.S. Air Force to protest a Lockheed Martin Corp.-Boeing Co. team’s monopoly on Pentagon satellite launches, the billionaire said today.

“These launches should be competed,” he told reporters at the National Press Club in Washington. “If we compete and lose, that is fine. But why would they not even compete it?”
Musk’s Space Exploration Technologies Corp., known as SpaceX, is trying to break the joint venture’s lock on U.S. military satellite launches, which have an estimated value of $70 billion through 2030. He has said competition in that market may save taxpayers more than $1 billion a year.

Video: SpaceX's Musk News Conf.: Falcon 9, Launch Lawsuit


SpaceX, based in Hawthorne, California, plans to file its suit Monday in the U.S. Court of Federal Claims. It seeks to reopen competition for a military contract to joint venture United Launch Alliance LLC for 36 rocket cores, said Ian Christopher McCaleb, senior vice president at Levick, a public relations firm representing SpaceX.

Taxpayer Cost
The Air Force agreed to the bulk purchase of the main rocket components last year in an attempt to hold down costs.  “This contract is costing U.S. taxpayers billions of dollars for no reason,” said Musk, who earlier today made a presentation at the U.S. Export-Import Bank’s annual conference.

Mark Bitterman, a spokesman for United Launch Alliance, said the military’s “robust acquisition and oversight process,” as well as the company’s improved performance, led to $4 billion in savings compared with prior acquisition approaches.

The joint venture recognizes the Pentagon’s “plan to enable competition and is ready and willing to support missions with same assurance that we provide today,” Bitterman said in an e-mail.

Matthew Stines, an Air Force spokesman, said in an e-mail that the service has “no formal statement” on Musk’s announcement of the SpaceX lawsuit.

Russian Engines
SpaceX will require three successful launches as part of the process to win U.S. certification, the service has said. Technical reviews and audits of the proposed rockets, ground systems and manufacturing process also are needed, according to the Air Force.


Musk, also chairman and chief executive officer of Tesla Motors Inc., told U.S. lawmakers last month that the Lockheed-Boeing venture’s Atlas V rockets uses engines from Russia, posing supply risks following the country’s invasion of Crimea in Ukraine.

The U.S. and Europe have been considering a possible expansion of sanctions against Russia.

Pentagon officials have asked the Air Force to review whether the use of Russian engines for the military launches poses a national security risk. 

Friday, April 25, 2014

The FCC Doesn’t Want To Destroy Net Neutrality, But It’s Going To Anyway

As reported by GigaOm: The Federal Communications Commission doesn't want companies like Netflix or Viacom to have to pay to get their content to end users of broadband networks, but it doesn't see a way (or maybe even a reason) to ban the practice.

In a call with reporters on Thursday, FCC officials laid out the agency’s thinking on new network neutrality rules and tried to address concerns that the internet as we know it is broken.

The agency’s hope is to have new rules in place by the end of this year, and it plans to release a public document called a Notice of Proposed Rule Making (NPRM) outlining its thinking and asking questions about the new rules. It plans to release this NPRM in three weeks at its May 15 open meeting. Once the documents are released, the public will have a chance to comment on them.

What was once unreasonable discrimination now becomes commercially unreasonable

Since some of the content of that document was released Wednesday, the media and public interest groups have been concerned about what the new network neutrality framework would allow — namely, how the agency planned to ensure that ISPs won’t discriminate against the packets flowing across their networks. The answer? The agency will replace the “unreasonable discrimination” clause from the original net neutrality rules that were defeated in court this year with standards associated with “commercial reasonableness.”

It’s a subtle shift, but an important one. When the U.S. Court of Appeals gutted the Open Internet Order that set forth the net neutrality rules in January, it did so on the basis that the agency didn’t use the right justification for its rules. It tried to turn ISPs into common carriers and regulate them that way, but the court declared that the FCC couldn’t put that burden on the ISPs without changing the law or going through regulatory process that was bound to cause a fight.  

Instead we get a compromise by which the FCC attempts to honor the original intent of the 2010 Open Internet Order with a new test for discrimination. That test is the “commercial reasonableness” standard. Here’s how the FCC wants to do it.

If the devil is in the details, here are the details

First, the net neutrality rules that were gutted by the courts made a distinction between wireline broadband and wireless broadband. For a history on why, check out this post or this one. The FCC plans to keep those distinctions intact for the new rules. With this understanding, let’s hit the three main topics the FCC plans to cover, saving the most complicated element for last.

Transparency: Both the original and the new Open Internet Order make a provision for transparency, namely that network operators must share how they are managing their network traffic with the consumer. This applied to both wireline and wireless networks, so if your ISP is treating certain traffic differently, it has to tell you. The FCC’s upcoming documents also ask if this transparency could go further.

When asked if the order could require greater transparency about company networks such as how congested they might be or if ISPs are charging for prioritization or access because the market is uncompetitive, an FCC official said, “The answer is yes.” He added that the agency believes that greater transparency will help consumers and the commission determine how the broadband networks are functioning. That’s a pretty exciting promise if the FCC can wrangle that type of data from ISPs. Right now, ISPs view that data as competitive and proprietary.

An AT&T network operations center. How much transparency is enough?

Blocking: The courts struck down the original order’s anti-blocking provision that said ISPs on wireline networks couldn't block lawful traffic and wireless ISPs couldn't block competing over-the-top calling and texting services. The new FCC documents will make the case that because blocking traffic interrupts the “virtuous cycle” of broadband access — namely that people use broadband because it gives them access to a variety of services, and because broadband access is beneficial, anything that makes people less inclined to use broadband would cause harm.

This new reasoning would allow the FCC to implement a no-blocking position without resorting to calling ISPs common carriers. Another interesting tidbit here is that the FCC plans to ask about establishing a baseline of broadband service and view anything that goes below this baseline as blocking. This might seem esoteric, but in 2007 when Comcast was interfering with the delivery of BitTorrent packets, it argued that it wasn't actually blocking them. Instead it was delaying delivery so the routers in effect dropped the packets and customers couldn't access their files.


Commercial reasonableness: Here is the heart of last night’s controversy and where the FCC is walking its finest line. The agency wants to ensure that the spirit of network neutrality lives on, but legally it has to use a standard that opens the door to prioritization. The FCC even seems okay with prioritization in certain cases, with an agency official offering up the example of packets coming from a connected heart monitor as a protected class that could be prioritized over other traffic.

However, it will seek to avoid the obvious examples of Netflix having to pay an ISP to see its traffic priorititzed over another content provider’s. It will do this using the standards the FCC set forth in a 2011 cell phone roaming order that has been tested in court. As part of that order, which dictated that mobile carriers have an obligation to offer roaming agreements to other such providers on “commercially reasonable” terms, the agency created a class of behaviors that were commercially unreasonable.
  • Does this practice have an impact on future and present competition?
  • How does vertical integration affect any deals and what is the impact on unaffiliated companies?
  • What is the impact on consumers, their free exercise of speech and on civic engagement?
  • Are the parties acting in good faith? For example is the ISP involved in a good faith negotiation?
  • Are there technical characteristics that would shed light on an ISP practice that is harmful?
  • Are there industry practices that can shed light on what is reasonable?
  • And finally, a catch all that asks if there are any other factors that should be considered that would contribute to the totality of the facts?
FCC Commissioners (L to R): Commissioner Ajit Pai, Commissioner Mignon Clyburn, Chairman Tom Wheeler, Commissioner Jessica Rosenworcel and Commissioner Michael O’Rielly (Source: FCC)

Of course, one challenge with this format is that it requires an ISP to behave badly before the FCC can act. The agency said it will be on the lookout for such violations, it will accept formal complains and that it will accept informal complaints. Once a problem is registered the FCC the agency will ask about how it should handle the complaint, and whether a time limit should be imposed for a resolution.

Finally, the official acknowledged that the agency asks in its documents if there is ever a reason for a flat prohibition against certain behaviors even if an ISP isn’t a common carrier. The agency would have to make the case that paid prioritization is such a consumer or industry harm that it should be prohibited altogether. But based on the thinking and attention devoted to the commercial unreasonableness standard, as well as the heart rate monitor example, it feels like the FCC isn't keen to walk this path.

So these are the topics and questions on which the FCC will vote on May 15 and, if approved, pass for public comment. At that point the agency typically offers a 30 or 90-day comment period.


So get ready, internet: the FCC does want to know your stance on this issue.

Thursday, April 24, 2014

Apple Tech Uses Wi-Fi Access Points For Indoor Navigation, 3D Positioning

As reported by Apple Insider: While most mobile devices rely on GPS for mapping and navigation, the system only works outdoors and in range of satellite timing signals. However, new technology from Apple could extend accurate positioning indoors without need for additional hardware aside from existing Wi-Fi infrastructure.

A patent granted to Apple by the U.S. Patent and Trademark Office on Tuesday describes a robust system that combines GPS, Wi-Fi access points and onboard location databases to provide mobile devices accurate positioning data in nearly any environment.

According to Apple's U.S. Patent No. 8,700,060 for "Determining a location of a mobile device using a location database," the method employs location estimation through the successful communication with one or multiple Wi-Fi access points.

By calculating a number of factors, including access point filtering, hardware communication range and so-called "presence areas," a mobile device can narrow down its position on a map with relative precision. This includes products without GPS receivers.

One of the first steps in Apple's patent calls for a location-aware device or devices (with GPS capabilities) to transmit their position to a first Wi-Fi access point, which in turn relays the information to a server-based location system. From this data, the system can then estimate the approximate location, or "presence areas," of other devices within the communication range of the access point.

To calculate these presence areas, the system may use any number of analyses including an averaging of geographic locations based on location-aware mobile devices, signal strength of a given access point and surrounding building architecture, among other variables. Presence areas may be selected in a multi-pass process by filtering out potentials based on "popularity, stability, longevity, and freshness."

Loaded with data, the system can plot out connected mobile devices in cells on a geographic grid. Each cell acts as a container for presence areas and corresponding access points. As seen in the image above, location-aware devices are represented as black triangles that are within or nearby presence areas denoted by circles.

One way a mobile device can calculate its location is by detecting multiple presence areas and averaging distance from those close by, while discarding data from "outliers" farthest away from a given position. Following processing, the device can then display its average location on a mapping app.

Alternatively, an access point can send position information about other access points nearby, including only those that are within a mobile device's area of interest. This method of filtering is also used to approximate margin of error, which is denoted by a radius or radii extending from a focal point within a presence area.

In addition, Apple's method accounts for three-dimensional space by taking into consideration altitude data from devices supporting such GPS metrics.



From left: Multi-pass analysis, multi-pass analysis with outlier, and 3D positioning grid.

Tuesday's patent is similar to technology created by "indoor GPS" firm WifiSLAM, which Apple purchased in March 2013 for about $20 million. WifiSLAM's system relies largely on Wi-Fi signals to accurately position mobile devices while indoors and does not require GPS to operate.

Apple's patent for a Wi-Fi-based positioning system was first filed for in 2010 and credits Ronald K. Huang as its inventor.    


Wednesday, April 23, 2014

Japan’s Plan for Centimeter-Resolution GPS

As reported by IEEE Spectrum: A stranger to Tokyo could easily get lost in its urban canyons. And GPS navigation, stymied by low resolution and a blocked view of the sky, might not be much help. But that won’t be the case after 2018. Engineers at Tokyo-based Mitsubishi Electric Corp. report that they’re on track to start up the first commercial, nationwide, centimeter-scale satellite positioning technology. As well as spot-on navigation, the technology will also usher in a variety of innovative new applications, its proponents say.

Named Quazi-Zenith Satellite System (QZSS), it is designed to augment Japan’s use of the U.S.-operated Global Positioning System (GPS) satellite service. By precisely correcting GPS signal errors, QZSS can provide more accurate and reliable positioning, navigation, and timing services.

Today’s GPS receivers track the distance to four or more GPS satellites to calculate the receiver’s position. But because of the various errors inherent in the GPS system, location can be off by several meters. In using the data from QZSS to correct the measured distance from each satellite, the accuracy of the calculated position is narrowed down to the centimeter scale.

“GPS positioning can be off by as much as 10 meters due to various kinds of errors,” says Yuki Sato, a research engineer in Mitsubishi Electric’s Advanced Technology R&D Center, the prime contractor for the space portion of the project. “And in Japan, with all its mountains and skyscrapers blocking out GPS signals, positioning is not possible in some city and country locations,” he adds.

The Japan Aerospace Exploration Agency (JAXA) got the project under way with the launch of QZS-1 in September 2010. Three additional satellites are slated to be in place by the end of 2017, with a further three launches expected sometime later to form a constellation of seven satellites—enough for sustainable operation and some redundancy. The government has budgeted about US $500 million for the three new satellites, which are to be supplied by Mitsubishi. It also apportioned an additional $1.2 billion for the ground component of the project, which is made up of 1200 precisely surveyed reference stations. That part’s being developed and operated by Quazi-Zenith Satellite System Services, a private company established for this purpose.

The four satellites will follow an orbit that, from the perspective of a person in Japan, traces an asymmetrical figure eight in the sky. While the orbit extends as far south as Australia at its widest arc, it is designed to narrow its path over Japan so that at least one satellite is always in view high in the sky—hence the name quasi-zenith. This will enable users in even the shadowed urban canyons of Tokyo to receive the system’s error-correcting signals.

“Errors can be caused, for example, by the satellite’s atomic clock, orbital shift, and by Earth’s atmosphere, especially the ionosphere, which can bend the signal, reducing its speed,” says Sato.

To correct the errors, a master control center compares the satellite’s signals received by the reference stations with the distance between the stations and the satellite’s predicted location. These corrected components are compressed from an overall 2-megabit-per-second data rate to 2 kilobits per second and transmitted to the satellite, which then broadcasts them to users’ receivers.

“This is all done in real time, so compression is really important,” says Ryoichiro Yasumitsu, a deputy chief manager in Mitsubishi’s Space Systems Division. “It would take too long to transmit the original data.” 

Compression also means a practical-size antenna can be employed in the user’s receiver. In QZS-1 trial tests, Yasumitsu notes that the average accuracy is about 1.3 centimeters horizontally and 2.9 cm vertically.

This centimeter-scale precision promises to usher in a number of creative, or at least greatly improved, applications beyond car and personal navigation. Besides pointing out obvious uses like mapping and land surveying, Sam Pullen, a senior research engineer in the department of aeronautics and astronautics at Stanford, says precision farming and autonomous tractor operations will be big applications. “Unmanned aerial vehicles and autonomous vehicles in general,” he adds, “will also find centimeter-level positioning valuable in maintaining and assuring separation from other vehicles and fixed obstacles.”

In addition, the Japanese government plans to use the service to broadcast short warning messages in times of disaster, when ground-based communication systems may be damaged. As instructed by the government, the control center will transmit a brief warning message to the QZSS satellite, which will then broadcast it to users on the same frequency.

Given the range of promised applications and relatively low cost of the Japanese system compared with the €5 billion ($6.9 billion) budgeted for the EU’s Galileo, for instance, other nations will be watching and waiting to see if QZSS achieves its goals.

Google's Street View Lets You Step Back In Time

As reported by The Verge: Three years ago, a magnitude 9.0 earthquake struck off the coast of Japan and moved the entire island by 8 feet, changing the way the Earth spun on its axis in the process. The devastation of the tsunami that followed resulted in the loss of thousands of lives and billions of dollars in damage to homes, businesses, and the country’s infrastructure.  

In the aftermath, Google set out to preserve imagery it had captured prior to the disaster, including original Street View recordings that became an unintended time capsule. The company made a one-off site called Memories for the Future that let viewers see certain areas before and after the devastation. It was an unusual site considering Google’s standard operating procedure: a feverish pace of updates that erased the old with the new and never looked back.

Google’s changing that now with a feature that lets you step back in time to earlier versions of its Street View data, going back to 2006. Since then, each time the company updated Street View data, it also quietly kept the older versions. And in numerous cases, skipping between them is the difference between desolation and a sprawling metropolis, or — like in Japan’s case — vice versa.

Cherryblossoms_kyoto_japan

Interstate90_utah
Singapore_skyline
The feature, which Google is rolling out to the web version of Maps today, generally stays out of the way unless you want to go back in time. If you’re viewing a location with earlier recorded images, there’s now an hourglass and a slider in the top left of the screen that shows you the month and year. Dial it back and it sweeps to that copy stored on Google’s servers, almost as if you were clicking on a location just up the road.

The result is a kind of time warp that can show you months' and years' worth of human ingenuity, and just as quickly show it erased following a disaster or new construction project. With Street View now recording more than 6 million miles across 55 countries, there are a lot of those.

"We have built this very complicated graph of imagery through time and space," says Luc Vincent, the director of engineering for Street View. He says the option to go back and forth through time was the most requested by Google Maps users, who have been hounding the company to add it for years. This was primarily for simple things, like seeing older images of their house, school, or neighborhood. "People would say, ‘My house, can you please preserve it? Because I like it this way,’" Vincent told The Verge. "We can show you everything now."

Google is creating so much data, in fact, Vincent says the current iteration of Time Machine is intentionally dialing back what people see. The smallest interval of time you can jump to is a month, even if Google’s gone through and captured Street View recordings more frequently. That’s not a normal occurrence for most places, Vincent says, but there are places like Google’s campus, and major cities where Street View cars are recording more than once a month, sometimes even several times a week.

"Algorithms pick the best looking images to show you"
Vincent says the company’s using an algorithm designed to pick the best imagery from the data that’s collected each month. It goes through the images the company has captured and weeds out sets that tend to have a lot of motion blur, or that have particularly bad weather.

But expanding the recordings to what Vincent refers to as "slices" has opened up new avenues for the company to show off Street View imagery it once kept to itself. That includes roads with shoulders heaped with snow, drenched dark forests, or simply alternate views of familiar places. "We can show you Times Square at night," Vincent says, a first for the service that overwhelmingly prefers clear blue skies. "When we chose the image, the freshest imagery is typically the best … now you don’t have to make a choice."

One wrinkle in all this is that the physical location of roads changes over time, either by human interaction or mother nature. In the case of the movement from the 9.0 earthquake, for instance, roads and buildings that were in one place when Google was first there, were at new GPS coordinates when they went through again. That’s been preserved in Time Machine, Vincent says.

Mexico_overpass

Soumaya_museum_mexico_city


"It’s not a bug; after the earthquake, the ground shifted by 3 meters. Everything else is from the same geo-coordinates," Vincent says. "It was the same thing with Hurricane Katrina in New Orleans."

Vincent and company hope Time Machine will be more than just a way to gawk at before and after photos of disasters, and perhaps become a tool for planning travel. They imagine people using it when planning a vacation to somewhere they've never been in order to see what it looks like during that particular time of year.

Seasonalchange_norwayGoogle won’t initially offer Time Machine for indoor imagery of buildings, or on trails, something it’s captured using its special Trekker backpack. It also won’t be available from the get-go on mobile devices. Vincent made no promises on timing short of saying that the company was working on it. With that said, Street View on the go is often meant as a way to get your bearings on what’s around you now, not years ago. But that behavior, just like images of the world Google is capturing, might ultimately change.

"We’ve been driving 3D cars for more than seven years," Vincent says. "It was totally different from what it is now."