Search This Blog

Monday, April 28, 2014

Scale Is Increasingly The Name Of The Game In Cloud Computing

As reported by GigaOm: Resistance to cloud computing might not be futile, but it’s at least beginning to look foolish — especially as services from the top providers such as Amazon Web Services keep getting cheaper while their performance gets better. It’s also looking like smaller-scale or “enterprise” cloud platforms will have to promise some serious differentiation in order to justify their higher costs.

To highlight this trend, here’s a chart from publishing analytics startup Parse.ly graphing its IT spending from inception until early 2014.


The long story made short — you can read the whole thing up through September 2013 here — is that Parse.ly started off using Rackspace primarily and AWS for backup and a variety of ad hoc workloads (e.g., Hadoop jobs). In 2012, it opted to cut costs by switching its primary analytic database to physical servers in a co-location center while continuing to run its cloud workloads primarily in Rackspace. In late 2013, it began transitioning more workloads to AWS and completed an entire transition to AWS in late February 2014.

After paying double (to both Rackspace and AWS) during the transition, Parse.ly is now paying less than monthly than it was before making the move. Its spending patterns might be unique because of the workloads it’s running, but they’re compelling nonetheless.

And, Parse.ly Co-founder and CTO Andrew Montalenti told me, there’s icing on this cake, as well: “What’s crazy is we got a speed-up and saved money.” The company’s primary analytics database is now running significantly faster on AWS SSD-backed instances than it did on bare metal (albeit hard-disk-backed) servers.

If recent claims from Google about adjusting its pricing in accordance with Moore’s Law come true — and if its unique strategy around price reductions on long-running instances catches on — we should be in for continually lower prices on basic cloud computing services. AWS is the cloud king, but Google and Microsoft are positioned as strong contenders, and if low costs are what wins users, they’ll all play along to ensure no one else owns that story. The same goes for improved performance and rapid feature updates.

More and more, it looks like the future of cloud computing will be renting the infrastructure that lets users operate like, well, Amazon, Google and Microsoft but at a fraction of the cost (and scale). We’ll hear a lot more about where the industry is heading at the Structure conference, which takes place June 18 and 19 in San Francisco, and features, among many others, Google’s Urs Hölzle, Amazon’s Werner Vogels and Microsoft’s Scott Guthrie.

T-Mobile Has Started Building A More Resilient LTE Network

As reported by GigaOm: T-Mobile has begun upgrading its LTE network with a new kind of antenna technology that will help fix one of the biggest problems in mobile: the inconsistent signals and connection speeds our phones see as we move through the mobile network.

Anyone who has ever had five bars and a rocking data link to the tower, only to lose it 20 yards later, can attest to this. But starting in Chicago, Dallas and San Antonio, T-Mobile users will soon see those peaks and valleys become plateaus.

The technology is called 4-by-2 multiple input-multiple output, or 4×2 MIMO. You may already be familiar with MIMO if you’re familiar with how LTE or Wi-Fi works: multiple antennas send multiple parallel transmissions from the transmitter to the device. While nearly all LTE systems today use 2×2 MIMO — two antennas at the tower connecting to two antennas in the phone — T-Mobile is doubling up on spatial streams being transmitted over the network.

What that means is that there will be a lot more signals flying at your T-Mobile 4G phone, tablet or mobile hotspot, ensuring you can get a better downlink connection even if you’re at the fringes of the network or their obstacles between you and the tower. The biggest benefits will be on the return trip, though. With more antennas at the tower to pick up your phone’s generally weaker signals, you’ll get a big boost in your uplink connection.

Gigaom first got the scoop on T-Mobile’s plans last June, when one of its vendors Nokia Solutions and Networks confirmed to me T-Mobile planned on deploying the antenna array technology. At the time, T-Mobile wouldn’t even acknowledge that it was using 4×2 MIMO, but this week T-Mo VP of Technology Mark McDiarmid confirmed to me that T-Mobile is in the process of rolling it out in multiple cities across its network this year as part of a larger LTE upgrade.

“We do see the benefits 4×2 MIMO offers and will be deploying this in many cities in 2014 as part of our Wideband LTE rollout,” McDiarmid said in a statement to Gigaom. “All of T-Mobile’s available devices currently support 4×2 MIMO and we’ll ensure that new devices will as well. We believe this will be one of the first deployments by a top carrier network in the US.”
Source: Flickr / swruler9284

Sprint is performing trials of a similar technology called 8T8R, which actually creates eight transmit paths as opposed to T-Mobile’s four, and will incorporate it into future upgrades to its new tri-band Spark network.

Historically T-Mobile has always trailed its competitors when it comes to launching new generations of network technologies. It was the last to get 3G and the last to start rolling out LTE, but once it had gotten started it took advantage of its newer network equipment to surpass its rivals. It built the fastest 3G network in the U.S. in 2011, and with 4×2 MIMO its now among the pioneers in one of the latest advancements in 4G networking.

T-Mobile wouldn’t offer any details as to where the network is now live, but Gigaom’s favorite network tracker Milan Milanovic found evidence of 4×2 in the wild in Chicago, Dallas and San Antonio by polling fellow network testers on Howard Forums. This screenshot supplied by forums user besweeet shows iPhone in engineering mode in San Antonio with the arrow indicating four transmit signals from the tower.

Source: Howard Forums user besweeet

What does this mean to me?

So if you’re a T-Mobile subscriber with an LTE handset, 4×2 MIMO basically means you’re going to get a more resilient connection as you move throughout the network. You won’t actually see your peak speeds improve, but you’ll be able to maintain a fast, consistent connection far more often, even when the network starts getting crowded.

According to Nokia networks’ Head of Technology for North America Petri Hautakangas, at the cell edge – those fringe areas of the network where your connection often suffers the most — you could see a 50 percent to 60 percent boost in download speeds and as much as 100 percent increase in upload speeds.

That boost provides a lot of advantages to T-Mobile as well as its customers. By connecting more customers throughout its network with faster speeds it increases its overall data capacity considerably, meaning it will take a lot more traffic to make its network congested.

As for where the network heads next, the location of the three sightings we've had so far provides a hint. They’re all Nokia-built networks. Nokia’s systems are concentrated in the interior of the U.S. Ericsson holds T-Mobile’s contract for most east and west coast cities. If Nokia has the jump on Ericsson for this new technology, then it might take a while before it arrives in New York or San Francisco.

In any case, this technology is an important step for mobile networking, demonstrating the subtle shift away from building faster networks, to building better networks. The 5-10 Mbps speeds we typically see on a smartphone today is plenty fast. But providing a consistent 5-10 Mbps connection no matter where you go in the network? That’s where the mobile industry should be heading.

Sunday, April 27, 2014

FTC Comes Out In Favor Of Tesla Direct Sales, Against Dealer-Backed Bans

As reported by Green Car Reports:  The ongoing fight between Tesla Motors and car dealers across the country has spilled from the headlines into the legal system, but so far, the outcome is far from certain. Will Tesla be allowed to sell its cars directly to consumers? Or is there some state interest in forcing Tesla into the dealer franchise model America's major carmakers use?

The Federal Trade Commission (FTC) weighed in through its "Competition Matters" blog, making it plain that the agency supports the Tesla direct sales approach, likening it to past technological advances in consumer-business relations.

"In this case and others, many state and local regulators have eliminated the direct purchasing option for consumers, by taking steps to protect existing middlemen from new competition. We believe this is bad policy for a number of reasons," wrote Andy Gavil, Debbie Feinstein, and Marty Gaynor in the FTC's "Who decides how consumers should shop?" posting to the Competition Matters blog.

The strong statement of policy is not a change to any law or regulation, but it does clearly indicate the FTC's stance on the matter. Gavil is the director of the FTC's Office of Policy Planning, Feinstein is director of the Bureau of Competition, and Gaynor is director of the Bureau of Economics.

The post continues, "Dealers contend that it is important for regulators to prevent abuses of local dealers. This rationale appears unsupported, however, with respect to blanket prohibitions of direct sales by manufacturers. And, in any event, it has no relevance to companies like Tesla. It has never had any independent dealers and reportedly does not want them."

Tesla CEO Elon Musk has explained why Tesla doesn't want conventional car dealers in the past. Though noting that it would be an easier path for Tesla, Musk thinks that conventional car dealers would have a conflict of interest in conveying the benefits of electric cars, since they would still rely on conventional (gasoline-burning) cars for the majority of their sales and profits.

Tesla's battle for direct sales is framed by existing franchise laws that prohibit anyone not licensed as a car dealer from selling vehicles to the public. Laws vary from state to state, but in all, 48 states have some version of the restriction.

The FTC appears to take issue not with those laws, but with how they're being used, and with the direct-sales bans being passed in several states.

"Regulators should differentiate between regulations that truly protect consumers and those that protect the regulated," the post continued.

Tesla now has more than 50 stores and galleries in the U.S., with six more due to open soon. Over 40 service centers are also currently in operation, with another 23 planned.                                         

Verizon, AT&T Will Face Bidding Limits In Incentive Auction

As reported by GigaOm: Last week the Federal Communications Commission laid out all of its proposed rules for next year’s controversial broadcast airwave incentive auction, save one. It didn't address the most contentious rule of them all: whether the countries’ two mega-carriers AT&T and Verizon will have free rein in the auction or face restrictions on how many airwaves they can buy.

The FCC is now taking a whack at the political piñata, and AT&T and Verizon aren't going to be pleased with what comes out. On Thursday, FCC Chairman Tom Wheeler began circulating proposed rules for low-band spectrum auction — of which the incentive auction is most definitely one — that would limit Verizon and AT&T’s ability to bid on all licenses in markets where competition for frequencies is particularly  intense.

What that means is that in areas where there’s the most demand for mobile broadband airwaves, such as the big cities, the FCC will set aside up to 30 MHz of airwaves for carriers that don’t already own a lot of low-band spectrum. The rules aren’t exactly a surprise since Wheeler has been leaning in this direction for months, though they’re likely to get overshadowed by the FCC’s controversy du jour, net neutrality.

The reason low-band spectrum is valuable is because of its propagation — it can reach out long distances in rural areas and punch through walls in dense metro areas. Most of the low-band spectrum in use in the U.S. today is owned by, you guessed it, Verizon and AT&T, both of which have tapped 700 MHz for the backbones of their LTE networks.

Wheeler elaborated in the FCC’s blog:
“… two national carriers control the vast majority of that low-band spectrum.  This disparity makes it difficult for rural consumers to have access to the competition and choice that would be available if more wireless competitors also had access to low-band spectrum.  It also creates challenges for consumers in urban environments who sometimes have difficulty using their mobile phones at home or in their offices.
To address this problem, and to prevent one or two wireless providers from being able to run the table at the auction, I have proposed a market based reserve for the auction.”
The nitty gritty
The way the auction would work under the FCC’s proposal is that in any given market, all carriers would bid freely for these 600 MHz airwaves. But after bidding hits a particular trigger point indicating high demand for those licenses, the FCC would basically split the auction in two, creating a reserve chunk of airwaves up to 30 MHz that only smaller carriers like Sprint, T-Mobile and regional operators could bid on. The unreserved portion would remain open to all bidders.

Verizon and AT&T wouldn’t necessarily face restrictions in every market. It all depends on the extent of their low-band holdings in any given region. There are even a few geographical cases where regional carriers like U.S. Cellular hold enough 700 MHz spectrum that they would be excluded from the reserve camp, FCC officials said.


FCC Commissioners (L to R): Commissioner Ajit Pai, Commissioner Mignon Clyburn, Chairman Tom 
Wheeler, Commissioner Jessica Rosenworcel and Commissioner Michael O’Rielly (Source: FCC)

The rules certainly aren't final. In May they go before the full commission, which will decide on specific mechanisms such as which auction stage reserve bidding would be triggered and what percentage of licenses in any given market could be reserved. It could also change up the rules entirely, easing restrictions on AT&T and Verizon, or toss them out entirely. Those carriers are putting a lot of political pressure on the FCC and Congress for an entirely open auction, and AT&T even threatened to sit the whole auction out.

AT&T may just be bluffing, but the threat has to give the FCC some pause. A major bidder sitting out the auction wouldn’t just mean less revenue for the government, it could cause the entire auction to fail. The way this complex auction is structured (I spell out all the details here), the broadcasters currently using the UHF band would agree to part with their TV channels, but only if their selling prices are met. The fewer bidders there are to buy those repurposed airwaves, the less likely the auction will meet those prices.

We’re still a year away from the first bids being placed, and it’s becoming increasingly clear there’s no way the FCC is going to be able to make happy all the various broadcasters, carriers, politicians and public interest groups involved. It’s just a question of whether it can make enough of them happy to actually pull the auction off.

Musk’s SpaceX to Sue Over Lockheed-Boeing Launch Monopoly


As reported by Bloomberg Businessweek: Elon Musk’s space company will sue the U.S. Air Force to protest a Lockheed Martin Corp.-Boeing Co. team’s monopoly on Pentagon satellite launches, the billionaire said today.

“These launches should be competed,” he told reporters at the National Press Club in Washington. “If we compete and lose, that is fine. But why would they not even compete it?”
Musk’s Space Exploration Technologies Corp., known as SpaceX, is trying to break the joint venture’s lock on U.S. military satellite launches, which have an estimated value of $70 billion through 2030. He has said competition in that market may save taxpayers more than $1 billion a year.

Video: SpaceX's Musk News Conf.: Falcon 9, Launch Lawsuit


SpaceX, based in Hawthorne, California, plans to file its suit Monday in the U.S. Court of Federal Claims. It seeks to reopen competition for a military contract to joint venture United Launch Alliance LLC for 36 rocket cores, said Ian Christopher McCaleb, senior vice president at Levick, a public relations firm representing SpaceX.

Taxpayer Cost
The Air Force agreed to the bulk purchase of the main rocket components last year in an attempt to hold down costs.  “This contract is costing U.S. taxpayers billions of dollars for no reason,” said Musk, who earlier today made a presentation at the U.S. Export-Import Bank’s annual conference.

Mark Bitterman, a spokesman for United Launch Alliance, said the military’s “robust acquisition and oversight process,” as well as the company’s improved performance, led to $4 billion in savings compared with prior acquisition approaches.

The joint venture recognizes the Pentagon’s “plan to enable competition and is ready and willing to support missions with same assurance that we provide today,” Bitterman said in an e-mail.

Matthew Stines, an Air Force spokesman, said in an e-mail that the service has “no formal statement” on Musk’s announcement of the SpaceX lawsuit.

Russian Engines
SpaceX will require three successful launches as part of the process to win U.S. certification, the service has said. Technical reviews and audits of the proposed rockets, ground systems and manufacturing process also are needed, according to the Air Force.


Musk, also chairman and chief executive officer of Tesla Motors Inc., told U.S. lawmakers last month that the Lockheed-Boeing venture’s Atlas V rockets uses engines from Russia, posing supply risks following the country’s invasion of Crimea in Ukraine.

The U.S. and Europe have been considering a possible expansion of sanctions against Russia.

Pentagon officials have asked the Air Force to review whether the use of Russian engines for the military launches poses a national security risk. 

Friday, April 25, 2014

The FCC Doesn’t Want To Destroy Net Neutrality, But It’s Going To Anyway

As reported by GigaOm: The Federal Communications Commission doesn't want companies like Netflix or Viacom to have to pay to get their content to end users of broadband networks, but it doesn't see a way (or maybe even a reason) to ban the practice.

In a call with reporters on Thursday, FCC officials laid out the agency’s thinking on new network neutrality rules and tried to address concerns that the internet as we know it is broken.

The agency’s hope is to have new rules in place by the end of this year, and it plans to release a public document called a Notice of Proposed Rule Making (NPRM) outlining its thinking and asking questions about the new rules. It plans to release this NPRM in three weeks at its May 15 open meeting. Once the documents are released, the public will have a chance to comment on them.

What was once unreasonable discrimination now becomes commercially unreasonable

Since some of the content of that document was released Wednesday, the media and public interest groups have been concerned about what the new network neutrality framework would allow — namely, how the agency planned to ensure that ISPs won’t discriminate against the packets flowing across their networks. The answer? The agency will replace the “unreasonable discrimination” clause from the original net neutrality rules that were defeated in court this year with standards associated with “commercial reasonableness.”

It’s a subtle shift, but an important one. When the U.S. Court of Appeals gutted the Open Internet Order that set forth the net neutrality rules in January, it did so on the basis that the agency didn’t use the right justification for its rules. It tried to turn ISPs into common carriers and regulate them that way, but the court declared that the FCC couldn’t put that burden on the ISPs without changing the law or going through regulatory process that was bound to cause a fight.  

Instead we get a compromise by which the FCC attempts to honor the original intent of the 2010 Open Internet Order with a new test for discrimination. That test is the “commercial reasonableness” standard. Here’s how the FCC wants to do it.

If the devil is in the details, here are the details

First, the net neutrality rules that were gutted by the courts made a distinction between wireline broadband and wireless broadband. For a history on why, check out this post or this one. The FCC plans to keep those distinctions intact for the new rules. With this understanding, let’s hit the three main topics the FCC plans to cover, saving the most complicated element for last.

Transparency: Both the original and the new Open Internet Order make a provision for transparency, namely that network operators must share how they are managing their network traffic with the consumer. This applied to both wireline and wireless networks, so if your ISP is treating certain traffic differently, it has to tell you. The FCC’s upcoming documents also ask if this transparency could go further.

When asked if the order could require greater transparency about company networks such as how congested they might be or if ISPs are charging for prioritization or access because the market is uncompetitive, an FCC official said, “The answer is yes.” He added that the agency believes that greater transparency will help consumers and the commission determine how the broadband networks are functioning. That’s a pretty exciting promise if the FCC can wrangle that type of data from ISPs. Right now, ISPs view that data as competitive and proprietary.

An AT&T network operations center. How much transparency is enough?

Blocking: The courts struck down the original order’s anti-blocking provision that said ISPs on wireline networks couldn't block lawful traffic and wireless ISPs couldn't block competing over-the-top calling and texting services. The new FCC documents will make the case that because blocking traffic interrupts the “virtuous cycle” of broadband access — namely that people use broadband because it gives them access to a variety of services, and because broadband access is beneficial, anything that makes people less inclined to use broadband would cause harm.

This new reasoning would allow the FCC to implement a no-blocking position without resorting to calling ISPs common carriers. Another interesting tidbit here is that the FCC plans to ask about establishing a baseline of broadband service and view anything that goes below this baseline as blocking. This might seem esoteric, but in 2007 when Comcast was interfering with the delivery of BitTorrent packets, it argued that it wasn't actually blocking them. Instead it was delaying delivery so the routers in effect dropped the packets and customers couldn't access their files.


Commercial reasonableness: Here is the heart of last night’s controversy and where the FCC is walking its finest line. The agency wants to ensure that the spirit of network neutrality lives on, but legally it has to use a standard that opens the door to prioritization. The FCC even seems okay with prioritization in certain cases, with an agency official offering up the example of packets coming from a connected heart monitor as a protected class that could be prioritized over other traffic.

However, it will seek to avoid the obvious examples of Netflix having to pay an ISP to see its traffic priorititzed over another content provider’s. It will do this using the standards the FCC set forth in a 2011 cell phone roaming order that has been tested in court. As part of that order, which dictated that mobile carriers have an obligation to offer roaming agreements to other such providers on “commercially reasonable” terms, the agency created a class of behaviors that were commercially unreasonable.
  • Does this practice have an impact on future and present competition?
  • How does vertical integration affect any deals and what is the impact on unaffiliated companies?
  • What is the impact on consumers, their free exercise of speech and on civic engagement?
  • Are the parties acting in good faith? For example is the ISP involved in a good faith negotiation?
  • Are there technical characteristics that would shed light on an ISP practice that is harmful?
  • Are there industry practices that can shed light on what is reasonable?
  • And finally, a catch all that asks if there are any other factors that should be considered that would contribute to the totality of the facts?
FCC Commissioners (L to R): Commissioner Ajit Pai, Commissioner Mignon Clyburn, Chairman Tom Wheeler, Commissioner Jessica Rosenworcel and Commissioner Michael O’Rielly (Source: FCC)

Of course, one challenge with this format is that it requires an ISP to behave badly before the FCC can act. The agency said it will be on the lookout for such violations, it will accept formal complains and that it will accept informal complaints. Once a problem is registered the FCC the agency will ask about how it should handle the complaint, and whether a time limit should be imposed for a resolution.

Finally, the official acknowledged that the agency asks in its documents if there is ever a reason for a flat prohibition against certain behaviors even if an ISP isn’t a common carrier. The agency would have to make the case that paid prioritization is such a consumer or industry harm that it should be prohibited altogether. But based on the thinking and attention devoted to the commercial unreasonableness standard, as well as the heart rate monitor example, it feels like the FCC isn't keen to walk this path.

So these are the topics and questions on which the FCC will vote on May 15 and, if approved, pass for public comment. At that point the agency typically offers a 30 or 90-day comment period.


So get ready, internet: the FCC does want to know your stance on this issue.

Thursday, April 24, 2014

Apple Tech Uses Wi-Fi Access Points For Indoor Navigation, 3D Positioning

As reported by Apple Insider: While most mobile devices rely on GPS for mapping and navigation, the system only works outdoors and in range of satellite timing signals. However, new technology from Apple could extend accurate positioning indoors without need for additional hardware aside from existing Wi-Fi infrastructure.

A patent granted to Apple by the U.S. Patent and Trademark Office on Tuesday describes a robust system that combines GPS, Wi-Fi access points and onboard location databases to provide mobile devices accurate positioning data in nearly any environment.

According to Apple's U.S. Patent No. 8,700,060 for "Determining a location of a mobile device using a location database," the method employs location estimation through the successful communication with one or multiple Wi-Fi access points.

By calculating a number of factors, including access point filtering, hardware communication range and so-called "presence areas," a mobile device can narrow down its position on a map with relative precision. This includes products without GPS receivers.

One of the first steps in Apple's patent calls for a location-aware device or devices (with GPS capabilities) to transmit their position to a first Wi-Fi access point, which in turn relays the information to a server-based location system. From this data, the system can then estimate the approximate location, or "presence areas," of other devices within the communication range of the access point.

To calculate these presence areas, the system may use any number of analyses including an averaging of geographic locations based on location-aware mobile devices, signal strength of a given access point and surrounding building architecture, among other variables. Presence areas may be selected in a multi-pass process by filtering out potentials based on "popularity, stability, longevity, and freshness."

Loaded with data, the system can plot out connected mobile devices in cells on a geographic grid. Each cell acts as a container for presence areas and corresponding access points. As seen in the image above, location-aware devices are represented as black triangles that are within or nearby presence areas denoted by circles.

One way a mobile device can calculate its location is by detecting multiple presence areas and averaging distance from those close by, while discarding data from "outliers" farthest away from a given position. Following processing, the device can then display its average location on a mapping app.

Alternatively, an access point can send position information about other access points nearby, including only those that are within a mobile device's area of interest. This method of filtering is also used to approximate margin of error, which is denoted by a radius or radii extending from a focal point within a presence area.

In addition, Apple's method accounts for three-dimensional space by taking into consideration altitude data from devices supporting such GPS metrics.



From left: Multi-pass analysis, multi-pass analysis with outlier, and 3D positioning grid.

Tuesday's patent is similar to technology created by "indoor GPS" firm WifiSLAM, which Apple purchased in March 2013 for about $20 million. WifiSLAM's system relies largely on Wi-Fi signals to accurately position mobile devices while indoors and does not require GPS to operate.

Apple's patent for a Wi-Fi-based positioning system was first filed for in 2010 and credits Ronald K. Huang as its inventor.