As reported by Wired: YOUR NEXT IPHONE will be even better at guessing what you want to type before you type it. Or so say the technologists at Apple.
Let’s say you use the word “play” in a text message. In the latest version of the iOS mobile operating system, “we can tell the difference between the Orioles who are playing in the playoffs and the children who are playing in the park, automatically,” Apple senior vice president Craig Federighi said Monday morning during his keynote at the company’s annual Worldwide Developer Conference.
Like a lot of big tech companies, Apple is deploying deep neural networks, networks of hardware and software that can learn by analyzing vast amounts of data. Specifically, Apple uses “long short-term memory” neural networks, or LSTMs. They can “remember” the beginning of a conversation as they’re reading the end of it, making them better at grasping context.
Google uses a similar method to drive Smart Reply, which suggests responses to email messages. But Apple’s “QuickType”—that’s what the company calls its version—shows that not only is Apple pushing AI onto personal devices, it’s pushing harder than Federighi let on.
Let’s say you use the word “play” in a text message. In the latest version of the iOS mobile operating system, “we can tell the difference between the Orioles who are playing in the playoffs and the children who are playing in the park, automatically,” Apple senior vice president Craig Federighi said Monday morning during his keynote at the company’s annual Worldwide Developer Conference.
Like a lot of big tech companies, Apple is deploying deep neural networks, networks of hardware and software that can learn by analyzing vast amounts of data. Specifically, Apple uses “long short-term memory” neural networks, or LSTMs. They can “remember” the beginning of a conversation as they’re reading the end of it, making them better at grasping context.
Google uses a similar method to drive Smart Reply, which suggests responses to email messages. But Apple’s “QuickType”—that’s what the company calls its version—shows that not only is Apple pushing AI onto personal devices, it’s pushing harder than Federighi let on.
Today, on its website, Apple also introduced an application programming interface, or API, that lets outside businesses and coders use a similar breed of neural network. This tool, Basic Neural Network Subroutines, is a “collection of functions that you can use to construct neural networks” on a wide range of Apple operating systems, including iOS as well as OS X (for desktops and laptops), tvOS (for TVs), and watchOS (for watches), according to the documentation. “They’re making it as easy as possible for people to add neural nets to their apps,” says Chris Nicholson, CEO and founder of deep learning startup Skymind.
For now, BNNS looks better at identifying images than understanding natural language. But either way, neural networks don’t typically run on laptops and phones. They run atop computer servers on the other side of the Internet, and then they deliver their results to devices across the wire. (Google just revealed that it has built a specialized chip that executes neural nets inside its data centers before sending the results to your phone). Apple wants coders to build neural nets that work even without a connection back to the ‘net—and that’s unusual. Both Google and IBM have experimented with the idea, but Apple is doing it now.
It might not work. Apple doesn’t provide a way of training the neural net, where it actually learns a task by analyzing data. The new Apple API is just a way of executing the neural net once it’s trained. Coders, Nicholson says, will have to handle that on their own or use pre-trained models from some other source. Plus, no one yet knows how well Apple’s neural nets will run on a tiny device like a phone or a watch. They may need more processing power and battery life than such devices can provide. But that’s all details; one day, neural nets will work on personal devices, and Apple is moving toward that day.
For now, BNNS looks better at identifying images than understanding natural language. But either way, neural networks don’t typically run on laptops and phones. They run atop computer servers on the other side of the Internet, and then they deliver their results to devices across the wire. (Google just revealed that it has built a specialized chip that executes neural nets inside its data centers before sending the results to your phone). Apple wants coders to build neural nets that work even without a connection back to the ‘net—and that’s unusual. Both Google and IBM have experimented with the idea, but Apple is doing it now.
No comments:
Post a Comment