As reported by IoT M2M Council: Intel CEO Bob Swann (pictured) kicked off this week’s Consumer Electronics Show in Las Vegas by announcing breakthroughs in artificial intelligence (AI) that pave the way for autonomous driving, a new era of mobile computing innovation, and the future of immersive sports and entertainment.
Intel demonstrated all these and more at CES by showing how the company is infusing intelligence across the cloud, network, edge and PC, and driving positive impact for people, business and society.
Swan shared updates from its Mobileye business, including a demonstration of its self-driving robo-car navigating traffic in a natural manner. The drive demonstrated Mobileye's approach to deliver safer mobility for all with a combination of artificial intelligence, computer vision, the regulatory science model of RSS (responsibility-sensitive safety) and redundancy through independent sensing systems.
Swan also highlighted Intel's work with the American Red Cross and its Missing Maps project to improve disaster preparedness. Using integrated AI acceleration on second-generation Xeon scalable processors, Intel is helping the American Red Cross and its Missing Maps project to build highly accurate maps with bridges and roads for remote regions of the world, which helps emergency responders in the event of a disaster.
"At Intel, our ambition is to help customers make the most of technology inflections like AI, 5G and the intelligent edge so that together we can enrich lives and shape the world for decades to come,” said Swan. “As we highlighted today, our drive to infuse intelligence into every aspect of computing can have positive impact at unprecedented scale."
Mobile computing was an area of emphasis, as Intel made announcements spanning products, partnerships and platform-level innovations to transform the way people focus, create and engage.
Intel executive vice president Gregory Bryant (pictured above) gave a first look and demonstration of the latest Intel Core mobile processors, code-named Tiger Lake. Tiger Lake is designed to bring Intel's people-led vision for mobile computing to lif.
With optimizations spanning the CPU, AI accelerators and discrete-level integrated graphics based on the Intel Xe graphics architecture, Tiger Lake should deliver double-digit performance gains, AI performance improvements, a leap in graphics performance and four times the throughput of USB 3 with the integrated Thunderbolt 4. Built on Intel's 10nm+ process, the first Tiger Lake systems are expected to ship this year.
Intel vice president of architecture for graphics and software Lisa Pearce (pictured above) provided insight into the progress on the Intel Xe graphics architecture, which will provide performance gains in Tiger Lake, and previewed Intel's first Xe-based discrete GPU, code-named DG1.
Updates were announced on Intel's Project Athena innovation program, including the first Project Athena-verified Chromebooks. Project Athena-verified designs have been tuned, tested and verified to deliver system-level innovation and benefits spanning battery life, consistent responsiveness, instant wake, application compatibility and more.
Intel has verified 25 Project Athena designs to date, and Bryant announced an expanded partnership with Google that has already resulted in the first two Project Athena-verified Chromebooks, the Asus Chromebook Flip (C436) and the Samsung Galaxy Chromebook. Intel expects to verify approximately 50 more designs across Windows and Chrome this year and deliver a target specification for dual-screen PCs.
Through deepened co-engineering efforts with OEM partners, Intel helps deliver category-defining devices based on Intel Core processors. This includes dual-screen and fold-able designs such as the Lenovo ThinkPad X1 Fold, which leverages the Intel Core processor with Intel Hybrid Technology (code-named Lakefield) expected to ship midyear, and the Dell Concept Duet. Bryant also previewed the company's latest concept device, a fold-able OLED display form factor, code-named Horseshoe Bend.
The data center is the force that delivers intelligence to businesses around the world and Xeon scalable processors continue to be the foundation of the data center. Intel executive vice president Navin Shenoy (pictured above) announced that third-generation Xeon scalable processors, coming in the first half of 2020, will include DL boost extensions for built-in AI training acceleration, providing up to a 60% increase in training performance over the previous family.
Shenoy highlighted several ways Intel is threading intelligence into data platforms across cloud, network and edge and how this is transforming sports and entertainment. For example, Netflix has used the latest video compression technology, AV1, to enhance its media streaming services and bring content to life across the globe, with up to 60% compression efficiency over the previous compression technology.
Intel and Netflix's joint efforts continue with the development of an open-source high-performance encoder (SVT-AV1), optimized on second-generation Xeon scalable processors, that delivers quality and performance gains making it viable for commercial deployment.
A claimed first-of-its-kind in computer vision, 3D Athlete Tracking (3Dat) uses AI to enhance the viewing experience with near real-time insights and visualizations. 3Dat uses highly mobile cameras to capture the form and motion of athletes, then applies algorithms optimized with DL boost and powered by Xeon scalable processors to analyse the bio-mechanics of athletes' movements.
Shenoy announced that this technology would enhance replays of the 100m and other sprinting events at the Olympic Games in Tokyo later this year.
Intel and the sports industry are transforming the sports viewing experience with volumetric video, a progression towards enabling sports viewing without limitations. Intel True View synthesizes the entire volume of a stadium’s field to provide endless angles that allow fans to choose any vantage point and player perspective and stream from their devices.
Intel and the NFL showcased the power of streaming volumetric video with a play from the Cleveland Browns versus Arizona Cardinals game. The data produced from the first quarter of an NFL game alone reach beyond 3Tbyte per minute.
No comments:
Post a Comment