A new Intel chip due in December will be able to run a generative artificial intelligence chatbot on a laptop rather than having to tap into cloud data centers for computing power, the company said on Tuesday.
The capability, which Intel showed off during a software developer conference held in Silicon Valley, could let businesses and consumers test ChatGPT-style technologies without sending sensitive data off their own computer. It is made possible by new AI data-crunching features built into Intel’s forthcoming “Meteor Lake” laptop chip and from new software tools the company is releasing.
At the conference, Intel demonstrated laptops that could generate a song in the style of Taylor Swift and answer questions in a conversational style, all while disconnected from the Internet. Chief Executive Officer Pat Gelsinger said Microsoft’s “Copilot” AI assistant will be able to run on Intel-based PCs.
“We see the AI PC as a sea change moment in tech innovation,” Gelsinger said.
Intel shares were down 1.5% after the company’s presentation.
Intel executives also said the company is on track to deliver a successor chip called “Arrow Lake” next year, and that Intel’s manufacturing technology will rival the best from Taiwan Semiconductor Manufacturing Co, as it has promised. Intel was once the best chip manufacturer, lost the lead, and now says it is on track to return to the front.
Intel has struggled to gain ground against Nvidia in the market for the powerful chips used in data centers to “train” AI systems such as ChatGPT. Intel said on Tuesday that it was building a new supercomputer that would be used by Stability AI, a startup that makes image-generating software. China’s Alibaba Group Holdings is using its newest central processors to serve up chatbot technology, Intel said.
But the market for chips that will handle AI work outside data centers is far less settled, and it is there that Intel aimed to gain ground on Tuesday.
Through a new version of software called OpenVINO, Intel said that developers will be able to run a version of a large language model, the class of technology behind products like ChatGPT, made by Meta Platforms on laptops. That will enable faster responses from chatbots and will mean that data does not leave the device.
“You can get a better performance, a lower cost and more private AI,” Sachin Katti, senior vice president and general manager of Intel’s network and edge group, told Reuters in an interview.
Dan Hutcheson, an analyst with TechInsights, told Reuters that business users who are weary of handing sensitive corporate data over to third-party AI firms might be interested in Intel’s approach.
If Intel Chief Gelsinger can make AI “so that anyone can use it, that creates a much bigger market for chips – the chips that he makes,” Hutcheson said.