<p>San Francisco: In September, Amazon said it would invest up to $4 billion in Anthropic, a San Francisco startup working on artificial intelligence.</p>.<p>Soon after, an Amazon executive sent a private message to an executive at another company. He said Anthropic had won the deal because it agreed to build its AI using specialised computer chips designed by Amazon.</p>.<p>Amazon, he wrote, wanted to create a viable competitor to the chipmaker Nvidia, a key partner and kingmaker in the all-important field of AI.</p>.<p>The boom in generative AI over the past year exposed just how dependent big tech companies had become on Nvidia. They cannot build chatbots and other AI systems without a special kind of chip that Nvidia has mastered over the past several years. They have spent billions of dollars on Nvidia’s systems, and the chipmaker has not kept up with the demand.</p>.Centre, industry, global experts to discuss trade regime of sensitive goods, technology on January 30.<p>So Amazon and other giants of the industry — including Google, Meta and Microsoft — are building AI chips of their own. With these chips, the tech giants could control their own destiny. They could rein in costs, eliminate chip shortages and eventually sell access to their chips to businesses that use their cloud services.</p>.<p>While Nvidia sold 2.5 million chips last year, Google spent $2 billion to $3 billion building about 1 million of its own AI chips, said Pierre Ferragu, an analyst at New Street Research. Amazon spent $200 million on 100,000 chips last year, he estimated. Microsoft said it had begun testing its first AI chip.</p>.<p>But this work is a balancing act between competing with Nvidia while working closely with the chipmaker and its increasingly powerful CEO, Jensen Huang.</p>.<p>Huang’s company accounts for more than 70 per cent of AI chip sales, according to research firm Omdia. It supplies an even larger percentage of the systems used in the creation of generative AI. Nvidia’s sales have shot up 206 per cent over the past year, and the company has added about $1 trillion in market value.</p>.<p>What is revenue to Nvidia is a cost for the tech giants. Orders from Microsoft and Meta made up about one-fourth of Nvidia’s sales in the past two full quarters, said Gil Luria, an analyst at the investment bank D.A. Davidson.</p>.<p>Nvidia sells its chips for about $15,000 each, while Google spends an average of just $2,000 to $3,000 on each of its own, according to Ferragu.</p>.<p>“When they encountered a vendor that held them over a barrel, they reacted very strongly,” Luria said.</p>.<p>Companies constantly court Huang, jockeying to be at the front of the line for his chips. He regularly appears on event stages with their CEOs, and the companies are quick to say they remain committed to their partnerships with Nvidia. They all plan to keep offering its chips alongside their own.</p>.<p>While the big tech companies are moving into Nvidia’s business, it is moving into theirs. Last year, Nvidia started its own cloud service where businesses can use its chips, and it is funneling chips into a new wave of cloud providers, such as CoreWeave, that compete with the big three: Amazon, Google and Microsoft.</p>.<p>“The tensions here are a thousand times the usual jockeying between customers and suppliers,” said Charles Fitzgerald, a technology consultant and investor.</p>.<p>Nvidia declined to comment.</p>.<p>The AI chip market is projected to more than double by 2027, to roughly $140 billion, according to research firm Gartner. Venerable chipmakers such as AMD and Intel are also building specialized AI chips, as are startups such as Cerebras and SambaNova. But Amazon and other tech giants can do things that smaller competitors cannot.</p>.<p>“In theory, if they can reach a high enough volume and they can get their costs down, these companies should be able to provide something that is even better than Nvidia,” said Naveen Rao, who founded one of the first AI chip startups and later sold it to Intel.</p>.<p>Nvidia builds what are called graphics processing units, or GPUs, which it originally designed to help render images for video games. But a decade ago, academic researchers realized these chips were also really good at building neural networks, the systems that now drive generative AI.</p>.<p>As this technology took off, Huang quickly began modifying Nvidia’s chips and related software for AI, and they became the de facto standard. Most software systems that are used to train AI technologies were tailored to work with Nvidia’s chips.</p>.<p>“Nvidia’s got great chips, and more importantly, they have an incredible ecosystem,” said Dave Brown, who runs Amazon’s chip efforts. That makes getting customers to use a new kind of AI chip “very, very challenging,” he said.</p>.<p>Rewriting software code to use a new chip is so difficult and time-consuming, many companies don’t even try, said Mike Schroepfer, an adviser and former chief technology officer at Meta. “The problem with technological development is that so much of it dies before it even gets started,” he said.</p>.<p>Rani Borkar, who oversees Microsoft’s hardware infrastructure, said Microsoft and its peers needed to make it “seamless” for customers to move between chips from different companies.</p>.<p>Amazon, Brown said, is working to make switching between chips “as simple as it can possibly be.”</p>.<p>Some tech giants have found success making their own chips. Apple designs the silicon in iPhones and Macs, and Amazon has deployed more than 2 million of its own traditional server chips in its cloud computing data centers. But achievements such as these take years of hardware and software development.</p>.<p>Google has the biggest head start in developing AI chips. In 2017, it introduced its tensor processing unit, or TPU, named after a kind of calculation vital to building AI. Google used tens of thousands of TPUs to build AI products, including its online chatbot, Google Bard. And other companies have used the chip through Google’s cloud service to build similar technologies, including high-profile startup Cohere.</p>.Accenture to enhance Indo Count's biz operations using digital technologies.<p>Amazon is now on the second generation of Trainium, its chip for building AI systems, and has a second chip made just for serving up AI models to customers. In May, Meta announced plans to work on an AI chip tailored to its needs, although it is not yet in use. In November, Microsoft announced its first AI chip, Maia, which will focus initially on running Microsoft’s own AI products.</p>.<p>“If Microsoft builds its own chips, it builds exactly what it needs for the lowest possible cost,” Luria said.</p>.<p>Nvidia’s rivals have used their investments in high-profile AI startups to fuel use of their chips. Microsoft has committed $13 billion to OpenAI, the maker of the ChatGPT chatbot, and its Maia chip will serve OpenAI’s technologies to Microsoft’s customers. Like Amazon, Google has invested billions in Anthropic, and it is using Google’s AI chips, too.</p>.<p>Anthropic, which has used chips from both Nvidia and Google, is among a handful of companies working to build AI using as many specialized chips as they can get their hands on. Amazon said that if companies such as Anthropic used Amazon’s chips on an increasingly large scale and even helped design future chips, doing so could reduce the cost and improve the performance of these processors. Anthropic declined to comment.</p>.<p>But none of these companies will overtake Nvidia anytime soon. Its chips may be pricey, but they are among the fastest on the market. And the company will continue to improve their speed.</p>.<p>Rao said his company, Databricks, trained some experimental AI systems using Amazon’s AI chips but built its largest and most important systems using Nvidia chips because they provided higher performance and played nicely with a wider range of software.</p>.<p>“We have many years of hard innovation ahead of us,” Amazon’s Brown said. “Nvidia is not going to be standing still.”</p>
<p>San Francisco: In September, Amazon said it would invest up to $4 billion in Anthropic, a San Francisco startup working on artificial intelligence.</p>.<p>Soon after, an Amazon executive sent a private message to an executive at another company. He said Anthropic had won the deal because it agreed to build its AI using specialised computer chips designed by Amazon.</p>.<p>Amazon, he wrote, wanted to create a viable competitor to the chipmaker Nvidia, a key partner and kingmaker in the all-important field of AI.</p>.<p>The boom in generative AI over the past year exposed just how dependent big tech companies had become on Nvidia. They cannot build chatbots and other AI systems without a special kind of chip that Nvidia has mastered over the past several years. They have spent billions of dollars on Nvidia’s systems, and the chipmaker has not kept up with the demand.</p>.Centre, industry, global experts to discuss trade regime of sensitive goods, technology on January 30.<p>So Amazon and other giants of the industry — including Google, Meta and Microsoft — are building AI chips of their own. With these chips, the tech giants could control their own destiny. They could rein in costs, eliminate chip shortages and eventually sell access to their chips to businesses that use their cloud services.</p>.<p>While Nvidia sold 2.5 million chips last year, Google spent $2 billion to $3 billion building about 1 million of its own AI chips, said Pierre Ferragu, an analyst at New Street Research. Amazon spent $200 million on 100,000 chips last year, he estimated. Microsoft said it had begun testing its first AI chip.</p>.<p>But this work is a balancing act between competing with Nvidia while working closely with the chipmaker and its increasingly powerful CEO, Jensen Huang.</p>.<p>Huang’s company accounts for more than 70 per cent of AI chip sales, according to research firm Omdia. It supplies an even larger percentage of the systems used in the creation of generative AI. Nvidia’s sales have shot up 206 per cent over the past year, and the company has added about $1 trillion in market value.</p>.<p>What is revenue to Nvidia is a cost for the tech giants. Orders from Microsoft and Meta made up about one-fourth of Nvidia’s sales in the past two full quarters, said Gil Luria, an analyst at the investment bank D.A. Davidson.</p>.<p>Nvidia sells its chips for about $15,000 each, while Google spends an average of just $2,000 to $3,000 on each of its own, according to Ferragu.</p>.<p>“When they encountered a vendor that held them over a barrel, they reacted very strongly,” Luria said.</p>.<p>Companies constantly court Huang, jockeying to be at the front of the line for his chips. He regularly appears on event stages with their CEOs, and the companies are quick to say they remain committed to their partnerships with Nvidia. They all plan to keep offering its chips alongside their own.</p>.<p>While the big tech companies are moving into Nvidia’s business, it is moving into theirs. Last year, Nvidia started its own cloud service where businesses can use its chips, and it is funneling chips into a new wave of cloud providers, such as CoreWeave, that compete with the big three: Amazon, Google and Microsoft.</p>.<p>“The tensions here are a thousand times the usual jockeying between customers and suppliers,” said Charles Fitzgerald, a technology consultant and investor.</p>.<p>Nvidia declined to comment.</p>.<p>The AI chip market is projected to more than double by 2027, to roughly $140 billion, according to research firm Gartner. Venerable chipmakers such as AMD and Intel are also building specialized AI chips, as are startups such as Cerebras and SambaNova. But Amazon and other tech giants can do things that smaller competitors cannot.</p>.<p>“In theory, if they can reach a high enough volume and they can get their costs down, these companies should be able to provide something that is even better than Nvidia,” said Naveen Rao, who founded one of the first AI chip startups and later sold it to Intel.</p>.<p>Nvidia builds what are called graphics processing units, or GPUs, which it originally designed to help render images for video games. But a decade ago, academic researchers realized these chips were also really good at building neural networks, the systems that now drive generative AI.</p>.<p>As this technology took off, Huang quickly began modifying Nvidia’s chips and related software for AI, and they became the de facto standard. Most software systems that are used to train AI technologies were tailored to work with Nvidia’s chips.</p>.<p>“Nvidia’s got great chips, and more importantly, they have an incredible ecosystem,” said Dave Brown, who runs Amazon’s chip efforts. That makes getting customers to use a new kind of AI chip “very, very challenging,” he said.</p>.<p>Rewriting software code to use a new chip is so difficult and time-consuming, many companies don’t even try, said Mike Schroepfer, an adviser and former chief technology officer at Meta. “The problem with technological development is that so much of it dies before it even gets started,” he said.</p>.<p>Rani Borkar, who oversees Microsoft’s hardware infrastructure, said Microsoft and its peers needed to make it “seamless” for customers to move between chips from different companies.</p>.<p>Amazon, Brown said, is working to make switching between chips “as simple as it can possibly be.”</p>.<p>Some tech giants have found success making their own chips. Apple designs the silicon in iPhones and Macs, and Amazon has deployed more than 2 million of its own traditional server chips in its cloud computing data centers. But achievements such as these take years of hardware and software development.</p>.<p>Google has the biggest head start in developing AI chips. In 2017, it introduced its tensor processing unit, or TPU, named after a kind of calculation vital to building AI. Google used tens of thousands of TPUs to build AI products, including its online chatbot, Google Bard. And other companies have used the chip through Google’s cloud service to build similar technologies, including high-profile startup Cohere.</p>.Accenture to enhance Indo Count's biz operations using digital technologies.<p>Amazon is now on the second generation of Trainium, its chip for building AI systems, and has a second chip made just for serving up AI models to customers. In May, Meta announced plans to work on an AI chip tailored to its needs, although it is not yet in use. In November, Microsoft announced its first AI chip, Maia, which will focus initially on running Microsoft’s own AI products.</p>.<p>“If Microsoft builds its own chips, it builds exactly what it needs for the lowest possible cost,” Luria said.</p>.<p>Nvidia’s rivals have used their investments in high-profile AI startups to fuel use of their chips. Microsoft has committed $13 billion to OpenAI, the maker of the ChatGPT chatbot, and its Maia chip will serve OpenAI’s technologies to Microsoft’s customers. Like Amazon, Google has invested billions in Anthropic, and it is using Google’s AI chips, too.</p>.<p>Anthropic, which has used chips from both Nvidia and Google, is among a handful of companies working to build AI using as many specialized chips as they can get their hands on. Amazon said that if companies such as Anthropic used Amazon’s chips on an increasingly large scale and even helped design future chips, doing so could reduce the cost and improve the performance of these processors. Anthropic declined to comment.</p>.<p>But none of these companies will overtake Nvidia anytime soon. Its chips may be pricey, but they are among the fastest on the market. And the company will continue to improve their speed.</p>.<p>Rao said his company, Databricks, trained some experimental AI systems using Amazon’s AI chips but built its largest and most important systems using Nvidia chips because they provided higher performance and played nicely with a wider range of software.</p>.<p>“We have many years of hard innovation ahead of us,” Amazon’s Brown said. “Nvidia is not going to be standing still.”</p>