AI Computing to Consume Power of Multiple NYCs by 2026, Says Founder
April 18, 2025
StevenHill
88
Nvidia and Partners Expand AI Data Centers Worldwide
Nvidia, along with its partners and clients, has been actively expanding the size of computer facilities around the globe to meet the high computational demands of training massive artificial intelligence (AI) models like GPT-4. This expansion is set to become even more crucial as additional AI models are deployed into production, according to Thomas Graham, co-founder of the optical computing startup Lightmatter. In a recent interview in New York with Mandeep Singh, a senior technology analyst from Bloomberg Intelligence, Graham highlighted the growing need for computational resources.
"The demand for more compute isn't just about scaling laws; it's also about deploying these AI models now," Graham explained. Singh inquired about the future of large language models (LLMs) like GPT-4 and whether they would continue to grow in size. In response, Graham shifted the focus to the practical application of AI, emphasizing the importance of inferencing, or the deployment phase, which requires substantial computational power.
"If you consider training as R&D, inferencing is essentially deployment. As you deploy, you'll need large-scale computers to run your models," Graham stated during the "Gen AI: Can it deliver on the productivity promise?" conference hosted by Bloomberg Intelligence.
Graham's perspective aligns with that of Nvidia CEO Jensen Huang, who has emphasized to Wall Street that advancing "agentic" AI forms will necessitate not only more sophisticated training but also significantly enhanced inference capabilities, resulting in an exponential increase in compute requirements.

"If you view training as R&D, inferencing is really deployment, and as you're deploying that, you're going to need large computers to run your models," said Graham. Photo: Bloomberg, courtesy of Craig Warga
Lightmatter's Role in AI Infrastructure
Founded in 2018, Lightmatter is at the forefront of developing chip technology that uses optical connections to link multiple processors on a single semiconductor die. These optical interconnects can transfer data more efficiently than traditional copper wires, using less energy. This technology can streamline connections within and between data center racks, enhancing the overall efficiency and economy of the data center, according to Graham.
"By replacing copper traces in data centers—both on the server's printed circuit board and in the cabling between racks—with fiber optics, we can dramatically increase bandwidth," Graham told Singh. Lightmatter is currently collaborating with various tech companies on the design of new data centers, and Graham noted that these facilities are being built from the ground up. The company has already established a partnership with Global Foundries, a contract semiconductor manufacturer with operations in upstate New York, which serves clients like Advanced Micro Devices.
While Graham did not disclose specific partners and customers beyond this collaboration, he hinted that Lightmatter works with silicon providers such as Broadcom or Marvell to create custom components for tech giants like Google, Amazon, and Microsoft, who design their own data center processors.
The Scale and Future of AI Data Centers
To illustrate the magnitude of AI deployment, Graham pointed out that at least a dozen new AI data centers are either planned or under construction, each requiring a gigawatt of power. "For context, New York City uses about five gigawatts of power on an average day. So, we're talking about the power consumption of multiple New York Cities," he said. He predicts that by 2026, global AI processing will demand 40 gigawatts of power, equivalent to eight New York Cities, specifically for AI data centers.
Lightmatter recently secured a $400 million venture capital investment, valuing the company at $4.4 billion. Graham mentioned that the company aims to start production "over the next few years."
When asked about potential disruptions to Lightmatter's plans, Graham expressed confidence in the ongoing need for expanding AI computing infrastructure. However, he acknowledged that a breakthrough in AI algorithms requiring significantly less compute or achieving artificial general intelligence (AGI) more rapidly could challenge current assumptions about the need for exponential compute growth.
Related article
Chapter Fourteen of AI-Driven Visual Novel Nightfall Revealed
Welcome back to the world of Nightfall, where AI-driven storytelling weaves a tapestry of fantasy, voice acting, and unexpected crossovers. In this latest chapter, we follow the quirky misadventures of Mike Wazowski as he ventures through a mysterious realm in search of allies. Brace yourself for st
AI Marketing Automation: Boost Revenue Without Hiring More Staff
Revolutionizing Your Marketing Strategy with AI in 2025Are you eager to transform your marketing approach and see your revenue soar without hiring more staff? In 2025, artificial intelligence is reshaping how businesses operate and expand. It's time to delve into AI marketing automation and discover
Mastering Photoshop: How to Merge Layers for Efficient Editing
If you're diving into the world of digital image editing with Adobe Photoshop, mastering the art of layer management is essential. Whether you're a novice or a seasoned pro, understanding how to merge layers can transform your workflow from chaotic to streamlined. As your projects grow in complexity
Comments (20)
0/200
FrankClark
April 18, 2025 at 1:31:52 PM GMT
The idea that AI computing might consume power equivalent to multiple NYCs by 2026 is mind-blowing! It's exciting to see Nvidia pushing the boundaries, but also a bit scary thinking about the energy impact. Can't wait to see how this unfolds! 🌍💡
0
GaryGonzalez
April 19, 2025 at 1:09:03 PM GMT
2026年までにAIの計算が複数のニューヨーク市の電力を消費するかもしれないという考えは驚くべきものだ!Nvidiaが境界を押し広げるのを見るのはワクワクするけど、エネルギーへの影響を考えるとちょっと怖い。どうなるか楽しみだね!🌍💡
0
RaymondWalker
April 19, 2025 at 2:32:21 PM GMT
¡La idea de que el cómputo de IA podría consumir energía equivalente a múltiples ciudades de Nueva York para 2026 es alucinante! Es emocionante ver a Nvidia empujando los límites, pero también un poco aterrador pensar en el impacto energético. ¡No puedo esperar a ver cómo se desarrolla esto! 🌍💡
0
JamesGreen
April 19, 2025 at 2:08:23 AM GMT
Die Vorstellung, dass die KI-Rechenleistung bis 2026 Energie verbrauchen könnte, die mehreren New Yorks entspricht, ist umwerfend! Es ist aufregend zu sehen, wie Nvidia die Grenzen verschiebt, aber auch ein bisschen beängstigend, wenn man an den Energieeinfluss denkt. Kann es kaum erwarten zu sehen, wie sich das entwickelt! 🌍💡
0
LarryWilliams
April 18, 2025 at 2:24:22 PM GMT
L'idée que le calcul de l'IA pourrait consommer une puissance équivalente à plusieurs villes de New York d'ici 2026 est hallucinante ! C'est excitant de voir Nvidia repousser les limites, mais aussi un peu effrayant de penser à l'impact énergétique. J'ai hâte de voir comment cela va se dérouler ! 🌍💡
0
TerryYoung
April 20, 2025 at 12:05:08 AM GMT
This AI expansion sounds insane! By 2026, the power consumption could match multiple New York Cities? That's wild! I'm excited to see what kind of AI models they'll be training, but also kinda worried about the environmental impact. Can't wait to see how this pans out! 🤯
0






Nvidia and Partners Expand AI Data Centers Worldwide
Nvidia, along with its partners and clients, has been actively expanding the size of computer facilities around the globe to meet the high computational demands of training massive artificial intelligence (AI) models like GPT-4. This expansion is set to become even more crucial as additional AI models are deployed into production, according to Thomas Graham, co-founder of the optical computing startup Lightmatter. In a recent interview in New York with Mandeep Singh, a senior technology analyst from Bloomberg Intelligence, Graham highlighted the growing need for computational resources.
"The demand for more compute isn't just about scaling laws; it's also about deploying these AI models now," Graham explained. Singh inquired about the future of large language models (LLMs) like GPT-4 and whether they would continue to grow in size. In response, Graham shifted the focus to the practical application of AI, emphasizing the importance of inferencing, or the deployment phase, which requires substantial computational power.
"If you consider training as R&D, inferencing is essentially deployment. As you deploy, you'll need large-scale computers to run your models," Graham stated during the "Gen AI: Can it deliver on the productivity promise?" conference hosted by Bloomberg Intelligence.
Graham's perspective aligns with that of Nvidia CEO Jensen Huang, who has emphasized to Wall Street that advancing "agentic" AI forms will necessitate not only more sophisticated training but also significantly enhanced inference capabilities, resulting in an exponential increase in compute requirements.
Lightmatter's Role in AI Infrastructure
Founded in 2018, Lightmatter is at the forefront of developing chip technology that uses optical connections to link multiple processors on a single semiconductor die. These optical interconnects can transfer data more efficiently than traditional copper wires, using less energy. This technology can streamline connections within and between data center racks, enhancing the overall efficiency and economy of the data center, according to Graham.
"By replacing copper traces in data centers—both on the server's printed circuit board and in the cabling between racks—with fiber optics, we can dramatically increase bandwidth," Graham told Singh. Lightmatter is currently collaborating with various tech companies on the design of new data centers, and Graham noted that these facilities are being built from the ground up. The company has already established a partnership with Global Foundries, a contract semiconductor manufacturer with operations in upstate New York, which serves clients like Advanced Micro Devices.
While Graham did not disclose specific partners and customers beyond this collaboration, he hinted that Lightmatter works with silicon providers such as Broadcom or Marvell to create custom components for tech giants like Google, Amazon, and Microsoft, who design their own data center processors.
The Scale and Future of AI Data Centers
To illustrate the magnitude of AI deployment, Graham pointed out that at least a dozen new AI data centers are either planned or under construction, each requiring a gigawatt of power. "For context, New York City uses about five gigawatts of power on an average day. So, we're talking about the power consumption of multiple New York Cities," he said. He predicts that by 2026, global AI processing will demand 40 gigawatts of power, equivalent to eight New York Cities, specifically for AI data centers.
Lightmatter recently secured a $400 million venture capital investment, valuing the company at $4.4 billion. Graham mentioned that the company aims to start production "over the next few years."
When asked about potential disruptions to Lightmatter's plans, Graham expressed confidence in the ongoing need for expanding AI computing infrastructure. However, he acknowledged that a breakthrough in AI algorithms requiring significantly less compute or achieving artificial general intelligence (AGI) more rapidly could challenge current assumptions about the need for exponential compute growth.




The idea that AI computing might consume power equivalent to multiple NYCs by 2026 is mind-blowing! It's exciting to see Nvidia pushing the boundaries, but also a bit scary thinking about the energy impact. Can't wait to see how this unfolds! 🌍💡




2026年までにAIの計算が複数のニューヨーク市の電力を消費するかもしれないという考えは驚くべきものだ!Nvidiaが境界を押し広げるのを見るのはワクワクするけど、エネルギーへの影響を考えるとちょっと怖い。どうなるか楽しみだね!🌍💡




¡La idea de que el cómputo de IA podría consumir energía equivalente a múltiples ciudades de Nueva York para 2026 es alucinante! Es emocionante ver a Nvidia empujando los límites, pero también un poco aterrador pensar en el impacto energético. ¡No puedo esperar a ver cómo se desarrolla esto! 🌍💡




Die Vorstellung, dass die KI-Rechenleistung bis 2026 Energie verbrauchen könnte, die mehreren New Yorks entspricht, ist umwerfend! Es ist aufregend zu sehen, wie Nvidia die Grenzen verschiebt, aber auch ein bisschen beängstigend, wenn man an den Energieeinfluss denkt. Kann es kaum erwarten zu sehen, wie sich das entwickelt! 🌍💡




L'idée que le calcul de l'IA pourrait consommer une puissance équivalente à plusieurs villes de New York d'ici 2026 est hallucinante ! C'est excitant de voir Nvidia repousser les limites, mais aussi un peu effrayant de penser à l'impact énergétique. J'ai hâte de voir comment cela va se dérouler ! 🌍💡




This AI expansion sounds insane! By 2026, the power consumption could match multiple New York Cities? That's wild! I'm excited to see what kind of AI models they'll be training, but also kinda worried about the environmental impact. Can't wait to see how this pans out! 🤯












