AI data centers are becoming ‘mind-blowingly large’

Must Read
bicycledays
bicycledayshttp://trendster.net
Please note: Most, if not all, of the articles published at this website were completed by Chat GPT (chat.openai.com) and/or copied and possibly remixed from other websites or Feedzy or WPeMatico or RSS Aggregrator or WP RSS Aggregrator. No copyright infringement is intended. If there are any copyright issues, please contact: bicycledays@yahoo.com.

The constructing of extra highly effective information facilities for synthetic intelligence, filled with increasingly more GPU chips, is driving information facilities to monumental measurement, based on the chief government of Ciena, which makes fiber-optic networking tools bought by cloud computing distributors to attach their information facilities collectively. 

“A few of these massive information facilities are simply mind-blowingly massive, they’re monumental,” says Gary Smith, CEO of Hannover, Maryland-based Ciena.

“You’ve gotten information facilities which are over two kilometers,” says Smith, greater than 1.24 miles. A number of the newer information facilities are multi-story he notes, making a second dimension of distance on high of horizontal sprawl. 

Smith made the remarks as a part of an interview with the monetary publication The Expertise Letter final week. 

Whilst cloud information facilities develop, company campuses are straining to assist clusters of GPUs as their measurement will increase, Smith mentioned.

“These campuses are getting larger and longer,” he says. The campus, which includes many buildings, is “blurring the road between what was a wide-area community and what’s inside the information heart.”

“You are starting to see these campuses get to fairly respectable distances, and that’s placing large pressure on the direct-connect know-how.” 

A direct-connect system is a networking system that’s purpose-built to let GPUs speak to at least one different, resembling Nvidia’s “NVLink” networking merchandise.

Smith’s remarks echo feedback by others serving the AI business, resembling Thomas Graham, co-founder of chip startup Lightmatter, who final month mentioned at a Bloomberg Intelligence convention that there are no less than a dozen new AI information facilities deliberate or in building now that require a gigawatt of energy to run. 

“Only for context, New York Metropolis pulls 5 gigawatts of energy on a median day, so, a number of NYCs.” By 2026, Graham mentioned, it is anticipated the world’s AI processing would require 40 gigawatts of energy “particularly for AI information facilities, so eight NYCs.”

Smith mentioned that the pressure positioned on Nvidia’s direct-connect know-how signifies that conventional fiber-optic hyperlinks, heretofore reserved for long-distance telecom networks, will begin to be deployed inside cloud information facilities in coming years.

“Given the velocity of the GPUs, and the distances that at the moment are happening in these information facilities, we expect there’s an intersect level for that [fiber optics] know-how, and that is what we’re centered on,” Smith informed the publication. 

Latest Articles

Red Hat’s take on open-source AI: Pragmatism over utopian dreams

Open-source AI is altering every part individuals thought they knew about synthetic intelligence. Simply have a look at DeepSeek, the...

More Articles Like This