This audio is generated automatically. Please let us know if you have back.
Artificial intelligence is used to protect the infrastructure of public services, to advance the point of research on renewable energies and help to allow clean energy projects – but simultaneously, technology threatens to tend the network with growth massive charge.
The BOOM of the AI -based data center could represent 44% of the American growth of the electricity load from 2023 to 2028, Bain & Company said in a October reportAnd “the request for satisfaction would require public services to increase the annual generation up to 26% by 2028”.
“We have had a relatively stable electricity request for almost two decades now,” said Neil Chatterjee, former president of the Federal Energy Regulatory Commission who joined the Aidash Advisory Council last week, a company D ‘Satellite analysis that uses AI to remotely monitor the utility infrastructure. “I do not know that political decision -makers, regulators, politicians or industry are prepared for the sharp increase in demand.”
Chatterjee said that one of the reasons why he is proud to associate with Aidash is that the company provides an example of the way in which AI “can really advance the transition of clean energy and mitigate the Risk of climate change ”by allowing public services to meet risks and challenges such as vegetation management, which costs them 6 billion at 8 billion per year.
“I really think that the climatic case for AI should be done,” he said.
The United States has around 7 million kilometers of power lines, “(more than) 200 million posts and billions of trees,” said Aidash co-founder and CEO, Abhishisk Singh. “With an increasingly evolving climate, all these assets are exposed with greater risks than five years ago. And more importantly, working conditions today, in terms of number of people we need to work – there is not enough workforce to inspect these assets more and more frequently on a large scale. »»
The recent breakthroughs in the generative AI are what allows Aidash to operate, said Singh. Before 2019, this type of use of AI was “not possible”.
At the same time, the rise of electricity demand led by AI raises “many complex questions”, said Chatterjee, as the question of the co-location of the data center.
Last month, Ferc rejected a modified interconnection service agreement This “would essentially allow a data center behind the counter in the imprint of an existing nuclear power plant,” said Chatterjee.
Although the agreement would have provided the energy center of a carbon -free basic source to the Center of Basic: “By taking it behind the counter, you potentially increase the costs for taxpayers at a time when the capacity is limited and therefore distributes higher costs among costs among costs among costs among costs among costs among costs among costs among costs among costs among costs among costs among costs among costs among costs among Costs among costs among costs among costs among costs among costs among costs among costs among costs among costs among costs among higher costs among costs among costs among more costs Fewer taxpayers, “he said. “And then if this agreement – the previous established, as it would be – would lead to a flood of other co -location transactions, then you could potentially have a challenge to adequate resources.”
“I think we need leadership at the highest level on how our AI adaptation approach will be,” said Chatterjee. “I don’t think it’s just, very frankly, that my former Ferc colleagues are caught in the middle of this dilemma compared to the cross-country center of the data center, because it is a kind of loser-perdeter.”
However, he said, it is optimistic because the dilemma “ultimately amounts to a mathematical equation” to respond to the overvoltage of demand and maintain reliability and affordability without retreating on the objectives of decarbonization that United States has established.
“It is achievable,” said Chatterjee.
Aid for permit, location and risk assessment
When the first chatgpt model was released in November 2022, the PACES software platform company was about six months old and worked to create an understanding tool by using conventional AI, said the co-founder and CEO James McWalter .
“We were in fact able to build an initial data set, but it took a lot of money, a lot of time, and immediately it was obsolete,” said McWalter. “We could not find a way to make it evolve appropriately, to serve customers at a reasonable cost. So we actually stopped this project completely, and we did not really plan to get it back. »»
Paces returned to the project after the first version of Chatgpt, said McWalter, and now uses AI tools to help clean energy developers to accelerate their projects by providing assistance for permits, locations, interconnection and environmental risk assessment.
The company is able to use AI for major data collection and classification tasks, as well as to produce and submit reports.
“Our point of view is that within two years, almost all office analyzes should be able to be automated to very great loyalty,” said McWalter, so that project developers can focus on tasks like “( Frequent) a meeting of the town hall, establishing a relationship of a relationship with a utility – all these really, really important things, they often do not have enough time to do. »»
In September, McWalter attended a Round table hosted by the White House Who invited managers of companies, hyperscalers and public services of AI to respect and discuss American leadership in AI infrastructure, and “(consider) the strategies to meet the demands of clean energy, permits and labor ”to meet the energy needs of AI.
“There is certainly these kinds of conversations that happen,” he said. “We will see with the new administration, if it continues, but there are certainly things taken into account by the people of (Department of Energy) and other government agencies.”
The mission of Pacees “is to build as much clean generation (as possible) and to move the entire economy of fossil fuels,” said McWalter. “Thus, one of the things we are currently working on is Allow large loading centers, such as AI data centers, to have additional optionality with things like a big generation in the background, a colocated generation, etc.
“Our point of view is that a large part of the type of large -scale loading of data centers, and this will simply use a lot of natural gas, unless we determine a means of adding renewable energies in the Mixture, “he said.
Help the development of nuclear merger
AI is also one of the new tools helping to accelerate research on the marketing of nuclear merger as a source of energy, according to a November Clean Air Task Force report.
“I am optimistic because there are a number of enabling technologies that make the merger possible now – before it is much more difficult,” said Sehila Gonzales de Vicente, global director of CATF merger and main author of the newspaper. “One of them is AI.”
There are a certain number of aspects in the fusion “where artificial intelligence can play a really important role and change the game in the sense that before, we have not been able to solve these problems,” said Gonzales de Vicente . “There was no easy way to resolve them.”
One of these problems is to maintain the stability of plasma whose merger reactions occur inside. Great instability, or disturbance, in plasma, will end any fusion reaction.
“Understanding the nature of the disturbances, understanding why they are produced and preventing them were really difficult,” said Gonzales de Vicente. “The approach was to mitigate them instead of preventing them.”
A central aspect of a “multi-institutional collaboration to develop a fusion data platform for automatic learning applications using magnetic fusion energy” led by the Massachusetts Institute of Technology is disturbance, “a library Python open source designed to recover, process and analyze data related to plasma disturbances, “according to the document. “A key strength of the disturbance is its ability to rationalize data recovery and generate large sets of data validated for applications (AI).”
“To be able to predict where these disturbances will take place, it is an inhuman task,” said Gonzales de Vicente. “A human being is unable to answer and solve this problem.”
With the progress of artificial intelligence and high performance IT, much more optimized modeling can be carried out inside a computer or in silico, she said. “You must always build machines, but you need to create fewer machines, and you can optimize your settings in much better.”
“Artificial intelligence is a fairly transverse tool for everything. Everything related to production design, optimization of conceptions, test conceptions is in silico, “said Gonzales de Vicente. “Now you have the possibility of not having to build any equipment or an entire plant. You can test it in silico. Imagine this, the savings in terms of time and money and the level of optimization you find yourself when you build this equipment. Before, you couldn’t have it.
Gonzales de Vicente said that it thought that artificial intelligence is useful for all industrial processes and cases where high -tech equipment is necessary, largely thanks to its ability to optimize operations.
“Over the next five years, we will have better tools. And these tools will be better developed, then we will have new tools, ”she said. “And that was the problem of merger. Fusion is a very complex technology, using machines that are very difficult to design and build. And now, this is the first time that we have all these tools in our hands to make it real. »»