Back-to-back storms left 4.5m people without power in record-low temperatures. Some lawmakers were quick to blame renewable energy sources for the outage, although the actual picture was more complicated.
It’s true that nearly half of Texas’s wind-energy capacity was taken out during the storms, although twice as much was lost from other sources, most notably gas-fired power stations. At the same time, demand for electricity rocketed, leading the grid operator to instate rolling blackouts. Texas is the only state in the continental US to operate its own electricity grid, leaving it isolated from other regions and unable to import electricity in times of crisis. This winter’s deep freeze made it clear that Texas needs a more resilient energy system – and distributed generation could help.
In a distributed energy system, technologies such as solar panels, batteries and small-scale windfarms generate and store electricity near where it will be used. These multiple inputs can be coordinated by way of a smart grid, a network that facilitates a two-way exchange of data and electricity. When electricity-generating technologies are distributed across a region – rather than concentrated at a handful of nuclear or coal-fired power stations – supply is less vulnerable to shutdowns or shocks.
However, balancing supply and demand from thousands of small-scale renewable generators will not be a straightforward task. Some experts have suggested that this is where artificial intelligence (AI) will play an important role. Machine-learning algorithms have already been deployed during successful pilot schemes, including one 2015 partnership between computing giant IBM and the US Department of Energy. The company found that solar and wind forecasts produced using machine-learning techniques were as much as 30% more accurate than those generated using a conventional method.
Traditionally, projecting the output of wind and solar assets was a tricky task, with operators relying on individual weather models to provide a narrow snapshot of key variables. The lack of accurate solar and wind forecasting data meant that utility companies had to hold higher amounts of capacity in reserve to compensate for potential inaccuracies. In contrast, IBM’s “self learning” weather model was able to analyse and integrate information from many weather models and data sources. This is one of AI’s most promising attributes for the energy sector: the ability to evaluate large quantities of disparate data.
Machine-learning algorithms also have the potential to optimise the way that households can consume electricity – if they’re given access to data from smart meters. In theory, information gathered from these devices could help to direct distribution across entire cities by predicting patterns of consumption.
In some commercial spheres, AI systems are already making decisions about energy management. Five years ago, Google’s DeepMind division developed an AI-powered recommendation system to improve the efficiency of data centres. In 2018, the company announced that it was going to allow the system to directly control cooling at the facilities under the supervision of human operators. Every five minutes, the AI creates a snapshot of data centres’ cooling systems based on input from thousands of sensors.
According to a company blog post, deep neural networks “predict how different combinations of potential actions will affect future energy consumption”. The AI system subsequently identifies which courses of action will result in the lowest level of consumption. While applications like this are not yet widespread across industry, their potential for managing electricity more effectively is beyond doubt.
Want the best engineering stories delivered straight to your inbox? The Professional Engineering newsletter gives you vital updates on the most cutting-edge engineering and exciting new job opportunities. To sign up, click here.
Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.