Employees at Exelon's Colorado Bend Energy Center in Texas. The company has announced new software across its entire fleet to improve performance and reliability. Credit: Exelon Corporation

The electrical grid is looking more and more like a supercomputer. Just as Internet-enabled technology has transformed cars, phones and other everyday devices, big data and code are only beginning to reshape the megastructure that keeps the lights on.

Earlier this month, Chicago-based energy company Exelon announced it would adopt Predix, a software platform for industry applications, across its entire fleet of power plants. The aim is to use data and analytics to increase performance and reliability in Exelon Generation’s 32,700-megawatt nuclear, hydroelectric, wind, solar and natural gas portfolio.

Developed by General Electric, Predix is “a cloud-based Platform-as-a-Service (PaaS) for the industrial internet.” Essentially, Predix acts as an operating system for individual machines and the industrial networks they comprise.

Just as a computer uses an operating system to help users capitalize on its processing power, machines — from oil wells to jet engines to MRI scanners to envelop stuffers — could also benefit.

Predix also enables programmers and engineers to build applications for the industrial world. For example, ThetaRay, an Israeli cybersecurity firm, developed a third-party “Anomaly Detection Service” app on the Predix platform. The app could potentially be used at a power plant to analyze huge reams of data to find unusual patterns or other early warning signs of impending failure or cyberattack.

The overarching aim of these and other Predix-based apps is to create a “digital twin” of an industry’s physical reality, says Sham Chotai, chief technology officer of software and analytics at GE Power.

“Can I have a high-fidelity, virtual representation of my machine, and all of my assets,” says Chotai, emphasizing the questions at the heart of the so-called Industrial Internet. “And then can I take that information and knowledge, bring it back and look at my entire fleet of assets?”

Predicting grid performance

A smart network of machines is a holy grail pursued by a range of tech companies across heavy industries like energy, healthcare and transportation. By 2025, the World Economic Forum estimates that the digital transformation of the electricity sector alone will be worth $1.3 trillion across the globe.

Much of that value is in predictive forecasting. If a piece of software can tell an energy company that a substation is going to fail a week before it does, that piece of information can help avoid a costly fallout.

Earlier this month, GE also rolled out the latest release of its Predix-based Digital Power Plant software, which uses algorithms to describe and predict potential failures in gas, steam and nuclear plants. According to GE, the technology can reduce unplanned downtime by up to 5 percent, reduce false positive alerts by up to 75 percent and reduce operational costs by up to 25 percent.

Predictive forecasting is also critical for a grid that increasingly relies on distributed, intermittent energy sources like wind and solar power. Grid operators must constantly take into account current and future weather conditions and demand curves to know whether to ramp up production or scale it back.

If smart meters can convert consumers into data points on the demand side, the hope is that so-called “edge-to-cloud” software like Predix can help turn terawatts into terabytes on the supply side.

“As the only Fortune 100 company in the electricity sector, we have a unique opportunity to lead the energy industry in the exploration, development and deployment of the next generation of clean, diverse energy technologies,” Chris Crane, president and CEO of Exelon, said in a press release. “This agreement allows for enhanced collaboration between GE and Exelon to develop solutions to complex industry challenges and accelerate the adoption of new, digital technologies across our industry.”

Other energy providers have also inked deals with GE. In October, the New York State Power Authority — the country’s largest state-owned power organization — said it would use GE’s Predix-based “Asset Performance Management” software to improve power performance and reliability. GE has reportedly supplied software and analytics in a more limited capacity to more than 20 utilities worldwide.

Human components, cybersecurity

There’s also a human component to the digitization of energy. With Predix, GE aims to embrace a technologist’s ethos of collaborative tinkering among a range of diverse, third-party developers. It borrows staples of the open Internet — app libraries, hackathons and code documentation, for example — and incorporates it into the Predix platform. Anyone with the right skills and knowledge can build an app for the iPhone. The same is increasingly true for building apps for a power plant.

Predix offers would-be project developers an online interactive demo to experiment with visualizing data from a wind turbine, for instance. Need to identify “failure points in networks, such as wires, pipes, and railway lines”? There’s an app for that, too.

However, with interconnection comes the potential for leaks, hacks and other security breaches. Information security in critical infrastructure is of growing concern as utilities, energy producers and other industries connect their assets to the cloud. As far back as 2007, the U.S. government demonstrated how hackers might destroy a generator. Last December, hackers cut power to more than 80,000 people in Ukraine.

GE maintains that security is of foremost concern. Predix relies on governance, certification, transparency and other pillars of security to provide “industrial grade security that builds end-to-end trust.”

But it can be difficult to know exactly how these security measures function: They require very specific technical expertise, and revealing their inner workings could itself compromise the underlying security.

GE isn’t alone in pursuing the Industrial Internet, either. “Operational Intelligence” firms like OSIsoft, Aspen Technology and Splunk develop software to generate and analyze machine-generated data.

By the end of the decade, tens of billions of assets are expected to be connected to the Internet, according to multiple projections. It reflects a broader trend of supplementing, or even replacing, tasks once done by people for algorithm-driven, cloud-based computing.

“In the old days you’d have an engineer who could listen for a knock in the engine or  [could] feel the vibration and know something’s wrong,” Chotai says. “Today, you’ve got a set of millennials who’ve got their noise-cancelling headphones on. They’re looking for data. They want to interact with these machines in a very different way, and they need to.”

David started writing for Midwest Energy News in 2016. His work has also appeared in InsideClimate News, The Atlantic, McClatchy DC and other outlets. Previously, he was the energy editor at The Christian Science Monitor in Boston, where he wrote and edited stories about the global energy transition toward cleaner fuels.