CIO

A power plant hack that anybody could use

Researcher Dillon Beresford has developed code that can take down Siemens industrial systems. But should he release it?

The night before the start of this week's Black Hat hacker conference here in Las Vegas, security researcher Dillon Beresford gave a demonstration to a small audience in his room at Caesar's Palace. The topic: how a hacker could take over the Siemens S7 computers that are used to control engines, machines and turbines in tens of thousands of industrial facilities.

It was a preview of the talk he was set to give Wednesday, and Beresford seemed both nervous and relieved to be finally talking to the handful of reporters and industry and government officials in the room. A few months ago it wasn't clear when or if he'd ever be able to go public with his research. Concerned that his research could be misused, he pulled out of an earlier conference to give Siemens more time to fix the problems he'd uncovered. Even now, after months of work with Siemens and the U.S. Department of Homeland Security, coordinating patch after patch for many of the bugs he's found, Beresford can't say everything he knows.

But clearly, he knows quite a lot. The question is, how much will he make public?

The NSS Labs researcher said he's found ways to bypass the S7's security measures and read and write data into the computer's memory -- even when the system has password protection enabled. He can steal sensitive information from the systems, he said. And on one model, the S7 300, he found a command shell, apparently left in the system's firmware by Siemens engineers, that he can connect to and use to run commands on the system.

After poking around for a bit he discovered a hard-coded username and password that allowed him access to a Unix-like shell program on the systems, where he can run his own commands: Username: basisk; password: basisk.

This shell is a "back door" to the system that could be misused by an attacker, Beresford said.

He also discovered dancing monkeys. This goofy graphic of four dancing monkeys was apparently an Easter egg -- a software developer's version of graffiti, left for other geeks to discover -- stuck in the S7 300's firmware.

The demo wasn't much to look at. The S7s are like futuristic grey shoeboxes with green LED lights on them. Smoking a cigarette, Beresford would type into his laptop and one by one, the machines would turn off. But considering that each one of those machines could be running a nuclear centrifuge or an elevator, the demonstration held everyone's attention.

The government official in the room Tuesday night -- a contractor from the U.S. Department of Homeland Security's Industrial Control Systems Cyber Emergency Response Team -- didn't want to be quoted. Neither did Tim Roxey, a staffer with the North American Electric Reliability Corp., the nonprofit corporation chartered with helping to keep the U.S. supply of electricity online.

Clearly both groups are interested in Beresford's work. The S7 300 systems on which Beresford found the back door and dancing monkeys are the same computers that were targeted by the Stuxnet worm, thought to have destroyed centrifuges at Iran's Natanz nuclear reactor.

For decades, makers of these industrial computer systems -- companies such as Siemens, Rockwell Automation and Honeywell International -- lived in a bubble. They built computer systems that were adapted by electrical engineers for the factory floor. It used to be that these systems operated entirely on their own, disconnected from the rest of the networked world, but gradually they've been networked with Windows computers. They are supposed to be run on networks that are physically separate from the rest of the world, but these networks can have misconfigured routers, and every time a consultant plugs a laptop into them, it's another opportunity for a virus to spread.

The problem is that these industrial systems were not built with security in mind, according to Dale Peterson, CEO of security consultancy Digital Bond. Industrial systems security experts like Peterson have known for at least 10 years that these kind of problems were coming, but not enough has been done. "We've made progress in a lot of areas, but we haven't made progress on these field devices," Peterson said.

He and other security experts say Siemens is hardly alone; that all industrial control systems suffer from the kinds of bugs that Beresford discovered.

The industry could add strong authentication control to machines like the Siemens S7, so they only run code that's given to them by trusted sources. But in a world where rebooting a computer means taking a power plant offline for a day, that's not easily done. "No one in the industry wanted to do this because of the possible consequences," Peterson said.

On the other hand, as Stuxnet has shown, the risks of a cyber-attack on these industrial systems are very real. And malicious programs wind up on factory floors all the time.

In February 2011, the two-year-old Conficker worm infected systems at a Brazilian power plant, according to Marcelo Branquinho, executive director with TI Safe, the consulting company that has been working on fixing the problem these past few months. Engineers would clean up the infection only to find it reappear on the network, most likely spread there by an infected machine that they had missed. "This is not the first Conficker infection we've seen in Brazilian automation plants," he said in an e-mail interview.

Branquinho wouldn't name the power plant, but the infection was clearly disrupting operations. The plant's management systems were freezing up and not displaying data from the field. This forced operators to control their systems the same way they did before computers -- using radios to communicate with each other.

If those infected Conficker machines had contained the type of software that Beresford has written, things would have been much worse.

This isn't the first time that researchers have released code relating to industrial systems, but past releases have focused on the Windows-based management consoles that these systems use -- not the control systems themselves. And the fact that Beresford has hacked the S7 300 -- widely used in the energy sector -- puts his work in a category by itself.

In fact, Beresford isn't sure when he's going to make the software he's written public. There are 15 modules, small programs he's written for the open-source Metasploit hacking toolkit, but he wants to give Siemens' customers time to patch their systems before he releases the code. He said that six months might be an appropriate window.

Once his code is available, anyone could use it. But Beresford believes that he's only making public what others have secretly known for a long time.

Digital Bond's Peterson says that releasing the code might be what it takes to push the industry to finally fix its security problems. "At this point, I'm like, let's give it a shot," he said. "I don't think he's telling the nasty people anything they don't already know."

Ralph Langner, one of the researchers who helped crack the Stuxnet mystery, thinks that Beresford should never release his code. "Dillon did not ask me for advice," he said. "But the advice I would give him is, 'Don't ever release the Metasploit code, because this is dynamite.'"

The Metasploit modules would make it easy for a less-skilled hacker to build software that could disrupt a power plant. And even if Siemens has addressed all of the underlying issues, it will be years before the patches are installed. One day of downtime at a power plant can easily cost the operator US$1 million, Langner said. "Don't assume that a power plant operator will say, 'I will shut my plant down for a day to install the damned patch,'" he said.

It turns out that Langner is the guy who inspired Beresford to look into Siemens systems in the first place. Because of the apparent reconnaissance work and sophisticated PLC programming involved in Stuxnet, Langner believes that only a few organizations have the technical know-how to pull something like this off.

Beresford wanted to prove that industrial hacking could be done on the cheap too. His company kicked in $20,000 to buy the Siemens systems, but Beresford did most of the work from his bedroom in a couple of weeks. "It's not just the spooks who have these capabilities," he said when he finally gave this Black Hat presentation. "Average guys sitting in their basements can pull this off."

Robert McMillan covers computer security and general technology breaking news for The IDG News Service. Follow Robert on Twitter at @bobmcmillan. Robert's e-mail address is robert_mcmillan@idg.com