Saving the System with Security

Can embedded security keep our military systems safe?

By Geoffrey James - Contributing Editor

Security is a big problem in the world of commercial computing. Every day seems to bring a new horror story of a corporate computer hacked, or a security flaw emerging in a major software program. One measure of the scope of the problem is the growth in sales of security software, a $9.1 billion market in 2007, with a yearto- year growth rate of nearly 11 percent, according Nicole Latimer-Livingston, a principal research analyst at the market research firm Gartner. “Prioritizing, choosing and maintaining security technologies will continue to be top issues for enterprises in 2007,” she says.

These problems come at a time when military electronics are shedding their traditional reliance on custom-designed electronics. “There is a definite trend towards the use of commercial components whenever possible,” explains former Marine Colonel Terry Morgan, director of global defense initiatives at Cisco. The U.S. military intends to use commercial technologies like WiFi and RFID to help implement something called “Network Centric Warfare,” which Tom Flynn, director of strategic initiatives and network systems at defense contractor Raytheon describes as “a network of distributed platforms that are constantly communicating with one another, creating an evolving and accurate picture of any combat situation.” (See Figure)

The use of commercial technologies in military electronics will require systems designers to pay renewed attention to system security, according to Arun Subbarao, vice president of Engineering at LynuxWorks, a company that develops embedded operating system and tools. “Great care will be needed to ensure that embedded systems remain secure from the problems that have plagued traditional data center and desktop computing,” he explains, adding that “chip designers and operating system programmers must work closely together to ensure that embedded systems remain safe and secure."

A Little History

The challenges facing embedded security are directly dependent upon major architectural mistakes made in the development of commercial computing, according to John Joseph, a system programmer who worked in operating system development in the 1970s. “The original time sharing systems, like the highly influential MULTICS, implemented rings of security that put the powerful systemlevel machine instructions out of the reach of normal applications,” explains Joseph, “The purpose of the operating system was to keep programs isolated from each other and especially from the operating system.”

Unfortunately, a decision was made at Microsoft in the 1990s to reject that kind of architectural isolation in favor of a closer relationship between the operating system and the applications running on it. (Apparently the reasoning was that such a change would make it easier for Microsoft to leverage the ubiquity of its operating systems into dominance of other software markets and the Internet.) As a result, today’s commercial systems are now highly susceptible to computer viruses and hacker attacks, according to Joseph. “Systems are noticeable less secure than they were twenty years ago,” he warns.

Embedded operating systems followed a different architectural path than generalized operating environments, according to Subbarao. “Because embedded systems have tended to be real-time and used to control discrete devices, embedded operating system tend to be less general purpose and less open to hacking, and lacking the kind of open access points that have made commercial systems so vulnerable,” he says.

The End of Isolation

Because commercial systems and products based upon them are inherently less secure, their use in military electronics opens up the proverbial can of worms. For example, imagine that you’re building an avionics controller for a fighter aircraft. It would obviously be useful to have a running report, displayed on an on-board screen or heads-up display, showing the weather conditions ten miles ahead in the current flight path.

In the past, a military electronics firm would probably have constructed an entirely secure and proprietary system for accomplishing this, presumably using scrambled data from secure military satellites. However, that kind of custom electronics work is prohibitively expensive in today’s budget conscious military electronics industry. For a fraction of the cost of developing new technology, a defense electronics firm could simply incorporate a browser into the avionics system, allowing it to access weather data on the Internet. However, if that implementation isn’t well thought out from a security standpoint, it might be possible for a terrorist or enemy combatant to use the Internet to hack the weather request to get a GPS fix on the location of the aircraft. Worst case, a hacker might figure out how to introduce a virus into the avionics controller, which might cause the aircraft to behave erratically.

The best way to keep this sort of thing from happening, according to Subbarao, is to “run two separate operating systems on the chip, with some form of data communication between the two functions,” explains Subbarao. In this case, the embedded operating system vendor is essentially using the presence of multiple operating systems to recreate the original (but neglected) “ring” structure that provided system security in the early days of timesharing.

Help from the Hardware

While embedded operating system vendors like LynuxWorks believe that it’s possible to create an acceptable level of system security inside these “virtualized” operating system environments, other elements of the commercial technology poses additional security challenges. For example, embedded systems that want to interface with devices through a standard data bus are opening themselves to hacks that examine the flow of data on that bus. “These architectures have been open for decade in order to make it easier for developers to connect new devices and new technology,” explains Subbarao, “What’s needed is a new level of hardware security that makes it more difficult to access data that’s flowing around in the commercial side of the system.”

Unfortunately, while they support the general notion of security, chip designers are loath to take the extra-effort required to implement expensive security features, according to Joe Grand, CEO for Grand Idea Studios, an electronics product development firm and expert on computer hardware security. “Embedded security is expensive and chip designers are often goaled on getting a product out the door within a certain market window rather than making sure that their product can’t be hacked,” he explains.

Despite this reluctance, there’s a growing sense in the chip design field that they must do more to support security. “Running security code on a general purpose CPU is a very power hungry way of executing the function,” says Smith, “Putting the security function in the hardware is by far the best answer.” Mukesh Lulla, president of Team F1, a privately-held supplier of OEM-ready software to the embedded systems market agrees. “A standard technique to detect viruses and hacks is to analyze data streams for certain patterns,” says Lulla, “The most efficient way to analyze hundreds of streams of data is to do the pattern matching directly inside specialized circuitry on the ASIC,” he says.

Not a Panacea

This is not to say that building security features into the chip will solve all the problems though, according to Jordan Selburn, a principle analysts at iSuppli, a research firm that studies the semiconductor industry. “There’s always going to be some kind of trade-off that needs to be made between flexibility, performance and security,” he explains.

Indeed, there may be a downside of building security into a chip because that would make it more difficult for system designers to determine whether their product is actually more secure. “In the case of software, it’s easier to examine lines of code and determine whether an application will behave in a secure manner,” explains Lulla, “But the same kind of features when implemented in hardware would require an analysis of the actual circuitry which is far more involved and difficult.” Indeed, it was recently pointed out that a minor mathematical error, introduced into a CPU chip (either intentionally or unintentionally), would make it possible for a clever hacker to completely decode the public encryption key scheme on which the Internet depends for secure e-commerce.

Since security is important and hardware security is no panacea, the most prudent approach for military electronics designers is to depend upon reliable embedded software designers, according to Subbarao. “While the software industry would like to see security features built into hardware, the electronics industry will continue to be dependent upon reputable software suppliers to ensure that their systems are as secure as possible,” he explains.

Geoffrey James is a contributing editor for Embedded Intel® Solutions and Chip Design magazines. He can be reached at:

http://www. geoffreyjames.com