Visualize the Virtualized Embedded World

By John Blyler

Although embedded developers live in a material world, their creations often start out and sometimes even remain in the virtual world. Virtualization--the creation of a virtual rather than actual "thing"--is not a new concept to embedded engineers. Most of them remember when virtualization meant dividing a hard drive into different partitions--typically to run MS-DOS and a Unix hybrid on the same machine.

Visualization was once the exclusive providence of mainframe system architects and programmers. Now, this method is an accepted way to share and better utilize resources on both PCs and embedded systems--from processors and memory through I/O devices. The capability to share resources is achieved through the use of a new layer of software known as the virtual machine monitor (VMM).

In traditional systems, a single OS or RTOS controls all hardware resources. This is not the case with a virtualized system, in which the virtual machine monitor regulates access to all of the underlying physical host’s hardware resources. In doing so, the VMM enables multiple operating systems to share the host’s hardware. The VMM also provides each operating system with a set of virtual interfaces that constitute a virtual machine (VM).

The usefulness of such an economization of precious resources has not been lost on Intel. Support for off-the-shelf systems based on Intel’s architecture (IA) hardware is well known. Enhancements to both processor and I/O subsystems have improved the performance overhead that is inherent in a virtualized system. Additional software improvements have opened VMs to a whole new set of applications. For more details, visit the Intel Virtualization Technology website at: www.intel.com/technology/computing/vptech.

Several technical articles in this issue of EIS magazine focus on virtualization. First, IP-Fabrics’ (www.ipfabrics.com) piece on the virtualization of network surveillance shows how application- specific programs implemented on virtual machines can help developers focus on packet processing and fight cyber-terrorism. Or consider Ardence’s (www.ardence.com) article on RTOS virtualization over a streaming network. Even the news section looks at Intel’s latest vPro technology, which includes the second-generation of Intel’s Active Management and Virtualization technologies.

But the virtualization process extends beyond the sharing of hardware systems and efficient resource utilization. Virtual-based approaches--specifically, system-accurate functional models-- have been used to optimize the design of both hardware and software embedded systems. In fact, the last several issues of EIS magazine have featured articles that dealt with the advantages of virtual prototypes for both tradeoff analysis and pre-hardware software development. Consider Graham Hellestrand’s piece on the virtual approach to the power optimization of real-time systems [Spring 2006]. For pre-hardware software design, consider Or Wloka’s and Shaviv’s article on the use of abstract yet timing-accurate functional models to enable early software development on Intel’s XScale platform [Summer 2006]. Both can be freely downloaded from the Embedded Intel Solutions website at www.embeddedintel.com.

Virtual prototypes are indeed another way to virtualize an embedded system. Perhaps the most touted advantage of these prototypes is in the creation of virtual processors that allow software developers to begin writing code before the physical hardware is available. But virtual prototypes help in the testing phase of an embedded system too.

A processor must exist in some form--either physically or virtually-- before software can be developed. Similarly, the system must exist before testing can begin. Typically, a comprehensive test lab was built to fully test the hardware and software. But as embedded systems become more complex to design, the creation and expense associated with test labs also rise.

Like their design brethren, many of today’s embedded testers have turned to the virtual world for help. Complete system simulations containing a mix of physical and virtual hardware are now being used to test large software programs both early on and throughout the development process. Software must still be tested on actual hardware before deployment. Yet much of the testing during the development, debugging, and QA phases can be accomplished on virtual models.

The great advances in hardware system modeling that have benefited the embedded-software design community are now helping the test group. Most simulators now use just-in-time compilation techniques and time compression to achieve simulator speeds as high as billions of simulated instructions per second, according to a recent IEEE Computer article [“The Virtual Test Lab” by Peter S. Magnusson, May 2005, Vol. 38, No. 5]. Such high performance means that it is now possible to run a full software load of applications. Clearly, virtualization has grown beyond the early ideas of resource sharing to include the actual design and testing of embedded software on virtual hardware models. All of these points make it very easy to visualize a virtual future.

Please share your thoughts with the embedded Intel community by e-mailing me at jblyler@extensionmedia.com.