Trusted Computing and NGSCB

Information

The problem of insecure PCs

Today's desktop and laptop computers are essentially open platforms, giving the user-owner total choice about what software runs on them, and the power to read, modify or delete files stored on them. Using firewalls and other tools, users can also determine what kinds of communication their computers can have with the rest of the world.

    This freedom has led to problems, such as:
  • insecurity for the user, since open platforms are prone to infection by viruses, worms, and to inadvertent installation of spyware, denial-of-service attackers, compromised software, keyboard keycatchers, etc.
  • insecurity for the network on which the computer is placed, since it may have viruses and worms, denial-of-service attackers, etc., which threaten other machines on the network.
  • insecurity for software authors and media content providers, since open platforms allow programs, music files, images etc. to be copied without limit and without loss of quality.

Trusted Computing (TC)

Trusted Computing is a cluster of proposals for a locked-down PC architecture which can give guarantees about the application software it is running, and which allows applications to communicate securely with other applications and with servers. In its strongest form, pathways between the computer and peripherals such as the keyboard, mouse and monitor are encrypted. The encryption keys are built into the hardware, and are not available to the PC owner. The PC only runs the operating system if it can verify its signature, and the operating system communicates securely with servers to authenticate application software before running it (attestation).

Microsoft is one of the main drivers for TC; its version is called New Generation Secure Computing Base (NGSCB), formerly known as Palladium. NGSCB includes a mechanism for introducing TC in parallel with present-day open systems. Future versions of Microsoft Windows incorporating NGSCB will have two modes, the trusted mode and the untrusted mode. The untrusted mode will be like Windows is now, allowing near-complete freedom to the PC owner. The trusted mode will be the locked-down one. The owner need not use the trusted mode, but it will be necessary to do so in order to access certain kinds of content, such as emails and documents whose authors have imposed TC restrictions, and TC-managed media files. It will not be possible to export files from the trusted mode to the untrusted mode.

TC will require several hardware changes, to enforce tamper resistance, memory security, and encryption keys. Intel's LaGrande Technology (LT) and AMD's Secure Execution Mode (SEM) provide the hardware support for the major ideas of NGSCB. The Trusted Computing Group (TCG) is an alliance of Microsoft, Intel, IBM, HP, AMD and other companies, to coordinate these activities.

Applications of TC

The original motivation was digital rights management (DRM): music and video files will be encrypted, and can only be played by recognised application software on a TC platform. The software will prevent you from making copies, and can restrict you in arbitrary other ways, e.g. by playing files only a certain number of times, or for a limited period. Early announcements of TC included much more draconian measures, such as software which would delete ordinary applications and media files if it detected copyright violations which took place outside the scope of TC.

Current motivations and applications for TC extend way beyond DRM. Bill Gates: `We came at this thinking about music, but then we realized that e-mail and documents were far more interesting domains'. Email which cannot be printed or forwarded, and self-destructs after a specified period, opens up many possibilities. Similarly, document authors could enforce privacy by restricting the ways copies are made or extracts taken by cut-and-paste, or preventing them altogether. Organisations can enforce restrictive distribution policies on documents created by their staff, preventing leaking to journalists or competitors.

These restrictions are enforced by the software. The TC version of Microsoft Word will check whether you have the right to copy-and-paste from the current document, before allowing you to do so. You will not be able to open the document with another application, because the document is encrypted and only MS Word has the key (securely held, of course).

Distributed firewalls represent another application of TC. Traditionally, firewalls assume that everyone on the inside of the network is trusted. However, the increased use of wireless access points, dial-ins, VPNs and tunnels breaks down the distinction between inside and outside. With a distributed firewall, every node in the network runs part of the firewall, protecting the host from the network and protecting the network from the host. But how to ensure that the distributed firewall is running according to the organisation's policy? This was easy on the centrally-managed firewall, but harder on the distributed firewall where every PC user can try to tamper with the rules. TC provides an answer, by making the host's part of the firewall attest its rule set. Other TC firewall features are also possible. For example, rate limitations for preventing denial-of-service attacks can also be enforced by an attested firewall. By limiting the rate machines can send email, we can also reduce the generation of spam.

How TC works

TC-capable hardware must be manufactured with a public/private key pair. The private key is held securely by the chip, and is never released. Ideally, the manufacturing process destroys all records of the private key. The chip is tamper-proof (it self-destructs rather than gives up its private key). Memory is curtained, to prevent debuggers and other software getting the private key, for example during signing operations. Applications authenticate themselves to a server, by sending the hardware's public key to the server, together with a digital fingerprint of the application. The server checks that both the hardware and the application are trusted, before sending the content (see the attestation protocol box). Servers need to know the set of valid public keys.

Memory curtaining is a strong, hardware-enforced memory isolation feature to prevent untrusted programs being able to read the memory allocated to trusted programs. TC-compliant hardware must also have secure IO, to address the threats posed by keycatchers and screen grabbers, and sound recording devices. A keycatcher is a hardware device between the keyboard and the computer, which records what you type. A screen grabber records what is displayed on the screen. Secure IO can also guarantee that input is provided by a physically-present user, as distinct from another program impersonating a user.

Why TC is a bad thing

TC has been much criticised by respected commentators, and with good reason. It removes control of the PC from its owner/user, and gives the control to the software and content provider. This can easily be abused. For example, TC can enforce censorship; if someone writes a paper that a court decides is defamatory, the author can be compelled to censor it by withdrawing all access rights -- and the software company that wrote the word processor could be ordered to do the deletion if she refuses [3].

TC will also allow software companies to increase their monopolies. You may feel it is hard to migrate from MS Office today, because it is accepted as an industry standard. But in a TC world, this lock-in will be even harder to break. Companies will receive TC-Office documents, and will need TC-Office to read them. Moreover, they will need to keep paying the rent for TC-Office in perpetuity, if they want to continue to have access to their archives. Home users will need it too, in order to read their gas bill.

"Trusted Computing" means PCs are more trustworthy from the point of view of software vendors and content providers, but less trustworthy from the point of view of their owners. It means your computer is working for other people, not for you. It gives them complete power over what your computer does, and it prevents you from even knowing in what ways it is using against you information which you have provided.

Will TC take off, or will it die?

Replacing the PC with a closed platform is obviously impossible, because PC buyers precisely value the fact that they can run any software they like, and they have control over their computer.

TC allows the freedom of the open platform to coexist with the security of the closed platform. It allows the restrictions to be introduced gradually. Users' objections will be assuaged by the reassurance that TC can be turned off, so it needn't seem such a threat. But eventually the price of turning it off will be too great. At work, you will need it to read TC'd emails and documents that are being sent to you. At home, you will need TC to communicate with your bank, your city council, and your entertainment provider; and you will need it because it will become your company's policy for teleworkers. Increasingly, the peripheral hardware you buy for your computer will only work with the TC mode of your computer. The non-TC world will continue to exist, but soon it be perceived as GNU/Linux is today: great because it gives you more freedom, but a pain because it gives you less choice.

The counter-argument is persuasive too. People won't use it if it stops them doing what they want to do. Why should I rent music from Sony, and put up with all the TC restrictions, when I can have it for free, and without restrictions, from my friend who has produced an open MP3 version? The first bank that enforces TC will find its customers preferring to move to another bank. If TC is a way of make the Chinese pay for software, the Chinese pay for software, they won't use it. Nor will students, hobbyists and enthusiasts. If these people don't use it, it may fail. Even if you are prepared to try to live with the costs and restrictions that TC seeks to impose, the sheer difficulty of coping with its constant checking, attesting, and nannying may make you turn it off. Companies can't even tolerate the intrusion of Microsoft's XP activation feature, so still less will they tolerate attestations on the network and through the firewall every time their staff wants to open a document.

Another argument why TC won't work for DRM is that there will be cracks and workarounds. Note, however, that cracking TC is harder than (say) cracking the DVD encryption mechanism CSS (which has been cracked). That is because TC is designed not to be BORE: break-once, run everywhere. BORE means that one person cracks it, and everybody benefits. Individual hardware keys and the attestation protocol help ensure that the fact of one person cracking it doesn't help anyone else.

So cracks may take longer, but workarounds are relatively easy. Even if the pathway is encrypted all the way to the speaker, I can still record what comes out of the speaker, and then create an MP3 out of that. That's BORE. Some quality is lost, for sure, but only once.

If I want to forward a TC'd email with non-forward restrictions, I may have to resort to taking a digital photograph of my screen displaying the email. I can email that to whoever I like, together with an OCR'd version for easier reading.

So in the end, TC won't work. And there are reports that even Microsoft beginning to think that. The problems it addresses will be solved by other means. For media content, easy distribution on the web will never go away, so we will have to find other ways of rewarding artists. In the office, TC offers a sledge hammer to crack a nut; the problems of privacy, confidentiality and authenticity can be solved more readily and more fairly by open technologies like PGP.

Resources

  1. Microsoft's papers including some technical information.
  2. Tal Garfinkel, Mendel Rosenblum, and Dan Boneh. Flexible OS Support and Applications for Trusted Computing gives some detail on a possible protocol (described in these notes).
  3. Ross Anderson's Trusted Computing FAQ is an excellent source of information, and has lots of links/references.
  4. Richard Stallman, Can you trust your computer?
  5. Seth Schoen, Trusted computing: promise and risk.