Where were ye, Orange Book ...

By David Collier-Brown
December 21, 2013

Big chunks of the solution to our current "NSA" problems are a solved problem in computer science. Unfortunately, they were solved back in the mainframe era and put on the shelf as too expensive. It's now time to bring some of them back.

The Bank and I
Imagine the files and programs on my phone have labels on them. My banking programs has one label that says "The Bank", while another says "David Collier-Brown". The files it creates have the same labels, and no program can read them unless it has both. The banking program will send careful selected information to programs that have just my label on them. This happens to include my printer and email programs, so I can email or print my bank statements and holdings. It can send much more to the bank itself, labeled with both the bank and my name. Let's call these labels (M & B), for me and the bank.

When written to files, the labels take the form of public keys. Anything encrypted with my private key and the bank's public key can be ready by people who have my public key and the bank's private key. My public key is easy for anyone to get, so bank can have a copy, but the bank's private key is very closely guarded, so only the bank is likely to have it. That allows the program to send encrypted files to the bank over ordinary insecure networks without anyone being able to read them.

There are also other labels used by the machine. One of these is the family key, F, which stands for both myself and my wife. Any bank accounts we share, are labeled (F & B), and my wife and I have copies of the private key for it. That allows me to access our shared accounts as well as my private ones, but not her private ones.

My Company and I
I work for a company, C, that wants their data treated with as much seriousness as my bank account. It in turn supplies two other companies, A and B, who are competitors of each other. A does not want to share things with C if they might leak out to B. B has the same concerns about leaks to A.

We therefore have files with labels like (C & A), which are only readable by people who have been approved by both companies to use them. If I wanted to send a file to one of my colleagues who had a label of just C, it would be useless to them: they only have half of what they need to decrypt it.

What I would have to do in that case is to send the file to A's security officer, who had a program that can remove A's label from a file. If they agreed, they'd send back a copy labeled with just C. If they didn't agree, they wouldn't remove A from the label, and the file would remain unreadable to people with just label C.

If A wanted a copy readable by anyone in their company, they would have to apply to me to have me remove C from the label.

If we were writing programs with A and B, we'd have all three labels on them. If A decided that the part of the program they wrote shouldn't be given to B, their security officer wouldn't agree to take their label off.

Alternatively, A might only agree to let B have the program if everyone else in the world did too, and would only agree to take their label off if we and B would agree to remove ours, too.

This doesn't keep someone from B who's part of the three-company team from printing it out and typing it back in again, but people would notice if B were to try to walk out of the building carrying printouts with "(A & B & C) Confidential" printed in bright red across every page.

The Orange Book
What I've just described was implemented and put into use in 1985, from a project that started in the U.S. Department of Defense in 1967, to provide adequate security for computers used by the military. When it started, all there were were mainframes and ideas about future personal computers. A year later, Data General introduced the first minicomputer, the Nova. In 1984, the Apple Mac was introduced, and in 1985 the Trusted Computer System evaluation criteria was finally published as an orange-covered paperback.

A few trusted systems were built: the NSA bought one, "Dockmaster" a Honeywell Multics B2 mainframe, and Sun and H-P briefly produced machines that managed to meet the lesser B1 criteria, but just barely. The technology of the day limited its use to only substantially powerful machines, and companies who had the expertise to implement labels without cryptography.

Nowhere in the standard was there mention of the cryptography that would make it practical. Considering that public-key encryption was invented in 1976 this was nearly inexcusable.

Several other key technologies have also come into existence in then years since 1985, including system integrity features and trusted path in hardware, and multi-factor authentication for ensuring one can't log on as another. Not the least of these technologies is sheer computing power. I have more power today in my wristwatch than Dockmaster ever possessed!

It's Long Past Time
It's now worthwhile to do a serious effort to provide decent, workmanlike computer security to everyone with a laptop, pad or phone. It's time to search out the successes of the past and see if they will help us with the problems of today.

David Collier-Brown

[Ps: the poem that the "where were ye" line comes from was

Where were ye, Nymphs, in what sequester'd grove

Where were ye, Nymphs, when Daphnis pin'd with love

Did ye on Pindus' steepy top reside

Or where through Tempe, Peneus rolls his tide

- The Idylliums of Theocritus]

You might also be interested in:

News Topics

Recommended for You

Got a Question?