Designing for CIA

May 28th, 2014 | Posted by Don Boylan in Availability Management | Favorite

MagnifyMany years ago, I worked for a company that specialized in producing slides from PowerPoint (remember slides, those little translucent pieces of plastic you would shine light through?). We were a Microsoft partner and what the industry called “in the box” with PowerPoint (remember when software came in boxes?). When you installed PowerPoint, the installation software would automatically set up a new printer that, when printed to, would produce a file. You would then send the file to our service center in Memphis, Tennessee via a modem hooked to your computer/telephone line (remember modems?). Because we were located in Memphis next to the FedEx airport terminal, you could upload your presentation to us at 10:00pm, and we could have the physical slides to you by 10:30 the next morning.

The reason I’m giving this background is to create a building block for ITIL’s remaining coverage of security. When ITIL was first released, it came out with a supplementary book devoted to IT security. Indeed, when I took the ITIL v2 Foundation class, the course covered the ten ITIL processes described in the Service Support and Service Delivery books, and then threw in a Xerox copy of a few pages from the Security Management book to study. Those few pages covered the only aspect of security that ITIL v3 still addresses. The rest of the security topics that were covered in ITIL v2’s Security Management book have been deferred to the acknowledged authority for IT security – ISO 27001.

ITIL v3 discusses the importance of addressing the management of data and information in the Service Design phase. The scope of  concepts surrounding the management of data has grown dramatically, but three important security aspects remain:

  • Confidentiality
  • Integrity
  • Availability

These are the CIA of IT security as it relates to data management in the design of a service. These aspects need to be addressed in the design phase, because failures in the design phase may lead to significant risks if issues arise after the technology is in production. If you want to know how significant those risks can be, ask the former CIO of Target.

That leads me back to when I worked at the slide company that produced shiny pieces of plastic from PowerPoint. Our company was the premiere service provider for PowerPoint and, at the time, PowerPoint owned both the Mac and PC presentation market. For this reason, I got to see all sorts of cool industry presentations, like previews of all the slides for the Comdex keynote addresses days before the conference started (remember Comdex?).

One day I was playing on our DEC VAX (remember DEC VAXs?) teaching myself VMS (remember VMS?) when I noticed that there had been a recent login to the DEC maintenance account. That was strange. The DEC maintenance person showed up twice a year with a mini-vacuum cleaner and cleaned the dust out the CPU cabinet, verified that all the hardware was working properly, and loaded any critical patches, but he hadn’t been around for months (um… why doesn’t Dell come out with a vacuum twice a year to clean my system?).

I started poking around into the log files and I noticed a LOT of activity by accounts that shouldn’t have been utilized in years. Programmer and administrative accounts had all been accessed for multiple hours every day for the previous three months.

Uh oh, we had hackers.

The first thing I did after telling my boss was call the FBI. Yup, back in those days, if you had a hacker break into a system, you called the Feds. We called the Feds because the hackers had uncovered issues with our system’s CIA (Confidentiality, Integrity and Availability) design.

It turns out that the program we used to upload the files created by the print driver was just a slightly modified version of a very old file transfer program called Kermit (no one remembers Kermit). If you used the DOS based version of the Kermit utility, dialed into our system via a modem to our VAX, and pressed ctrl-c after being prompted to begin the file upload, our VAX would drop you to a system prompt with admin privileges. The hackers had apparently found this defect in our security and started setting up fake accounts that mimicked programmer and admin accounts so they could dial in directly without relying on the Kermit utility.


What was worse is that when the print driver that prepared the PowerPoint file to be uploaded captured the user’s contact information, shipping address, and payment information, it simply added that information to the beginning of the PowerPoint file and gave the file a different file extension. Oh, and the information added to the beginning of the PowerPoint file was stored as plain, unencrypted text.

Oh shit! The hackers had access to all of our customers’ credit card information.

This would be an issue of a lack of Confidentiality in the design of our system. There should have been controls in place to ensure that only authorized users could see confidential data.


It is actually fairly rare for anyone to physically look at 150 slides before they start displaying them on the big screen. Wouldn’t it have been hilarious for the hackers to have secretly inserted a few well worded slides into the middle of John Dvorak’s, Jim Seymour’s, or Peter Norton’s Comdex keynote address? Therein lies the issue with our system’s Integrity design. There was no way to verify that the files sent to us by our customers were actually the files we imaged onto 35mm slides.


Lastly, our bank of modems was the starting point for all of our company’s revenue. It was a major issue when the telephone lines failed, or we had a hardware issue with any of the twelve 2400 baud modems (remember when we measured our communication speed in baud?). Our logs showed that the hackers had been using up to four of our precious in-bound telephone lines at a time, essentially locking out our paying customers from being able to send us their files. This reduction in Availability directly affected our company’s ability to make money.

The Feds discovered that the hackers were all from a college in upper New York state. They never got back to us and told us the outcome of their investigation, and we quickly added the necessary controls to block their access to the VAX, but it was a valuable lesson learned.

Design for CIA.

You can follow any responses to this entry through the RSS 2.0 You can leave a response, or trackback.

Leave a Reply

Your email address will not be published. Required fields are marked *