Information assurance must adapt to changing technology

The new semester has begun at Norwich University and we've had a couple of class sessions for the IS340 Introduction to Information Assurance course. One of the changes I've made in the last year is that I no longer inflict death by PowerPoint on my students; instead of pontificating at them, I just distribute printouts of the lectures (six slides per page), make the PDF and PPTX versions of the slides available to the students through a folder on the course Web site and guide the students in vigorous discussion in every class meeting. We also use a learning platform (an implementation of Moodle) for online discussions, some tests, and submission of assignments.

After a first session discussing the grading plan, syllabus, requirements for essays and presentations, academic honesty and so on, we turned in the second meeting to a discussion of the history and mission of information assurance (IA).

The slides for that part of the course include pictures of computational equipment all the way back to the abacus. I asked the students what information security involved back in the days even before the abacus, when people calculated using small stones, (calculi in Latin). The stones helped the calculators keep track of the numbers they were working with. So what kinds of security issues were significant back then?

The students thought of surreptitiously removing stones in the middle of a calculation or adding stones; both would result in incorrect results. These security breaches would involve physical security. Then what, I asked, would be the key issues of information that were being affected by such manipulation of the stones? They immediately answered that playing with the stones would affect the integrity of the calculations. If the theft or insertion of stones were noticed, there could be delays in finishing the work – a denial of service.

We talked about the kinds of harm that might be caused to users of an IBM 1401 computer in, say, 1966 (that was what I personally used for FORTRAN IV-G programming when I entered McGill University that year). Did we have to cope with worms? With damage from hackers interfering with the operation of our programs? Not really: the 1401 was not a multiprocessing system: it ran exactly one program at a time for a single user. Memory was cleared after each program, so there were no residual effects of one program on the next being run. Somebody could mix up our punch cards or take some out or even add some – but they'd have to affect our program using physical means, not electronic ones.

So what's the point of discussing old technology in an IA course?

In our discussion, I made it clear that the issue is that IA must adapt to changing technology. We discussed possible security issues when direct neural interfaces allow direct brain-to-computer operations [see Wolpaw et al. 2000, “Brain-Computer Interface Technology: A Review of the First International Meeting,” in IEEE Transactions on Rehabilitation Engineering 18(2):164] and perhaps even direct computer-to-brain interactions. In addition to the usual effects of man-in-the-middle attacks in the communications channels, such interference in brain-computer-brain interactions (computer-aided telepathy) could lead to new forms of propaganda. The folks who like to wear aluminum foil deflector beanies to prevent control waves from affecting their brains might actually have a point if governments, advertisers, and anyone else could beam specific thoughts and impressions into our minds without permission.

I told my young students that whatever the change in information technology that they will face, they must adapt to the new aspects of IA. They must never become stick-in-the mud conservatives who snarl that in their day, they did things differently – and who resist change simply because they don't like it.

I told them the story of a programmer called Jacques who worked in the data center where I was director of technical services in the mid-1980s. We had just started using RELATE/3000, an early relational database management system (RDBMS) with its own fourth-generation language (4GL) for report-writing. Jacques was constantly causing the interpreter to crash on a stack overflow; I investigated and discovered that he was spelling out the precise location of every field in his reports using the RDBMS macro language instead of allowing the report writer to do its work and place the data automatically on the page – something that would have taken two lines and 60 seconds. So why was Jacques doing this? Because, he said, that's the way he had done it in COBOL for many years and it gave him the precise control over layout that he was used to.

I cheerily informed him that if he overflowed the stack again by using the macro language in this way, I'd have him fired.

He changed his programming style right away once the situation became clear to him and used the RDBMS system properly from that point on.

Professionals adapt to change. Period.

Copyright © 2011 IDG Communications, Inc.

The 10 most powerful companies in enterprise networking 2022