Social engineering in penetration testing: Cases

* Penetration tests with a social-engineering angle

My friend and colleague Dr. John Orlando helped create the Master of Science in Information Assurance at Norwich University and has been teaching ethics courses for many years. He recently wrote a paper on the ethical dimensions of social engineering as a tool of penetration testing and has kindly allowed me to publish an edited version of his work for Network World readers.

What follows in this column and the next is entirely Orlando’s work with minor edits.

* * *

Penetration testing is an important means of assessing the strength of an organization’s information security program. A security system may look good from the inside, but a test is an excellent way to determine if it will hold up under pressure. These tests can range from simple port scans to all-out hacking attacks.

However, since security depends on people, not just on technology, social engineering is one possible tool for use in penetration tests. Deception is a common means of breaching a security system, and a social engineering test can ascertain the strength of policies and how well employees follow those policies.

However, the use of social engineering in penetration tests raises ethical issues because humans are being used for research purposes. Abuses such as Nazi experiments on prisoners and the Tuskegee Syphilis Study have led to a body of widely accepted guidelines for the ethical use of human subjects in research. I will draw upon human research principles and a few sample cases to identify ethical guidelines for the use of social engineering in penetration testing.


Piggybacking: A security consultant wearing a suit and tie, and carrying a briefcase, stands at the front entrance to a corporation. He waits for an employee to unlock the door with her ID scan and follows her in.

Shoulder Surfing: A security consultant notices employees standing outside a door smoking on their break. He walks over and mills about looking over his shoulder as employees enter the keypad code to reenter the building. With that information he lets himself in.

Computer Technician: Two security consultants walk into an office wearing “Computer Doctors” jumpsuits. They tell the administrative assistant that they have an order to fix the system. The assistant says, “Mr. Smith did not tell me about this, and he’s on vacation today and can’t be reached.” They reply, “We’re booked for the next two weeks. The system is overheating and could melt down at any moment. If it burns up because we were not allowed to work on it, somebody’s going to get fired. Are you sure you didn’t forget the order?” The assistant nervously lets them in.

Bribery: A security consultant posing as a representative of another company approaches an employee outside of work and offers him $50,000 to get some memos concerning the company’s plans for a new product.

In the next column, Orlando presents his analysis of the ethical issues presented by these applications of social engineering. In the meantime, readers may want to apply the principles discussed in the recent series of columns about ethical-decision making and come to their own conclusions before reading his comments.

* * *

John Orlando, MSIA, PhD, is Instructional Resource Manager in the School of Graduate Studies at Norwich University. He earned his doctorate in philosophy from the University of Wisconsin at Madison in 1993 and has more than a decade of experience in online university education. He teaches undergraduate ethics and philosophy courses at Norwich and can be reached by e-mail

Learn more about this topic

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.

Copyright © 2007 IDG Communications, Inc.