DEC co-founder writes memoir, traces company's rise and fall

Computer pioneer Harlan Anderson on competing vs. IBM, man/machine interaction and Digital Equipment's lack of software expertise

Harlan Anderson, who founded Digital Equipment Corp. with Ken Olsen in 1957, has written a new book on his days as a computer pioneer: "Learn, Earn and Return: My Life as a Computer Pioneer," published by Locust Press. In it, he chronicles his humble beginnings on an Illinois farm up through his first interactions with computers at the University of Illinois; large-scale projects at MIT's Lincoln Lab;, and then founding, growing and watching, from afar, the ultimate demise of DEC.

Harlan Anderson, who founded Digital Equipment Corp. with Ken Olsen in 1957, has written a new book on his days as a computer pioneer: "Learn, Earn and Return: My Life as a Computer Pioneer," published by Locust Press (read an excerpt here). In it, he chronicles his humble beginnings on an Illinois farm up through his first interactions with computers at the University of Illinois; large-scale projects at MIT's Lincoln Lab;, and then founding, growing and watching, from afar, the ultimate demise of DEC. Anderson shared some thoughts on DEC and computer advancements in general with Network World Managing Editor Jim Duffy.

What made you and Ken Olsen decide to start Digital Equipment?

We were restless. This giant [SAGE Air Defense] project at MIT was moving into the implementation phase. No longer was it the forefront of adventurous exploration of new technology. One of the things that was not uppermost in our mind was going out to make lots of money.

Slideshow: 10 Unsung Fathers of Technology 

What was DEC's biggest contribution to the industry?

Bringing the man/machine interaction capabilities to the commercial world. That was the initial important contribution. Software…was never Digital's strong suit for many, many years. The other thing was, the computer – compared with IBM machines at the time – was cheap enough that it was economical to let somebody tinker with it. It was uneconomical for a several million dollar machine to be tied up by one programmer. Time sharing was an attempt to get around that. Time sharing was the concept to allow diddling.

What was DEC's biggest mistake?

There weren't a lot of serious mistakes until near the end because it had an enormous rise. The decline occurred for two reasons: when the VAX strategy was wearing out and no longer propelling the company, there was no new computer strategy that could come close to that. So it was lack of a new computer strategy; the second was the innovation in the industry was moving from hardware to software, which was not part of DEC's strong suit. They did software because they had to but they were certainly not a leader in it.

Another was when the personal computer came along the industry moved to standards. Microsoft Windows became the default standard and all applications were programmed to work with that operating system standard. DEC had thrived on proprietary things which were not compatible. [Legendary DEC engineer] Gordon Bell pointed out that the company had the technology – it knew how to build the personal computer, it had the Rainbow – but there was a lot of infighting at the company at the time about which one of the different approaches was the right one to take, but neither was compatible with the industry. The matrix form of organization made it very difficult to make decisions. Decisions had to be based on consensus and you had two competing groups that were strong in their different ideas in how to do it. They really missed the boat for a variety of reasons but that's probably the biggest thing that started the downward spiral of the company.

How did it feel to realize you were going to compete with IBM after years of collaborating with them at MIT?

That wasn't the primary focus, worry or concern because we didn't plan to compete directly with them. We were trying to nibble around the edges. The investors encouraged us to change the name from Digital Computer Co. to Digital Equipment. We didn't feel we were competing with IBM at that time, and for a long time. We sold a lot of computers into high energy physics research where they were used by physicists to track the results of their experiments. They collected tons of data that they couldn't analyze manually.

Our first computer was called the Program Data Processor (PDP). It kept us out of the gun sights of IBM. We never would have guessed in the beginning where our market was going to be. It was basically high-speed, high-performance (computers), without software, and it was kind of the better mousetrap concept. It was for those people trying to use computers for unusual applications like communications, pattern recognition, etc. Those people would recognize the inherent quality of the computer, its performance, and weren't overwhelmed by the fact that we were not prepared to provide a lot of software, nor back-up support or hand holding like IBM was well known for.

What do you consider the most groundbreaking development in computer technology?

From a hardware standpoint, it was magnetic core memory. Its predecessor was storage tube technology which was cumbersome, not reliable, and expensive and slow. It was really choking the future of big, high-speed computers.

The most significant development in terms of software and applications came directly out of MIT – the ability for man and machine to work together in real time with real data. The ability to collect data in digital form over communications lines and analyze it and combine it from different sets and make a meaningful picture.

What's been the most significant development in the computer industry over the last 40 years?

The fundamental technology thing that I think was crucial in propelling the industry was cheap memory. When we developed the memory test computer, that magnetic core memory was costing a dollar a bit. Now for a couple of bucks you get a billion bits. That has allowed lots of things that would have been totally uneconomical. The idea of computers being fast enough to generate frequencies that sounded like music have been around a long time but it was not practical other than just a stunt, because memory was too expensive.

The idea of giant databases – Google's ambitious thing to digitize every book in the world – that's awesome. And the government's idea to digitize everybody's medical records is another awesome, and scary, thing. All of that's possible because of cheap memory.

Another significant development was fiber optics and what it's meant to cheap communications. Put those two together and it's pretty all-encompassing.

Did you ever foresee the impact the computer age would have on society?

No way. To us it was just a challenge to make things cheaper, and stay alive. One of the very important things to us was, we stumbled into a group of extremely intelligent visionary users of our equipment.

What's the most important contribution computers could make to society? What's the most dangerous?

The biggest contribution is probably bringing educational opportunities and information to remote parts of the world, where it's been difficult for economic and social advancements to occur because they were isolated. Creating opportunities related to better information sources is the broad (contribution) that computers can bring.

The danger is a threat to our privacy and too much information of the wrong kind in the hands of the wrong people. We make a lot of charitable contributions and lots of the organizations find it easy and convenient to put on their Internet Web sites a list of all the contributors. What happens? Google tracks down all of this stuff and I'm kind of embarrassed they're there. I never meant that to be a public endorsement of that organization or an invasion of my privacy. And yet it just happened. So I would say government and other invasions of people's privacy is a new, or growing, danger.

Learn more about this topic

Compaq ends Digital Equipment era

Sale of Digital's net biz puzzles users

The 11 most influential microprocessors of all time

Editors' Picks
Join the discussion
Be the first to comment on this article. Our Commenting Policies