- Silicon Valley's 19 Coolest Places to Work
- Is Windows 8 Development Worth the Trouble?
- 8 Books Every IT Leader Should Read This Year
- 10 Hot Hadoop Startups to Watch
Network World - Austin, Texas -- The Design Automation Conference (DAC) held here this week celebrated its 50th anniversary, prompting the industry and academic community that poured so much energy into chip, software and hardware design to both take a look back and to predict what the future may hold.
The first DAC in 1964 brought the electronic design automation pioneers together in Atlantic City, in an era led by Bell Labs and the aerospace firms, and it was the year the Beatles arrived in America, pointed out William Joyner, director of computer-aided design and test at Semiconductor Research Corp., in his talk. Joyner is credited with developing the first practicable synthetic-logic design tool, and was formerly director of research at IBM’s Thomas J. Watson Research Center.
Not unlike today, there were worries expressed back in the 1960’s that computer-based processing was going to wipe out jobs. The keynote address presented at DAC that year by a U.S. government representative raised the issue of “whether we are going to have people on breadlines because of design automation,” Joyner said. The technology explosion that followed brought more decades of DACs that highlighted the latest research from industry and academia pushing chip and software design forward.
However, as he spent time reviewing the now-bygone papers given in the first 25 years of DACs as part of his preparation for his talk, Joyner said he could not help but notice how demeaning attitudes about women were back in the 1960s. He cited a number of references in these old research papers, written by men, where technical projects they were orchestrating were ordering up a small army of “girls” to be assembled to perform manual tasks. Session chair Soha Hassoun, associate professor in the department of computer sciences at Tufts University, could only roll her eyes and express thanks she didn’t live through that.
Rob Rutenbar, professor of computer science at the University of Illinois, Urbana, then tackled a recap of the past 25 years of DAC, where the focus was on simulations, transistors, and the first symbolic model checks. The early 1990s were an age of “embedded systems” and “just big chips” where “formal verification breaks out as a category” as researchers and industry seek to become more strenuous in their designs to manage flows and power. Into the earlier part of this century, “timing and interconnect” and radio frequency came to the fore, and new frontiers arose, such as systems and heterogeneity.
In 2002, the first “network on a chip” was developed by William Dally and Brian Towles, and it “spawned an industry,” Rutenbar said. From then on, systems have been the focus, with multi-core system on a chip. “What’s new is communications” that are configurable and programmable, Rutenbar added.
The era of “complementary metal-oxide semiconductor” CMOS is largely on the way out and that of nanometers and FinFET, a term coined by University of Berkeley researchers for multi-gated architecture, has begun.