Austin, Texas -- The Design Automation Conference (DAC) held here this week celebrated its 50th anniversary, prompting the industry and academic community that poured so much energy into chip, software and hardware design to both take a look back and to predict what the future may hold.
The first DAC in 1964 brought the electronic design automation pioneers together in Atlantic City, in an era led by Bell Labs and the aerospace firms, and it was the year the Beatles arrived in America, pointed out William Joyner, director of computer-aided design and test at Semiconductor Research Corp., in his talk. Joyner is credited with developing the first practicable synthetic-logic design tool, and was formerly director of research at IBM’s Thomas J. Watson Research Center.
Not unlike today, there were worries expressed back in the 1960’s that computer-based processing was going to wipe out jobs. The keynote address presented at DAC that year by a U.S. government representative raised the issue of “whether we are going to have people on breadlines because of design automation,” Joyner said. The technology explosion that followed brought more decades of DACs that highlighted the latest research from industry and academia pushing chip and software design forward.
However, as he spent time reviewing the now-bygone papers given in the first 25 years of DACs as part of his preparation for his talk, Joyner said he could not help but notice how demeaning attitudes about women were back in the 1960s. He cited a number of references in these old research papers, written by men, where technical projects they were orchestrating were ordering up a small army of “girls” to be assembled to perform manual tasks. Session chair Soha Hassoun, associate professor in the department of computer sciences at Tufts University, could only roll her eyes and express thanks she didn’t live through that.
Rob Rutenbar, professor of computer science at the University of Illinois, Urbana, then tackled a recap of the past 25 years of DAC, where the focus was on simulations, transistors, and the first symbolic model checks. The early 1990s were an age of “embedded systems” and “just big chips” where “formal verification breaks out as a category” as researchers and industry seek to become more strenuous in their designs to manage flows and power. Into the earlier part of this century, “timing and interconnect” and radio frequency came to the fore, and new frontiers arose, such as systems and heterogeneity.
In 2002, the first “network on a chip” was developed by William Dally and Brian Towles, and it “spawned an industry,” Rutenbar said. From then on, systems have been the focus, with multi-core system on a chip. “What’s new is communications” that are configurable and programmable, Rutenbar added.
The era of “complementary metal-oxide semiconductor” CMOS is largely on the way out and that of nanometers and FinFET, a term coined by University of Berkeley researchers for multi-gated architecture, has begun.
“Suddenly, trust and security show up as topics,” said Rutenbar, alluding to papers given at DACS in those years. “Can I trust your fab, your chip?”
Lithography, optics, and statistics gained interest, and from the years 2009 on synthesis and lay-out verification and test reliability gained prominence. It’s now acknowledged that it’s necessary to “deal with a gigantic scale” in an era where smartphones, cars, and healthcare represent new demands in computing, said Rutenbar.
And what of the next 25 years to come? Leon Stok, vice president of electronic design automation at IBM, addressed that question in his presentation, noting his ideas were also intended to reflect some thoughts he gathered in an informal survey of colleagues.
First off, Stok said the industry might well be concerned that the best engineering talent is leaving the field to join the social-networking players that are also shaking things up, “the Googles and Facebooks of the world.” He emphasized the design-automation industry needs to make a “compelling argument” to retain talent and help it grow.
It’s time for another “revolution” to come up with a new means to get to the heart of “doing something useful” — the goal of systems — rather than the vast computational works that dominate. Design automation today relies on different tools cobbled together. But the age confronting designers is that of the cloud, Stok said. “We have to get data from anywhere, any place,” he said. “We need scalable design and methodologies.”
One question is whether the future of electronic-design automation should be more oriented toward open source, Stok said. “Currently, everything is locked up.” And he posed the question for the future: “Would open source help to create a platform?”
Ellen Messmer is senior editor at Network World, an IDG publication and website, where she covers news and technology trends related to information security. Twitter: MessmerE. E-mail: email@example.com.