Find schools
When you click on a sponsoring school or program advertised on our site, or fill out a form to request information from a sponsoring school, we may earn a commission. View our advertising disclosure for more details.
When you click on a sponsoring school or program advertised on our site, or fill out a form to request information from a sponsoring school, we may earn a commission. View our advertising disclosure for more details.
Dr. Jiang Hu is a professor and co-director of graduate programs in the Department of Electrical & Computer Engineering at Texas A&M University. He received his BS in optical engineering from Zhejiang University, and a PhD in electrical engineering from the University of Minnesota.
Dr. Hu’s research interests include electronic design automation (EDA), computer architecture, approximate computing, and machine learning for EDA.
Dr. Hu has received five IEEE Conference Best Paper Awards, the IBM Invention Achievement Award, and the Humboldt Research Fellowship. He has served on the IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems editorial boards and the ACM Transactions on Design Automation of Electronic Systems. He was the general chair of the ACM International Symposium on Physical Design 2012, and was named an IEEE fellow in 2016. Dr. Hu co-authored Machine Learning Applications in Electronic Design Automation, released in 2022.
“Today, many people are working on AI and ML for chip design,” Dr. Hu says. “It’s a definite trend not only in the research field, in academia, but also in the industry.”
The first major application of AI and ML in chip design is what’s known as design prediction. It stems from the fact that chip design is extremely complex, and decisions early on in the process have significant downstream effects. But it’s hard to gauge what those effects will be until they appear. Going back and making changes is extremely costly in time and other resources.
For decades, engineers have attempted to solve design prediction, either through simple equations and formulas, or through “quick-and-dirty” prototyping that shirks full-fledged design in pursuit of a rough framework; neither has been wholly successful. Conventional methods must start from scratch each time, even if they’re successful in one situation. The result is an enormous drain on resources.
“The big difference is AI/ML can extract and reuse knowledge,” Dr. Hu says. “We can use data to train an AI/ML model to quickly predict what the impact of an early decision will be on the downstream steps, and that prediction will be much faster than the traditional methods.”
The other main application of AI and ML in chip design is through flow parameter tuning. There are many steps in the flow of chip design, and each step has its own software tool, designed by EDA professionals. The tools are extremely complex, with their own suite of parameters, which need to be specified by users to determine the tradeoffs of a particular chip’s design (e.g., faster processing speed versus lower power consumption).
“In the past, it was a bit of an art, with designers setting up parameters according to their experience,” Dr. Hu says. “There was no systematic way of doing it. But this is where AI and ML can help, in what we call automatic flow parameter tuning.”
Design prediction and automatic flow parameter tuning are what Dr. Hu calls the low-hanging fruit of AI/ML applications for chip design. But he also notes that new research results are being seen in using reinforcement learning, like the kind used by Google’s AlphaGo, in chip design. These techniques could help in the macro placement of chip designs or analog chip designs to tune the size of transistors.
AI probably won’t be taking away the jobs of many engineers, but it will become a coworker to more and more engineers as time goes on. AI is best at taking over tedious and repetitive tasks. Ideally, it can function as an engineer’s creative assistant.
But the proliferation of AI in different engineering disciplines, like chip design, means that engineers and engineering students may need to adjust their skills.
“It’s important for chip designers and others working in electronic design automation to understand basic AI and ML techniques,” Dr. Hu says. “TensorFlow by Google and PyTorch by Meta are very well-known platforms, and it’ll be good for engineering students to be familiar with those tools and to understand the basic concepts of AI and ML.”
At the same time, the next generation of engineers can’t neglect conventional techniques. It’s tempting for engineering students to spend all their time on AI and ML tools, but there will be some deficiencies without fundamental skills in C++ programming and algorithm design.
“AI and ML are powerful, but they’re not everything, and they have their own weaknesses,” Dr. Hu says. “We still need to have conventional techniques.”
One of the major limitations of AI and ML techniques is that they can’t guarantee accuracy. A human brain can say with certainty that two plus two is four, but AI and ML models can’t. That lack of final precision means human involvement and conventional techniques are crucial, particularly in signoff tools: a category that includes tools that evaluate whether a particular design is ready for fabrication, and those that ensure that a designed chip runs at its stated specifications.
“There are tools for doing this type of work, but it’s not what AI and ML are good at,” Dr. Hu says.
AI and ML are changing the future of chip design, finally addressing what’s known as the design productivity crisis. The crisis stems from the fact that the average transistor count of any given chip is continuing to rise, creating a consistently increasing complexity with which society’s engineering capability has had trouble keeping pace.
“There’s been an increasing gap between chip complexity and design productivity,” Dr. Hu says. “But now, with AI and ML techniques, we have a hope to reduce this gap.”
Successful applications of AI and ML in chip design have reflexive benefits. As chip design gets cheaper and more efficient, more chips will be available to help boost the power of AI and ML computing. Currently, large language models (LLMs) like ChatGPT need to be trained, and the training is costly, requiring tens of thousands of high-end GPU boards. Still, if AI and ML reduce the costs of chip design, and make those chips more powerful and more effective, they can participate in a virtuous cycle where AI makes better chips that make better AI. The AI is, from a certain perspective, building itself with help from engineers.
“It is indeed an exciting time to be working on AI and ML for chip design,” Dr. Hu says.
The ability of a computer to learn and problem solve (i.e., machine learning) is what makes AI different from any other major technological advances we’ve seen in the last century. More than simply assisting people with tasks, AI allows the technology to take the reins and improve processes without any help from humans.
The 12th annual National Robotics Week (RoboWeek) takes place April 2-10, 2022. Established by Congress in 2010, this tech-focused week is about demonstrating the positive societal impacts of robotic technologies, and inspiring students of all ages to pursue careers related to Science, Technology, Engineering, and Math (STEM).
Why are women underrepresented in engineering, the top-paying undergraduate major in the country? Why does a disproportionate amount of engineering research funding go to men? Which schools are actively creating opportunities for women? Which female engineers are leading the way? Find out here.
Field engineering is a crucial discipline within the broader engineering landscape, focusing primarily on the on-site implementation, troubleshooting, and maintenance of engineering projects. Field engineers are tasked with applying technical knowledge in real-world settings, often collaborating with construction personnel, project managers, and clients to ensure that projects are executed according to specifications and within the allocated timelines. Their role demands high technical proficiency, adaptability, and problem-solving skills, as they must swiftly address any challenges that arise on-site.
Today, digital twins are not limited to just physical objects. With the rise of virtual and augmented reality technologies, digital twins can now replicate entire environments and systems in a virtual space. This has opened up new possibilities for testing and simulation, allowing companies to reduce costs and risks associated with physical prototypes.