logo

Technology to facilitate business

The Modern World of Data Centers

By Keith Engelbert, Chief Technology Officer,...

The Modern World of Data Centers

The Triple Bottom Line and Data Centers

By Rob Nash-Boulden, Director, Data Centers, Black...

The Triple Bottom Line and Data Centers

CHANGE OR DIE- AND THIS CHALLENGE STARTS WITH YOU

By Matthew Stockwin, Manufacturing Director, Coats...

CHANGE OR DIE- AND THIS CHALLENGE...

The Chicago Housing Authority Collects, Monitors and Reports Big Data for Energy Benchmarking Compliance

By Ellen Sargent, Director of Sustainable...

The Chicago Housing Authority...

Cerebras Systems Releases the First-Ever Trillion Transistor Chip

By Enterprise Technology Review | Monday, March 16, 2020

The Cerebras WSE, a semiconductor chip, transforms AI ecosystems with its unique compute density.

FREMONT, CA: Cerebras Systems has launched a larger than ever semiconductor chip. The Cerebras Wafer-Scale Engine (WSE) is 56-times larger than the next-largest chip. It delivers more compute memory and communication bandwidth.

 It packs in 1.2 trillion transistors, is computer-optimized and exclusively for AI and deep learning capabilities.

The WSE is designed around an 8.5-inch square-shaped chip with a pattern of lines etched in silicon. The chip offers a parallel-processing speed to allow AI research more speedy than before.

 Comparing with the largest graphical processing unit (GPU), the Nvidia has 21.1 billion transistors. By the usage of trillion transistors and 400,000 AI-optimized cores, WSE is meant to be known as as the biggest AI chip.

" While AI is used in a general sense, no two data sets or AI tasks are the same. New AI workloads continue to emerge, and the data sets continue to grow larger," said Jim McGregor, principal analyst and founder at TIRIAS Research.

The engineers behind the design of the chip reveal that it can be used in large data centers and will help the progress of AI in every field of development. It includes self-driving cars to talking digital assistants like Amazon's Alexa.

Cerebras Systems' CEO Andrew Feldman said, "Designed from the ground up for AI work, the Cerebras WSE contains fundamental innovations that advance the state-of-the-art by solving decades-old technical challenges that limited chip sizes—such as cross-reticle connectivity, yield, power delivery, and packaging."

The advantages of WSE is that, with AI, large chips take up information more rapidly than producing the answers. As a result of this, neural network that in the past took months to train can be trained in minutes on the Cerebras WSE.

“Every architectural decision was made to optimize performance for AI work. The result is that the Cerebras WSE delivers, depending on workload, hundreds or thousands of times the performance of existing solutions at a tiny fraction of the power draw and space,” He concluded.

Cerebras Systems, founded in 2016, is a stealth mode startup with a team of experts comprising computer architects, computer scientists, deep learning researchers, and engineers of all types.

Top