
New
InnovationMore in Innovation →
Sparse AI Hardware Could Cut Energy Use Without Shrinking Models
Researchers argue that new hardware built around sparsity could dramatically lower the energy and time needed to run large AI models by skipping computations involving values near zero.
Key Takeaways
- Researchers argue that sparse-native hardware can exploit the many near-zero values inside large AI models.
- A Stanford team reports a chip that averaged far lower energy use than a CPU while running faster across tested workloads.
DE
DT Editorial AI··via spectrum.ieee.org