SambaNova, builders of the fastest chip for agentic AI, released research highlighting the mounting concerns over the energy demands of AI data centres and the impact on households and national power grids. As AI deployment accelerates, business leaders and consumers are aware that legacy, GPU-based infrastructure is not built for the efficiency and scale required in a power‑constrained world.
The survey of 2,525 adults across the US and UK shows that concern about AI’s energy appetite is no longer abstract. Awareness of AI data centres' electricity usage is widespread, and consumers are drawing a direct line between the infrastructure choices providers make and their own monthly bills.
Key findings from the AI Energy Survey
The findings echo SambaNova’s 2024 AI Leadership Survey, which exposed a readiness gap inside enterprises as AI deployments surged. One year ago, 49.8% of business leaders were concerned about AI’s energy and efficiency challenges, yet only 13% monitored the power consumption of their AI systems. Today, concern has moved beyond the data centre floor and into living rooms: while leaders still struggle to measure AI power usage, three in four consumers worry AI infrastructure will raise their household bills and strain national grids.
AI data centres must be engines of efficient growth
“The findings reveal a new reality: AI is no longer just an enterprise technology story – it is an infrastructure story that reaches all the way to consumers’ electricity bills,” said Rodrigo Liang, CEO and co‑founder of SambaNova. “Data centres are the growth engine of AI, but if they are built on inefficient hardware, that growth will come with unacceptable power and cost trade‑offs.”
“People want powerful, always‑on AI – but they also want providers to keep grids stable and energy costs under control,” Liang continued. “This is why we’re focused on building efficient systems that dramatically increase tokens‑per‑second and throughput per rack, without blowing past standard power envelopes.”
Liang added: “With our new SN50-based systems, customers can stand up high‑density AI data centres that run fleets of intelligent agents in real time while staying within 20 kW per rack and using standard air cooling – no exotic power or cooling retrofits required. The SN50 chip delivers up to 5x more compute per accelerator and as much as three times better inference efficiency than leading GPU-based systems, enabling operators to scale AI services faster, serve larger models and longer context, and still reduce total cost of ownership. This is how we turn AI data centres into efficient, high‑growth infrastructure for the next decade, instead of a drag on national power systems.”
Underestimating AI’s power implications
Last year’s AI Leadership Survey showed enterprises were racing ahead with AI adoption while underestimating its power implications: “Last year, we projected that by 2027 more than 90% of leaders would be concerned about AI’s power demands and would monitor consumption as a board‑level KPI,” stated Liang. “This new data suggests the inflexion point may arrive faster than expected: with three‑quarters of consumers worried about AI’s impact on their bills and 83% explicitly calling for energy‑efficient AI.”
SN50 Chip: Built for power‑constrained, AI‑first data centres
SambaNova’s fifth‑generation SN50 RDU chip is purpose‑built for fast inference and agentic workloads in modern AI data centres. Each 20 kW SambaRack SN50 integrates 16 SN50 processors, and up to 16 racks can be interconnected to support 256 accelerators over a multi‑terabyte‑per‑second fabric, enabling customers to deploy very large models with longer context while maintaining high throughput and low latency.
For data centre operators facing tightening power budgets and surging AI demand, SN50 turns existing facilities into high‑density AI zones, allowing them to expand capacity quickly inside current power and cooling envelopes – exactly the kind of infrastructure shift consumers are now demanding.