Stealth Startup Niv-AI Unleashes GPU Power for AI

Published 16 hours ago3 minute read
Uche Emeka
Uche Emeka
Stealth Startup Niv-AI Unleashes GPU Power for AI

The burgeoning field of artificial intelligence relies heavily on electricity as a fundamental raw material. However, current processing techniques in AI are straining the capabilities of data center operators to efficiently manage their interactions with the power grid, often forcing them to reduce GPU utilization by as much as 30%. This significant inefficiency was underscored by Nvidia CEO Jensen Huang, who, during the company's annual GTC customer conference, stated, "There is so much power squandered in these AI factories" and highlighted that "Every unused watt is revenue lost," indicating a substantial economic impact.

The core of this challenge lies in the dynamic power demands of modern AI workloads. As frontier labs orchestrate thousands of Graphics Processing Units (GPUs) to train and execute advanced models, there are frequent, millisecond-scale power demand surges. These surges occur as processors rapidly transition between intensive computation tasks and communication with other GPUs. Such unpredictable and sharp fluctuations make it exceedingly difficult for data centers to maintain a stable and manageable power draw from the electrical grid.

To mitigate the risk of insufficient electricity supply, data centers currently resort to costly solutions. They either pay for temporary energy storage to buffer these power surges or, more commonly, throttle their GPU usage. Both approaches directly diminish the return on investment in expensive AI chips, presenting a critical economic and operational hurdle. Lior Handelsman, a partner at Grove Ventures and a board member for Niv-AI, emphasized the urgency of the situation, stating, "We just can’t continue building data centers the way we build them now."

Addressing this pressing issue, the startup Niv-AI has emerged from stealth with a $12 million seed funding round. Based in Tel Aviv and founded last year by CEO Tomer Timor and CTO Edward Kizis, the company is backed by prominent investors including Glilot Capital, Grove Ventures, Arc VC, Encoded VC, Leap Forward, and Aurora Capital Partners. Niv-AI's mission is to resolve this power management problem by precisely measuring GPU power consumption through innovative new sensors and subsequently developing advanced tools for more efficient energy management.

Niv-AI's initial roadmap focuses on a deep understanding of power dynamics. The company is actively deploying rack-level sensors designed to detect power usage at a millisecond resolution directly on GPUs, both on their own hardware and in collaboration with design partners. The immediate goal is to meticulously map the specific power profiles associated with various deep learning tasks. This data will then be used to develop sophisticated mitigation techniques, ultimately enabling data centers to unlock and utilize more of their existing computing capacity without risking grid instability.

Looking ahead, Niv-AI plans to leverage the extensive data collected to build a sophisticated AI model. This model will be trained to predict and synchronize power loads across an entire data center, effectively serving as an "AI copilot" for data center engineers. The founders envision their ultimate product as a crucial "intelligence layer" bridging the gap between data centers and the electrical grid. As CEO Tomer Timor explained, "The grid is actually afraid of the data center consuming too much power at a specific time."

This innovative approach offers dual benefits. On one side, it empowers data centers to maximize their GPU utilization and better leverage the power they already pay for. On the other, it fosters the creation of "much more responsible power profiles" between data centers and the grid, alleviating concerns about excessive power draw. This solution is particularly attractive at a time when hyperscalers constructing new data centers face significant challenges related to land use and supply chain disruptions. Niv-AI anticipates having an operational system deployed in a handful of U.S. data centers within the next six to eight months.

Loading...
Loading...
Loading...

You may also like...