
Nvidia has made public the finalization of a long-term pact to supply Meta* with substantial quantities of its present and forthcoming artificial intelligence accelerators, alongside central processing units that serve as alternatives to offerings from Intel and AMD.
The financial specifics of this arrangement were not disclosed by Nvidia; however, it was confirmed that the agreement encompasses both the existing Blackwell processors and the upcoming Rubin AI chips. Furthermore, the deal incorporates the self-contained Grace CPU systems and the Vera line of processors.
This announcement from Nvidia comes at a time when Meta* is actively creating its proprietary AI silicon and simultaneously engaging in discussions with Google regarding the potential utilization of that company’s Tensor Processing Units (TPUs) for its AI workloads.
Ian Buck, the General Manager for Nvidia’s Hyperscale and High-Performance Computing division, highlighted that the Nvidia Grace CPUs have already demonstrated the capability to use half the power for standard workloads, such as database operations, and the subsequent generation, Vera, is anticipated to deliver even more significant performance advancements.
While Nvidia refrains from disclosing specific revenue figures from Meta, reports from Reuters indicate that Meta is counted among the four primary customers responsible for generating 61% of Nvidia’s total revenue.