
Google has once again injected a new element into the rapidly evolving AI contest, drawing significant attention from its primary rivals.
In a statement shared on X on November 25th, Nvidia expressed its pleasure concerning Google’s achievements, noting, “We are pleased with Google’s success—they have made significant strides in AI, and we remain a supplier to Google.” However, the post immediately followed this by asserting that “NVIDIA provides superior performance, adaptability, and interchangeability compared to ASICs” (the specialized electronic circuits) such as those being developed by Google.
OpenAI CEO Sam Altman also commented on X, offering congratulations: “Bravo to Google for Gemini 3! It appears to be a formidable model.”
These public acknowledgments surfaced merely days after growing excitement surrounding Google’s Gemini 3 model and the custom hardware it utilizes. Salesforce CEO Marc Benioff indicated on X that he would not return to ChatGPT after testing Google’s new offering, stating, “The leap is astonishing—in reasoning, velocity, visuals, video… everything is crisper and quicker. It genuinely feels like the world has shifted once more.”
Furthermore, reports from The Information suggest Meta is currently in discussions with Google regarding the acquisition of its Tensor chips. This follows an announcement from Anthropic in October detailing plans to substantially boost its utilization of Google’s technological assets.
Google’s stock experienced a rise of almost 8% over the previous week, whereas Nvidia saw a modest decline of slightly over 2%.
What is at stake transcends mere accolades or a few contract signings. Given the tech sector’s conviction that AI will revolutionize global structures—impacting the investment holdings of everyone from ultra-wealthy individuals to retirees relying on their 401(k)s—the ultimate victor and its guiding philosophy could have consequences for nearly everyone in America.
On the surface, Nvidia’s communication suggests a lack of worry regarding Google encroaching on its market share. This perspective is understandable, as Google’s chips operate on fundamentally different principles than Nvidia’s standard product line, meaning they are not direct, head-to-head substitutes.
Yet, the very fact that both OpenAI and Nvidia felt compelled to publicly recognize Google’s progress is telling.
Angelo Zino, Senior Vice President and Technology Lead at CFRA, remarked to CNN, “For the moment, they are leading, let’s say, until someone else unveils the next iteration.”
Google and Meta did not furnish immediate comments when reached for statement. Nvidia declined to provide input.
The current leader
Google is far from being an outsider in the AI field. Alongside ChatGPT, Gemini stands as one of the world’s most utilized AI conversational agents. Moreover, Google belongs to the elite group of cloud service providers large enough to be termed “hyperscalers”—a designation for the few massive tech entities that provide large-scale, rented computing infrastructure to external corporations. Google’s established services, such as Search and Translate, have incorporated AI capabilities dating back to the early 2000s.
Despite this background, Google appeared largely unprepared for the disruptive arrival of OpenAI’s ChatGPT in 2022. Reports indicated that Google management declared a “code red” in December 2022 following ChatGPT’s rapid, seemingly overnight ascendance, according to The New York Times. While OpenAI reports that ChatGPT currently commands at least 800 million weekly active users, Google’s Gemini application registers 650 million monthly active users.
Attendees view the new features of the Gemini AI model during a “Made by Google” event in Mountain View, California, on August 13, 2024.
Attendees view the new features of the Gemini AI model during a “Made by Google” event in Mountain View, California, on August 13, 2024. Manuel Orbegozo/Reuters
However, Gemini 3, launched on November 18th, now occupies the top spots on various benchmark leaderboards for tasks involving text creation, image modification, image interpretation, and text-to-image synthesis, positioning it ahead of competitors like ChatGPT, xAI’s Grok, and Anthropic’s Claude in those specific domains.
Google announced that over one million users tested Gemini 3 within its initial 24 hours, utilizing both the company’s AI programming suite and the APIs enabling digital services to interface with other applications.
However, Ben Barringer, Global Head of Technology Research at investment firm Quilter Cheviot, suggests that users often employ different AI models for distinct functions. By way of illustration, benchmark tests show that models from xAI and Perplexity outperform Gemini 3 in search-related metrics.
Zino commented further, “This doesn’t automatically imply that (Google’s parent company) Alphabet will become… the definitive authority in AI,” adding, “They are simply one more component within an AI ecosystem that is continuously expanding.”
Heightened Hardware Competition
Google began developing its Tensor chips well before the recent surge in AI interest. Still, Nvidia maintains its dominance in the AI processing hardware sector, reporting a 62% year-over-year increase in sales during the October quarter and a 65% growth in profits compared to the prior year.
This strong performance is largely attributed to the power and broad applicability of Nvidia’s hardware. Nvidia, along with its main competitor AMD, specializes in chips known as Graphics Processing Units, or GPUs, capable of executing massive volumes of complex mathematical operations very rapidly.
Google’s Tensor chips fall into the category of ASICs—Application-Specific Integrated Circuits—meaning they are tailored for particular uses.
Components of an Nvidia Corp. GB3000 GPU shown during the Hon Hai Tech Day conference in Taipei, Taiwan, on Friday, November 21, 2025.
Components of an Nvidia Corp. GB3000 GPU shown during the Hon Hai Tech Day conference in Taipei, Taiwan, on Friday, November 21, 2025. An Rong Xu/Bloomberg/Getty Images
Jacob Feldgoise, a senior data research analyst at Georgetown’s Center for Security and Emerging Technology, explained via email to CNN that while both GPUs and Google’s chips can be employed for both the training and execution of AI models, ASICs are generally optimized for “more narrowly defined workloads” than GPUs are designed to handle.
Beyond the inherent architectural differences between the chip types, Nvidia provides complete technological ecosystems for data centers, encompassing not only GPUs but also other vital elements like networking components.
Moreover, it furnishes a software development environment that permits engineers to fine-tune their code, enabling applications to extract maximized efficiency from Nvidia’s hardware—a crucial factor in securing long-term client relationships. Even Google utilizes Nvidia as a client.
Ted Mortonson, Technology Sector Strategist at Baird, stated, “When you observe the sheer scope of what Nvidia offers, virtually no one can compete.”
Hardware like Google’s is unlikely to supplant Nvidia hardware in the immediate future. Nevertheless, the wider acceptance of ASICs, coupled with added competition emerging from AMD, might signal a trend where companies seek to lessen their dependence on Nvidia.
Barringer of Quilter Cheviot suggests that Google will not be the sole source of competition in AI hardware, and it is improbable that it will attain the level of market saturation seen with Nvidia.
“I view this as contributing to a balance,” he concluded.
