Larry Ellison, co-founder and CTO of Oracle, has raised concerns regarding what he perceives as a significant flaw in today’s leading large language models, including OpenAI’s ChatGPT, Google’s Gemini, xAI’s Grok, Meta’s Llama, and Anthropic’s Claude. He contends that these models are rapidly becoming commoditized due to their reliance on a shared, publicly available pool of internet data for training.

During Oracle’s fiscal Q2 2026 earnings call held in December 2025, Ellison expressed that the fundamental issue lies in the uniformity of training data among major AI players. He stated, “All the large language models—OpenAI, Anthropic, Meta, Google, xAI—they’re all trained on the same data. It’s all public data from the internet. So they’re all basically the same. And that’s why they’re becoming commoditized so quickly.” This commonality, he argues, leads to a lack of meaningful differentiation among these technologies, creating a landscape where competition is primarily based on cost and features, rather than innovation and value.

However, Ellison sees this situation as an opportunity rather than an endpoint for artificial intelligence. He envisions the next phase of AI development to be driven by systems that leverage private and proprietary enterprise data, moving beyond the limitations of publicly sourced datasets. He believes that this “second wave” of AI infrastructure will surpass the current growth driven by GPUs and public data models, leading to significant economic potential.

Ellison has emphasized Oracle’s unique position in this shift, noting that much of the valuable corporate data currently resides within Oracle’s databases, providing the company a competitive edge in developing secure, enterprise-grade AI applications. To seize this opportunity, Oracle is making substantial investments, projecting around $50 billion in capital expenditures for the full fiscal year, a notable increase from the initially estimated $35 billion. This includes plans for a 50,000-GPU supercluster powered by AMD MI450 chips expected to launch in Q3 2026, as well as the OCI Zettascale10 supercomputer connecting hundreds of thousands of NVIDIA GPUs.

Oracle’s aggressive investments are reflected in its cloud backlog, exceeding $500 billion as of late 2025, primarily fueled by escalating AI demand from enterprises. Despite this positive outlook, Ellison’s vision is met with fierce competition, as rival companies like Amazon Web Services, Microsoft Azure, and Google Cloud aggressively expand their own enterprise AI solutions. Additionally, advancements in synthetic data generation may further challenge the reliance on proprietary datasets in the future.

As the landscape of AI continues to evolve, Oracle’s focus on harnessing private enterprise data could play a pivotal role in redefining the competitive dynamics of the industry and unlocking new avenues for innovation.

Popular Categories


Search the website