For over a decade, Nvidia’s quarterly earnings have acted as a real-time barometer for AI market sentiment. Its latest results not only smashed expectations — they solidified a broader narrative that strategic investors like Kam Thindal, Managing Partner at Core Capital Partners, have been preparing for: AI infrastructure is becoming the backbone of modern economies.
The numbers speak for themselves. For the quarter ending April 27, Nvidia posted $44.1 billion in revenue, a 69% year-over-year increase, with $39.1 billion coming from its data center division alone. Despite a $4.5 billion inventory charge related to China export restrictions, Nvidia still delivered non-GAAP EPS growth of 33%. That performance, and its guidance of $45 billion for the upcoming quarter, cements its position not only as a dominant AI player but as a bellwether for where capital and compute are converging.
“AI is no longer experimental — it’s operational,” said Kam Thindal. “We’re in a phase where the real opportunity is in the infrastructure layer: compute, deployment environments, and sovereign capacity.”
Infrastructure as the Investment Thesis
CEO Jensen Huang’s framing of AI compute as “electricity for cognition” captures a key shift in how markets, and investors, are evaluating the space. The demand curve is expanding across three fronts: agentic AI that reasons in chains, physical AI in robotics and simulations, and edge inference running in PCs, vehicles, and factories. Each of these domains exponentially increases the cycles of inference and, by extension, GPU demand far beyond the cloud.
At Core Capital Partners, this evolving landscape has only deepened the firm’s conviction in backing intelligent infrastructure. Aman Thindal, the firm’s CFO, points to Nvidia’s ability to withstand geopolitical headwinds as a marker of long-term resilience. “When a company absorbs a multi-billion-dollar hit from regulatory policy and still raises guidance to record levels, it tells you how strong the underlying market really is,” he noted.
The China Gap and the Rise of Sovereign Compute
Nvidia’s strategic pivot away from China is now being matched by rapid adoption across sovereign initiatives. In Saudi Arabia, the state-backed HUMAIN program has already contracted for 18,000 Grace-Blackwell chips, with options for hundreds of thousands more. In Abu Dhabi, the Stargate initiative, led by G42, OpenAI, and Oracle, is on track to build a 1GW AI cluster, with its first 200MW phase expected online in 2026.
These developments reinforce a trend that Core Capital has been following closely: national AI infrastructure as the next capital-intensive frontier. “The movement toward sovereign compute is not just a geopolitical hedge — it’s a generational buildout,” said Kam Thindal. “And like any infrastructure wave, the early backers will capture outsized value.”
Inference at Scale: From Concept to Norm
Huang’s emphasis on a tenfold increase in AI inference token generation is more than a technical metric — it’s a signal. Inference, unlike one-off model training, reflects real-time AI use cases that are always on. To support this, Nvidia is introducing Dynamo, a high-throughput open-source inference framework replacing Triton. Early benchmarks show a 30x throughput gain on Blackwell chips.
For investors, inference-heavy workloads change the economics of compute. “When infrastructure is in use 24/7, utilization compounds, and so does return on investment,” said Aman Thindal. “This is the kind of structural shift we look for across all verticals — technology becoming foundational.”
Blackwell Ultra and the Performance Runway
Hardware execution remains a core Nvidia strength. The GB300 NVL72 rack, the flagship configuration for Blackwell Ultra, is expected to ship to OEM partners in the second half of 2025. The platform promises real-time inference speeds 30x faster than Hopper chips for trillion-parameter models, with materially lower cost per token.
The cadence has impressed the investment world. Early access is already underway through Oracle Cloud, and hyperscalers are racing to reserve capacity, suggesting that revenue from Blackwell will begin contributing as early as fiscal Q3.
A Shifting Customer Mix with Strategic Implications
Concentration risk, historically tied to U.S. tech giants, is steadily declining. Nvidia’s most recent 10-K shows three customers contributing 10% or more of total revenue, widely believed to be Microsoft, Amazon, and Google. Yet the emergence of sovereign customers, Tier-2 cloud providers like CoreWeave and Lambda, and large enterprise on-premise deployments is creating a more balanced revenue base.
“The distribution of demand is evolving, and with it comes opportunity,” said Kam Thindal. “The AI economy is no longer centralized. It’s global, multi-tiered, and increasingly investable.”
Market Context and Strategic Outlook
While the S&P 500 posted roughly 13% earnings growth in the March quarter, Nvidia’s net income jumped 31% despite its China-related charge. Its revenue grew more than five times faster than the index average. That spread explains why investors remain willing to assign Nvidia a ~30x forward P/E, an uncommon premium for a company already generating over $44 billion per quarter.
Kam Thindal sees this as validation of a broader thesis. “This is what long-duration capital looks for: dominant players in markets with asymmetric upside and limited downside. Nvidia’s role in the AI economy is structural, not cyclical.”
A Global AI Arms Race and the Role of Capital
As Nvidia continues to push hardware innovation and software integration, while navigating complex regulatory environments, its story becomes about more than chips. It becomes a roadmap for where capital, policy, and technology will collide in the coming decade.
“AI infrastructure is no longer a subset of tech investing,” said Aman Thindal. “It’s becoming a category of its own.”
And for Core Capital Partners, that’s exactly where the focus is: not on chasing trends, but on identifying the infrastructure that will define them. With Nvidia leading the charge, and global demand expanding beyond borders, Kam Thindal and his team are ensuring their portfolio is positioned not just for what’s now, but for what’s next.
Jordan French is the Founder and Executive Editor of Grit Daily Group , encompassing Financial Tech Times, Smartech Daily, Transit Tomorrow, BlockTelegraph, Meditech Today, High Net Worth magazine, Luxury Miami magazine, CEO Official magazine, Luxury LA magazine, and flagship outlet, Grit Daily. The champion of live journalism, Grit Daily’s team hails from ABC, CBS, CNN, Entrepreneur, Fast Company, Forbes, Fox, PopSugar, SF Chronicle, VentureBeat, Verge, Vice, and Vox. An award-winning journalist, he was on the editorial staff at TheStreet.com and a Fast 50 and Inc. 500-ranked entrepreneur with one sale. Formerly an engineer and intellectual-property attorney, his third company, BeeHex, rose to fame for its “3D printed pizza for astronauts” and is now a military contractor. A prolific investor, he’s invested in 50+ early stage startups with 10+ exits through 2023.