Google’s Power Play: Owning the Stack, Risking the Crown

orsiri
05-07 08:57

Reinvention in Plain Sight

Alphabet is no longer the market’s favourite ‘obvious’ company, which is precisely why I find it compelling. At just under $4.7 trillion in market capitalisation and trading near its highs, it looks fully appreciated on the surface. A trailing P/E of roughly 29 and a forward multiple above 30 do not scream bargain. Yet those numbers obscure a deeper transition: Alphabet is shifting from a predominantly advertising-driven enterprise into a vertically integrated AI infrastructure company.

This is not a cosmetic pivot. It is a structural rewrite of how revenue is generated, how costs are controlled, and how competitive advantage is sustained. The market, in my view, is still pricing Alphabet as a highly efficient incumbent rather than a company attempting to control the full economics of the AI stack.

Control the stack, and you control the outcome

The TPU Gambit: Cost Control Disguised as Innovation

Alphabet’s decision to externalise its Tensor Processing Units is the most under-analysed development in its current strategy. For years, TPUs were an internal efficiency lever. Now they are becoming a commercial product.

This move is less about competing head-on with GPUs and more about reshaping cost curves. By designing and deploying its own silicon, $Alphabet(GOOGL)$ reduces its dependency on third-party suppliers and gains tighter control over inference economics. That matters because AI profitability is increasingly determined not by capability, but by cost per computation.

Google Cloud’s 21.8% year-on-year growth is often attributed to demand for AI workloads. I would argue the more interesting driver is supply-side efficiency. When infrastructure is purpose-built, margins expand quietly. Alphabet is not simply selling compute; it is monetising its ability to deliver that compute more cheaply than peers.

Here is an insight that tends to be overlooked: if inference costs fall materially—and early signs suggest they are—Alphabet is structurally positioned to capture that decline as margin expansion rather than pass it through as price cuts.

However, this is also where the bear case becomes genuinely uncomfortable. The AI developer ecosystem is deeply entrenched in NVIDIA’s CUDA stack, which is not just a software layer but an entire tooling, optimisation, and talent ecosystem built over more than a decade. Switching to TPUs is not a simple hardware decision; it requires retraining teams, rewriting workloads, and accepting performance trade-offs in non-optimised use cases.

In effect, CUDA is a form of economic gravity. Even if TPUs offer superior cost efficiency for specific workloads, enterprises must weigh that against migration risk, engineering cost, and the loss of ecosystem flexibility. Multi-cloud strategies further complicate this, as firms prefer portable architectures rather than ones tightly coupled to a single provider’s silicon.

This creates a meaningful adoption ceiling. TPUs can thrive internally and among tightly integrated partners, yet still fall short of becoming a broad industry standard. If that happens, Alphabet captures efficiency but not ecosystem dominance—and the valuation uplift tied to external monetisation begins to look optimistic.

Gemini 2.0: Fewer Clicks, More Value?

The integration of Gemini 2.0 into Search and Workspace is often framed as defensive. I think that misses the more interesting angle: Alphabet is attempting to change what it monetises.

Traditional search relies on volume—more queries, more clicks, more ads. AI-enhanced search shifts the focus towards intent. Fewer links, but higher relevance. In theory, that allows Alphabet to extract more value per interaction.

This is already bleeding into Workspace, where AI features are driving subscription growth tied to measurable productivity gains. That introduces recurring, enterprise-style revenue with higher visibility and operating leverage.

With operating margins above 36% and net income exceeding $160 billion, Alphabet has the financial strength to absorb this transition. The key question is whether the new model enhances or dilutes its core economics.

The risk is sharper than it appears. AI-generated answers compress ad inventory, and while higher intent may offset that, the elasticity is unproven at scale. Even marginal declines in revenue per query would have significant aggregate effects.

A second, underappreciated dynamic is session compression. As answers become more complete, users spend less time searching. That improves user experience while quietly undermining engagement-driven monetisation. If monetisable surface area shrinks, the burden shifts to subscription and productivity revenues to scale faster to compensate—raising the bar for Gemini’s commercial success.

Waymo: Scaling Physics, Not Just Software

Waymo is often described as a future profit engine, but its present reality is far more constrained by operational physics than software elegance.

The challenge is not whether autonomous driving works—it increasingly does—but whether it scales economically across fragmented urban environments. Each city introduces regulatory variation, mapping requirements, and safety validation processes that do not scale cleanly. Expansion is not replication; it is requalification.

Unit economics are equally critical. At current utilisation levels, the cost per mile must absorb vehicle depreciation, sensor maintenance, fleet management, and remote operations support. High utilisation can, in theory, drive attractive margins, but achieving that utilisation consistently across cities is non-trivial. Idle vehicles are not just inefficient; they are capital sitting still.

This leads to a more grounded bear case: Waymo’s path to profitability depends on synchronising three variables—regulatory approval, rider demand density, and fleet utilisation. If any one of these lags, returns compress quickly.

There is also the question of capital intensity. Scaling a robo-taxi network requires continuous fleet expansion and infrastructure investment. Unlike software, this is not infinitely scalable at near-zero marginal cost. If revenue growth does not outpace capital deployment, Waymo risks becoming a structurally dilutive business.

The strategic upside remains real, particularly as a non-correlated revenue stream and data engine. But the path there is narrower and more execution-dependent than the narrative often suggests.

Financial Deep Dive: The Cost of Owning Everything

Trend intact—but increasingly priced for perfection

Alphabet’s financial strength is unquestionable, but its direction of travel matters more than its current position. Revenue of $422.5 billion and operating cash flow of $174 billion provide immense capacity—but that capacity is being deployed into an increasingly capital-intensive model.

Levered free cash flow of $27.5 billion highlights the scale of ongoing investment. This is not incidental spending; it is the financial expression of the TPU and Gemini strategy. Data centres, custom silicon, and AI infrastructure are absorbing capital today in exchange for margin control tomorrow.

The key linkage to the thesis is timing. TPU-driven cost advantages and Gemini-driven revenue shifts must materialise before capital intensity normalises. If they do, Alphabet emerges with structurally higher margins and tighter control over its economics. If they do not, the company risks a prolonged period of compressed returns on invested capital.

This is the trade-off at the heart of the story: Alphabet is front-loading investment to internalise future profit pools. The market is already pricing in a degree of success. The risk is not that the strategy fails outright, but that it takes longer—or delivers less—than expected.

Competitive Fault Lines: Integration vs Dependency

The competitive landscape becomes clearer when viewed through integration strategies.

$Microsoft(MSFT)$ has prioritised speed through its OpenAI partnership, embedding advanced AI rapidly across its ecosystem. That delivers immediate relevance, but introduces economic sharing and strategic dependency.

$Alphabet(GOOGL)$ is pursuing control. By owning its stack, it seeks to internalise margins and dictate its own roadmap. This creates execution risk, but also the potential for margin sovereignty.

$NVIDIA(NVDA)$ remains the gatekeeper of AI infrastructure economics. Alphabet’s TPU push is one of the few credible attempts to dilute that influence, shifting pricing power from a single supplier towards a more competitive dynamic.

$Apple(AAPL)$ introduces the most structurally disruptive alternative. Its on-device AI strategy challenges the assumption that intelligence must reside in the cloud. If this model scales faster than expected, it does not simply slow Alphabet’s cloud growth—it undermines the premise that centralised infrastructure will capture the majority of AI value.

Momentum, Consensus, and Fragility

Momentum is strong; expectations may be stronger

Alphabet’s outperformance—24% year-to-date versus 6% for the S&P 500—reflects strong institutional conviction. With over 80% ownership and relatively low short interest, positioning is crowded in a quiet, confident way.

That becomes relevant when expectations are high and narratives are aligned. In this setup, downside is often triggered not by catastrophic failure, but by incremental disappointment. A single earnings miss tied to weaker cloud margins, slower TPU uptake, or softer AI monetisation can force institutional rebalancing. Given the scale of ownership, that rebalancing is not passive—it can create sharp, liquidity-driven drawdowns as large funds adjust exposure simultaneously.

In other words, the risk is not panic selling. It is coordinated realism.

Dominance scales beautifully—until the strain quietly compounds

Verdict: Control Comes at a Cost

Alphabet is attempting to do something few companies manage: own the entire economic stack of a technological shift while it is still unfolding.

The logic is compelling. Control the infrastructure, compress the cost base, reshape the revenue model, and introduce new growth vectors. If it works, Alphabet does not just participate in the AI economy—it defines it.

But the risks are equally structural. Ecosystem inertia may limit TPU adoption, AI may dilute rather than enhance search economics, capital intensity may persist, and decentralised AI paradigms may erode the value of centralised infrastructure.

My view is cautiously constructive, but with a clear caveat. Alphabet’s strategy is not wrong—but it is expensive, complex, and timing-sensitive.

Owning everything is powerful.

It is also unforgiving if even one piece slips.

@TigerStars @Daily_Discussion @Tiger_comments @Tiger_SG @Tiger_Earnings @TigerClub @TigerWire

💰Stocks to watch today?(07 May)
1. What news/movements are worth noting in the market today? Any stocks to watch? 2. What trading opportunities are there? Do you have any plans? 🎁 Make a post here, everyone stands a chance to win Tiger coins!
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment
1