AI Going Into 2026, Year Of The Agents (Part 2)
The Trade Broadens
In Part 1, I laid out the fracturing of the AI landscape. Three labs, three philosophies, three very different bets on what intelligence is and how to build it sustainably.
Here, I want to talk about the “AI trade” and how the landscape is evolving.
From late 2022 through 2024, the AI trade was insultingly simple. If you believed “this stuff is real,” the move was basically: own Nvidia and don’t get shaken out. From the October 2022 lows, NVDA is up on the order of ~10x, depending on where you start the clock. GPUs became the de facto tax on intelligence, hyperscaler capex went vertical, and every investor deck discovered “AI inference” by slide three.
That phase is over.
Not because AI is done, but because the narrative and the economics have fractured. The value chain has stopped behaving like a neat stack with Nvidia cemented at the bottom and “AI apps” at the top. Instead, it looks increasingly like a web of differing infrastructures: multiple types of compute, different power contracts, different network topologies, different memory bottlenecks, and very different software economics layered on top.
The simple trade, long NVDA, ignore the rest, has morphed into a set of smaller, more nuanced ones. Nvidia can still win, but its total addressable slice of the pie is starting to look like 60–70% of the high‑end market over time, not the 100% NVDA bulls were hoping for. Hyperscalers are building their own silicon (Google TPUs, Amazon’s Trainium/Inferentia, Microsoft’s Maia/Cobalt), and that alone forces you to think more carefully about where value actually accrues.
While I’ve written in the past about how the NVDA fear from TPUs is overblown, the fracturing of the infrastructure stack can’t be ignored.
Watch the Bottlenecks
No matter what chips the hyperscalers choose, or the data centers are full of, they will all need billions or trillions of dollars worth of: power, connectivity infra, high bandwidth memory, etc.
We’re no longer asking, “How big can Nvidia’s TAM get?” We’re asking, “How much of this multi‑trillion dollar AI capex cycle flows through a single vendor?” I still own Nvidia, but the edge now lives in the second‑order trade: bottlenecks and enablers that sit around and beneath the GPUs.
2025 was all about finding the bottlenecks: what parts of the AI stack were going to be in the tightest supply and demand balance. 2026 will be the same way.
Power: The Most Underappreciated Bottleneck
For many of these massive datacenter projects the binding constraint hasn’t been GPUs for a while now. The binding constraint is power.
The next generation of AI data centers is being planned at gigawatt scale. Stargate alone is targeting more than 5 gigawatts of power capacity by 2028. That’s the rough equivalent of several large nuclear plants pointed directly at tokens‑per‑second.
The hyperscalers have internalized this and are now in an arms race for electrons. Microsoft is locking up nuclear capacity. Amazon is buying into nuclear‑adjacent data center assets. Google is signing early‑stage SMR deals. They all have two problems at the same time: first, the sheer amount of total power they need, and second, the requirement that a growing share of that power be “clean,” because they’ve made public net‑zero commitments that investors actually track.
To me, this makes power a direct AI input, not just “macro background.” I’m watching names like Vistra (VST) as a way to be long tight US power markets and data‑center‑adjacent expansions. Constellation (CEG) is interesting as one of the cleanest plays on nuclear and zero‑carbon baseload. Small modular reactor plays like Oklo and NuScale are way earlier and riskier, but structurally pointed at the right buyer base: hyperscalers who want dedicated, predictable, low‑carbon power for 20+ years at a time.
If GPUs are the “brains” of AI, the grid is the circulatory system. The market is only starting to price that in.
“Power” has been an emerging theme in 2025, with some traction for various names. Nuclear stocks had a big run up. But mostly these names have not outperformed as much as other parts of the AI stack.
I’m watching for this to change in 2026, and for the market to grapple with the idea that this power demand is here to stay and will continue to grow.
Memory: The Other Semiconductor Bet
Memory is another trade that has run a lot in 2025, but that I expect to continue to outperform in 2026.
Modern AI accelerators are HBM‑bound. You can have the best GPU architecture on earth, but if you can’t feed it data fast enough, you’re leaving performance on the table. HBM is how you feed it: stacked, ultra‑wide, high‑bandwidth memory sitting right next to the GPU die. Every serious AI chip at the top end today ships with large amounts of HBM; there isn’t a real substitute.
This is a real oligopoly. SK Hynix has the lead in HBM bit share and technology, with Samsung close behind and pushing hard to close the gap. Micron, which used to be more of an afterthought in this segment, has come from behind and is now gaining share with competitive HBM3E offerings.
Critics of the memory thesis would likely say that these are cyclical stocks and memory is a commodity. But HBM economics look different from commodity DRAM. Manufacturing is more complex because of stacking and through‑silicon vias. Capacity additions are lumpy and slow. A meaningful chunk of volumes are governed by multi‑year purchase agreements tied directly to GPU roadmaps, not spot pricing. The result is a structurally tighter supply‑demand balance that feels more like a specialty component than a fungible memory chip.
Memory has also traditionally been a “boom and bust” industry, which has left these manufacturers wary of building up too much capacity. This is likely to extend tight supply conditions out far longer than critics appreciate.
If you do rough math, a high‑end AI GPU will easily have low‑thousands of dollars of HBM content on it. Multiply that by hundreds of thousands or millions of accelerators over a multi‑year cycle and you end up with a very large pool of revenue sitting with a small number of vendors.
My positioning reflects that. Micron (MU) is a core position as the cleanest US‑listed way to be long HBM share gains plus an improving DRAM/NAND cycle. On top of that, I own SK Hynix via the German ADR (HY9H. SK Hynix is difficult for American retail traders to buy at many brokerages, but that represents an additional opportunity for investors as access may increase in the future with an American-listed ADR, or US retail access to Korean stocks.
The only reason I don’t own Samsung is that it’s a company with many segments beyond memory, and I prefer to get direct access to the theme. But Samsung stock has done quite well this year, and might be worth a look.
Optical Networking: The Bandwidth Bottleneck
After power, the next bottleneck is bandwidth inside and between data centers.
Training large models is basically about moving enormous tensors around as quickly and reliably as possible. That’s a networking problem. Copper wiring can’t keep up with what these systems want to do in terms of speed and density, which is why you’re seeing the industry march from 100G to 400G to 800G and now planning for 1.6T per port in the 2025–2026 window.
There are really two worlds here. Inside the data center, you have short‑reach optics connecting GPUs, racks, and switches. Between data centers, you have coherent optics doing long‑haul and data‑center‑interconnect links for model checkpoints, replication, and serving. Both sides get a structural uplift as AI workloads ramp.
My exposure here is through a small cluster of names. Lumentum (LITE) gives me leverage to the laser and component side: the engines that make high‑speed links possible. Coherent (COHR) is a more vertically integrated bet that spans materials, components, and modules, which can pay off if they keep execution tight. Ciena (CIEN) is the systems‑level play, especially on the coherent DCI and long‑haul builds.
The bull case is straightforward: TAM is expanding right now as AI architectures demand denser fabrics and more bandwidth per GPU, but these companies have not been rerated the way Nvidia has. They only started getting noticed in the second half of 2025. Whether the chip in the slot is an H100, a TPU, or a Maia, the optical plumbing has to exist and has to keep evolving.
The bear case is mostly about cyclicality and concentration. Optical, like memory, has a long history of painful inventory cycles. A handful of hyperscalers drive the bulk of demand, and when even one slams the brakes, everyone in the supply chain feels it.
There’s also a structural question about how much margin eventually migrates into silicon photonics and onto the switch ASICs themselves, away from discrete optics vendors.
But for the foreseeable future these optics names are likely here to stay, and whereas NVDA has 10x’d since its 2022 lows these names are up much less.
I don’t think of these names as “Nvidia 2.0.” I think of them as under‑owned infrastructure enablers that capture the second wave of AI spending as networks densify.
Broadcom And The Broader Semi Ecosystem
Broadcom (AVGO) is one of the more holistic names in the whole web.
On one side, it sells networking and switching silicon that underpins AI data centers broadly. If traffic grows and fabrics become more complex, Broadcom benefits. On the other side, it designs custom ASIC accelerators for hyperscalers: chips that explicitly exist to reduce those buyers’ dependence on Nvidia. In other words, if the world sticks with Nvidia, Broadcom wins via networking. If the world pivots harder into custom silicon, Broadcom wins via ASIC design.
I had avoided AVGO for most of the AI trade (to my detriment), but I think it’s in an interesting place here.
It’s been quite beat down after its last earnings call, but I think it might well be positioned better than ever. It is set up nicely to benefit from a “broadening out” of the AI trade into non-NVDA names, and the recent sentiment shift might represent a brief “lull” in the stock that quickly reverts higher.
Around that broadening out thesis, I like some smaller “picks and shovels” plays.
Teradyne (TER) sits on the test side: every advanced chip eventually has to go through increasingly complex test regimes, and AI‑driven complexity doesn’t change that. It’s a name that has only recently started being included in the “AI trade” and hasn’t moved much.
Photronics (PLAB) sells photomasks and benefits from the sheer number of tape‑outs at advanced nodes. They are positioning themselves close to TSMC, and stand to benefit greatly if the AI buildout sustains. It’s only after their last earnings call that the market really started to see the potential for PLAB, and I am looking for that newfound optimism to continue.
Phase 3: Enterprise Software
If Phase 1 was NVDA, and Phase 2 has been data center build outs, phase 3 is software.
Software companies that benefit from AI fall into one of two categories:
AI Beneficiaries that can offer a better service or a lower price from AI
Tools that find themselves with a new group of customers because of AI
AI is clearly a threat to certain categories: thin‑moat customer service tools, simple workflow wrappers around the same APIs everyone has access to, low‑end codegen products with no real integration into how work is actually done. Those things should get commoditized quickly.
But there’s also a paradox of productivity. If 2026 is the year of the agent as many are saying, those agents will lead to ever increasing productivity output, and will need to be managed by humans. Faster deployment and cheaper experimentation means more services, more endpoints, more complexity. That increases the coordination problem and the surface area that has to be managed.
That’s where I think AI is net‑bullish for specific types of software.
Atlassian (TEAM) might be one of them. If you sit in a modern engineering organization, Jira and Confluence are essentially the operating system for work. AI will auto‑generate tickets, write documentation drafts, and create tests, but those artifacts still need to be tracked, triaged, and linked to real projects. More activity might paradoxically, create more Jira, not less. Especially in the short term.
Salesforce (CRM) is a higher‑beta, higher‑execution‑risk bet on a different layer: agent orchestration. If you imagine a world where AI agents are reading and writing customer records, triggering workflows, and coordinating sales or service activity, they need a single source of truth for customer data. CRM is a natural candidate for that role, and Salesforce is very obviously trying to turn itself into the control plane for those agents. Whether they execute well enough is a different question, but the positioning is there.
MongoDB (MDB) is another interesting option that falls more in that second category. AI‑native apps and agentic workflows generate new kinds of data: embeddings, vectors, logs, traces, per‑user state, and all sorts of semi‑structured garbage that needs to live somewhere flexible. Mongo already sits in the “developer default” slot for a lot of modern applications, and I think that only gets stronger as teams build custom AI tools rather than relying exclusively on off‑the‑shelf SaaS. MongoDB has a chance to be the “data layer” for all the agentic systems coming online in 2026 and beyond.
AI For The Real Economy: The Private Equity Angle
One of the more contrarian parts of my book is private equity.
Private equity is, at its core, an operational improvement business. You buy stable cash flows, use leverage, and then grind out better margins through operational changes over time. AI is an operational toolkit that can be deployed across dozens or hundreds of portfolio companies: back‑office automation, faster and more systematic due diligence, sales and pricing optimization, supply chain analytics, and so on.
The macro backdrop is also quietly improving with rates coming down and the economy reaccelerating.
The industry is still sitting on roughly $2 trillion of uncalled capital. Deal and financing markets, which froze up during the sharpest part of the rate shock, have started to reopen. The big platforms are fundraising successfully again and slowly working through their backlog of exits.
Public software gets de‑rated if investors think AI compresses its margins or commoditizes its product. PE funds don’t necessarily need to chase the high‑multiple AI infrastructure names at all; they can simply apply AI inside their portfolio operations and harvest the incremental EBITDA in private.
I’m expressing that view primarily through TPG, Blackstone (BX), and KKR. TPG skews a bit more growth‑oriented and trades at a discount relative to some peers, but I like it because it lives most directly in the “traditional private equity” business model.. Blackstone is the industry heavyweight with enormous diversification. KKR rounds out the basket as another high‑quality platform with a long runway of dry powder. I categorize all three as “real economy beneficiaries of AI” rather than AI trades per se.
The Google Bet
My largest single‑stock position remains Alphabet (GOOG).
The core of the thesis is that Google has quietly built the best combination of cost structure, data, and distribution for large‑scale AI, and while the stock has done well recently I think the market doesn’t fully appreciate the implications of that
They don’t pay the Nvidia tax, because their Gemini models run on TPUs they built and deployed themselves. They have the deepest pool of proprietary data across Search, YouTube, Maps, Gmail, Docs, Android, Chrome, and the rest of the stack. They own the distribution layers where AI shows up for consumers and enterprises: the browser, the phone, the productivity suite, and a rapidly growing cloud platform.
And with Gemini 3 they’ve moved to the frontier on capability.
There was a common belief that AI would cannibalize search, but Sundar has said the opposite is happening. According to Google CEO Sundar Pichai and Chief Business Officer Philipp Schindler AI overviews lead to more searches, and they monetize at the same rate as normal searches.
In my opinion, the combination of distribution, model quality, and core business strength makes them well positioned to be the single biggest AI winner.
Just as importantly, Google is being under‑credited as infrastructure in a market obsessed with the OpenAI brand. If the winning configuration in this space is “who can deliver the most useful tokens at the lowest marginal cost into the broadest set of workflows,” Google’s combination of custom silicon, scale, data, and distribution is extremely hard to beat over a multi‑year horizon.
The Agent Economy
With many AI leaders calling 2026 the “year of the agent” it’s important to think deeply about how agents will affect the ways we work and do business.
Most software today is priced per seat: you pay $X per user per month, and revenue roughly scales with headcount. In an agentic world, that breaks. While one agent might do the work of ten software engineers it’s a tough sell to charge a buyer 10x as much, and nearly impossible to gate the subscription to a single agent.
That leads to a “Service as a Software” model: usage‑based pricing tied to work done rather than people licensed. You can already see this in pockets of customer service and BPO‑adjacent products where per‑conversation or per‑resolution pricing is emerging. The logical endpoint is that AI services start to look a lot like BPO contracts, just delivered through software.
If that’s right, the first set of targets are obvious: call centers and BPOs (TTEC, Concentrix, Teleperformance, etc.), low‑end outsourcing, and any hourly‑billing professional services segment where a lot of the work is repeatable and structured. The market will demand the benefits of AI productivity, and those business models are structurally exposed to that pressure.
The winners will likely be the platforms with three characteristics: proprietary data that agents need to read and write (systems of record like CRM and ERP), workflow orchestration capabilities that can coordinate humans and agents together (ServiceNow, Salesforce, TEAM, etc.), and integration layers or standards that let agents safely touch live systems without bespoke plumbing each time.
I’m not all‑in on this yet, but I am watching ServiceNow (NOW) and similar names as potential “agent orchestration” winners that sit one layer above the model providers.
I have been shy about my software exposure because:
I don’t know what the space is going to look like yet.
I don’t know when the market will appreciate who the winners are.
Picking “AI winners” in software is quite challenging because we are so early in the process. While I have taken a stab at picking a few names, this is a space to watch closely for emerging winners. It might look a lot different than we expect.
What I’m Watching In AI Overall
In the near term, the main thing I care about is how the hyperscalers talk about:
Broad AI Capex plans
How much goes to Nvidia versus in‑house silicon
Power demand
Perhaps more importantly I am watching the shape of the demand for these AI tokens. Is it still mostly experimentation, or are we seeing real production workloads with committed spend?
For agents, I can see in my own daily life that we are nearing an inflection point where the models have become good enough to do the work autonomously. In 2026 I am watching closely for adoption. Where is the adoption taking place, and how is it being implemented? What model runs the implementation? Etc.
Many are talking about 2026 as the year of the agent, but I am thinking of it more as the year of ROI. The models are good enough, where is the value showing up to the bottom lines of the people providing and using the models themselves, not just the infrastructure layer building out the data centers.
The Risks
While AI is clearly a transformational technology it’s important to remember that the trade can fail while the technology “succeeds.”
One outcome is an AI winter 2.0 where capabilities plateau faster than expected, and we are forced to wait for a technological breakthrough before the growth can continue. In this scenario the market realizes it has funded more infrastructure than can be economically justified. That would strand a lot of capex and compress multiples across the whole ecosystem, with many of our favorite names down significantly.
While I rate this scenario as fairly unlikely, particularly in 2026 because of the nature of how good Gemini is, and how likely that model and others like it are to improve in the next 12 months, this is an important risk to monitor.
Regulation also sits in the background of all of this. Heavy‑handed AI rules, competition remedies, or data regulations could compress margins and slow deployment in ways that are hard to model upfront, particularly in Europe.
And over all of it hangs the macro. A real recession or credit event will pull correlations toward one and drag almost everything down together, regardless of the underlying AI story.
None of these are reasons to avoid the space entirely, but they are reasons to size positions carefully and avoid pretending this is a free option on “the future.” We can be 100% confident that AI will trade the world, and still need to tread carefully.
How I’m Positioned
Putting it all together, the way I think about the AI book today is roughly as follows.
At the core infrastructure layer, I own Alphabet (GOOG), Nvidia (NVDA), Broadcom (AVGO), Micron (MU), and SK hynix. That’s a core combination of chips, memory, and integrated AI infrastructure I want to be long.
Around the bandwidth bottleneck, I own Lumentum (LITE), Coherent (COHR), and Ciena (CIEN) as targeted exposure to the optical and networking uplift.
On the software side I hold Atlassian (TEAM), MongoDB (MDB), and Salesforce (CRM). I think this “software” side of the portfolio is extremely important to watch in 2026, and I don’t love how I’m positioned. This will be an evolving story.
For real‑economy leverage to AI operational improvements, I own TPG, Blackstone (BX), and KKR.
On the watchlist, I’m spending time on power — Vistra (VST) and Constellation (CEG) in particular — plus ServiceNow (NOW) on the agent orchestration side, and a broader basket of SMR and advanced nuclear names as the power race evolves.
2025 was an absolutely amazing year in AI, an amazing year to own many AI stocks, and 2026 is shaping up to be the same way. But nothing is certain.
Happy New Year, and good luck out there!
Disclaimer: The information provided here is for general informational purposes only. It is not intended as financial advice. I am not a financial advisor, nor am I qualified to provide financial guidance. Please consult with a professional financial advisor before making any investment decisions. This content is shared from my personal perspective and experience only, and should not be considered professional financial investment advice. Make your own informed decisions and do not rely solely on the information presented here. The information is presented for educational reasons only. Investment positions listed in the newsletter may be exited or adjusted without notice.
I hold positions in several securities mentioned in this post, including but not limited to GOOG, NVDA, AVGO, MU, LITE, COHR, CIEN, CRM, TEAM, MDB, TPG, BX, KKR, and others. My positions may change at any time without notice. This is not investment advice - I’m sharing my personal views and positioning. Do your own research.







