AI Stocks: Can Investors Still Win Big or Is The Crash Ahead?
Summary
Investment in AI: The podcast discusses the rapid pace of investment in AI, highlighting significant deals such as OpenAI's $300 billion deal with Oracle and Nvidia's $100 billion investment in AI data centers.
AI Industry Structure: AI is broken down into four layers: applications, foundational models, hyperscalers, and hardware/infrastructure, with each layer facing unique financial challenges and opportunities.
Profitability Challenges: Many AI applications and foundational models are not yet profitable due to high compute costs, with venture capital and large tech companies' cash flows funding these losses.
Hyperscalers and Hardware: Hyperscalers like Amazon, Microsoft, and Alphabet, along with specialized companies like Coreweave, are profiting from AI by providing computing power, while hardware providers like Nvidia and Cisco benefit from infrastructure demand.
Funding and Risk: The AI ecosystem relies heavily on external funding, creating a circular spending pattern that could lead to greater risk if market confidence falters.
Investment Opportunities: Investors are advised to focus on bottlenecks and constraint points in the AI supply chain, such as electrical infrastructure and edge computing, with companies like Micron and Qualcomm highlighted as potential winners.
Long-term AI Outlook: Despite current profitability issues, the podcast suggests AI spending has a long runway, driven by tech giants' commitment to AI as an existential business necessity.
Stanbury Research's AI Portfolio: The podcast promotes Stanbury Research's new AI-driven investment system, claiming it outperforms traditional benchmarks and offers a structured approach to investing in AI.
Transcript
Good luck tracking the AI industries these days. It is a full-time job. Every day there's a new model, a new multi-billion dollar deal, and another stock soaring. But of course, you can't ignore AI as an investor, you must understand it. And with a new industry like this, it's helpful to sit down and chart out exactly how it works, who does what, who's paying who, and what businesses will actually turn into money makers. So today, I'm going to break it all down for you right here in this video. Along the way, I'll talk about a whole bunch of AI stocks, and I'll bring in an AI expert to help us make sense of it all. Let me start with the billiondoll deals being thrown around every day. So, OpenAI made a $300 billion deal to buy processing power from Oracle, and Oracle shares rose 36% in a day. Nvidia pledged to invest $100 billion into Open AI as part of a plan to build 10 gigawatts of AI data centers. Coreweave announced it added an additional $6.5 billion to its standing deal with Open AI. And another one a week later for $14 billion with Meta. Shares of Core Weave were up 32% in a month. And all of that that I just listed happened in September alone. Now, under normal circumstances, deals with these many zeros would take weeks or months to hammer out, but the pace of activity in Silicon Valley is blistering with deals worth hundreds of billions of dollars happening at what seems like an overnight pace. And what's also interesting is many of these deals come as a big surprise and they upend strategies these companies have pursued for years. So for instance, Meta has developed its own LLM and now it spent billions in salaries for its own super intelligence research team. Plus, it spent tens of billions on its own infrastructure. But now we hear Mark Zuckerberg is in talks with Alphabet to use its Gemini model to improve Meta's ad business. Or or take Microsoft, a major investor and supplier for Open AI. It's made a huge bet on Open AI, integrated its AI into Microsoft's co-pilot product. But Microsoft just cut a deal with Anthropic to use its claw AI and office apps. So money and computing and intelligence are flying around every which way and it practically takes a super intelligence to track it all at this point. That's where we're going to break it down today. So uh the stocks ripping higher on AI excitement do all sorts of different things. Some are extraordinarily profitable and others are acting only to incinerate cash at least for now. If you want to look at this, I break AI down into four layers. Applications, foundational models, hyperscalers, and then hardware and infrastructure. Now, most people are the endusers of AI. They're working at that application layer. So, let's start there. Uh, today people use AI through things like Perplexity's notebook and cursor for coding. And of course the chatbots too, but I'll get to those in a second. Uh uh these apps are built on top of foundational models made by the big AI companies. The AI apps collect money from subscription revenue and pay the model creators for access to intelligence. Uh and these app companies aren't even close to profitable yet. Uh since AI eats up so much costly compute, each action from a user actually loses these companies more money. They have a long way to go. until profitability and you can extend this assumption to the popular chat bots like open II chat GPT Google Gemini and Anthropics Claude. So yes, these chatbots work on the application layer. They are the product built on top of the foundational model. In the case of these, it just so happens that the AI product is built by the same company that has built the foundational model. And you'll see as we go along, many of these companies and AI straddle several of these layers. So, if we could separate Chat GPT from OpenAI, it's likely that Chat GPT would pay OpenAI more than it takes in in revenue. Same goes for Claude and Gemini. The application layer, it just is not profitable yet. But if you step down to the next layer, the foundational models, they're not really making money either. You know, look at what I consider the five big models. Open AAI's Chat GPT, Anthropics Claude, AI's Grock, Alphabets, Gemini, and Meta Lama. uh these companies carry massive compute costs. They pay not just to serve answers to model users which is known as inference compute but they spend hundreds of million dollars on training runs to improve the models. So these companies don't really provide financials. Uh but we can tell that building models is a money pit. Reuters reports OpenAI earned 4.3 billion of revenue in the first half of 2025 but posted a loss of about 2.5 billion. Anthropic's probably in the same boat. Elon Musk's XAI is burning about$1 billion dollars per month. And if you could break them out, the numbers for Alphabet and Meadow would probably look about the same. So these foundational models simply cost way too much to build and run. But those costs are the revenue that gets sent down to the next layer, the hyperscalers. And all the money is being spent on computing power. The hyperscalers are the companies building and running massive data centers which they rent out to train and run models. So prior to the AI boom, the three biggest hyperscalers were Amazon, Microsoft and Alphabet in that order. Uh and now as time has gone on, the market has grown. There are now specialty cloud companies like Coreweave that cater to AI companies. Oracle now is expanding their compute capacity to provide services for open AI. And then of course the model builders are also becoming their own hyperscalers. Meta, OpenAI, XAI, all building data centers of their own. And this compute part is where the profits start to show. The hyperscalers earn outstanding margins by renting out traditional compute. They have a business with profitable unit economics, we call it. That means they can sell compute for more than it costs them. But they are investing all those profits right back into the business for the future. You can see this when you look at the financials of a pure play like Cororeweave. The company has positive cash flow from operations but negative free cash flow after you consider their capital spending. So profitable now but plowing those profits into the future. And speaking of profits from AI, I want to point out that Stansbury Research is working at the cutting edge of AI uh using it to create profits for investors just like you. In our case, we just launched a fully quantitative system to select the best businesses in the market and combine them into an AIdriven portfolio that any investor can follow. It's built on the proven fundamentals of finance paired with AI. The backtested results are incredible, beating the S&P 500, gold, a 6040 portfolio, even Warren Buffett's, Birkshshire Hathway, and you could learn all about it at stanberry system.com right now. There's a link in the description, but let me get back to the AI industry for now. Uh, the biggest boom really has happened where those hyperscalers are spending and that's on the next layer, the hardware providers. That's Nvidia for GPUs, Cisco for networking, and it spreads to all kinds of building and infrastructure companies. Much of the AI spend is ending up in companies that make wiring, cooling, and electrical power for data centers. And these companies at this layer are in fullon boom mode, making high revenue, earning big profits, and their stocks show it. You can see this in HVAC company Comfort Fix, Power Generation, Constellation Energy, uh, and Corning, which does fiber optics, Amphanol, which does high-speed interconnects. These picks and shovels plays on AI are still booming. But if you step back, take the 10,000 foot view, you can see that AI is not paying for itself. It's clear that the revenue from AI end users is nowhere near the spending on data centers. So the money isn't flowing through the layers from the top down. It still needs funding to come into the ecosystem at various sources and each layer is getting this funding from different sources. And as an investor, you need to understand where the money is coming from. So if you look at the top, the losses for AI applications are being funded mostly by venture capital. If you move down to the foundational models, it it starts to split for for private firms like OpenAI and XI. That money is coming from venture capital. But for Alphabet, Meta, uh the the the big mag seven companies, the massive free cash flows from their advertising businesses are funding their AI projects. You go and look at the hyperscalers, it's also mixed. The huge tech companies can fuel much of their own growth with cash flows, but they are also borrowing. So this is where debt comes into the picture. So despite having 47 billion in cash on hand, this summer Meta sought to raise 26 billion in private debt. Cororeweave has 14 billion in private debt sent to get its $3 billion in revenue and uh the burgeoning private credit industry is financing hundreds of billions of AI spend. UBS Global Research reports that private lending to the tech center reached uh 450 billion earlier this year, up from 100 billion, I'm sorry, up 100 billion from the prior year. So there's still no value being created by AI. It's only supported by money from outside investors and lenders. But that is not necessarily a bad thing. This is how it goes with young industries and new technologies. The business model's not there yet, but that doesn't mean it won't develop. That is what is great about a modern capitalist economy. We can fund progress without an immediate reward and end up living in a more productive and richer world down the line. So the question you have to ask is how long can the money keep flowing into AI? And I can tell you that AI spending has a very long runway. I have to admit early on I thought we only had a few quarters maybe before we needed to see significant AI revenue before investors started to sour on the idea. But now I believe that markets can keep supporting AI spend for much longer. And here is exactly why most of the spending decisions and the capital come from the gigantic tech companies and they are allin. Microsoft Alphabet Meta they see AI as an existential threat to their existing business and they will not seed any ground at all to their competitors. Microsoft CEO Satia Nadella said he's haunted by the thought of Microsoft not surviving into the AI era. Mark Zuckerberg of Meta said he'd rather end up misspending a couple hundred billion than miss out on AI. And Alphabet founder Larry Page has reportedly told a employees he would rather go bankrupt than lose the AI race. And for now, the share prices of AI companies show that there's no reservations yet about this spending stopping just yet. However, I need to point out that the AI ecosystem has gotten more complicated and circular. AI companies with capital are pouring it right back into other layers of the AI stack. So, as noted earlier, Nvidia is investing 100 billion into OpenAI. OpenAI will use that investment to buy chips from Nvidia. And Nvidia's done the same thing with other companies. It owns more than 6% of Coreeave, which uses Nvidia chips to build its data centers. So these are roundabout deals in which suppliers are investing back into their customers who are using it to buy more from the supplier. So this circular nature of AI spending does create greater risk. Right? This feedback loop of AI spending suggests that any crash will be swift and severe. So if you think about an industry with a traditional value chain, changes in the business sort of scale linearly to asset values, right? If um if we take the business process for Walmart and they improve a little bit, the stock goes up a little bit. Uh but when you have a circular system that makes it you know nonlinear, chaotic even, right? Small changes can lead to bigger reactions. So the cycle is driving all of AI higher today. If faith in the AI future falters, the whole thing can fall apart. But thanks to the almost religious belief in AI from Silicon Valley's tech CEOs, the spending can run for a long long time. So that is what you need to watch. Uh and if you want to invest in AI like everybody does these days, you really need to understand how the industry works. That's why I brought in Josh Balin. Josh is here at Stanbe Research. He's exactly the kind of person you want to hear from on this topics. He worked for one of the biggest and best hedge funds in the world. ran his own quant hedge fund and then he spent more than a decade working in tech with AI, robotics, wearables, and so much more. So Josh, you know this stuff better than me. What am I missing here in the AI story? >> Look, I I think you've captured sort of this the circular nature of of the ecosystem at this point. Um, but there really are, I guess, what I would refer to as like three constraint uh layers that sort of amplify both the risk and the opportunity in the space. Okay. So, so first you know and this sort of underscores how the hyperscalers have been spending right initially they were spending on chips right they had to lock down Nvidia chips um and at this point the bottleneck is really shifting from that chip capacity uh to power right data centers need massive amounts of electrical capacity and the traditional grid uh can't really keep up right and so you're even hearing anecdotally that people's power bills are going up because of this incremental data center demand. Um, we're even seeing deals out there in the market where AI companies are are buying power plants or investing directly in power generation companies because they know that this is now sort of the new constraint. And so what this really does is it creates kind of a an ancillary uh investment thesis around electrical infrastructure uh transformers grid modernization um more efficient power transformation within the data centers right cabling chips etc and so that's sort of one constraint right I think the second constraint and you didn't really touch on this but it is that data quality is actually becoming a moat for some of those foundational model builders, right? Open AAI has already cut a big deal with Reddit uh for for millions of dollars, right, for training. Um and and it's not really the compute there. Uh it it's the content, right? It's the underlying content that makes these models smart, right? Stack Overflow, same thing. New York Times. And so all of a sudden these proprietary data sets these these good data sets that allow for uh highlevel training of these foundational models is really kind of going to be a a competitive differentiator. And this is really the third and we've written about this and we've talked about this but everything we've primarily talked about has been about the data center itself but this is really crucial for investors. edge infrastructure really becomes the next battle uh ground for these hyperscalers and frankly a huge opportunity for investors right because AI can't just run in data centers right it needs to run locally in cars factories cell phones airpods right all of these devices and this proliferation of new AI enabled devices actually can creates sort of a new opportunity that investors can um you know can take advantage of right so it's it's compute infrastructure for basically all of these little devices that are starting to proliferate right and so in a sense um you know again back to your original question the the spend makes sense and the circular nature of it is kind of okay and I know we talked about this briefly before but in a way um this direct spend at such high levels is actually a way for some of these foundational ational models that do rely on venture capital or debt financing to almost eliminate those uh expenses and that um friction of adding a third-party financeier and they're just going direct to the supplier to cut the deal. So in a way they're almost leveraging the strong financial position of some of the big suppliers like an Nvidia to cut direct deals and eliminate some of the unnecessary financial transactions. Yeah, it's interesting from a financial perspective. You think of that as adding risk, the circular nature, but they're actually kind of de-risking by locking in their suppliers and locking in, you know, coming over these bottlenecks that that rather than have their business fall apart when they don't have what they need. So, um, it could work out in the industry's favor. Like I said, it's uh it's it sounds like a risk, but if you if you dig in, it's it's still all pretty bullish. Um, and so you know, Josh, you're you're plugged into the tech industry. You know, lots of people in both finance and Silicon Valley. When you talk to people, do you get any sense that the people making these spending decisions are starting to have doubts or they're getting cost conscious or is it still the mood in the room is still full speed ahead? >> So, look, we started having these discussions about an AI boom or an AI bubble back in April or May, Matt. And I told you back then that the canary in the coal mine would be when these hyperscalers start to pull back on spending, right? Announce that they've built too much or they're going to pull capex back. Um 6 months later, we haven't seen it, right? And so to some extent, it still is full speed ahead. But as I mentioned before, it's really buy everything we can to secure those critical constraint points, right? And so when you talk about some of these mega deals, right, these really to my earlier point, this isn't panic spending, right? These are strategic moves to lock up scarce resources, power capacity, chip allocation, uh data center real estate, um cooling, cabling, there just isn't enough to go around right now. And so, you know, I think in some ways when you're seeing, you know, you were talking earlier about some of the cross-pollination where, you know, Meta is using Google's models and Apple are using Google's models and Anthropic is actually collaborating with OpenAI and vice versa. This isn't um indecision. This is actually to some extent the spreading of risk, diversification. And you know, I think folks are starting to understand that it's going to be difficult for one winner to emerge from some of these foundational models. And so they're really hedging their bets, right? And so, you know, you mentioned coreweave. I think the biggest indication that spending hasn't slowed and some of the bigger uh hyperscalers and spenders haven't decided to spend uh slow spending is that you know a a core has a very significant debt burden that would traditionally worry uh certain financial markets certainly and that just does not seem to be the case just yet and so we continue to spend um you know without question I'm going tell you the exact same thing I told you in May and in a you know in May, June, July, even August. When you hear one of these larger companies, a Meta and Amazon a FA, you know, uh uh beginning to pull back on spend, that is when uh we need to be very cautious about what they're pulling back on and how it impacts. But as of right now, um, you know, back to your earlier points that, uh, Zuckerberg made, Ellison made, um, you know, certain folks at Google, um, this isn't a bet that AI is going to be, uh, big. We we know that they are betting that if they if they don't invest now, that is the existential risk. And so that's really a higher bar for stopping spend. Um and so yeah it it is full steam ahead but you know again very strategically at the moment. >> Okay so if an investor is looking at the stock market now you know you've talked about the bottlenecks and and investing bottlenecks are where the opportunities are which I is always fun to to point out. Um where along this ecosystem do you think investors should be making bets today? Are we still at picks and shovels? Are we are there applications that are actually going to start turning profitable? What specific stocks should people be looking at? >> Well, I think you mentioned some of the picks and shovels opportunities, right, within those constraint points. Um, I've written pretty extensively about Micron. Um, that's obviously, you know, within data centers and even at the edge, uh, high bandwidth memory is is a scarce resource, right? Uh, every AI player is going to need them both in data centers and at the edge. And so you've seen Micron and some other memory providers act uh pretty well. Same thing you know we talked about uh power um you know power and cooling within data centers. You know a verdictive um has been something where yeah the stock has ripped but um there aren't uh that many uh competitors to something like this. So when you find uh these um you know constraint uh uh driven opportunities the verdives the microns th those are good places I think you know the other kind of category that I would sort of put this in are the you know let's call it profitability now uh uh bucket right where um you know the oracles of the world they they do have stable cash flows from other businesses they're paying a dividend yeah the stock has worked but um you Again, until there's an indication that uh business is, you know, the the the trends are changing or there's some type of negative catalyst, I think some of these quote profitability now opportunities um will remain. And you know I think the third is you know again as we kind of get more and more edge expansion um some other uh more traditional uh um you know localized be it semiconductor plays or even data center plays will continue to work. So, I've highlighted Qualcomm in the past as a potential winner here. Uh, as the proliferation of of devices grows, um, you know, Broadcom obviously, um, you know, continues to work. In short, um, you know, I really do want some portion of an investor's portfolio allocated to these opportunities. The, uh, the returns are there. Um, they're not without risk. uh but again uh spending continues and um I really do think that uh returns can continue. So you know back to the four first point that you made right this circular funding mechanism it's not a bug it's it's a feature of this right it's actually how the ecosystem is showing some maturity it doesn't mean that it doesn't come with some risk and some peril u but again I think this really does create um more certainty that uh revenues amongst big players continues uh support amongst the ecosystem continues and subsequ quently uh certain stocks will continue to work from here. >> All right. Well, that's that's great, Josh. Thank you so much for breaking this down. It's an incredible new industry being built and there's so much to learn. Glad we can learn it from you. There's a lot of profit opportunity here. Um and people before you go, don't forget that Josh is just one of the folks we have working here at Stanbury Research. In fact, to build our new quantitative AI based uh investing system, we enlisted experts in quantitative finance, computer science, even a PhD in astrophysics. If you want to see how to build a simple but profitable AI portfolio with AI helping you make the stock selection, go and visit stansberryystem.com to get the whole story. You won't be disappointed. As always, like and subscribe here on YouTube, and I will be back next week.
AI Stocks: Can Investors Still Win Big or Is The Crash Ahead?
Summary
Transcript
Good luck tracking the AI industries these days. It is a full-time job. Every day there's a new model, a new multi-billion dollar deal, and another stock soaring. But of course, you can't ignore AI as an investor, you must understand it. And with a new industry like this, it's helpful to sit down and chart out exactly how it works, who does what, who's paying who, and what businesses will actually turn into money makers. So today, I'm going to break it all down for you right here in this video. Along the way, I'll talk about a whole bunch of AI stocks, and I'll bring in an AI expert to help us make sense of it all. Let me start with the billiondoll deals being thrown around every day. So, OpenAI made a $300 billion deal to buy processing power from Oracle, and Oracle shares rose 36% in a day. Nvidia pledged to invest $100 billion into Open AI as part of a plan to build 10 gigawatts of AI data centers. Coreweave announced it added an additional $6.5 billion to its standing deal with Open AI. And another one a week later for $14 billion with Meta. Shares of Core Weave were up 32% in a month. And all of that that I just listed happened in September alone. Now, under normal circumstances, deals with these many zeros would take weeks or months to hammer out, but the pace of activity in Silicon Valley is blistering with deals worth hundreds of billions of dollars happening at what seems like an overnight pace. And what's also interesting is many of these deals come as a big surprise and they upend strategies these companies have pursued for years. So for instance, Meta has developed its own LLM and now it spent billions in salaries for its own super intelligence research team. Plus, it spent tens of billions on its own infrastructure. But now we hear Mark Zuckerberg is in talks with Alphabet to use its Gemini model to improve Meta's ad business. Or or take Microsoft, a major investor and supplier for Open AI. It's made a huge bet on Open AI, integrated its AI into Microsoft's co-pilot product. But Microsoft just cut a deal with Anthropic to use its claw AI and office apps. So money and computing and intelligence are flying around every which way and it practically takes a super intelligence to track it all at this point. That's where we're going to break it down today. So uh the stocks ripping higher on AI excitement do all sorts of different things. Some are extraordinarily profitable and others are acting only to incinerate cash at least for now. If you want to look at this, I break AI down into four layers. Applications, foundational models, hyperscalers, and then hardware and infrastructure. Now, most people are the endusers of AI. They're working at that application layer. So, let's start there. Uh, today people use AI through things like Perplexity's notebook and cursor for coding. And of course the chatbots too, but I'll get to those in a second. Uh uh these apps are built on top of foundational models made by the big AI companies. The AI apps collect money from subscription revenue and pay the model creators for access to intelligence. Uh and these app companies aren't even close to profitable yet. Uh since AI eats up so much costly compute, each action from a user actually loses these companies more money. They have a long way to go. until profitability and you can extend this assumption to the popular chat bots like open II chat GPT Google Gemini and Anthropics Claude. So yes, these chatbots work on the application layer. They are the product built on top of the foundational model. In the case of these, it just so happens that the AI product is built by the same company that has built the foundational model. And you'll see as we go along, many of these companies and AI straddle several of these layers. So, if we could separate Chat GPT from OpenAI, it's likely that Chat GPT would pay OpenAI more than it takes in in revenue. Same goes for Claude and Gemini. The application layer, it just is not profitable yet. But if you step down to the next layer, the foundational models, they're not really making money either. You know, look at what I consider the five big models. Open AAI's Chat GPT, Anthropics Claude, AI's Grock, Alphabets, Gemini, and Meta Lama. uh these companies carry massive compute costs. They pay not just to serve answers to model users which is known as inference compute but they spend hundreds of million dollars on training runs to improve the models. So these companies don't really provide financials. Uh but we can tell that building models is a money pit. Reuters reports OpenAI earned 4.3 billion of revenue in the first half of 2025 but posted a loss of about 2.5 billion. Anthropic's probably in the same boat. Elon Musk's XAI is burning about$1 billion dollars per month. And if you could break them out, the numbers for Alphabet and Meadow would probably look about the same. So these foundational models simply cost way too much to build and run. But those costs are the revenue that gets sent down to the next layer, the hyperscalers. And all the money is being spent on computing power. The hyperscalers are the companies building and running massive data centers which they rent out to train and run models. So prior to the AI boom, the three biggest hyperscalers were Amazon, Microsoft and Alphabet in that order. Uh and now as time has gone on, the market has grown. There are now specialty cloud companies like Coreweave that cater to AI companies. Oracle now is expanding their compute capacity to provide services for open AI. And then of course the model builders are also becoming their own hyperscalers. Meta, OpenAI, XAI, all building data centers of their own. And this compute part is where the profits start to show. The hyperscalers earn outstanding margins by renting out traditional compute. They have a business with profitable unit economics, we call it. That means they can sell compute for more than it costs them. But they are investing all those profits right back into the business for the future. You can see this when you look at the financials of a pure play like Cororeweave. The company has positive cash flow from operations but negative free cash flow after you consider their capital spending. So profitable now but plowing those profits into the future. And speaking of profits from AI, I want to point out that Stansbury Research is working at the cutting edge of AI uh using it to create profits for investors just like you. In our case, we just launched a fully quantitative system to select the best businesses in the market and combine them into an AIdriven portfolio that any investor can follow. It's built on the proven fundamentals of finance paired with AI. The backtested results are incredible, beating the S&P 500, gold, a 6040 portfolio, even Warren Buffett's, Birkshshire Hathway, and you could learn all about it at stanberry system.com right now. There's a link in the description, but let me get back to the AI industry for now. Uh, the biggest boom really has happened where those hyperscalers are spending and that's on the next layer, the hardware providers. That's Nvidia for GPUs, Cisco for networking, and it spreads to all kinds of building and infrastructure companies. Much of the AI spend is ending up in companies that make wiring, cooling, and electrical power for data centers. And these companies at this layer are in fullon boom mode, making high revenue, earning big profits, and their stocks show it. You can see this in HVAC company Comfort Fix, Power Generation, Constellation Energy, uh, and Corning, which does fiber optics, Amphanol, which does high-speed interconnects. These picks and shovels plays on AI are still booming. But if you step back, take the 10,000 foot view, you can see that AI is not paying for itself. It's clear that the revenue from AI end users is nowhere near the spending on data centers. So the money isn't flowing through the layers from the top down. It still needs funding to come into the ecosystem at various sources and each layer is getting this funding from different sources. And as an investor, you need to understand where the money is coming from. So if you look at the top, the losses for AI applications are being funded mostly by venture capital. If you move down to the foundational models, it it starts to split for for private firms like OpenAI and XI. That money is coming from venture capital. But for Alphabet, Meta, uh the the the big mag seven companies, the massive free cash flows from their advertising businesses are funding their AI projects. You go and look at the hyperscalers, it's also mixed. The huge tech companies can fuel much of their own growth with cash flows, but they are also borrowing. So this is where debt comes into the picture. So despite having 47 billion in cash on hand, this summer Meta sought to raise 26 billion in private debt. Cororeweave has 14 billion in private debt sent to get its $3 billion in revenue and uh the burgeoning private credit industry is financing hundreds of billions of AI spend. UBS Global Research reports that private lending to the tech center reached uh 450 billion earlier this year, up from 100 billion, I'm sorry, up 100 billion from the prior year. So there's still no value being created by AI. It's only supported by money from outside investors and lenders. But that is not necessarily a bad thing. This is how it goes with young industries and new technologies. The business model's not there yet, but that doesn't mean it won't develop. That is what is great about a modern capitalist economy. We can fund progress without an immediate reward and end up living in a more productive and richer world down the line. So the question you have to ask is how long can the money keep flowing into AI? And I can tell you that AI spending has a very long runway. I have to admit early on I thought we only had a few quarters maybe before we needed to see significant AI revenue before investors started to sour on the idea. But now I believe that markets can keep supporting AI spend for much longer. And here is exactly why most of the spending decisions and the capital come from the gigantic tech companies and they are allin. Microsoft Alphabet Meta they see AI as an existential threat to their existing business and they will not seed any ground at all to their competitors. Microsoft CEO Satia Nadella said he's haunted by the thought of Microsoft not surviving into the AI era. Mark Zuckerberg of Meta said he'd rather end up misspending a couple hundred billion than miss out on AI. And Alphabet founder Larry Page has reportedly told a employees he would rather go bankrupt than lose the AI race. And for now, the share prices of AI companies show that there's no reservations yet about this spending stopping just yet. However, I need to point out that the AI ecosystem has gotten more complicated and circular. AI companies with capital are pouring it right back into other layers of the AI stack. So, as noted earlier, Nvidia is investing 100 billion into OpenAI. OpenAI will use that investment to buy chips from Nvidia. And Nvidia's done the same thing with other companies. It owns more than 6% of Coreeave, which uses Nvidia chips to build its data centers. So these are roundabout deals in which suppliers are investing back into their customers who are using it to buy more from the supplier. So this circular nature of AI spending does create greater risk. Right? This feedback loop of AI spending suggests that any crash will be swift and severe. So if you think about an industry with a traditional value chain, changes in the business sort of scale linearly to asset values, right? If um if we take the business process for Walmart and they improve a little bit, the stock goes up a little bit. Uh but when you have a circular system that makes it you know nonlinear, chaotic even, right? Small changes can lead to bigger reactions. So the cycle is driving all of AI higher today. If faith in the AI future falters, the whole thing can fall apart. But thanks to the almost religious belief in AI from Silicon Valley's tech CEOs, the spending can run for a long long time. So that is what you need to watch. Uh and if you want to invest in AI like everybody does these days, you really need to understand how the industry works. That's why I brought in Josh Balin. Josh is here at Stanbe Research. He's exactly the kind of person you want to hear from on this topics. He worked for one of the biggest and best hedge funds in the world. ran his own quant hedge fund and then he spent more than a decade working in tech with AI, robotics, wearables, and so much more. So Josh, you know this stuff better than me. What am I missing here in the AI story? >> Look, I I think you've captured sort of this the circular nature of of the ecosystem at this point. Um, but there really are, I guess, what I would refer to as like three constraint uh layers that sort of amplify both the risk and the opportunity in the space. Okay. So, so first you know and this sort of underscores how the hyperscalers have been spending right initially they were spending on chips right they had to lock down Nvidia chips um and at this point the bottleneck is really shifting from that chip capacity uh to power right data centers need massive amounts of electrical capacity and the traditional grid uh can't really keep up right and so you're even hearing anecdotally that people's power bills are going up because of this incremental data center demand. Um, we're even seeing deals out there in the market where AI companies are are buying power plants or investing directly in power generation companies because they know that this is now sort of the new constraint. And so what this really does is it creates kind of a an ancillary uh investment thesis around electrical infrastructure uh transformers grid modernization um more efficient power transformation within the data centers right cabling chips etc and so that's sort of one constraint right I think the second constraint and you didn't really touch on this but it is that data quality is actually becoming a moat for some of those foundational model builders, right? Open AAI has already cut a big deal with Reddit uh for for millions of dollars, right, for training. Um and and it's not really the compute there. Uh it it's the content, right? It's the underlying content that makes these models smart, right? Stack Overflow, same thing. New York Times. And so all of a sudden these proprietary data sets these these good data sets that allow for uh highlevel training of these foundational models is really kind of going to be a a competitive differentiator. And this is really the third and we've written about this and we've talked about this but everything we've primarily talked about has been about the data center itself but this is really crucial for investors. edge infrastructure really becomes the next battle uh ground for these hyperscalers and frankly a huge opportunity for investors right because AI can't just run in data centers right it needs to run locally in cars factories cell phones airpods right all of these devices and this proliferation of new AI enabled devices actually can creates sort of a new opportunity that investors can um you know can take advantage of right so it's it's compute infrastructure for basically all of these little devices that are starting to proliferate right and so in a sense um you know again back to your original question the the spend makes sense and the circular nature of it is kind of okay and I know we talked about this briefly before but in a way um this direct spend at such high levels is actually a way for some of these foundational ational models that do rely on venture capital or debt financing to almost eliminate those uh expenses and that um friction of adding a third-party financeier and they're just going direct to the supplier to cut the deal. So in a way they're almost leveraging the strong financial position of some of the big suppliers like an Nvidia to cut direct deals and eliminate some of the unnecessary financial transactions. Yeah, it's interesting from a financial perspective. You think of that as adding risk, the circular nature, but they're actually kind of de-risking by locking in their suppliers and locking in, you know, coming over these bottlenecks that that rather than have their business fall apart when they don't have what they need. So, um, it could work out in the industry's favor. Like I said, it's uh it's it sounds like a risk, but if you if you dig in, it's it's still all pretty bullish. Um, and so you know, Josh, you're you're plugged into the tech industry. You know, lots of people in both finance and Silicon Valley. When you talk to people, do you get any sense that the people making these spending decisions are starting to have doubts or they're getting cost conscious or is it still the mood in the room is still full speed ahead? >> So, look, we started having these discussions about an AI boom or an AI bubble back in April or May, Matt. And I told you back then that the canary in the coal mine would be when these hyperscalers start to pull back on spending, right? Announce that they've built too much or they're going to pull capex back. Um 6 months later, we haven't seen it, right? And so to some extent, it still is full speed ahead. But as I mentioned before, it's really buy everything we can to secure those critical constraint points, right? And so when you talk about some of these mega deals, right, these really to my earlier point, this isn't panic spending, right? These are strategic moves to lock up scarce resources, power capacity, chip allocation, uh data center real estate, um cooling, cabling, there just isn't enough to go around right now. And so, you know, I think in some ways when you're seeing, you know, you were talking earlier about some of the cross-pollination where, you know, Meta is using Google's models and Apple are using Google's models and Anthropic is actually collaborating with OpenAI and vice versa. This isn't um indecision. This is actually to some extent the spreading of risk, diversification. And you know, I think folks are starting to understand that it's going to be difficult for one winner to emerge from some of these foundational models. And so they're really hedging their bets, right? And so, you know, you mentioned coreweave. I think the biggest indication that spending hasn't slowed and some of the bigger uh hyperscalers and spenders haven't decided to spend uh slow spending is that you know a a core has a very significant debt burden that would traditionally worry uh certain financial markets certainly and that just does not seem to be the case just yet and so we continue to spend um you know without question I'm going tell you the exact same thing I told you in May and in a you know in May, June, July, even August. When you hear one of these larger companies, a Meta and Amazon a FA, you know, uh uh beginning to pull back on spend, that is when uh we need to be very cautious about what they're pulling back on and how it impacts. But as of right now, um, you know, back to your earlier points that, uh, Zuckerberg made, Ellison made, um, you know, certain folks at Google, um, this isn't a bet that AI is going to be, uh, big. We we know that they are betting that if they if they don't invest now, that is the existential risk. And so that's really a higher bar for stopping spend. Um and so yeah it it is full steam ahead but you know again very strategically at the moment. >> Okay so if an investor is looking at the stock market now you know you've talked about the bottlenecks and and investing bottlenecks are where the opportunities are which I is always fun to to point out. Um where along this ecosystem do you think investors should be making bets today? Are we still at picks and shovels? Are we are there applications that are actually going to start turning profitable? What specific stocks should people be looking at? >> Well, I think you mentioned some of the picks and shovels opportunities, right, within those constraint points. Um, I've written pretty extensively about Micron. Um, that's obviously, you know, within data centers and even at the edge, uh, high bandwidth memory is is a scarce resource, right? Uh, every AI player is going to need them both in data centers and at the edge. And so you've seen Micron and some other memory providers act uh pretty well. Same thing you know we talked about uh power um you know power and cooling within data centers. You know a verdictive um has been something where yeah the stock has ripped but um there aren't uh that many uh competitors to something like this. So when you find uh these um you know constraint uh uh driven opportunities the verdives the microns th those are good places I think you know the other kind of category that I would sort of put this in are the you know let's call it profitability now uh uh bucket right where um you know the oracles of the world they they do have stable cash flows from other businesses they're paying a dividend yeah the stock has worked but um you Again, until there's an indication that uh business is, you know, the the the trends are changing or there's some type of negative catalyst, I think some of these quote profitability now opportunities um will remain. And you know I think the third is you know again as we kind of get more and more edge expansion um some other uh more traditional uh um you know localized be it semiconductor plays or even data center plays will continue to work. So, I've highlighted Qualcomm in the past as a potential winner here. Uh, as the proliferation of of devices grows, um, you know, Broadcom obviously, um, you know, continues to work. In short, um, you know, I really do want some portion of an investor's portfolio allocated to these opportunities. The, uh, the returns are there. Um, they're not without risk. uh but again uh spending continues and um I really do think that uh returns can continue. So you know back to the four first point that you made right this circular funding mechanism it's not a bug it's it's a feature of this right it's actually how the ecosystem is showing some maturity it doesn't mean that it doesn't come with some risk and some peril u but again I think this really does create um more certainty that uh revenues amongst big players continues uh support amongst the ecosystem continues and subsequ quently uh certain stocks will continue to work from here. >> All right. Well, that's that's great, Josh. Thank you so much for breaking this down. It's an incredible new industry being built and there's so much to learn. Glad we can learn it from you. There's a lot of profit opportunity here. Um and people before you go, don't forget that Josh is just one of the folks we have working here at Stanbury Research. In fact, to build our new quantitative AI based uh investing system, we enlisted experts in quantitative finance, computer science, even a PhD in astrophysics. If you want to see how to build a simple but profitable AI portfolio with AI helping you make the stock selection, go and visit stansberryystem.com to get the whole story. You won't be disappointed. As always, like and subscribe here on YouTube, and I will be back next week.