Would you rather have an assistant with the intelligence of like Einstein, but they have no access to the internet and they don't know anything about your history? Or would you like to have an assistant that's just above average intelligence knows everything about you and they can use the internet? Okay, you would choose that. Hey, great to have you here in person. I remember when you were right up this hill. Yeah. You guys were over here. So I had a couple thoughts this morning. First, I got to take a ride in full self-driving 12. How was that? It was mind boggling. I think this is going to be a bit of a chat, cheapy-tea moment for full self-driving, but what it really just reminded me of the magic of this moment. Tesla rebuilding their models for how they do self-driving around imitation learning and all this interesting stuff going on over there. I think they probably made more progress in the last 12 months than in the last seven years in terms of what's going on there. And it's going to be rolled out here. It's already rolled out to 5,000 people. And so people are going to start experiencing that. And I think we're having more and more of these moments because this substrate we're going to talk a lot about today, AI, and just the compute like what it unlocks.
The second in prepping for this pod was how bad you make my head hurt. I was thinking about this. What I love about this pod is it's a forcing function. You and I talk all the time. You're always challenging me. We're always comparing notes. But now with a little bit of structure around it, every couple of weeks, we have to think about some topics. Today, we're going to talk a lot about think AI and compute and chips and its impact on big businesses. And honestly, I liken it to an athlete. And they say, in order to be the best I can possibly be, maybe Kobe, I want to practice against the best. No, but like, listen, the reality is, it's like running 10 miles a day to get ready for a big game. Like, if you're in this business and you're not exhausted with the analysis you're doing, the thinking you're doing, particularly at moments like these to try to gather this data and try to gather edge, then you're probably not going to end up on top of the heap. Yeah, I agree. And I think that having a topic or an idea that you want to fully flush out and be able to talk about causes you to place a few phone calls, you know, read a few PDFs. And before you know it, you actually realize you've learned something you didn't know, you know, five days earlier. It's a, you know, I think the little, you know, pull the screen back a little bit. I mean, you and I talked, you know, we're interacted five, 10 times a day over the course of last week on these topics. And then we turn over a rock and we find more data, more information, we share that with one another leads to another conversation, you know, and the combined networks allow us to ask a lot of the smartest people in the world the questions we need to be asking to try to figure out this moment. So it's been a lot of fun, but, but it does give me a bit of a headache. Hopefully a good one.
So, so we remain, we remain kind of in an earning season. Yeah. And so what, what, what happened in the past few weeks that you think is super important. Yeah. Well, we have a lot of, we have a few stocks that have run a lot, meta, Nvidia up 30, 40%, even with the pullback that we had today. But the truth is the NASDAQ hasn't really moved that much. I mean, I think we're up three or four percent through today. If you look at the median stock, I think it's up about 1%. In fact, I think we have a chart here just on the dispersion that we see in the NASDAQ, you know, and so remember last year was, was like this risk on moment, a mean reverting moment for all of technology.
And this year we're, we're, we're really starting to see the winners and the losers. We have some software companies that reported after the bell to tonight that are down a lot because they're not seeing the AI pull forward that maybe an Amazon or a Microsoft. So that's my first takeaway. My second takeaway is, you know, against these higher prices for some of these companies, you know, the backdrop looks a little bit more challenging. So we had a CPI print that came out last week that ran a little hotter than people expected.
The 10 years back up to four, three, remember, at the end of the year, I think I had gotten down to three, five. And then we had what I thought was a really provocative tweet at the end of the week from Larry Summers, where he said the next move by the Fed could be higher. Now, why is this so provocative? Well, the market is betting for sure. The only debate about the soft landing has been when is the Fed going to cut? Right? And so you have summers come out and say, Hey, I think the next move could be higher. That would be a shock to the market. Was he being provocative or do you think there's real data that suggests that that the soft landing is in a foregone conclusion?
Well, listen, Larry was was spot on right in 2022. Okay. I think last year he he was a little bit too aggressive as to where he thought rates were going to have to go. At the end of 22, I think he said maybe they could have to go to six or seven percent. But I'm humble in the face that the future is unknown and unknowable. Like, we don't know. That's the truth of the matter. So as investors, we have to try to distribute this these probabilities. And so if I go back and look at this, real rates, right? So real rates, this is the restrictiveness that we have in the economy. So this is effectively the interest rates we have less the expected inflation rate in the future. There is high as they've been since the fall of 2007.
And the last time they were higher than that was in the summer, August of 2000. Okay. Now, what was the what was going on August of 2000 and the fall of 2007? Well, the the economy was on a heater and the Fed was trying to slow it down. Okay. So that's the level that the Fed currently has its foot on the brakes. And every month that inflation comes down, if it does, okay, then the restrictiveness goes up, right? So the Fed, if inflation is coming down, then the Fed does nothing and its foot goes harder on the brake. So that's why Paul has said we have to cut rates just to stay equally restrictive.
So for Larry to be right, we would really have to see a reversal in inflation, which I don't think many people forecast or see. But I think the important takeaway is this, as investors, I know, and you're already probably saying, God, how do Brad side track me on macro? I don't want to talk about this. You know, I often think about that famous saying, if you don't do macro macro, does you? But when I think about when I think about it in this moment, it's just to say stocks have run up a bunch at the start of this year. Okay. The backdrop has gotten a little less predictable. There's now this tug of war that's going on. So I think we're going to have to see both of those things play out.
And then of course, this week, the monster that comes tomorrow bill is Nvidia. In fact, that CNBC is screaming every day, whichever way Nvidia goes, so goes the market. Now, I don't think it's quite like that. But one of the things that I was thinking about in regard to this is because we were making a bet that AI was for real, that training workloads were going to be large, and that these inference workloads were going to kick in. As investors, we often take what we call this private equity approach to the public markets, which is, let's get the big trends, the phase shifts, the super cycles, right?
I think about you had me over to benchmark. This is years ago. And you said, Brad, will you come and talk about booking.com and the case you do at Columbia Business School in the old Graham and Dodd class that I teach with Chris Begg on occasion. And the thing I tried to teach the students in that class is why did all the analysts on Wall Street miss booking.com, misspike price line. Now, remember, price line was a billion dollar company in the public markets. Today, it's $120 billion, $120 bagger in the public markets. I mean, there aren't many venture capitalists that ever get $120 bagged, let alone a public market investor. And the takeaway in the class is everybody and all the analysts on Wall Street were so focused on how many hotels were they going to add in the quarter, right? And there'd be a lot of volatility around the number of hotels added in the quarter. Nobody really took the time horizon to say in five, 10, 15 years, how much more the offline world is going to book their hotels online and how much bigger that's going to be. So often the short term trading, they would get the long term conviction right, but they would end up trading out of the position.
So I look at Nvidia tomorrow, and the honest to God truth is we have no edge on a quarter or on day to day trading of these things. I think we do believe that the, and we're going to talk about this later, the amount of compute that's going to have to be built in the world is way bigger than people than consensus estimates, you know, currently forecast. But I think tomorrow it's going to be really interesting. What could really move? I mean, they're sold out, right? And their productions known. So it's just pricing that could be different. Correct. Well, I think there's so every, every hedge fund, every long, only person, they track all this data, right? So the co-woss data, you know, the order books, the H 100 data. And I think what people are seeing, and there's been some tweets about this, is that the lead times on Nvidia H 100s are going down. So what might you think if the lead times are going down, you would say, Oh, the demand must be going down, or the supply must be going up, catching up with demand. And you know, we've all been trained that every supply constraint is ultimately met with a glut, right? So everybody's just the wall of worry around Nvidia's. When does the glott come? We've pulled forward all this training demand. It's dark fiber like in the year 2000.
I think those things are not accurate. But of course, I have no idea what this means as to tomorrow. So I think there are just tons of questions about AI chips inference, how much of it's going to be going on. I know we're going to hit on a bunch of that today. So, you know, we stirred the pot last week, a little bit or two weeks ago, by questioning the consensus view on Google, which is that they're going to be a big AI winner. I think we called it the $2 trillion question. I tweeted about it. You know, Mark Suster chimed in and said, you know, I'll take the side that they're going to be an AI loser. You know, but why don't we dive in a little bit? You had this good idea. Hey, let's look at these large cap tech companies through the lens. Are they winner or they loser from AI? So let's start with Google. Yeah. And I wanted to back up a little bit and borrow a framework from one of my, one of my close friends and and someone that I think a lot of people have listened to and learned from around investing, Mike Mobison. And years ago, shortly after I first met him, he was teaching a class at Columbia and they started him in the sky. Paul Johnson started talking about an acronym they titled cap competitive advantage period. And what they would do is they would take a company's market cap and they'd look at the trends in the company and they would back into the number of years into the future that Wall Street was telling you this company was going to have a competitive advantage. And by by by basically counting the number of years it would take in free cash flow to build into the market cap. And what the point did he made is that, you know, different businesses have different amount of durability.
我认为这些事情不准确。但当然,我不知道这对明天意味着什么。所以我认为有很多关于AI芯片推理的问题,它们将会发生多少。我知道今天我们将会涉及很多这方面的问题。所以,你知道的,我们上周或两周前引发了一些争议,质疑了关于谷歌的共识观点,即他们会成为一个 AI 大赢家。我认为我们称之为 2 万亿美元的问题。我在推特上谈到过。你知道,马克·萨斯特也表示自己会赌他们会是 AI 大输家。但是让我们深入了解一下吧?你提出了一个好主意。嘿,让我们通过这个视角来看看这些大型科技公司,它们是 AI 的赢家还是输家?所以让我们从谷歌开始。是的。我想再退一步,借用我一个密友的一个框架,也是一个我认为很多人都听过并从他那里学到东西的投资人 Mike Mobison。几年前,我第一次遇见他后不久,他在哥伦比亚大学教授一门课,他和他的同事保罗·约翰逊开始谈论一个他们制定的缩写 cap 竞争优势期。他们会拿一个公司的市值,看公司的趋势,然后反推出华尔街告诉你这家公司还会有竞争优势的多少年。通过基本上计算将来市值需要多少年才能构建出自由现金流。他的观点是,你知道,不同的企业拥有不同程度的持久性。
And so, you know, Coca-Cola might only have a 3% growth rate, but it might have a 40 or 50 P.E. because everyone is willing to bet that 75 years from now you'll still see Coca-Cola on the show. Yeah. Yeah. Right. Whereas, you know, you look at a company that all of a sudden faces the innovators deliver faces disruption. And this cap can close super fast and it has dramatic impacts on the market cap of the company. I remember when Blackberry first got in trouble, the the the valuation just retrenched so aggressively that many people got fooled into thinking it was a value because it was trading at 10 times earnings. Yeah. Same way. Yeah. And and and what was happening is the people in the know were saying this company's competitive advantage just became quickly undurable. Yes. I'm not sure undurables are what you understand the point that I'm making. Oftentimes because we have a lot of high growth stocks in Silicon Valley and so they get assigned big multiples. Multiples are a byproduct of a couple things, how fast you're growing because we're trying to forecast those free cash flows into the future, but also to the second point, the durability because we have to assign a discount rate.
What's the probability that we're actually going to be able to collect those annuities sometime in the future. And so, the less confidence we have about the future, the higher the discount rate. And so, even though you may have high growth, we have to discount it a lot. And Mike has gone on, I think, to talk about optionality, especially around tech companies. Sometimes you have a platform position that increases the optionality. You're going to be able to move into other fields and therefore, that would also be a positive. But other people have questioned why these tech companies have high multiples at all because they're so susceptible to tech disruption, in which case you could argue the other side. But anyway, the reason I thought this would be an interesting way to talk about some of the large companies and AI, I don't think there's a single person out here that is arguing that AI is not some kind of fundamental phase transition, Clay Christensen's disruptive wave or whatever. And in fact, I think the number one way you could probably commit Harry Carrey as a public company would be to, on your earnings call, say, we think AI is full of shit, like we don't want anything to do with it.
And so, everyone's forced to have an answer. And I think, you know, one example that's pretty obvious to everyone is Microsoft, I think it became very clear to a lot of investors when they learned about an LLM and what it was capable of. And the fact it could help write code, and then it could help you write a paper, you know, and it could help you with creative endeavors. People looked at the Microsoft portfolio of assets, especially where they make money around the office suite, right? And said, and the developer community, where they also, you know, control a lot of the IDEs that are used to program. And they said, oh, this is easy. Microsoft will be enhanced by this. And we should also add their their adaptness at moving quickly with the open AI relationship and being they were in there in front of this early. Absolutely. And so all three of those things you go, oh, this, they're in that winter. And lo and behold, you know, their stock went up and they had multiple expansion.
Yeah, I mean, I think that the first question we ask as investors is, is this thing real? And what do we mean by real? What I mean by is the juice worth the squeeze. It costs me something to have AI if I'm a customer, right? And is the productivity gains that I'm getting as a business worth it, right? So, you know, I was talking about Tesla, you know, start off the conversation. Well, if if this model and this compute and all of this capability allows me to develop full self driving and to win the automotive market because of that, then of course, the juice is worth the squeeze. I think if co-pilot allows my engineers to be 30, 40, 50% more productive, then I'm replacing human beings with machines. Of course, that's worth the squeeze. And so I would say that as we sit here in the early, but I would even say it another way in the Microsoft case, if you're not using their tool and you're programming without it, you're falling behind. Right. And so it becomes, you know, a tool that you have to have to remain competitive.
Yeah. And so, you know, to me, I, we really try to look at it through the lens of, is this company like, if we look at their existing business, is their existing business enhanced or attacked because of AI? And then what new business opportunities do they have? And if we go back to Google for a second here, because I think we it's kind of this iconic study because the consensus view was that they were going to be a a huge winner in AI. And let let's step back for a second here. 20 years ago, right, the idea that you were going to be able to ask any questions, immediately get information for free, like Google gives us, right, was just beginning. And the gains to humanity caused by the revolution that Google really led around information discovery and how efficient and how quick they provided information discovery really just it changed the world in every respect. It moved humanity forward back to Ridley's idea of ideas having sex, right? Like it just allowed us to have more ideas, collect more information, exchange more information.
I asked the team to do a little analysis like as investors like, where do we think the right multiple should be for Google? And like, what what are the things that are inputs into that? So if you think about this, Google does about 10 billion queries a day, right? So, you know, a couple, a couple queries are more than a query for every human on the planet. And it has the most efficient system in the world for doing that. I think if you pull up this tweet from Vivek, it'll show that that the number of queries that are asked of Google has slowed down a lot, right? So they're growing at about 4% a year. Monetization is up a lot, 13% more ads on the page. We've talked about that. And so you have a 17% keg or around search.
我要求团队进行一些像投资者那样的分析,我们认为 Google 的正确倍数应该是多少?还有,那些是这种倍数的输入因素?如果你考虑一下,Google 每天大约处理100亿个查询,对吧?所以,你知道,几个查询就已经超过了地球上每个人的查询。而且它拥有全球最高效的系统来做到这一点。我认为如果你查看 Vivek 的推文,它将展示 Google 被询问的查询数量已经大大减少了,对吧?所以它们每年大约增长4%。货币化水平提高了很多,网页上的广告增加了13%。我们已经谈论过这个问题。所以你有一个17%的搜索成长率。
And so the first thing that we just say is that the basing engagement, even before AI had slowed down a lot, because you're already at 10 billion queries a day. So next we took a crack at a chart that says, how many of those queries are going to become chat GPT-like queries? Okay. And so the black line here is kind of the number of queries that are information retrieval queries on Google. And the blue line is the actual and the forecast by us for the chat GPT-like equivalent queries that are going to occur. Now, what where do we get that information? Well, first we know a little something about the number of queries on chat GPT, right? And we know that open AI and Google are working on these search integrated experiences. I think they call it SGE, where they're going to have answers like perplexity in line with search. I think that Microsoft is now doing this with the rebranded co-pilot.
So a lot of the information retrieval searches are going to be replaced with these chat GPT-like searches. And this is where it starts to get interesting because two things kick in, Bill. Number one, it costs a hell of a lot more, right? To provide answers than it did to provide 10 blue links. So if you say like, what's it cost to provide 10 blue links? It's about a third of a penny or less per query. Now, what is it cost to do that for 750 tokens today? And of course, this will go down over time, but it's 10 X more, right? It's four cents per query. And then if you look at the refinement of queries that's really going on. So a lot of times, Bill, what they do is they'll send back these 10 blue links and then they'll use that as they're prompt, right? To re query the engine. This could be up to 50 X more expensive to serve an answer, a high quality answer to the consumer versus 10 blue links. So your cost goes up a lot.
Well, what about revenue? So I asked the question on the other side, well, we know that revenue goes down. Why? Because I'm not clicking on all these ads on the page, right? And so revenue per search likely goes down. I think if you look at that in terms of what the gross margin is to Google, right? The cost of serving going up, the revenue coming down, I don't know, you may take a 95% margin today on a business where you have 99% share, your share likely goes down over time because you have people like chat, GPT you have to compete with, but worse yet, your margin on each of those queries goes down over time. I don't know what it nets out at 50, 50%, 60%.
Now mind you, for perplexity or co-pilot or meta, et cetera, that's a great business of 50% margin business. And you know, it reminds me what we've all said so many times, Google's margin is their opportunity. So the problem for Google is they have to do this because their competitors are forcing them to do it. Okay, but it definitely is going to be a lower margin business and they're unlikely to have the 99% share. So everybody has texted and emailed me, yeah, but they've got YouTube and they've got Gmail and they've got Gemini 1.5 and they've got all this stuff. And I stipulate all of that is true. Okay, but the business that today produces the vast majority of profits for the company. What percent? I mean, listen, I think that search and YouTube produce over 100% of the profits because they have a lot of money losing units in the business, but it's over 80% of the profits in the business. And so when you think about that, now listen, they've got great management. They can cut costs. There are lots of things they can do. I'm not saying this is going to occur overnight. But if you and I were talking with Clayton Christensen about the innovators dilemma and we were analyzing this business, this would really be case exhibit number one.
Now, the irony of the innovators dilemma, Bill, is most of the companies that face it, they know they face it. They know they face it. So the question is, why don't they do anything about it? I think some don't. But in this case, you, I think there's no doubt that they know. Of course, they know. So the question is they're trying to thread the needle, right? Can we somehow modify this in a way where we continue to grow our quarterly earnings because just setting the platform on fire and retrenching in the public markets and doing all of that very, very difficult advantage by one thing that the searches that are going away first or Wikipedia like searches that don't have much monetization. So it doesn't, it doesn't, the revenue doesn't come apart right away, even though you might be losing search volume. And more importantly, like people start getting addicted to the answer right away, which is very different from 10 blue link.
I think like the horses out of the barn on answers, right? Like once consumers experience the magic of an answer, they're not going back to hunting and pecking for a roster for an athletic team through 10 blue links. So you're just going to use it. And all of this just reminded me of lastly of somebody tweeted out a Grantham quote this week that I thought was pretty interesting. But it's this if you pull out this tweet from, from Charlie, he says, the S&P profit margins move down to 10.6% Q423, the lowest since Q420 20. And that is the quote from Grantham, I love. Profit margins are probably the most mean reverting series in finance.
And if profit margins don't mean revert, then something has gone badly wrong with capitalism. Right. I mean, what's happening here with Google? It's not that this is anomalous. Right. When there are big pools of profits like exists in Google search, capitalism has a way to redistribute those people would give it up. Like I agree, like Apple and Microsoft had given up 100% prior to this new reality. That's why that's why Sacha says you if you're in technology, if you run a company like Microsoft, all the money is made in the two to three years around a phase shift. You cannot miss a phase shift. If you miss it, then you miss all the value capture for the next decade.
And I might argue with AI, it's going to be even bigger value capture and disruption. And it's going to last longer than a decade. One other thing we don't know yet that I just be issuing your thoughts on is, what's the business model for this? Because right now, the premium versions of perplexity and chat GBT have a dollar amount, their subscription. This kind of looks like the Netflix type situation. So is it subscription or is it free? Do you want ads around your answers or not? Well, I mean, listen, remember the disruptors. They don't have to generate a lot of margin on this because they are no money on it today.
So what do they have to do? They have to cover their costs. Yeah. And these disruptors want to see Google dance, as Sacha said. So the prize prize when he said I was too. But listen, listen, the fire in the belly is exactly what you need. I think Sacha has founder level fire in the belly about this moment in time. And I think he has it not just because he wants to see the stock price go up. I think he has it because people like Sacha, they're post money. And what they care about right now is moving humanity forward.
And they understand that they've worked their entire careers to get to this place where we go from computers acting like calculators that are modestly beneficial to now computers helping us answer and solve the most perplexing and fundamental questions that we face. And so I suspect that they're going to underprice this. If I were perplexity, I would I wouldn't have any ads in a thing, right? No need to put ads in it. And I would attack and I would try to get share, right? And so it would be surprising to me if we don't see meta AI and Microsoft and by dance and perplexity
and all the others who are providing answer engines, chat GPT, coming at them below margin. Okay. Now, let's ask let's answer the question that I'm talking about the shorter term while everybody is fighting to gain share, they'll price it so they cover their costs or maybe not maybe Microsoft's willing to eat it, you know, here for a while. And I continue like I as of right now and I've played with all of them on the consumer side, I don't think anyone has a pro I perplexity came out as a lot faster.
现在让我们来谈谈我所说的短期问题,当每个人都在为获得市场份额而奋斗时,他们会定价以覆盖成本,或者也许微软愿意承担这一成本,你懂的,在这里一段时间。就我目前的情况来说,我已经玩过所有消费者端的产品,我认为没有一个产品能像pro I perplexity那样快。
That was pretty cool. But when I just look at the quality of the answers, I don't I mean, on different searches, one might be better than the other, but I don't see anything that's so holistically notable. Like I remember when Google search came out like where I was, I remember trying it versus off the list and yeah, and it was like you could tell like, Oh, this is better, right? And I don't see that right. I had that I did have that feeling of chat GPT versus a traditional Google search. Oh, yeah. I mean, you're saying among the answer.
Now you have you have four or five, right? Right. I leave I leave back, you know, companies competing in this consumer AI space in there. I don't see anyone yet. Now, I as I said on our last part, I think if you get this memory, right? It could change. And since we did that, um, open AI published a release that says we're working on it. We're working on it. Here comes memory. But the promise is where I think pretty thin relative to people again, because I think this is so important. You and I are in agreement that this could be the next 10x moment with GPT like experiences for consumers.
So just double click again on on what you're understanding is a memory and why you think your sensibilities are that it's so important. So I think there's two elements to this. And one of them is a is a user expectation thing. And then the second one is a technical observation on the latter. Let me let me get to that in a second.
But on the user expectation thing, you know, it's funny. I always go back to the movie her, which I thought was just incredible. But like, you I think want to be able to talk to this thing and have it remember everything that you ever told it. And if you had one that knew all of your emails, all of your contacts could remember your to do this, could when you're about to meet someone that could bring up the last four times you met with them and the reminders you left yourself at that point in time, like you're talking to, you know, we talk about the programmers 30% productivity increase.
This could be a human 30 or 40%. Like if you have this thing in your head that just remembers everything. So that's and by the way, and I said this last time, I think most people have just extrapolated AGI into infinity and think it's going to do all these things. But it's not doing it right now. And and you know this because you and I've been talking about this for a couple of months now, if you talk to the people that are at the tops of these firms and you say, Hey, why can't this thing remember everything I want? And they go, Oh, that's that's a hard problem.
Exactly. And it turns out that just because of the way this thing works, it would literally have to retrain every night on each every individual user and training costs are super expensive. Right now it's trained on the internet. It's regurgitating the internet. It's not training on everything in your database. And so there are people, including open AI and all I think every want this is another one of those things I think they're all aware of this. But they don't know how to do it technically.
And and and you know, I invite anyone to come on the show that thinks they know how to do it technically if they'd like to correct us or whatever I'd love or if there's a startup that thinks they know how please come see us who right? You're both funded. And you and I've talked about this. I mean, and listen, I think open AI open the kimono a little bit. I think they're further out in front than they revealed.
But you know, again, I come back to this idea that I'm really lucky out here. My my assistant Brit. She's been with me for 15 years. She knows me longitudinally. My likes, my dislikes, my family, everything about my kids, everything about hotels I've stayed at, rooms I want to stay in, etc. So my expectation of her is that she can take offload a lot of that because she has all that prior history, right?
And very human had that. Exactly. You think about the productivity unlock for humans. If you give that for free to every human in their pocket, and I'm convinced it's going to happen. But one of the things I would suggest, I had a really interesting conversation with some friends about Apple. Okay. Because this is the giant, right? This is the thing in everybody's pocket. Nobody's talking about them. But they they have so much information about me. Okay.
They have my contact list. They have my emails. They have my text. They have all these applications. And so one of the things I'll just drop out there, I think a little provocative about what they may be doing. Because I've read a bunch of stuff on Twitter about how they're building their large language model of their own. My sense is they're not doing that. Okay. My sense is, in fact, that if you think about it in the context of my assistant, right? So here's the metaphor I give to you.
Would you rather have an assistant with the intelligence of like Einstein, but they have no access to the internet and they don't know anything about your history? Or would you like to have an assistant that's just above average intelligence knows everything about you and they can use the internet? Okay. You would choose that, right? You would choose the and so think about what Apple is going to do. Maybe more like a small language model, like really understand all the language, really understand everything about me, really understand how I interact with all these applications.
And then when I have a deep problem I need to solve, they can sub agent it out, right? They can send me down the path of chat GPT or send me down the path of Gemini or send me to meta AI for an answer engine if I want to go down that path. But I think that there's this layer on the top that's just a different architecture, a different way of thinking about this. That's going to be more like my assistant Brit that's just steering everybody in the right directions.
I think Apple is superbly positioned to do this. But of course you don't want it to just you also want to be able to tell it things that you just to remember this or mark this down or attach this to a note. And we've talked in the past about how an LLM could be a user interface disruption. And so you could imagine a small business starting with a CRM that is only voice, right? And you say this customer this, you know, and you just talk to it and you want it to remember. But that has to be architect. Think about this. You know, Brett Taylor's new business Sierra, we're looking at a bunch of businesses in this space. Again, you and I are talking about it in the consumer landscape. Remember everything longitudinally about me. But what is a CRM? It's remembering everything longitudinally about your other, your customers. Well, one thing I want to do just to wrap this up because I think that you and I are analysts and you know, I think oftentimes in our business people talk about it. Is this company good or is this company bad? And I think one of the things you and I think about a lot is distribution of probabilities. And is it reflected in valuation?
So if you pull up this chart we did, which is the man comparison, it shows the growth rates and the multiples applied to Microsoft, Amazon, Nvidia and Google. And here all we did was take consensus numbers. So these are not altimeters numbers. Our numbers are higher for some and lower for others. But one of the things I just want to point out is at the top, this is the 23 through 25 expected growth rates, 14% for Microsoft, 12% for Amazon, 42% for Nvidia and 11% for Google. So Google's already expected out of those four to be growing at the slowest rate. But then what's interesting if you come down here to the price to earning ratio, right? You'll see that Google is trading at the lowest PE, right? 21 times 24 and 20 or 18 times 2025 expected.
So all the things that you and I just talked about Bill about growth rate and durability of free cash flows into the future, I would argue a lot of these are already discounted in the stock, right? People are already placing those bets. And so one might take the other side of that and say, yeah, Brad, yeah, Bill, I know all those things to be true, but they can cut a lot of costs and do a lot of things and that could cause the multiple to go up. But if you go to the line under that to the peg ratio, because this is one a lot of people want to ignore on a on a price to earnings multiple, for example, in videos a lot higher. But if you actually look at it on a peg ratio, this year, it's it's a much less expensive company. If you look at on 25 peg ratios, it's just a little bit different. So there are two ways in which to look at future price to earnings multiples. One's growth adjusted, right? That tries to take growth out of the equation and just look at it in terms of strict valuation.
So my big takeaway from this bill is we're not here to pick on Google. We're just to say this is an important case study to watch about innovators dilemma. And it's clear to me that investors are already discounting it, that they have some of these headwinds coming. And I think there may be an opportunity. So, you know, I said the other day, if they manage to thread this needle, trust me, I'll climb on board that bus because I think there are tremendous costs they can cut out of that business. There's a lot of fitness they can drive into that business. And the real question is, how are they going to drive down the costs of serving these inferences? And how are they going to monetize this in a way that by the way, we, you know, it's funny because I think they have other assets like when you talked about Apple, you said they have the handset. Well, you know, Google controls the entire Android market, which is a big market.
They have a competitor to Microsoft's office suite. Now, they have historically not invested a lot in that it's not a big driver of their revenue. They could they could all of a sudden triple down, you know, they were ahead in type ahead. If you remember, like, I, my kids use those products and I was always on the Microsoft and I can remember, like it was finishing sentences for my kids. I was like, what was that? Right. That was inside of Gmail. Yeah, first. Yes. And so they have assets that that they could bring to the bear. And I, and I think, you know, everything you said about Apple is true. Like having the physical control, the physical device seems real to me like meta, like the notion that my AI would live in my WhatsApp as a person.
I like that doesn't feel intellectually perfect to me. Like it being in the phone. Yes. Feels perfect to me. Like this thing's with me all the time. But let's talk about two things in that regard. So we talked about you talked about memory being a 10 X chat GPT moment. So you said GPT was one of these 10 X moments to you compared to blue links. If we got memory, that would probably feel 10 X like by the way, why you're there. I have to say one thing that relates to to valuation. Yeah. One thing that drives durability going back to our competitive advantage period is switching costs. If I start relying on one of these things as my memory, and I don't have a way to pull that out and jump to something else, I'm stuck.
Yes. Like I am hooked, locked, stuck. Right. Which means I'm very, very positive for the person that gets there. So I think I'm looking for memory as a 10 X moment. The other 10 X moment I'm looking for here, Bill, is actions, right? Going from answers to actions. And so let's talk about that for a second. Yeah. You know, this company, Rabbit's been making some waves. They have their version one out. Yeah. I saw, I think Tony Fiddell tweeted the other day, can't wait to get his hands on one. There's a bunch of cool demos online. We've spent some time with the company. And now the thing that they have or that they talk about is a huge differentiator is what they call a large action model, not a large language model, large action model. And basically think of it like cursor control, Bill. So if I say, and in fact, we did this demo upstairs when they were visiting, I said, book an Uber going from, but it was able to do it.
It literally had trained on the behavior, cursor behavior of people using these apps. And it was able to book that without any other intervention by me. So I took to doing some research and said, could Apple do this? Because Apple knows exactly what pixel I'm using on the screen to hit a book button on booking.com or on Uber or whatever the case may be. Now remember, Caparti talked about this when he went to open AI the first time he worked on a project that he called World of Bits. And World of Bits, the iconic thing he tried to do there. And I think this was maybe five or six years ago was to book a hotel. Could he get an AI to book a hotel? And he said, at the time it was damn near impossible, he had to write all these very specific algorithms, had to try to figure out what every booking page looked like. And he said recently on Lex Freeman, maybe a year ago, he said, I think if I tried to do it now using the general capabilities that exist today, it would be a lot easier.
它实际上已经在训练人们在使用这些应用程序时的行为,包括光标行为。它能够在我没有任何其他干预的情况下完成订票。所以我开始做一些研究,想知道苹果是否也能做到这一点?因为苹果知道我在屏幕上点击booking.com或Uber等按钮时所使用的像素。现在记住,卡帕尔蒂在第一次参观开放AI时谈到了这个问题,他当时致力于一个名为World of Bits的项目。在那里,他试图做的标志性事情是预订一家酒店。他能让AI来预订酒店吗?他说,当时这几乎是不可能的,他不得不编写所有这些非常具体的算法,不得不设法弄清楚每个预订页面的外观。他大约一年前在Lex Freeman上说,他认为如果现在尝试使用当今存在的一般功能来做这件事,会变得更容易一些。
So I think that Apple's working on this clearly startups like rabbit are looking on working on it. I think that is another 10x moment that's in front of us, which is we go from answers where I'm just asking it for information to actions. And once it can start booking my hotel booking, reserving my restaurant. And then I just say, same thing, do it again. Right? Because it has a little bit of memory about my prior action. Those are really powerful. And there's a there's an element of this that's just a fancier version of screen scraping, right? There's a there's a there's a hackiness to this to this notion. And I have often said, you know, why in the world are we writing in the self-driving world, are we writing, you know, millions and millions of line of code to infer the state of a traffic light? Like, why don't we just broadcast the state of the traffic light? And it would be it would be three orders of magnitude less code. But guess what? I think we we literally are going to bypass. I think if we had done that, that also would have been intensive, right? Because then we had to wire everything up to be morning.
Well, here's where I think the world's going. We met with these robotics companies, we meet with Tesla, et cetera, imitation learning. Okay, they're not even going to know what the stop sign is or the traffic light is or the dog in the street. They're not going to write C plus plus for every one of those specific incidents. They're literally going to watch the behavior of the five star human drivers for enough hours. And they're going to imitate it. All right, but you're missing my point back on the on the on the internet side, which is is telling having the having the AI, um, like move my cursor around and click and fill out things is is not the most efficient way to do this. You would you would have API. API is of course with these different services and and and a way to interact. And that's going to be a an interesting evolution. And there's a number of startups working on this too, on different ways to try and drive action and to, you know, some of them will will sit on top of browsers and do that or some of them might try and sit on top of your phone. Of course, Google and Apple will stop them from doing that. I totally agree with that. It's funny. I was asking our analysts, right? 10 billion queries a day on Google today. I say, do the number of queries in the future go up or go down? Right. And I had somebody, uh, if I saw our event at perplexity, said to me, well, the number of queries probably goes down because you don't have to ask it so many times. It'll just give you an answer. And I said, what about the positive reflexivity? Once I get the answer, I've got more questions. Right. Like as long as it's fast and it's producing the that information, I actually think actions and memory will unlock more interactions because it's so much more valuable to me. I'll start using it more and more for these future things. And I don't know. It'd be interesting to see there, you know, for a while, we've had the elections of the world or whatever, you know, do integrations, right? And so the maybe the possibility exists that if I'm an Uber customer and open table customer that that that eventually, I will tell them my favorite front end and they'll come to some agreement there so that they can pass my registration information through and that that all happened seamlessly. But there's a lot of work to do to make all that happen. Right. I mean, I think there's a lot of agent to agent interaction that will go on. Um, an AI agent representing both of these parties. But what's interesting about the action model, you know, the - hackiness that you talk about, right? I imagine this will get solved by startups in some pretty hacky ways to begin, but then it will ultimately likely be solved at scale in more elegant ways, whether it's APIs or agent to agent interactions, etc. But we're starting to see real experimentation. And I've had some of the early prototypes of actions actually coming to pass. And that feels to me like the next two big breakthroughs are going to be this memory. And by the way, I said this last time and it's a subtle point. But I don't think Google, you know, it's another issue in the in the disruption. I don't think Google has treated its partners well in the search ecosystem. And so there's a lot of, of angst there and a lot of mistrust. And so if open AI or perplexity came along and said, would you integrate and pass tokens? They might say yes. I think they're going to be more reluctant to do that with Google. I mean, at a minimum, we know they would probably like more competitors. In the game of sending them leads, right? So I mean, I think just the fact that you're a smaller player that you can be another source of competition. And they're not so dependent on Google for upstream traffic is probably an advantage to you.
嗯,这就是我认为世界正在变化的地方。我们与这些机器人公司见面,与特斯拉见面,等等,模仿学习。好的,它们甚至不会知道停车标志是什么,交通灯是什么,街上的狗是什么。他们不会为每一个具体的事件写C plus plus。他们实际上会观察五星级人类驾驶员的行为足够多的小时数。然后他们会模仿它。好了,但你忽略了我的观点,在互联网方面,那就是告诉AI,像移动我的光标,点击和填写东西,这不是最有效的方法。你会有API。API当然是这些不同的服务,以及与之交互的一种方式。这将是一个有趣的演变。有许多初创公司也在研究这一点,期望通过不同的方式来驱动行动,并有些公司会在浏览器的顶部安排一些或者有些公司也尝试将其放在手机的顶部。当然,谷歌和苹果都会阻止他们这样做。我完全同意这一点。有趣的是,我询问了我们的分析师,现在每天在谷歌上有100亿次搜索查询。我说,未来的搜索查询数量会增加还是减少?我曾经在某个困扰的事件看到有人对我说,好吧,搜索查询的数量可能会减少,因为你不必多次提问,它会直接给你答案。我说,积极的反应呢?一旦我得到答案,我就会有更多问题。就像只要它快速而且提供那些信息,我实际上认为行动和记忆会释放出更多的互动,因为对我而言这非常有价值。我将开始越来越多地用于这些未来的事情。我不知道。在某种程度上,我们已经看到世界上的一些选举或者其他的整合,对吧?所以也许存在这样的可能性,如果我是Uber的客户和open table的客户,最终我会告诉它们我更喜欢的前端,并且他们会达成某种协议,这样他们可以顺利传递我的注册信息,并且所有事情都能无缝进行。但是有很多工作要做才能让一切顺利进行。我认为会有很多代理对代理的互动。一个代表这两方的AI代理。但有趣的是,你提到的行动模型的 - 欺骗性,对吧?我想这将通过一些非常巧妙的方式来解决,但最终可能会以更优雅的方式在规模上得到解决,不管是API还是代理对代理的互动等。但我们开始看到真正的实验。我看到了一些早期的行动原型实际上成为现实。对我来说,下一波两大突破可能是记忆。顺便说一句,上次我说了这一点,这是一个微妙的观点。但我认为谷歌,在颠覆中的另一个问题是,我认为谷歌没有在搜索生态系统中善待其合作伙伴。所以有很多焦虑和很多不信任。如果open AI或perplexity过来说,你是否愿意整合并传递令牌?他们可能会说是的。我认为他们对于与谷歌做这样的事情可能更加犹豫。至少,我们知道他们可能想要更多的竞争对手。在给他们导入信息的游戏中。所以我认为,你是一个较小的参与者,你可以成为另一个竞争来源。他们对于自己的上游流量不再那么依赖谷歌可能对你有利。
Well, I know we're going to want to move to the topic of chips here in a second. But before we get there, we touched on Microsoft, we touched on Google, we touched on Apple, just by way of comparison. And people have heard me talk about meta a little bit in this in this regard. So again, the way we approach the analysis for all large cap tech, we said, is their existing business get better or worse because of AI? And then do their new business opportunities get bigger. Okay. So in the case of meta, unlike Google, Google has this massive super profitable business that's under assault by answers and actions. In the case of meta, we've seen their core business get better as a result of AI. Why? Because you're targeting videos now on reels, you're targeting. And then it happened pre-level. That was already happened. It was starting to happen. But the big difference really, I think between bite dance and meta was that yaming at bite dance adopted an approach around AI and GPUs before meta did. And I think I think Mark really made that transition about three years ago. You can see it in their cap X spend. But the big question was, obviously, he had to spend the money before he got the results. And so investors like us were kind of holding our breath and we're saying, would this lead to better engagement? Well, now we know it's lead led to massively better engagement. And I'm not talking just on reels. This is on the core big blue Facebook product. This is on WhatsApp. This is on Instagram. So they have these big platforms that are benefiting from both more engagement and the targeting of ads. Remember, this stock was at 90 bucks and everybody said, Facebook's dead because Apple attacked Apple pushed through their changes that disabled their ability to really track people into it. And basically because of AI, they've been able to backfill that monetization completely. Nobody thought no investors thought that was going to be the case 18 months ago. So their core business got a lot stronger. Now, as we look ahead, think about the new business opportunities that are in front of them. And I'm just talking about the things that Mark talked about on the call. Number one, they've got tens of millions of business customers that now they're literally creating these customer service agents for AI agents for every single WhatsApp business. Now, we don't see that as much here in the US, even though WhatsApp is the fastest growing messaging platform in the US. But if you go to a place like India or Brazil, people are transacting. Some of the biggest AI engines, right? AI bot companies are being built on WhatsApp as a platform in Brazil and in India, where they have tens of millions of customers already using them. So these have become platform companies that are enabling vertical and horizontal bots. And they're going to build their own. They're going to build it for celebrities. They're going to build shopping agents that assist me buying things on Instagram. You know, I always see all these things I like on Instagram, but it's a pain to actually buy the things on Instagram. The one click never got that easy. Now, I think you're going to see shopping agents that assist in doing that. And then just think about this content creation bill, whether you're an advertiser, just think what we saw this week with Sora. Text a video. I mean, now think of this in the context of an advertiser trying to drive a motor or creator or a creator, right? So my sons are creators creators on these platforms. This is going to unleash monster amounts of creation in the world at lower costs. And so all of that, I think, benefits their core business. You have these new businesses that they get to move into that I just mentioned.
Then, of course, I thought another interesting thing from Morning Brew. I think the pod that Mark was on last week, he talked about the meta AI glasses that all my analysts have, right? He said, you know, most people, they looked at Mark taking the video, reviewing the Vision Pro from his couch and they see, you know, that got a lot of glamour on Twitter. But the fact of the matter is Mark said the way you want to think of VR and AR really is as your desktop or your laptop. But the meta AI glasses, he said, think of as your phone, right? Because I'm going to be able to text. I'm going to be able to call. I'm going to be able to listen to music. I'm going to be able to order my Uber. I'm going to be able to do all these things from those glasses. And I don't have to pull out this rectangular thing or I keep it in my pocket or whatever. I think that's why you're seeing such incredible demand for those. And of course, the form factors will change and it'll evolve over time. But that's an entirely new line of business. So this is a company that's spending $20 billion a year on these other businesses that haven't been generating a lot of return. And I think now the market's starting to assign some value to those businesses. But we should be fair, right? Because YouTube benefits from those same of course dynamics that you talked about. And if you're talking about units of being the phone or the Apple and Google already have a huge install base like what forward is a magnitude to the number of Rayman glass.
Yeah, no, no, no, for sure. But I think the question is from where you are today. Right. And so like I'll stipulate YouTube will be a better business in the future. Content targeting will be better. Ad targeting will be better. Right. And as long as Google is able to backfill the core of search like we just discussed, then it's going to be worth more in the future. There's no doubt about it. And of course, in terms of just their basic research and development around AI, what they did with Gemini 1.5, etc. I mean, like these, they have incredible talent and resources. The only liability is they have an incumbent business that is a monopoly business with monopoly profit. So that we can move on. Let's do a fast drive by. I'm going to do one on Apple and then you do Amazon. Okay. As a reminder to everybody, just our opinions, not investment advice. So for me, you know, Apple, you could argue they have the best asset in the world in this phone. And if you look at the user base of this compared to the Android user base, you know, it's just perfect, right? And they've been doing Siri for a while. And so you connect those two things, you say shit, like they put a L and M on top of this, they could get to all the data. Well, sir, you could give a permission to read your emails. So you could literally get to all the data. So that's a massive positive. Now the negative is they haven't been known for internet services. You look at Spotify, relative to Amazon music, Siri's kind of been. It's been terrible. Not evolving, right? Like it's very much like it was the day it came out. Yeah. And so it would almost require a pivot of like like Mark did on cost. And you'd almost have to see Tim come out and say, we're making a massive pivot. Like 180 degrees, we're going to be all in. Like we're putting our best engineers on this. And until that happens, I think it's a doubters camp. Yeah. Well, I mean, clearly it's been an underperformer this year. I mostly related to China market. Yeah, you have China market, but you have all these concerns in the market going just as to what's the durability or revenue going to be in the future. You clearly have. But they have the assets. Imagine if you took like five of the top AI people. I mean, I'm these companies and they were there. Like they were there the way Tony Fiddell was there early on for the iPod. Like if you had that, you know, listen, for the first time, they have real challengers, whether it's a humane or rabbit, a meta AI glasses, these other things, right? Like I'm just saying the door has been cracked on the ecosystem. Siri has not evolved, right? So they, they, I think they're the first to acknowledge that. I think they are going to try to disrupt themselves about that.
I'm not sure whether we'll see a big breakthrough moment this year. I think we'll definitely see announcements this year about AI on the edge running on the phone and in all these other things. We'll start to see, they'll start to crack the door on this. To me, the big breakthrough on Apple is if they can run a five, 10 billion dollar parameter model on the phone on the edge without consuming all of my battery, which, you know, there's a lot of talk that they're going to be able to do that. They can maintain some memory about me and then they can show me the early part of actions on this device. It will unlock a huge new device cycle. Okay. And that's what drives this style. I was the one supposed to do. Okay. You do. You do. I'm going to ask you quick questions.
So on the e-commerce side of the business, does AI help or hurt Amazon? Yeah. Okay. And how much? Yeah. I think, I think on AI, they're two, when I think about retail e-commerce, I think about it from two directions. First is Apple has been in the business of AI from a merchandising perspective, just like Alibaba has been for a long time. Think about the largest retailer in the world. Think about the way Macy's used to work. There was somebody at the store who would say, we're going to show black t-shirts today at the front of the line. And Amazon today, nobody knows why they are targeting Brad Gersner with certain things. It's a black box. Okay. So they're using it. But here's the thing I think is happening a bit to them on that front. And by the way, Andy Jassy is getting fit. They are tightening the screws on costs and all the other things.
But look at a company like T. Moo. Okay. The Pimdodo on China owns that quickly became the largest advertiser on Facebook. It's e-commerce sales are through the roof. Now what they're doing, I mean, it's so incredibly clever. It's full stack AI. So they don't even have inventory or merchandise. They literally go out and they collect data from customers about what they think they will want. They can assess how many of those things to build. And they literally are building it for themselves. So they vertically integrated this AI e-commerce business. And then they're pushing it out the other end. And so I think there have been a lot of people in the US who have been dismissive, but they've been shocked how big that business has come become in such a short period of time. We're starting to see this out of TikTok as well, where they're turning into an e-commerce business. I think this opportunity sits in front of meta as well. So I think there are some orthogonal challenges. But in terms of the core, I think their core continues to get better because of better targeting and AI reducing costs. Think about their customer care cost. We do have to move on.
Hit AWS as quickly as possible. Yeah. So I would say AWS, at the end of the day, these companies are in the business of renting AI services to end enterprises. Right. And as much as we talk about Azure and Microsoft running the table today, here's the truth. We've seen almost no share shift from AWS to Azure as a result of open AI. And if you would have asked me 12 months ago, I would have said jury's out as to whether or not that's going to happen. It didn't happen. Amazon responded quickly enough. And here's the other thing, you know, in slutments, talk a lot about this term, data gravity, right? It turns out all my data is in AWS. So long as they have a decent AI solution, I'm going to stick with them because I don't have to move anything. And I think they delivered that solution to a podcast. And Jesse was talking about the fact that they have proprietary chips for both training and inference. And obviously, as the AWS stack grew up, they had moved into networking chips. They've moved into a lot of technologies. People wouldn't have thought about Amazon owning or designing.
Do you give them in this bigger transition? Do you give them any chance of being competitive in it from a AI silicon perspective? So I think the right way to think about it is not will they build a better GPU than Nvidia? The right way to think about it is can they supplant part of the supercomputer, right? The entire system. Are there pieces that they can pull out and plug in or workloads, specific workloads that they can serve with a lower cost infrastructure because they're doing hyper targeting silicon all the way to model? And I think the answer to that is yes. But I still think they'll be one of the largest buyers of Nvidia GPUs in the world because it doesn't replace that for a lot of really important workloads. Well, he also said that that I think would be good for the listeners to hear. And I don't want to overstate where Alexa is. And we talked about Siri earlier, but he said that as Alexa got bigger, that the training costs are tiny compared to the inference cost. And he suggested, and maybe this is me interpolating that the inference market over time is going to be much more susceptible to things that are lower power, lower cost, all the things that aren't just performance from a silicon perspective. And I totally believe that to be true. And with that, we can move to the biggest headlines of the week. We finally got here, which is this debate over the future of the compute build out needed to support AI. And to your earlier point about valuation, how unique is this revenue, how long does it last? And so we have a couple of charts, just a ban or tweets to bang through here to kind of contextualize this first, Nvidia stocks up a lot. But it's because the revenue of the profits have greatly exceeded expectations. So this chart just shows what their data center market share has grown to in the year, right? The world is shifting toward AI as a compute infrastructure and they benefited.
One of the areas I think I tweeted about this that I think has been greatly underestimated this idea of sovereign demand. And I tweeted this week, you know, I think Jensen was over in the Middle East meeting with with several of the GCC countries over there. And I think what people still don't appreciate is there probably dozens of sovereigns who are trying to get into the Nvidia order book. And that they view it as one of their top three national priorities to build out AI capabilities. And I think the size you're talking about for some of these GCC countries is going to be competitors competitive with the hyperscalers itself. So in that context, right? When Sam Altman suggested and blew everybody's mind that he was going to raise seven trillion dollars to, you know, build chips. I don't know if he ever said it. It was it was inferted and repeated over and over again. So he threw out a big number. But I do think that we're talking trillions of dollars over the course of the next four to five years as we rebuild the world's compute infrastructure. And then finally, Masa did not want to be left out. And so he came out and said that he's going to raise a hundred billion dollars to build fabs and chips to compete within video as well. You've watched the semi industry for a long time. Okay. And and more importantly, just the dynamics of supply and demand. So just step back for a second. Right. What do you make of all of this?
Well, I have some cynicism, but that comes naturally to me. The first thing I would say is they're they're all talking about raising money from the exact same people. So if I were those people, I would just be a little careful because I think to a certain extent, there's a there's a smell of loose money. That's how I would interpret it. Because they're not they're not saying they want to raise this money. Absolutely. They're saying they want to raise it from a very specific group in the Middle East. So so that's the one thing too. I was struck when I read about sovereign server stacks, you know, there needs to be a reason, right? It would have to be about, you know, wanting to have control over certain amounts of information. It could be proprietary information to your country could be wanting to control how all of them operate in that country. Um, servers typically depreciate a bit like fish, you know, and and and that was been true of DRAM and storage and all of these had fish, right? Fish. They last a day. Like, yeah, well, I mean, it's I'm being provocative, obviously, but people have talked about with that with those like you wouldn't you wouldn't want to hoard any because what happens is, you know, the next generation comes along and goes down quickly. So I would just, you know, there was a time at which, um, Microsoft was trying to convince the world that we'd all need an NT server for every employee. And, you know, when I heard that the first time, I was like, trying to get my head to the so I don't know. I don't know if countries need server stacks, maybe. Um, like I said, it'd have to have those particular frames in mind.
The second thing that just struck me and this gets more to, uh, to the Altman and the masa quote is, you know, the idea that we're just going to go compete with Nvidia, like, it's pretty radical. There are already people competing with Nvidia and these competing with Nvidia. Like, like, there are other people that have somewhat of a head start. Like decades. Yeah. So you're just going to go do it. I was like, that's bold. Like it's not like chip, um, design bins. Oh, yeah, Intel obviously, but like, it's not like chip design bins to disruption or like software does. Like this is hard stuff.
Yeah. And then some of them, and once again, I don't know that there was an exact quote, but the idea that you're going to build a fab and compete, you're going to compete with TMS TSMC and Nvidia at the same time. Like, no chance. Yeah. Like, like, no chance. Cause like, let's say, let's say you got it. I mean, we all know how the math work, but say you got a 20% chance of competing with either of them. Right. Like, then you're down to four, like a pull in the soft and by the way, the time scale that you're going to need. Like, I mean, just read, well, we'll get into it in a minute.
Cause we'll talk about like what it means to have a competitive fab around the world. But TSMC is far, far ahead of the competition. And one of the reasons AMD has a higher market cap than Intel today is specifically because they got out of the fab business and bet on TSMC. So I think, I think it's really important to tease out those two things, but there's chip design, right? And obviously Nvidia is already designing for two to three generations ahead.
I mean, the series B is already taking orders in the order book likely to launch in Q three of this year. And, you know, is order of magnitudes better than the H one hundreds that are out there today. And then you're, they're already designing what comes after Blackwell. And so it's not as though they're standing still and anybody who knows Jensen, he's an animal and that company is, is playing for the future. And then if you look at TSMC and, and, and I shared with you a video, maybe just pull up a little bit about kind of the findings from that.
But this is from the founder of TSMC and the CEO for decades, Morris Chang talking about the competitive advantages and bill, because this really gets to the fab. Like, why are all the world's fabs in Taiwan? Okay. And why aren't the fabs in Texas anymore with Texas instruments? Or why aren't the fabs and other parts of the world? And what does that mean for the future of this build out? And I think the implication of these, of these releases is that we're going to start building a bunch of fabs in the Middle East. I think we know we're trying to build fabs in Arizona. There's some talk about building fabs in Mexico. But maybe just let's deconstruct that one. What does it mean to build a fab outside of Taiwan to make next generation AI chips?
You shared this link with me. And I, it's a talk that Morris Chang gave at MIT, I think in November, right? Very recently, he's over 90. And the first, it's like an hour long talk. And the first 60% is a history lesson. But then the last 40%, I would encourage everyone to go watch, like everyone, including every politician in the United States of America. Because Morris makes the point that the reason Taiwan is competitive has to do with the labor model that exists there. And the type of work people are willing to do and your ability to retain them.
And he walks through his history of hiring in Texas and other parts of the US. Yeah, interesting enough. He ran a fab plant for Texas instruments in Texas. And he explains why Taiwan, like why Texas could never possibly compete with a fab plant in Taiwan. And he even, he admits that the US was a manufacturing prowess in the 50s and 60s. But the social requirements that we put on labor at that time are different than they are today.
And so whether you look at TSMC and it turns out it's the same thing's true of a Foxconn factory in Mexico, you have people working longer hours, sometimes living in dormitories. And he mentions that on his talk. And he says the country must more likely to disrupt Taiwan would be Vietnam or India, not an advanced culture. And to think you're going to re on shore a fab and ignore Morris Chang is just kind of crazy to me.
And I look at the requirements once again that we put on companies around labor and say to myself, it's not going to happen here. And the people will quickly react to that and say, oh, you're in favor of of forced labor or likes super hard labor. But the people that are choosing that at that point in time are choosing a better life. And to deny them that opportunity, like the individual that lives in war as that's commuting to this Foxconn plant is getting a better life. Even with his four nights a week in the dorm. And to deny him that and insist on our circuit 2023 social norms on that country is unfair. Right. From my point of view. Right. I think it denies them their chance at disruption. So when you unpack and we'll put the link to the video here, when you unpack that message, it's really that Taiwan thrived because these operators and technicians would spend their life working in the same fab on this. You're getting getting better and better at the same thing. And he talked about a 12% churn rate.
I think when he was at the fab in Texas, and he said the problem is the second a better job would come along, they would leave for a better shot. That was it was 12 during during a recession. During a recession, implying that it was much higher. It was with the 25 when when when the times were good. And he said, you just can't run a fab plant with 25% churn among the operators. You produce really bad product. And I think the point is it's not just better life. It's also kind of these cultural norms. And so that's why he said, you know, in Vietnam and India, they have cultural norms. He believes that are more consistent with with loyalty and staying with something longer.
And on top of that, that it would be an improvement in the quality of life for the people who would take these jobs. And therefore the incentive is there for them to stay in those positions. And either right or wrong. Like, I, you know, I'm not, I'm not provocative. Like, yeah, it's provocative because it says that if America is really worried about the concentration in Taiwan, they should probably be trying to help build some exactly plants in Mexico or Vietnam or that kind of thing versus bringing them here. Cause the odds of operating them here in a competitive way, globally competitive, where low. So I guess, does that make you skeptical of the chip sack? I mean, I see the Intel is back in Washington looking for another $10 billion to subsidize the work that they're doing. I mean, I get the US national security concern, particularly considering that 100% or virtually 100% of these advanced chips are being manufactured in a place that has risk, has political risk associated with it. Bill, let me ask you this.
By the way, I am somewhat skeptical of chip sack. And then the other thing I would say to you is, like China is probably in a really good place. Yeah. They're really smart. They have all the intellectual capability of being competitive. And they probably still have this opportunity for them, you know, over time in terms of the willingness of certain part of their population to be willing to work in those types of situations. Yeah. I mean, I'll take probably the under on that. I think that the opportunity, like now there is a global imperative to diversify the source of manufacturing.
And I think Morris Chang was having this conversation at MIT recently because he understands the global imperative. I think you are going to see plants that get built in places like Vietnam and Japan. I think you are going to see them get built in India. You're probably going to see some attempts in the Middle East. Obviously, we're trying to do some of this stuff here. I think from a United States interest, both in terms of wanting to maintain leadership in AI and wanting to diversify our political risk associated with Taiwan, it's not so important that everything is produced in the US, right? That shouldn't be our goal or objective for all the reasons that Morris Chang points out.
But I do think it would be better if we had four or five places around the world that were load balancing the manufacturing of these chips. That's a fair point. And I think the whole re-onshoring argument conflates the national security interest with a interest in American jobs and that kind of thing. Going back quickly to these new chip companies that are going to miraculously compete with Nvidia, I would tell you, and this is maybe an older venture capitalist talking and one who's watched different partners sit on the boards of startup semiconductor companies, it ain't easy. The first silicon that comes back doesn't always work. And you might be 50 million to first silicon. You might be 100 million to first silicon.
You've got to get in line at TSMC. How are you going to break that door down? How are you going to out compete in video for TSMC's time? How are you going to get on that place? And it's hard. And by the way, once you do get working silicon, your yields are probably crappy. That's what happens. This is physical material science type stuff. It's not software. Right. And you're going up against, again, two companies that are run in pretty exceptional ways by exceptional founders in the case of gents has been there for three decades. He's devoted his life to this TSMC seemingly has similar types of leadership. But one of the things I wanted to pivot to on this bill, because it begs the question, why is Sam throwing out this really big number? Right? Why is Masa throwing out this really big number?
And I think the answer, like one of the things I want to talk about is this, which is just what is the size of the market opportunity that we're talking about here? And so I have a slide we had Jensen when he was in the Middle East, he mentioned, and the quote was, and this was just from Feb 24, he said, there's about a trillion dollars worth of installed base of data centers around the world. And over the course of the next four to five years, we'll have two trillion of data centers powering software around the world, and it will all be accelerated compute. Okay. And so I asked my team to break that down a little bit. Like what, you know, how does that look like per year, right, to get to this number? So, of course, I'm doing this from outside in, we take a swag at it, and it pulls up this next line, bar chart.
So this is the AI data center build out. In blue is the new accelerated compute, right? In green is the replacement data center that we think will go to accelerated. And then in gray is the replacement that's non accelerated compute. So this would be more like, you know, X86. And again, I'm certain this is wrong specifically, but that's what we're in the business of doing, trying to build a forecast based upon folks who are providing this information. The line running through the center that starts at 55% and goes down to 26%. That is in video's share of that global compute build out based on current consensus numbers for Nvidia. Okay.
So the consensus forecast that has the stock at $700 a share assumes, if you believe this TAM to be accurate, assumes that their share will go from 55% today to 26% in 2028. Now, I think if you just step back and you say, okay, do we think we're going to go from a trillion to data centers to two trillion to data centers just ask that at a high level? Okay. Will you and I just spent an hour plus talking about how a 10 billion queries a day are likely to move from information retrieval, right to inference as we as humans expect to get answers rather than a card catalog. We talked about enterprises, whether they're doing their engineers in code generation or whether customer service centers or whether Tesla and full self driving or whether it's sovereigns who are taking on national security issues, you know, drone fleets or whether it's proteins and life sciences or material sciences. There isn't going to be a single industry on the planet that's not employing accelerated compute in order to solve the problems of their enterprise.
So if you said to me, with that is the backdrop a year ago, I think the big question bill was, is there going to be enough productivity gains in the world to justify this compute build out? Remember, people thought, Oh, we're pulling forward all the demand for NVIDIA. This is like dark fiber in 2000. We're going to be way over supplied. We're going to have a glut. I think the evidence on the field is that that was wrong. I think the evidence on the field is that in fact, just like we talked about on the last pod, we tend to underestimate the size of these super cycles because when you have these phase shifts, everything changes. You have positive reflexivity in the world, more begets more because it's better. Right. And so I think the bigger question when I look at this chart, what I push my team on, what I'm certain of, you know, what I the rumored pricing of H 100s to be 100s is that B 100s are going to cost even more than the H 100s. And so I say to my team, like these margins have to get competed down. Right. But the feedback and something I think that's really important is that although the B 100 is more expensive, it's so much more powerful. Right. It's like this.
If you had an employee bill and you were paying them $100,000 and I said, Hey, you ought to hire this other guy, he costs 200,000. You said, Well, why would I hire him if he costs 200,000? I would say he does 10x the work of the guy who you pay 100,000 to. You would pay him $200,000 in a second. Right. And I think that's why Nvidia today is getting those margins. In the future, I expect that there's going to be more competition, whether it comes from this custom chips that you're talking about, whether it's from other competitors like AMD, whether it's from, you know, new startups that from masa or Sam, etc.
But you and I just talked about the challenge to build a fab and the challenge to design those chips is non trivial. And you know, the probabilities are somewhat low. And so it's going to take time to get time. I mean, if you're starting today, like, when would you have an impact? But one question I would have for you on this is, you know, if you're right about this, I wonder about TSMC's capacity. Right. You know, and that is a limited right. So you're looking at the chart and saying, how do we get to two trillion of replacement in new if TSMC is gated in their ability to produce these?
Now, Jensen gave this number. He's intimately familiar with TSMC and their ability to produce. So I said, I think he has a sense in his head about what they can get done. I think that the other limit limiting factor we're going to run up against here pretty quick. It's not going to be capital, right? It may or may not be TSMC, but the power consumption. So even for the B 100, the data center designs, like you're talking about liquid, cooling, custom design data centers, they're going to consume massive amounts of power. And I think part of the reason you're hearing about this sovereign demand from the Middle East bill is they understand that their chief competitive advantage is low cost energy, right?
And I don't think they're talking about building all of this just to service the needs of their country. I think they're smart enough to understand they're trying to equitize all their petrochemical well into technology. Well, they're in the future. And if I was running one of those countries, I would look at this phase shift as an opportunity to become the supplier to the world of computer aided intelligence, right? And if they can do that because they have lower cost energy and because they can recruit the likes of Sam Altman, they can recruit the likes of TSMC to their countries to set up shop there to design chips there.
It's not all that different than digging wells, right? Think about digging a well, you have to spend a lot of money, a lot of time, a lot of research. You're hoping you get your payback five, 10, 15 years into the future. And so I think this is a rational decision by them to build out this capability. But to your point, it's a non-zero and non-trivial undertaking to try to get it done. Now, if they do that, it's going to put them in competition with some of the hyperscalers, right? From core weave, day, W.S. to others who are in the business of renting AI capabilities out.
But I think it's good for the world because what we want to see is a lot of competition. We want to see the price of AI compute fall over time. That's going to lead to a lot more consumption. And because of the productivity gains from the end applications, again, self-driving cars are coming up with vaccines or solving complex problems or just allowing consumers to have answers instead of 10 blue links. We need the cost to come down on all this stuff. And maybe this would be a good way to wrap. But if you bring that attitude to the table, I mean, it sounds like, and I don't want to put words in your mouth, but there's no reasonable end in sight from where you sit, like on this way.
We're at the very beginning, and it's going to go for a long time. I said last week, and I think we had a little video that went out. And I said, we are going to hit a Zona disillusionment. I don't know whether it's this quarter, maybe tomorrow within video when it starts, right? We're going to hit a Zona disillusionment where you have a mismatch between supply and demand. And then all the skeptics are going to say, I told you so, the Internet's a fad. AI's a fad. Mobile's a fad. It happens in every super cycle.
We're going to use that as a buying opportunity because we are absolutely convinced that the runway is longer and wider and the impact on humanity is going to be greater because the end applications are so compelling that are using AI to assist them in everything that they do. But that's really where the tug of war is in the world today, Bill. And that's what makes a market, right? You're going to have those people who are skeptics about that demand. That's what creates a wall of worry. You know, why does NVIDIA trade it 20 to 30 times earnings, right? I would say because there's a lot of skeptics and there's a lot of worry about whether or not these free cash flow is durable into the future, right?
I bet to the worry is more on pricing than volume, right? And I, by the way, and it's unknowable. Like, I don't know, nobody who follows this nose. So you have to assign some probability to that future outcome. But I think you're right. It is a good place to leave it.
I wish you could be here. I know you're, you're anchored down there in Texas, speaking of Texas fabs. But this is fun to have you here in person. Good to see you. Like old times. And, and I think we're going to be talking and debating this for as long as we're doing this pod for sure. No doubt. No doubt. Good to see you.