Here’s what happened: Google dropped some fancy new compression technology called TurboQuant this week, and the market collectively freaked out. Memory stocks like Micron, SanDisk, Western Digital, and Seagate got absolutely hammered—we’re talking double-digit losses. The fear? That Google’s tech makes AI computing so efficient it’ll tank demand for memory chips. Sounds scary, right? Bank of America says everyone needs to take a breath.
The tech itself is genuinely impressive—TurboQuant can reduce the memory needed to run AI inference by up to six times without losing accuracy. That’s legitimately cool. But here’s the thing: compression tech isn’t exactly new. Nvidia’s been doing similar stuff for the past year. The underlying tech has been on everyone’s radar in the chip space. So why the panic?
Welcome to Wall Street’s greatest hits: fear beats facts, at least for a day or two.
BofA’s analysts made a solid point in their Friday note: “AI capex remains the ultimate proof point of AI spend/demand, not efficiency measures.” Translation: the real story isn’t whether AI gets more efficient—it’s how much money companies are actually throwing at it. And spoiler alert: they’re throwing *a lot*.
Think about it logically. Yes, compression tech makes AI run leaner. But companies aren’t going to suddenly stop building AI infrastructure because it’s more efficient. If anything, efficiency makes AI *more* accessible, which means *more* adoption, which means *more* demand for chips. It’s like saying cars got more fuel-efficient, so nobody will buy gas anymore. That’s not how this works.
The numbers back this up. AI spending is projected to surpass $1 trillion by 2030, according to BofA’s “conservative” estimates. That’s not a typo. A *trillion* dollars. Even if compression tech cuts memory demand by 50%, you’re still talking about massive capex flowing into the chip ecosystem.
This whole thing actually reminds BofA of the DeepSeek panic from 2025, when everyone lost their minds over a Chinese AI model and forgot that fundamentals still matter. Spoiler: the market recovered, and the sky didn’t fall.
Here’s where it gets interesting: Micron, the poster child for this week’s memory stock meltdown, is trading at the low end of its historical valuation despite having more than *doubled* in the past six months. BofA’s price target implies over 35% upside from current levels. That’s not nothing.
The bank still ranks memory among its top chip subsectors—right behind AI computing, semiconductor capital equipment, and AI networking plays. They’re not exactly running for the exits.
Look, market panics are part of the game. Someone invents something cool, everyone assumes it kills an entire industry, and then reality sets in. Memory stocks got whacked this week, but the underlying thesis—that AI capex is going to be massive for years—hasn’t changed. If anything, more efficient AI infrastructure just means the party lasts longer.
Sometimes the best opportunities come right after everyone overreacts.