SanDisk Corp. (SNDK.US), a global leader in SSD storage products, saw its stock price continue its exceptionally strong upward trend since 2025, closing up more than 8% at the end of Monday's trading session. Since the beginning of 2026, the stock has surged over 350%, significantly outperforming global equity benchmark indices, with its total market capitalization approaching $160 billion. Throughout 2025, SanDisk's stock price achieved a staggering increase of 580%. This year, driven by seemingly endless demand for storage chips fueled by the AI infrastructure boom, it continues an epic, myth-like rally in the global storage chip sector, solidifying its status as an undeniable AI computing infrastructure "super stock" in global markets.
SanDisk is scheduled to report its fiscal year 2026 third-quarter financial results after the market closes on Thursday, April 30, Eastern Time. Just ahead of the highly anticipated earnings release, several Wall Street financial giants, including Morgan Stanley, have raised their latest target prices and earnings per share (EPS) estimates for SanDisk, highlighting widespread market optimism that the company's results will significantly exceed expectations and drive the stock to new highs.
An analyst team at Morgan Stanley, led by veteran analyst Joseph Moore, stated in a client report on Monday: "Third-party forecasts currently indicate that overall NAND average selling prices (ASP) will rise sharply by approximately 90% in the first quarter, with an expected increase of 70% to 75% in the second quarter. Enterprise SSDs for large data center customers are likely to show relatively stronger performance—a segment where SanDisk holds particular strength compared to NAND storage peers." Furthermore, Morgan Stanley raised its price target for SanDisk significantly from $690 to $1,100, maintaining its "Overweight" rating.
The analysts, including Moore, noted: "We remain very positive on the stock, but the market has fully understood its recent strength; believing in the duration of this strength may take some more time. For us, DRAM remains a more severe bottleneck for AI growth, but NAND will follow closely. Longer-term enterprise data center prepayments themselves should tell the story. Moreover, compared to DRAM, NAND capital expenditure remains lower, so supply expansion is not a factor catching up to demand."
Regarding earnings per share, Morgan Stanley substantially raised its fiscal 2026 EPS estimate for SanDisk from $41.09 to $53.24. It also increased its fiscal 2027 EPS estimate from $82.73 to $155.06 and raised its fiscal 2028 EPS estimate from $91.92 to $134.13.
For the fiscal 2026 third quarter, the consensus expectation is for SanDisk to report adjusted EPS of approximately $14.55, with GAAP EPS expected to be $13.82. Total revenue for the quarter is projected to reach as high as $4.72 billion, compared to only about $1.7 billion in the same period last year.
Another prominent Wall Street firm, the analyst team at Melius Research led by star analyst Ben Reitzes, initiated coverage on SanDisk with a "Buy" rating and a high target price of $1,350. As of Monday's close, SanDisk's stock price was near $1,070. The analysts at Melius believe the artificial intelligence boom will drive sustained growth in storage demand through the end of this decade (i.e., 2030), and that U.S. storage chip leaders Micron (MU.US) and SanDisk (SNDK.US) both have further upside potential.
According to data from market research firm Counterpoint Research, the storage market has entered a "super bull market" or "super-cycle" phase, with current supply-demand dynamics and pricing conditions far surpassing the previous peak seen during the 2018 cloud computing boom. Since 2026, DRAM/NAND storage chips have continued their rapid ascent. The latest memory price survey from research firm TrendForce estimates that overall Conventional DRAM contract prices will increase significantly by 58%-63% quarter-over-quarter in Q2 2026 (following a Q1 price increase expectation of 93% to 98%). The NAND Flash market continues to be dominated by AI training/inference and broad-based data center-related demand, with chain-reaction price increases across all product lines persisting. It is projected that overall contract prices in Q2 will rise sharply by 70%-75% on top of the nearly 100% increase seen in Q1.
With the concentrated emergence in 2026 of super AI agent tools capable of autonomous task execution, such as Anthropic's Claude Cowork and OpenClaw, this wave of AI Agents has rapidly swept across the globe. The bottleneck in AI computing architecture is fundamentally shifting from GPUs, centered on matrix multiplication throughput, to data center CPUs, which focus on control flow, task orchestration, and memory/IO coordination. High-performance CPUs for hyperscale AI data centers are facing severe supply shortages.
In the view of financial giants like Morgan Stanley, the core narrative of AI computing investment is transitioning from "competition focused on single-point AI GPU/ASIC computing power" to "full-stack AI systems driven by AI Agents." In this shift of the AI narrative, data center CPUs and storage chips are likely to be the biggest winners.
As the South Korean benchmark KOSPI index, heavily weighted with Samsung and SK Hynix, hits record highs despite pressure from worsening geopolitical tensions, and the Taiwanese stock market, led by heavyweight TSMC—a major beneficiary of the AI boom—also reaches new peaks, alongside a record 17-day winning streak for the Philadelphia Semiconductor Index, investors are increasingly convinced that the "AI computing investment theme" can overpower other market noise.
Whether it's Google's massive TPU AI computing clusters or vast clusters of Nvidia AI GPUs, all rely on comprehensively integrated HBM memory systems paired with AI chips. Beyond HBM, tech giants like Google and OpenAI are accelerating the construction or expansion of AI data centers, necessitating large-scale purchases of server-grade DDR5 memory and enterprise-grade high-performance SSD/HDD storage solutions.
From a fundamental hardware theory perspective, AI computing is inherently limited not only by computing power but also by "data movement capability." Whether for Nvidia GPUs or TPU systems, what truly determines the efficiency of large model training and inference is not just the number of Tensor Cores/matrix units, but the bandwidth available per second to feed weights, KV cache, activation values, and intermediate tensors into the computing cores.
From a cross-analytical perspective of semiconductors and AI data center infrastructure, DRAM/NAND storage chips are "perfectly positioned" within the AI wave because they benefit from both the training expansion and inference expansion trends. Furthermore, they act as a "universal toll gate" that is cross-platform, cross-architecture, and cross-ecosystem. As the AI era shifts from being training-dominated to being dominated by inference, agents, long context, and retrieval-augmented generation, system demands for capacity, bandwidth, power efficiency, and data persistence layers will only intensify.
Morgan Stanley's predictive data indicates that by 2030, an additional 15 to 45 exabytes (EB) of DRAM storage chip demand will be generated, equivalent to 26% to 77% of the entire industry's annual supply volume in 2027. The Morgan Stanley analyst team emphasized that storage chips are becoming one of the most "sustainably monetizable" layers within the AI computing infrastructure system. Whether host-level DRAM, memory interface chips, or CXL expansion and tiered storage architectures, all will become important carriers of long-term value. Storage chips are no longer just capacity configuration options but core components that directly determine the efficiency and throughput of AI system workloads.
No matter how powerful GPU/TPU computing power is, without HBM providing the bandwidth for data feeding, and without enterprise NAND and high-capacity HDDs handling training checkpoints, vector databases, and inference data lakes, the utilization rate and AI workload efficiency of the entire AI computing infrastructure cluster cannot be maximized. Global capital markets are therefore willing to assign higher valuations to the storage chain because it benefits from a triple leverage effect of "volume growth + price increases + long-term supply constraints," rather than just single-factor shipment growth.
Wall Street's current core forecast regarding storage chip supply-demand dynamics is that the DRAM/NAND supply-demand mismatch could persist until around 2028, and the market is still underestimating the profit growth trajectory of storage chip manufacturers in this cycle. Precisely because of this, SanDisk's stock price has risen substantially since the start of 2026 and has undergone extreme revaluation over the past year. This is not merely speculative sentiment but essentially the market re-rating the "AI-driven pricing power of storage."
Storage chips are being re-priced by the market from traditional "strong-cycle commodities" to core bottleneck assets within AI infrastructure. SK Hynix's latest earnings report serves as a sample of this super-cycle: Q1 revenue was approximately 52.6 trillion Korean won—a year-over-year increase of 198% and a quarter-over-quarter increase of about 60%. Operating profit was about 37.6 trillion won—a year-over-year surge of 405% and a quarter-over-quarter increase of about 96%, marking the company's first time breaking the 50 trillion won quarterly revenue milestone, with an operating profit margin as high as 72%. Management also emphasized that AI-related HBM demand continues to significantly outstrip capacity. Samsung Electronics forecast Q1 operating profit potentially reaching 57.2 trillion won, implying roughly an eight-fold year-over-year increase, with the core driver similarly being the AI data center construction boom's pull on DRAM/HBM/NAND.
In other words, AI servers don't just need GPUs; they also need HBM for high-bandwidth proximal memory, DDR5/LPDDR5 for system memory, and NAND/eSSD for data lakes and inference caching. Storage has transformed from a "supporting component" into the throughput bottleneck of the AI factory.
SanDisk's position corresponds to the high-beta revaluation of NAND/SSDs. Morgan Stanley raised its price target for SanDisk from $690 to $1,100 and significantly increased its EPS forecasts for 2026-2028, citing the core logic of continued NAND ASP increases, strong AI and enterprise SSD demand, and still-restrained NAND capital expenditure, making it difficult for supply to catch up with demand quickly. Melius went further, assigning SanDisk a bull-case target of $1,350 and Micron a target of $700, arguing that AI is transforming storage demand from short-cycle PC/phone drivers into long-term infrastructure demand driven by cloud providers, AI servers, agents, and physical AI.
The market's willingness to assign higher valuations to storage chip giants like SanDisk and Micron essentially stems from the belief that these companies can transition completely from "cyclical manufacturers" to "AI infrastructure suppliers" characterized by long-term supply agreements, prepayments, and highly predictable cash flows.
More interestingly, the market's investment logic is expanding from单纯的 "HBM premium" to include shortages in traditional DRAM/NAND supply. Profit margins for traditional DRAM are even beginning to outpace those for HBM. As prices for DDR5, LPDDR5, enterprise SSDs, and NAND continue to soar, storage chip manufacturers may not be willing to allocate all incremental capital expenditure solely to HBM, which involves extremely complex advanced packaging, huge costs, and yield challenges, but will instead reassess capital returns across HBM, traditional DRAM, NAND, and advanced packaging.
In other words, HBM remains the star asset for AI training/massive inference workloads, but the greater upside surprise currently comes from the fact that the "entire storage pool is facing shortages," especially as high-performance DRAM and eSSDs are also being used extensively for AI inference workloads. The demand and price increase potential for these components may be far from over.
Large cloud computing providers, major AI ASIC customers, and server manufacturers are indeed pushing for multi-year agreements to secure large-scale supplies of HBM, DDR5, and eSSD for the coming years. However, in an environment where spot and contract prices are rising significantly each quarter and supply is likely to remain tight until 2028 or even 2030, storage chip manufacturers have greater incentive to retain quarterly pricing power or demand prepayments, price reset clauses, and higher guaranteed returns.
The ultimate outcome may be that the bull market in the storage sector is no longer just about "Nvidia igniting HBM" but has evolved into a full-stack storage super-cycle lasting until around 2030, driven by sustained expansion in AI capital expenditure, limited wafer capacity expansion, bottlenecks in advanced packaging, inventory building by cloud providers, and rising prices for traditional storage.
Comments