HPE's Securities Analyst Meeting 2025: Strategy, Signals, and Why It Matters
Hewlett Packard Enterprise (HPE) used its Securities Analyst Meeting (SAM) 2025 to present a clear vision of the company’s goals for the next three years: becoming a networking-driven, AI, and cloud-focused business with tighter cost controls and increased cash returns. PAC attended this meeting. Management’s strategy is built on three pillars: winning in the market (leading in AI-era networking, scaling profitable AI infrastructure, and expanding software-led hybrid cloud), driving operational leverage (about $1 billion in structural savings), and increasing capital returns (free cash flow exceeding $3.5 billion and a higher dividend), all while reducing leverage to roughly two times net debt to EBITDA by fiscal year 2027.
The financial outlook is clear. For fiscal year 2026, HPE forecasts non-GAAP diluted net earnings per share (EPS) of $2.20–$2.40, free cash flow (FCF) of $1.5–$2.0 billion, networking margins in the low 20s percent, and Cloud & AI margins of 7–9 percent. By fiscal year 2028, the company expects at least $3 in non-GAAP EPS, over $3.5 billion in FCF, networking margins of 25–28 percent, and Cloud & AI margins of 8–10 percent.
The portfolio HPE aims to develop
Networking is at the core. After integrating Juniper, HPE argues that networking becomes mission-critical in the AI era: fabrics that move east-west traffic efficiently, routing that scales with new data-center topologies, and security that merges with the network itself. The company’s vision combines “AI for networks” (operations powered by analytics and automation) with “networks for AI” (data-center and campus/branch networks designed for AI workloads). The aim is to increase networking’s share from about half of HPE’s non-GAAP operating profit in Q3 FY25 to nearly 60 percent by FY28.
Cloud & AI focuses on profitable growth, where HPE believes it has a structural advantage. That begins with Private Cloud AI (PCAI), HPE’s “AI factory” bundle, and expands to customized AI factories for sovereign and enterprise customers. The company states this segment already accounts for more than half of cumulative AI orders since Q1 FY23 and experienced a 250 percent increase in sequential “sovereign” AI orders from Q2 to Q3 FY25. It also involves expanding HPE GreenLake (the company’s hybrid cloud platform) for observability, data protection, orchestration, and multi-vendor virtualization; and emphasizing HPE-owned storage IP, especially the Alletra MP platform. HPE ties these initiatives to long-term demand trends: AI infrastructure nearly doubling from 2025 to 2028, and a hybrid cloud market that favors operational consistency across public and private environments.
The Cost and Capital Playbook
Two programs support margin growth and cash flow increase. First, at least $600 million in yearly synergies from integrating Juniper by FY28; second, at least $350 million of “Catalyst” savings by FY27, covering Stock Keeping Unit (SKU) simplification, vendor consolidation, workforce alignment, and operational excellence. The company attributes these savings to improved non-GAAP operating profit and higher free cash flow, which in turn funds dividend increases (a proposed 10 percent rise to $0.57 per share for FY26) and more share repurchases, while also aiming to cut net leverage to around 2× by FY27.
What It Means for the IT Market
HPE’s thesis is that the bottleneck for AI is shifting from compute to the network. If this is true (and in PAC’s opinion, this is a valid assumption), buyers will value Ethernet fabric design, routing, network-native security, and the operational layer (Artificial Intelligence for IT Operations, or AIOps) that maintains environment stability more highly. Expect more opinionated “full-stack” offerings combining networking, security, and operations, along with a shift in enterprise architectures toward AI-ready data-center networks featuring liquid cooling and automated management.
At the same time, “sovereign AI” deployments aligned with country-specific policies, locations, and supply-chain rules are becoming mainstream in regulated and public-sector contexts. HPE is heavily investing in this area with PCAI and larger AI factories, arguing that hybrid models will dominate: public cloud for elasticity and ecosystem support, and private AI for data control, predictable costs, and compliance. Storage software is regaining importance as data grows and training pipelines require both throughput and efficiency, a dynamic HPE aims to capture with its own IP in Alletra MP. Additionally, as many customers reassess their virtualization stacks, HPE is positioning GreenLake to serve as the control plane across multi-vendor, multi-cloud environments.
What it means for HPE, division by division
Networking is the key focus. The combined HPE-Juniper portfolio aims to lead in campus and branch networking, routing (including data-center interconnects), data-center fabrics for AI, and converged security, and will accelerate through 2028. The margin target is clear (25–28 percent by FY28), and the value-creation plan relies on cross-selling across more than 47,000 partners, engineering integration, and capturing synergies. If achieved, networking will account for the majority of HPE’s operating profit by the end of the plan.
Cloud & AI integrates four key levers. First, servers: guiding customers to next-generation platforms that lower power consumption and space needs while enhancing security, with higher software and services integrated to boost margins. Second, storage: unifying solutions around HPE’s software and systems (notably Alletra MP) to improve performance and reduce total cost of ownership. Third, GreenLake: expanding cloud management, orchestration, observability, backup, disaster recovery, and runtime software to deliver unified operations across public and private environments, including customers exploring virtualization alternatives. Fourth, AI infrastructure: scaling PCAI and custom “AI factories,” especially for sovereign and enterprise accounts where HPE’s high-performance computing (HPC) expertise and liquid-cooling experience set it apart.
Corporate Investments & Other serves to house non-core items and provide clearer disclosure around the two growth engines. In parallel, HPE Financial Services supports adoption with financing and lifecycle services, a lever the company highlights for accelerating enterprise cloud and AI deployments.
What it means for HPE customers
For enterprise and public-sector buyers, the key takeaway is a more integrated and prescriptive HPE approach. The company will increasingly offer “opinionated” stack options that combine networking, compute, storage, security, and operations into a cohesive lifecycle under HPE GreenLake. This can reduce time-to-value and operational overhead, especially with AI-assisted operations and unified observability, but it also encourages customers to adopt HPE’s reference architectures and fabric designs. Organizations with data sovereignty or regulatory constraints should experience a faster path to private AI through PCAI and related services, supported by financing and asset-lifecycle options to ease capital costs. As HPE executes its synergy and Catalyst programs, customers can expect fewer overlapping SKUs, more consistent support models, and potential improvements in price-performance for standardized builds, including liquid-cooled options, across edge and core data centers.
The broader industry influence and execution watch-outs
If HPE’s prediction is correct, the industry’s focus will shift further toward AI-driven networking: Ethernet fabrics optimized for AI traffic, routing designed for new interconnect patterns, and security that is inherently integrated into the network. This creates pressure on competitors to enhance security and AIOps, and to deliver comprehensive “AI fabrics,” not just individual products. It also supports a hybrid approach for AI, combining hyperscale and private environments, rather than a one-directional move back to private data centers or an exclusive reliance on the cloud. Finally, it emphasizes the ongoing competition in storage software around throughput, data reduction, cyber-resilience, and integration with private cloud systems.
The risks are typical for plans of this size. The $600 million of annualized Juniper synergies and $350 million of Catalyst savings are key to the margin and FCF results; integration across the portfolio, channel, and supply chain will determine how much of that value materializes on time. The capital plan also assumes disciplined working capital and steady demand as HPE simultaneously raises dividends and accelerates buybacks while bringing leverage back to approximately 2×. And the competitive landscape in enterprise AI remains crowded, from hyperscalers and chip makers to storage specialists, so execution on openness, pricing, and time-to-value will be crucial.
Bottom line
HPE is repositioning itself around a straightforward narrative: lead with networking in the AI era, scale profitable AI infrastructure, and grow a software-led hybrid cloud business, then turn that operational leverage into higher and more predictable cash returns. The plan aligns with where many large buyers are heading: hybrid operations, private AI for sensitive data, and networks that serve as both the platform for AI and increasingly AI-operated themselves. If HPE executes its integration and cost reduction programs while maintaining open stacks that accommodate the multi-vendor environment of modern IT, it can play a larger, more profitable role in building and managing AI-era data centers and achieve the EPS and FCF targets set for FY28.