AWS Summit London: Inference is the new battleground for cloud AI dominance

Attending a massive event like the AWS Summit London leaves you with many impressions and pieces of information to digest. Perhaps the most surprising thing was that there was not the usual fanfare of launch announcements with all the unavoidable excitement. Therefore, the learnings were more about how things were said or what was not said.

Let’s start with the scale. The packed halls of London’s Excel convention centre seem to corroborate AWS’s claim that more than 20.000 people attended the event. For a London tech event, it rarely gets bigger than that. Thus, unsurprisingly, AWS had to cater to a diverse set of executives and interests. This might help to explain AWS’s choice of content. The sessions appeared to focus on two clusters. First, emphasizing the acceleration of the transformation journey through Generative and Agentic AI without getting lost in technical details. One AWS executive summed it up that scale is about the consistent application of AI that gives an organization operational efficiency. It is here where differentiation and value creation cut in. Second, and perhaps more surprisingly, elevating partner ecosystems. Yet, what was equally surprising was that there was no customer on the main stage talking about the pivot toward scaling Generative and Agentic AI. But the discussions at the many booths made up for this, for instance, TCS showcasing how Generative AI is turbocharging mainframe modernization. At the same time, the acceleration of Agentic AI is opening a much broader set of buying centers for AWS, as they asserted that Line of Business decision-makers are already sitting at the table when AI workloads are being discussed.

Inference is the next frontier

The main keynote was delivered by Alison Kay, VP of UK and Ireland, and David Brown, VP for AWS Compute, who positioned the AWS portfolio as an expansive set of building blocks that allow customers and partners to put them together. Translated into analyst speak, the strategic priority is about model choice. The connecting tissue for those building blocks is security, ranging from silicon to end-users and connectivity, evidenced by AWS’ fiber network, supposedly the largest globally. David’s pitch focused on positioning inference as the third major pillar for AWS after compute and storage. To that end, he focused on AI governance by pointing to the guardrails embedded in Amazon Bedrock that have been significantly enhanced by automated reasoning, which is meant to help block hallucinations. Automated reasoning systems can verify the logical consistency of statements generated by AI. AWS claims it pioneered it. Currently in private preview, general availability is expected over the coming months. The client journey of London Stock Exchange Group (LSEG) reinforced the points that David was trying to get across.  They embarked on a multi-cloud journey, leveraging the many building blocks he had called out. Beyond the more obvious building blocks, they highlighted their use of Outposts in inference. Outposts are a cloud computing service that brings AWS infrastructure and services to on-premises data centers or edge locations. This provides a much-needed nuance of the interplay of cloud, data, and AI, rather than just bandying around aspirations around Generative and Agentic AI that the marketing of the supply side tends to throw around. This isn’t just semantics—it signals AWS’s evolving ambition to be the fabric that enterprises stitch together for AI outcomes. The focus on inference also reflects the next chapter of GenAI: not experimentation, but operationalization.

Sovereign Cloud takes center stage

As so often, the more nuanced insights and narratives also came through the focused discussions with executives rather than the plenary sessions. For example, the suggestion is to think about governance and business case for AI in more holistic ways. Therefore, organizations should include planning for failure on the GPU level, including leveraging cluster management that allows them to roll back deployments. On the topic of Sovereign Cloud, which gained new prominence due to the tariff war discussions, executives pointed to the long-term nature of the discussions between compliance bodies and the hyperscalers. Thus, it is difficult to chase the shadows of political events. In the end, the fundamental question providers like AWS must answer is how to be compliant while keeping customer value. Even though concerns about geopolitics were palpable, those long-term considerations seem to drive AWS’s decision-making. These comments came on the back of the announcement to invest €7.8 billion into the AWS European Sovereign Cloud. The AWS European Sovereign Cloud is planning to launch its first AWS Region in the State of Brandenburg, Germany, by the end of 2025, available to all customers.

AWS’ trinity of modularity, governance, and cost-aware scale

In summary, the at times more implicit motivation for presenting more disparate pieces of information appeared to lie in referencing a modular rather than a hierarchical view of enterprises, where AWS is reinforcing its leading position in cloud by double-clicking on model choice and competitive pricing. At the same time, having launched the Amazon Nova foundation models, AWS has more control over its stack and thus is in a better position to capture value from the deployments. Suffice it to say that is the part that was not said. AWS is subtly recasting itself not as the flashiest AI innovator, but as the most reliable and configurable one. By reinforcing modularity, governance, and cost-aware scale, it’s making a clear play to be the enterprise AI backbone.

Share via ...