Scale your AI transformation with a powerful, secure, and adaptive cloud infrastructure

1 month ago 13
News Banner

Looking for an Interim or Fractional CTO to support your business?

Read more

At Microsoft Ignite, we’re introducing important updates crossed our full unreality and AI infrastructure.

The instauration of Microsoft’s AI advancements is its infrastructure. It was customized designed and built from the crushed up to powerfulness immoderate of the world’s astir wide utilized and demanding services. While generative AI is present transforming however businesses operate, we’ve been connected this travel for implicit a decennary processing our infrastructure and designing our systems and reimagining our attack from bundle to silicon. The end-to-end optimization that forms our systems attack gives organizations the agility to deploy AI susceptible of transforming their operations and industries.

From agile startups to multinational corporations, Microsoft’s infrastructure offers much prime successful performance, power, and outgo ratio truthful that our customers tin proceed to innovate. At Microsoft Ignite, we’re introducing important updates crossed our full unreality and AI infrastructure, from advancements successful chips and liquid cooling, to caller information integrations, and much flexible unreality deployments.

Unveiling the latest silicon updates across Azure infrastructure

As portion of our systems attack successful optimizing each furniture successful our infrastructure, we proceed to harvester the champion of manufacture and innovate from our ain unsocial perspectives. In summation to Azure Maia AI accelerators and Azure Cobalt cardinal processing units (CPUs), Microsoft is expanding our customized silicon portfolio to further heighten our infrastructure to present much ratio and security. Azure Integrated HSM (hardware information module) is our newest in-house information chip, which is simply a dedicated hardware information module that hardens cardinal absorption to let encryption and signing keys to stay wrong the bounds of the HSM, without compromising show oregon expanding latency. Azure Integrated HSM volition beryllium installed successful each caller server successful Microsoft’s datacenters starting adjacent twelvemonth to increase extortion crossed Azure’s hardware fleet for some confidential and general-purpose workloads.

We are besides introducing Azure Boost DPU, our archetypal in-house DPU designed for data-centric workloads with precocious ratio and debased power, susceptible of absorbing aggregate components of a accepted server into a azygous dedicated silicon. We expect aboriginal DPU equipped servers to tally unreality retention workloads astatine 3 times little powerfulness and 4 times the show compared to existing servers.

We besides proceed to beforehand our cooling exertion for GPUs and AI accelerators with our adjacent procreation liquid cooling “sidekick” rack (heat exchanger unit) supporting AI systems comprised of silicon from manufacture leaders arsenic good arsenic our own. The portion tin beryllium retrofitted into Azure datacenters to enactment cooling of large-scale AI systems, specified arsenic ones from NVIDIA including GB200 successful our AI Infrastructure.

Liquid cooling vigor   exchanger unit

In summation to cooling, we are optimizing however we present powerfulness much efficiently to conscionable the evolving demands of AI and hyperscale systems. We person collaborated with Meta connected a caller disaggregated powerfulness rack design, aimed astatine enhancing flexibility and scalability arsenic we bring successful AI infrastructure into our existing datacenter footprint. Each disaggregated powerfulness rack volition diagnostic 400-volt DC powerfulness that enables up to 35% much AI accelerators successful each server rack, enabling dynamic powerfulness adjustments to conscionable the antithetic demands of AI workloads. We are unfastened sourcing these cooling and powerfulness rack specifications done the Open Compute Project truthful that the manufacture tin benefit. 

Azure’s AI infrastructure builds connected this innovation astatine the hardware and silicon furniture to powerfulness immoderate of the astir groundbreaking AI advancements successful the world, from revolutionary frontier models to ample standard generative inferencing. In October, we announced the motorboat of the ND H200 V5 Virtual Machine (VM) series, which utilizes NVIDIA’s H200 GPUs with enhanced representation bandwidth. Our continuous bundle optimization efforts crossed these VMs means Azure delivers show improvements procreation implicit generation. Between NVIDIA H100 and H200 GPUs that show betterment complaint was doubly that of the industry, demonstrated crossed manufacture benchmarking.

We are besides excited to denote that Microsoft is bringing the NVIDIA Blackwell level to the cloud. We are opening to bring these systems online successful preview, co-validating and co-optimizing with NIVIDIA and different AI leaders. Azure ND GB200 v6 volition beryllium a caller AI optimized Virtual Machines bid and combines the NVIDIA GB200 NVL 72 rack-scale plan with state-of-the-art Quantum InfiniBand networking to link tens of thousands of Blackwell GPUs to present AI supercomputing show astatine scale. 

We are besides sharing contiguous our latest advancements successful CPU-based supercomputing, the Azure HBv5 virtual machine. Powered by customized AMD EPYCTM 9V64H processors lone disposable connected Azure, these VMs volition beryllium up to 8 times faster than the latest bare-metal and unreality alternatives connected a assortment of HPC workloads, and up to 35 times faster than on-premises servers astatine the extremity of their lifecycle. These show improvements are made imaginable by 7 TB/s of representation bandwidth from precocious bandwidth representation (HBM) and the astir scalable AMD EPYC server level to date. Customers tin present motion up for the preview of HBv5 virtual machines, which volition statesman successful 2025. 

Accelerating AI innovation done unreality migration and modernization 

To get the astir from AI, organizations request to integrate information residing successful their captious concern applications. Migrating and modernizing these applications to the unreality helps alteration that integration and paves the way to faster innovation portion delivering improved show and scalability. Choosing Azure means selecting a level that natively supports each the mission-critical endeavor applications and information you request to afloat leverage precocious technologies similar AI. This includes your workloads connected SAP, VMware, and Oracle, as good arsenic open-source bundle and Linux.

For example, thousands of customers tally their SAP ERP applications connected Azure and we are bringing unsocial innovation to these organizations specified arsenic the integration betwixt Microsoft Copilot and SAP’s AI adjunct Joule. Companies similar L’Oreal, Hilti, Unilever, and Zeiss person migrated their mission-critical SAP workloads to Azure truthful they tin innovate faster. And since the motorboat of Azure VMware Solution, we’ve been moving to enactment customers globally with geographic expansion. Azure VMware Solution is present disposable successful 33 regions, with enactment for VMware VCF portable subscriptions

We are besides continually improving Oracle Database@Azure to amended enactment the mission-critical Oracle workloads of our endeavor customers. Customers similar The Craneware Group and Vodafone person adopted Oracle Database@Azure to payment from its precocious show and debased latency, which allows them to absorption connected streamlining their operations and to get access to precocious security, information governance, and AI capabilities successful the Microsoft Cloud. We’re announcing contiguous Microsoft Purview supports Oracle Database@Azurefor broad information governance and compliance capabilities that organizations tin usage to manage, secure, and way information crossed Oracle workloads.  

Additionally, Oracle and Microsoft program to supply Oracle Exadata Database Service connected Exascale Infrastructure successful Oracle Database@Azure for hyper-elastic scaling and pay-per-use economics. Additionally, we’ve expanded the availability of Oracle Database@Azure to a full of 9 regions and enhanced Microsoft Fabric integration with Open Mirroring capabilities. 

To marque it easier to migrate and modernize your applications to the cloud, starting today, you tin measure your application’s readiness for Azure utilizing Azure Migrate. The caller application aware method provides method and concern insights to assistance you migrate full exertion with each dependencies arsenic one.

Optimizing your operations with an adaptive unreality for concern growth

Azure’s multicloud and hybrid approach, oregon adaptive cloud, integrates abstracted teams, distributed locations, and divers systems into a azygous exemplary for operations, security, applications, and data. This allows organizations to utilize cloud-native and AI technologies to run crossed hybrid, multicloud, edge, and IoT environments. Azure Arc plays an important relation successful this attack by extending Azure services to immoderate infrastructure and supporting organizations with managing their workloads and operating crossed antithetic environments. Azure Arc present has implicit 39,000 customers crossed each industry, including La Liga, Coles, and The World Bank.

We’re excited to present Azure Local, a new, cloud-connected, hybrid infrastructure offering provisioned and managed successful Azure. Azure Local brings unneurotic Azure Stack capabilities into 1 unified platform. Powered by Azure Arc, Azure Local tin tally containers, servers and Azure Virtual Desktop connected Microsoft-validated hardware from Dell, HPE, Lenovo, and more. This unlocks caller possibilities to conscionable customized latency, adjacent real-time information processing, and compliance requirements. Azure Local comes with enhanced default information settings to support your information and flexible configuration options, similar GPU-enabled servers for AI inferencing.

We precocious announced the wide availability of Windows Server 2025, with caller features that see easier upgrades, precocious security, and capabilities that alteration AI and instrumentality learning. Additionally, Windows Server 2025 is previewing a hotpatching subscription enactment enabled by Azure Arc that volition let organizations to instal updates with less restarts—a large clip saver.

We’re besides announcing the preview of SQL Server 2025, an endeavor AI-ready database level that leverages Azure Arc to present unreality agility anywhere. This caller mentation continues its industry-leading information and show and has AI built-in, simplifying AI exertion improvement and retrieval augmented procreation (RAG) patterns with secure, performant, and easy-to-use vector support. With Azure Arc, SQL Server 2025 offers unreality capabilities to assistance customers amended manage, secure, and govern SQL property astatine standard crossed on-premises and cloud.

Transform with Azure infrastructure to execute unreality and AI success

Successful translation with AI starts with a powerful, secure, and adaptive infrastructure strategy. And arsenic you evolve, you request a unreality level that adapts and scales with your needs. Azure is that platform, providing the optimal situation for integrating your applications and information truthful that you tin commencement innovating with AI. As you design, deploy, and negociate your situation and workloads connected Azure, you person entree to champion practices and industry-leading method guidance to assistance you accelerate your AI adoption and execute your concern goals. 

Jumpstart your AI travel astatine Microsoft Ignite

Key sessions astatine Microsoft Ignite

Discover much announcements astatine Microsoft Ignite

Resources for AI transformation

Read Entire Article