How hollow core fiber is accelerating AI  

6 months ago 20
News Banner

Looking for an Interim or Fractional CTO to support your business?

Read more

This blog is portion of the ‘Infrastructure for the epoch of AI’ bid that focuses connected emerging exertion and trends successful large-scale computing. This portion dives deeper into 1 of our newest technologies, hollow halfway fibre (HCF). 

AI is astatine the forefront of people’s minds, and innovations are happening astatine lightning speed. But to proceed the gait of AI innovation, companies request the close infrastructure for the compute-intensive AI workloads they are trying to run. This is what we telephone ‘purpose-built infrastructure’ for AI, and it’s a committedness Microsoft has made to its customers. This committedness doesn’t conscionable mean taking hardware that was developed by partners and placing it successful its’ datacenters; Microsoft is dedicated to working with partners, and occasionally connected its own, to make the newest and top exertion to powerfulness scientific breakthroughs and AI solutions. 

background pattern

Infrastructure for the epoch of AI

Explore however you tin integrate into the satellite of AI

One of these technologies that was highlighted astatine Microsoft Ignite successful November was hollow halfway fibre (HCF), an innovative optical fibre that is acceptable to optimize Microsoft Azure’s planetary unreality infrastructure, offering superior web quality, improved latency and unafraid information transmission. 

Transmission by air 

HCF exertion was developed to conscionable the dense demands of workloads similar AI and amended global latency and connectivity. It uses a proprietary plan wherever airy propagates successful an aerial core, which has important advantages implicit accepted fibre built with a coagulated halfway of glass. An absorbing portion present is that the HCF operation has nested tubes which assistance trim immoderate unwanted airy leakage and support the airy going successful a consecutive way done the core.  

Azure blog abstract

As airy travels faster done aerial than glass, HCF is 47% faster than modular silica glass, delivering accrued wide velocity and little latency. It besides has a higher bandwidth per fiber, but what is the quality betwixt speed, latency and bandwidth? While velocity is however rapidly information travels implicit the fibre medium, web latency is the magnitude of clip it takes for information to question betwixt 2 extremity points crossed the network. The little the latency, the faster the effect time. Additionally, bandwidth is the magnitude of information that is sent and received successful the network. Imagine determination are 2 vehicles travelling from constituent A to constituent B mounting disconnected astatine the aforesaid time. The archetypal conveyance is simply a car (representing azygous mode fibre (SMF)) and the 2nd is simply a van (HCF). Both vehicles are carrying passengers (which is the data); the car tin instrumentality 4 passengers, whereas the van tin instrumentality 16. The vehicles tin scope antithetic speeds, with the van travelling faster than the car. This means it volition instrumentality the van little clip to question to constituent B, truthful arriving astatine its destination archetypal (demonstrating little latency).  

For implicit fractional a century, the manufacture has been dedicated to making steady, yet small, advancements successful silica fibre technology. Despite the progress, the gains person been humble owed to the limitations of silica loss. A important milestone with HCF exertion was reached successful aboriginal 2024, attaining the lowest optical fibre loss (attenuation) ever recorded astatine a 1550nm wavelength, adjacent little than axenic silica halfway azygous mode fibre (SMF). 1 Along with debased attenuation, HCF offers higher motorboat powerfulness handling, broader spectral bandwidth, and improved awesome integrity and information information compared to SMF. 

The request for speed 

Imagine you’re playing an online video game. The crippled requires speedy reactions and split-second decisions. If you person a high-speed transportation with debased latency, your actions successful the crippled volition beryllium transmitted rapidly to the crippled server and to your friends, allowing you to respond successful existent clip and bask a creaseless gaming experience. On the different hand, if you person a dilatory transportation with precocious latency, determination volition beryllium a hold betwixt your actions and what happens successful the game, making it hard to support up with the fast-paced gameplay. Whether you’re missing cardinal enactment times oregon lagging down others, lagging is highly annoying and tin earnestly disrupt gameplay. Similarly, successful AI models, having little latency and high-speed connections tin assistance the models process information and marque decisions faster, improving their performance. 

Reducing latency for AI workloads

So however tin HCF assistance the show of AI infrastructure? AI workloads are tasks that impact processing ample amounts of information utilizing instrumentality learning algorithms and neural networks. These tasks tin scope from representation recognition, earthy connection processing, machine vision, code synthesis, and more. AI workloads necessitate accelerated networking and debased latency due to the fact that they often impact aggregate steps of information processing, specified arsenic information ingestion, preprocessing, training, inference, and evaluation. Each measurement tin impact sending and receiving information from antithetic sources, specified arsenic unreality servers, borderline devices, oregon different nodes successful a distributed system. The velocity and prime of the web transportation impact however rapidly and accurately the information tin beryllium transferred and processed. If the web is dilatory oregon unreliable, it tin origin delays, errors, oregon failures successful the AI workflow. This tin effect successful mediocre performance, wasted resources, oregon inaccurate outcomes. These models often request immense amounts of processing powerfulness and ultra-fast networking and retention to grip progressively blase workloads with billions of parameters, truthful yet debased latency and high-speed networking tin assistance velocity up exemplary grooming and inference, amended show and accuracy, and foster AI innovation. 

Helping AI workloads everywhere

Fast networking and debased latency are particularly important for AI workloads that necessitate real-time oregon near-real-time responses, specified arsenic autonomous vehicles, video streaming, online gaming, oregon astute devices. These workloads request to process information and marque decisions successful milliseconds oregon seconds, which means they cannot spend immoderate lag oregon interruption successful the network. Low latency and high-speed connections assistance guarantee that the information is delivered and processed successful time, allowing the AI models to supply timely and close results. Autonomous vehicles exemplify AI’s real-world application, relying connected AI models to swiftly place objects, foretell movements, and program routes amid unpredictable surroundings. Rapid information processing and transmission, facilitated by debased latency and high-speed connections, alteration adjacent real-time decision-making, enhancing information and performance. HCF exertion tin accelerate AI performance, providing faster, much reliable, and much unafraid networking for AI models and applications. 

Regional implications 

Beyond the nonstop hardware that runs your AI models, determination are much implications. Datacenter regions are expensive, and some the region betwixt regions, and betwixt regions and the customer, marque a satellite of quality to some the lawsuit and Azure arsenic it decides wherever to physique these datacenters. When a portion is located excessively acold from a customer, it results successful higher latency due to the fact that the exemplary is waiting for the information to spell to and from a halfway that is further away.

If we deliberation astir the car versus van illustration and however that relates to a network, with the operation of higher bandwidth and faster transmission speed, much information tin beryllium transmitted betwixt 2 points successful a network, successful 2 thirds of the time. Alternatively, HCF offers longer scope by extending the transmission region successful an existing web by up to 1.5x with nary interaction connected web performance. Ultimately, you tin spell a further region astatine the aforesaid latency envelope arsenic accepted SMF and with much data. This has immense implications for Azure customers, minimizing the request for datacenter proximity without expanding latency and reducing performance. 

The infrastructure for the epoch of AI 

HCF exertion was developed to amended Azure’s planetary connectivity and conscionable the demands of AI and aboriginal workloads. It offers respective benefits to extremity users, including higher bandwidth, improved awesome integrity, and accrued security. In the discourse of AI infrastructure, HCF exertion tin alteration fast, reliable, and unafraid networking, helping to amended the show of AI workloads. 

As AI continues to evolve, infrastructure exertion remains a captious portion of the puzzle, ensuring businesslike and unafraid connectivity for the integer era. As AI advancements proceed to spot further strain connected existing infrastructure, AI users are progressively seeking to payment from caller technologies similar HCF, virtual machines similar the precocious announced ND H100 v5, and silicon similar Azure’s ain archetypal spouse AI accelerator, Azure Maia 100. These advancements collectively alteration much businesslike processing, faster information transfer, and ultimately, much almighty and responsive AI applications. 

Keep up connected our “Infrastructure for the Era of AI” bid to get a amended knowing of these caller technologies, wherefore we are investing wherever we are, what these advancements mean for you, and however they alteration AI workloads.   

More from the series

Sources

1 Hollow Core DNANF Optical Fiber with <0.11 dB/km Loss

The station How hollow halfway fibre is accelerating AI   appeared archetypal connected Microsoft Azure Blog.

Read Entire Article