Submarine communication cables – almost 560 of them deployed to date – crisscross our oceans, interconnecting continents and carrying over 99% of intercontinental data traffic. This article looks at how the emergence of artificial intelligence (AI) will affect these cables in terms of traffic demand as well as in terms of how we design and operate the key equipment that feeds data into these cables.

AI and Network Traffic Volumes
There are several reports that predict a huge increase in the processing power required for AI applications as well as the electrical power consumed by future AI workloads. Indeed, power consumption is already reaching critical levels in countries such as Singapore and Ireland, where data center projects have been put on pause by government intervention because of concerns over insufficient capacity on their power grids. But there is very little published data at the moment on the likely workload that AI will place on transport networks – be they metro, long-haul terrestrial, or even submarine links.

Is there a lesson to be learned from cloud computing?

As shown in Figure 1, two distinct types of traffic emerged in cloud computing 10 years ago.

Figure 1: Traffic flowsNorth-South: user to network trafficFirst was “user-to-network” traffic. Because of the way PowerPoint diagrams tend to be drawn, this is usually portrayed as running up and down, or north-south.
North-south traffic was originally thought to be the main contributor in a cloud architecture because users would need to move information back and forth to the cloud applications in the data center instead of running the applications locally.

East-West: intra- and inter-data center traffic
The other type of traffic – east-west – is the local (intra-data center) and long-distance (inter-data center) traffic between servers. Initially, this was assumed to be just occasional updates, with most of the real traffic staying inside the data center, travelling between servers in the same rack or racks in the same building.

Traffic Evolution
As time evolved, however, an odd thing happened, as highlighted in Figure 2.

Figure 2: Traffic volumes

The volume of north-south traffic grew steadily as companies migrated to cloud applications, but the volume of east-west traffic grew at an even faster rate – and soon dwarfed north-south traffic levels. The east-west growth arose from virtualization, cloud computing, and the intensive use of databases and internal applications that require constant server-to-server communication. For example, your Facebook page may look like a single pane of information, but it’s actually a collage of multiple information streams that come from different parts of the virtualized cloud of storage. And this east-west growth can happen over long distances.

To explain why, let’s look at the Facebook model, which is driven by advertising revenue. As a U.K. resident, I spend most of my money here, so if I travel to the U.S. or around Asia, there’s no point in serving me U.S. or Asia ads – the Facebook data center in Singapore, for example, has to synchronize with one of their three data centers in Europe to retrieve the appropriate advertising and deliver it with low latency so I don’t scroll past it before it’s delivered, because then Facebook would not get paid for my “eyeballs.” East-west traffic also means that data centers can load-balance and act as resilient backup.

From Conventional Cloud to AI Data Centers

The biggest question mark over AI data centers is what effect AI processes such as training vs. inference will have on traffic patterns. The initial observations are that the move to AI data centers for training large neural network models is boosting east-west growth because, for one thing, these models are exceeding the scale that can easily be trained inside using the capacity that can be rented at any one time in a single data center. I’ve shown this speculative growth prediction with the shaded section in Figure 2.
Inference requires low latency, whereas training is regarded as being less latency-sensitive for the simple reason that it usually happens within a single data center. In the inference phase, in terms of network traffic, a pre-trained model is queried using relatively short text prompts, and the responses sent back are also short text documents or still images. North-south traffic per user will obviously increase as AI moves from text responses to still images and then to ever-increasing quality of video.

Figure 3: The operational and network-based impact of AI in SLTE technology

AI’s Impact on Submarine Network TechnologyOnce a submarine cable is deployed, it has an engineering life of 25 years, but the only thing we can upgrade during that lifetime is the submarine line-terminating equipment (SLTE), and there may be multiple opportunities to enhance cable performance by taking advantage of the latest transponder and SLTE terminal technology. AI can play a part in this enhancement in two different ways: operational and network-based functionality.

Operational Use of AI

There are three main operational areas where AI can be used to enhance SLTE capability, and all of these functions tend to be housed within the management and automation software of the network.

Automating Network Planning and Capacity Optimization

In a modern open cable system, it’s essential to characterize each fiber pair to ensure that the cable operator got what they paid for in terms of wet plant performance. Characterization using a range of parameters has been well defined by the SubOptic Open Cable Working Group. But the process of taking these measurements can be labor intensive, and AI can certainly play its part in reducing the manual work required.
The next step for operators is spectrum planning and allocation. That can include the need to support spectrum sharing on a given fiber pair. This stage is performed knowing the exact type of transponders that will be deployed, and these performance parameters can be plugged into an AI model that can help optimize the available spectrum.
Finally, optical power planning is an ongoing aspect of fiber pair operation, and it’s essential that this is carefully designed in order to ensure stable service levels.

Enhanced Network Monitoring

Coherent transponders and, to a lesser extent, the transport platforms they are housed in offer a veritable fire hydrant flow of telemetry, with the latest transponders having hundreds of information points in their published data models. Dealing with this flood of information over the data communication network has always been a challenge, but this is exactly the kind of problem that AI can help with by extracting correlations and patterns that the human eye alone cannot recognize.

Predictive Maintenance

Hand in hand with enhanced monitoring is the ability to use the processed telemetry from the transponder, transport platform, SLTE terminal, and repeater chain to detect potential operational issues and to deal with them before they become a problem.

Network-based use of AI

Network-based features generally run in AI-enhanced transponders, and at least two different features have seen active development.

Threat Monitoring and Proactive Protection

For several years, Infinera has been involved in the development of techniques for seismic anomaly detection – basically using existing submarine cables to enhance tsunami warning systems. Cable operators have shown interest in this capability, but they have also thrown down the challenge of enhancing sensitivity in order to detect threats such as seabed trawling or dragging anchors in the vicinity of the cable.

Advanced Nonlinear Compensation

The limiting impairments for modern coherent transponders are nonlinear effects caused by high optical power levels in the fiber. Using fully effective, algorithmic nonlinear compensation (NLC) techniques is extremely computationally intensive – well beyond the immediate roadmap for ASIC processing power. Limited NLC does deliver useful additional performance, but one direction of research is looking at the use of a neural network approach for NLC that uses far less processing power yet delivers a significant performance improvement.

RSS Feed Source link

By elboriyorker

HOSTING BY PHILLYFINESTSERVERSTAT | ANGELHOUSE © 2009 - 2024 | ALL YOUTUBE VIDEOS IS A REGISTERED TRADEMARK OF GOOGLE INC. THE YOUTUBE CHANNELS AND BLOG FEEDS IS MANAGED BY THERE RIGHTFUL OWNERS. POST QUESTION OR INQUIRIES SEND ME AN EMAIL TO elboriyorkeratgmailcom (www.phillyfinest369.com)