When you purchase through links on our site, we may earn an affiliate commission.Heres how it works.
Chief Technology & Operating Officer, Neos Networks.
Supporting GenAI
The case for edge networking is clear.
AI applications are both data-heavy and compute-intensive.
This is particularly pertinent when it comes to the need for inferencing and localized processing of data.
The more time-critical the system is, the more that data should be stored and processed at the edge.
Take AI inferencing (using an AI model to conclude from new information or data) for example.
This is why we must build out data centers outside central locations at the edge.
According to Goldman Sachs, a ChatGPT enquiry requires almost 10 times as much electricity to process as aGooglesearch.
By distributing the computing burden across the connection, power demand is spread, not concentrated.
Fiber-optic cables provide significantly lower latency and higher bandwidth than traditional copper cables.
This allows for faster data transfer rates.
To meet AI demands, data center buildout needs to be supplemented by edge buildout.
A hybrid model?
We list the best AI tools.
The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc.
If you are interested in contributing find out more here:https://www.techradar.com/news/submit-your-story-to-techradar-pro