Edge AI could test Starlink's mettle
By
Diana Goovaerts Jan 15, 2025 2:35pm
"Edge systems on the planet are much less expensive, relatively easy to deploy and maintain, and can be updated as often as needed and at a reasonable cost,ā Jack Gold noted.
- AI workloads are expected to move from centralized training to inferencing at the edge
- Starlink might have more trouble serving these workloads than terrestrial telcos
- Latency, power and compute are key issues
Starlink burst onto the broadband scene in recent years, lighting up industry conversations like a supernova. But could the hype around the company end up flaming out in the era of edge AI?
The question is an interesting one and the answer, according to several analysts is: it depends.
Certainly, Starlink has a lasting place in the broadband conversation due to its unique ability to quickly connect folks in remote areas. And depending on how the next few years play out ā that is, if the company secures money from the Broadband Equity, Access, and Deployment (BEAD) Program for expansions and gets its hands on more spectrum ā it could become an even more prominent figure in the broadband scene.
But there are two things that artificial intelligence (AI), and edge AI in particular, requires that Starlink lacks: low latency and compute.
As Colin Campbell, SVP of Technology for North America at Cambridge Consultants, noted, āIf you want to be truly on the edge, you want to be as close as possible [to end users], and space networks arenāt closeā by definition. Additionally, satellites by design have limited space for the compute power required to process edge AI workloads at scale.
Sure, you could send those workloads to ground for processing, he said, but that would just add to the latency issue.
According to
Starlinkās website, the company currently provides typical latency of 25 to 60 milliseconds (ms), with latency climbing to over 100 ms for certain extremely remote locations. While that sounds high compared to the 10-20 ms of latency fiber providers like AT&T and Frontier provide, itās not actually that much of a problem ā at least not yet.
If you want to be truly on the edge, you want to be as close as possible [to end users], and space networks arenāt close.
Colin Campbell, SVP of Technology for North America, Cambridge Consultants
Why? Well, as Recon Analytics founder Roger Entner told Fierce āWe are still looking for the use case where a few milliseconds or even 10 or 20 milliseconds of additional latency make a difference.ā
Houston, we have some problems
Jack Gold, of J. Gold Associates, pointed out there are a few other factors that likely wonāt work in Starlinkās favor when it comes to serving edge AI. First, satellites donāt exactly get updated that often for obvious reasons. And second, theyāre expensive.
In contrast, āedge systems on the planet are much less expensive, relatively easy to deploy and maintain, and can be updated as often as needed and at a reasonable cost,ā he noted.
Then thereās just the nature of how satellite networks function. āYou are not always connected to the same one even for the same communications,ā Gold added. āThe satellites will pass you off to the next one as they fly by overhead. So, if you are computing something on one, it may not even finish by the time you move to the next satelliteā¦So it seems impractical to run AI on the satellite.ā
And then thereās the question of what running any real AI compute on a satellite would do to the birdās power load. If it increases it too high, that āmay be a problem in getting enough solar power or at the least would add more costs,ā Gold said.
Long story short, there seem to be a lot of drawbacks to the prospect of running edge AI applications on satellites. But there could be one use case that could end up being Starlinkās Goldilocks zone.
āFor some applications, satellites may be the way to interact with terrestrial based edge systems, especially for remote use cases where latency is not a critical issues (e.g., not health or safety related, where a few seconds or even minutes wonāt affect outcomes negatively), or if there is no real terrestrial network to access,ā Gold concluded.
Itās hard to know how or even if Starlink is thinking about offering edge AI services. Fierce tried to reach out to the company via its parent SpaceX (since apparently Starlink lacks a media contact) but got no response. Starbase, weāre here if you want to talk.
AI workloads are expected to move from centralized training to inferencing at the edgeStarlink might have more trouble serving these workloads than terrestria | Could capacity and latency issues hinder Starlink's consumer AI ambitions?
www.fierce-network.com