Edge AI could test Starlink's mettleâ
By
Diana Goovaerts Jan 15, 2025 2:35pm
"Edge systems on the planet are much less expensive, relatively easy to deploy and maintain, and can be updated as often as needed and at a reasonable cost,â Jack Gold noted.
- AI workloads are expected to move from centralized training to inferencing at the edge
- Starlink might have more trouble serving these workloads than terrestrial telcos
- Latency, power and compute are key issues
Starlink burst onto the broadband scene in recent years, lighting up industry conversations like a supernova. But could the hype around the company end up flaming out in the era of edge AI?
The question is an interesting one and the answer, according to several analysts is: it depends.
Certainly, Starlink has a lasting place in the broadband conversation due to its unique ability to quickly connect folks in remote areas. And depending on how the next few years play out â that is, if the company secures money from the Broadband Equity, Access, and Deployment (BEAD) Program for expansions and gets its hands on more spectrum â it could become an even more prominent figure in the broadband scene.
But there are two things that artificial intelligence (AI), and edge AI in particular, requires that Starlink lacks: low latency and compute.
As Colin Campbell, SVP of Technology for North America at Cambridge Consultants, noted, âIf you want to be truly on the edge, you want to be as close as possible [to end users], and space networks arenât closeâ by definition. Additionally, satellites by design have limited space for the compute power required to process edge AI workloads at scale.
Sure, you could send those workloads to ground for processing, he said, but that would just add to the latency issue.
According to
Starlinkâs website, the company currently provides typical latency of 25 to 60 milliseconds (ms), with latency climbing to over 100 ms for certain extremely remote locations. While that sounds high compared to the 10-20 ms of latency fiber providers like AT&T and Frontier provide, itâs not actually that much of a problem â at least not yet.
If you want to be truly on the edge, you want to be as close as possible [to end users], and space networks arenât close.
Colin Campbell, SVP of Technology for North America, Cambridge Consultants
Why? Well, as Recon Analytics founder Roger Entner told Fierce âWe are still looking for the use case where a few milliseconds or even 10 or 20 milliseconds of additional latency make a difference.â
Houston, we have some problemsâ
Jack Gold, of J. Gold Associates, pointed out there are a few other factors that likely wonât work in Starlinkâs favor when it comes to serving edge AI. First, satellites donât exactly get updated that often for obvious reasons. And second, theyâre expensive.
In contrast, âedge systems on the planet are much less expensive, relatively easy to deploy and maintain, and can be updated as often as needed and at a reasonable cost,â he noted.
Then thereâs just the nature of how satellite networks function. âYou are not always connected to the same one even for the same communications,â Gold added. âThe satellites will pass you off to the next one as they fly by overhead. So, if you are computing something on one, it may not even finish by the time you move to the next satelliteâŠSo it seems impractical to run AI on the satellite.â
And then thereâs the question of what running any real AI compute on a satellite would do to the birdâs power load. If it increases it too high, that âmay be a problem in getting enough solar power or at the least would add more costs,â Gold said.
Long story short, there seem to be a lot of drawbacks to the prospect of running edge AI applications on satellites. But there could be one use case that could end up being Starlinkâs Goldilocks zone.
âFor some applications, satellites may be the way to interact with terrestrial based edge systems, especially for remote use cases where latency is not a critical issues (e.g., not health or safety related, where a few seconds or even minutes wonât affect outcomes negatively), or if there is no real terrestrial network to access,â Gold concluded.
Itâs hard to know how or even if Starlink is thinking about offering edge AI services. Fierce tried to reach out to the company via its parent SpaceX (since apparently Starlink lacks a media contact) but got no response. Starbase, weâre here if you want to talk.
AI workloads are expected to move from centralized training to inferencing at the edgeStarlink might have more trouble serving these workloads than terrestria | Could capacity and latency issues hinder Starlink's consumer AI ambitions?
www.fierce-network.com