BRN Discussion Ongoing

mrgds

Regular
Hi Tech, I don't know what his wife looks like but Keith Witek seems to mingle with a very sexy crowd!

Before being hired as chief operating officer for Tenstorrent, Keith apparently spent more than three years at Google as Director of Strategic Alliances, after over a year at SiFive as SVP of corporate development, strategy, and general counsel. He was also the director of R&D and enablement and associate general counsel at Tesla for nearly three years. Prior to that, he spent nearly 14 years at AMD across a number of roles, including as a corporate vice president.
Yes, he"s had one
S- iFive
T- esla
T- enstorrent
A- MD
G- oogle

PARTY
doesn"t really want another

😂
 
  • Haha
  • Like
Reactions: 12 users

Realinfo

Regular
Oops…he did it again !!

Tech just mentioned his oft stated view that January 2025 is his personal timeline. Correct me if I’m wrong Texta, but by this you mean for solid and consistently rising earnings?

For mine, I think you’re being a tad conservative, but if you are right, there will have to be a whole raft of good things happen between now and then...and they will come into the public domain either by way of announcements (formal via the ASX and informal via press releases etc) OR from the admirable efforts of super sleuths such as a very own 1000 Eyes.

The impact of all this stuff coming to light will be very positive on our battler’s share price between now and January 2025.

After our fortuitous meeting with the Mercedes racer at The Fox Goes Free pub, my better half and I have been meandering around the West Country. The only thing I can add to what came from the get together, is that after eighteen months of being hit over the head because of a falling share price, my shareholding better half has been placated by what she learned.

On this note, I want to reiterate some previous utterings relating to going the way of Kodak and dinosaurs…

There continues to be much discussion about are we in this and is this us. I firmly believe that the technology world is very aware of Akida, very aware of what it can do, and very aware that it’s not going away anytime soon. Given this, if you were somebody in authority within a Nvidia, a Qualcomm or the like, do you think you would risk your company’s future, by ignoring something truly game changing and thus missing the Akida bus?

I don’t think so. This is why I believe that the Nvidia’s, the Qualcomm’s…name your company people…no matter what name you nominate, they and/or their favourite tech partner would be looking into how they can use Akida.

To be not doing so, would be risking going the way of Kodak and the dinosaurs.
 
  • Like
  • Love
  • Fire
Reactions: 57 users

IloveLamp

Top 20
Screenshot_20230728_170434_LinkedIn.jpg
 
  • Like
  • Fire
  • Love
Reactions: 29 users

Sirod69

bavarian girl ;-)
Thomas Hülsing
Thomas Hülsing1.System-of-Systems Engineering
40 Min. •


What useful things can AI do for us in everyday life?

Artificial intelligence has received a lot of attention through ChatGPT and many other tools have popped up very quickly.

On the one hand, research in the field of artificial intelligence is receiving the attention it has long missed. Because the basics in the field of artificial intelligence have been researched since the 1960s. With the increasing performance of computer processors and the falling prices for data storage, an increasingly powerful artificial intelligence could also be created.

On the other hand, the multitude of different systems fuels a fear of artificial intelligence. A restriction and regulation of artificial intelligence is immediately demanded.

Centralized systems like ChatGPT and other artificial intelligences run in energy-guzzling data centers that are expanded daily. In addition, masses of data can be stored where nobody knows what will happen to the data. This is a development that is worrying, because we want to disclose less data about ourselves and also want and need to use less energy.

This is where Edge Computing and Edge AI is getting more and more interesting every day.
Because the new edge computing technology consumes as little energy as natural brains. In addition, our data remains in edge computing.

What artificial intelligence in Edge AI can do for us and support us in everyday life and even save lives is what this BrainChip podcast is about with experts who have a lot of experience with Edge AI.

 
  • Like
  • Fire
  • Love
Reactions: 28 users

Frangipani

Regular
Care for some more speculative dot-joining?

I wonder which company in the Edge AI space (pun intended) would be able to assist HPE (Hewlett Packard Enterprise) in processing data on the surface of the moon from 2026 onwards and also happens to have a CEO that used to work for HPE? 😊

HPE, Astrolab will take edge computing to the moon in 2026​

By Tommy Clift Jul 3, 2023 08:04am


Still shot of the July 20, 1969 moon landing.


HPE's Norm Follett outlined the company's plans to “hitch a ride” on Astrolab’s Flexible Logistics and Exploration (FLEX) rover set to be launched on a SpaceX mission to the moon in mid-2026. (NASA)

Artificial intelligence (AI) isn’t just taking over planet Earth, it’s also expanding its influence in space with a little help from HPE. The company recently unveiled plans to bring AI-enabled edge computing services to the moon via an Astrolab rover set to launch in 2026.

HPE initially brought edge computing and AI technology to space with the launch of Spaceborne Computer-2 (SBC-2), installed onto the International Space Station (ISS) in May of 2021. The SBC-2 packed a one-two punch with its Edgeline Converged EL4000 Edge system and the ProLiant DL360 server to provide a system capable of withstanding harsh space environments and enabling workloads across edge, high performance compute (HPC) and AI.

The SBC-2 was part of HPE’s “greater mission to significantly advance computing and reduce dependence on communications as humans travel farther into space, to the Moon, Mars and beyond,” according to an update the company posted last year. At the time, it announced the completion of 24 data processing experiments using the SBC-2. Those experiments spanned across healthcare, natural disaster recovery, image processing, 3D printing, 5G and AI-enabled solutions in an effort to “prove reliability in space.”

During HPE Discover earlier this month, company officials sat down with executives from AstroLab and Axiom to discuss various moves to expand this work. HPE Senior Director of Space Technologies and Solutions Norm Follett during the discussion outlined plans to “hitch a ride” on Astrolab’s Flexible Logistics and Exploration (FLEX) rover set to be launched on a SpaceX mission to the moon in mid-2026. In this case, the hitchhiker will be SBC technology attached to the rover, which will provide data services to both HPE and AstroLab customers.

“We’re actually able to take 1500 kilograms of customer cargo with us,” Astrolab CEO Jaret Matthews stated. “We anticipate that a lot of our customers are going to want to make use of the edge-computing service on our platform provided by HPE.” He explained that for the first few years of the rover’s expedition, data will be the priority export .

HPE’s edge-computing power will enable the rover to ship back fully-formed “insight” rather than pieces of a puzzle, Matthews continued. “If you can produce a refined map, rather than 1000s of images, you save yourself bandwidth,” he explained.

Follett emphasized that “there’s still some work to do” ahead of the mission. The two companies have a reservation agreement, but they still need to work through additional business details including how customers will take advantage of the service. They also still need to "solve a lot of technical challenges," he said.

“Right now, we are 254 miles up in space. We are at the edge; we have the most powerful computer to ever go into space. That’s not deep space; it’s low earth orbit. And we’re gonna go further,” Follett concluded.






theCube.png



Coverage from SiliconANGLE's livestreaming video studio

UPDATED 12:19 EDT / JUNE 22 2023
Norm-Follett-and-Jaret-Matthews-HPE-Discover-June-2023.jpg
EMERGING TECH

Breaking boundaries: How Astrolab and HPE are redefining edge computing in space​

Chad-Wilson-Headshot-96x96.jpg

BY CHAD WILSON
share.png
SHARE

Space exploration has always captured the imagination of humanity, pushing the boundaries of what we thought was possible.

In the quest to explore new frontiers, innovative partnerships are being formed to revolutionize space technology. One such partnership between Venturi Astrolab Inc., a startup specializing in planetary robotic systems, and Hewlett Packard Enterprise Co., is set to redefine the future of space exploration.

Jaret Matthews (pictured, right), founder and chief executive officer of Astrolab, joined theCUBE at HPE Discover to talk about Astrolab’s and HPE’s partnership and the innovations that are coming out of the partnership in edge computing, as well as the future of computational science in space.

“Astrolab is developing the next link in the transportation network for the solar system,” Matthews said.

Matthews and Norm Follett (pictured), director of global technical marketing at HPE, spoke with theCUBE industry analysts Dave Vellante and Lisa Martin at HPE Discover, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed the HPE/Astrolab partnership and how the companies are leveraging cutting-edge technology to overcome challenges and pave the way to the moon and beyond. (* Disclosure below.)

Astrolab’s mission to develop planetary robotic systems​

Astrolab, a startup founded three years ago, is making waves in the space exploration industry. Based in Los Angeles, the company is focused on developing novel planetary robotic systems. Matthews brings a wealth of experience from his time at NASA’s Jet Propulsion Laboratory and Space Exploration Technologies Corp., where he worked on projects such as the Mars rovers and spacecraft mechanisms.

With the imminent launch of SpaceX’s Starship lander, which is set to be the largest rocket ever created, Matthews believes that Astrolab can pave the way for groundbreaking missions on the moon and eventually Mars.

“Having spent seven years at that organization myself, I’m very confident that they’re gonna get there,” Matthews said when speaking of SpaceX. “You know, they’re trying something that’s extremely hard, but they have a really incredible team and deep, deep experience now and accomplishing really hard goals. I have every confidence that they’ll get there.”

Revolutionizing data transmission from the moon with HPE​

One of the key challenges faced by those operating on the moon is limited bandwidth for data transmission back to Earth. To tackle this issue, Astrolab has formed a strategic partnership with HPE to leverage its edge-computing capabilities.

The partnership aims to revolutionize the way data is processed and transmitted from the lunar surface. Rather than sending vast amounts of raw data, Astrolab and HPE are working together to extract valuable insights locally, using edge computing. This approach not only optimizes bandwidth usage, but also enables more efficient and meaningful data analysis.

“For the first five or 10 years of lunar development, in the coming decade, the main export from the lunar surface is going to be data,” Matthews said, when talking about the significance of the collaboration.

By harnessing HPE’s expertise in edge computing, Astrolab aims to maximize the value derived from lunar missions and accelerate the exploration of the moon as a stepping stone to Mars.

Drawing a parallel between the current developments in space exploration and the building of railroads to California in 1870, Matthews highlighted the economic opportunities presented by expanding humanity’s reach beyond the current horizon
, with influential figures, such as Jeff Bezos and Elon Musk, investing billions of dollars in developing a transportation network for the moon.

As Astrolab and HPE forge ahead with their partnership, the prospects for space exploration and the utilization of edge computing in space are set to soar. With the moon becoming the next frontier of exploration and economic potential, Astrolab’s groundbreaking work will help to shape the future of space travel and scientific discovery.

“Soon it will become not only frequent, but economical to send stuff to the moon,” Matthews added.
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of HPE Discover:



(* Disclosure: This is an unsponsored editorial segment. However, theCUBE is a paid media partner for HPE Discover. Hewlett Packard Enterprise Co., Intel Corp. and other sponsors of theCUBE’s event coverage have no editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE​



If you’ve read so far and were pretty confident about the possibility of Akida being integrated but then got disheartened when spotting “Intel” in the above disclosure (“Oh, so it must be Loihi then…”), let me tell you this:

While Intel is one of the biggest HPE Discover event sponsors (a so-called Emerald Sponsor, alongside Kioxia, Microsoft and vmware - NVIDIA, AMD, Samsung, AWS are “merely” Platinum Sponsors) and were also given ample opportunity to market their Xeon processors and what not, no one from Intel was present for the above “edge computing in space” interview with the HPE/Venturi Astrolab representatives, which leads me to (naively?) conclude that Intel is not involved in this project.

Moreover, in another interview from the June 20-22 HPE Discover 2023 Edge-To-Cloud Conference in Las Vegas, HPE CEO Antonio Neri made some interesting remarks, such as

From 2:32 min:
And last but not least obviously was AI. Because everything is about AI. And we have a right to play with a unique - you know - intellectual property and we announce - I think - a bold move to offer the first language model available as a service that you can privately train your data in a sustainable environment… To me the most important metric, Lisa, is the customer feedback. I come here to listen and - not just to tell, but to listen - and gather what is next in their minds, so we can continue to address those challenges.”

And lo and behold, a couple of minutes later, Antonio Neri even mentions a three-legged stool, albeit his sturdy piece of furniture is a bit different from Sean Hehir’s.

From 8:34 min:
“… I mean, we cannot innovate everything. But to me innovation is a three-legged stool: Number one is our organic innovation - GreenLake is a great example of that (…) as people are coming now, saying AI is a big opportunity, a big investment - we already invested 2.6 billion $ in that business since 2019, and you mentioned some of the acquisitions, but, you know, the other piece of this is that you have to complement your, you know, own innovation with external innovation, because the other thing you have to think about is bring different type of talent. And every acquisition we have brought, brought that different way to think about issues or solutions, and that complemented our talent. And then the third piece of this is the broader partner ecosystem, and part of the partner ecosystem is to make strategic bets in small, you know, call it “start up companies”…”

From 12:13 min: “We have the infrastructure to run at a scale, because we have unique IP both on the silicon side that allows us to give openness and choice to customers … and then the software, the software to really optimise that infrastructure to the workload (?) that really differentiates us.

And he goes on emphasising sustainability


 
  • Like
  • Fire
  • Love
Reactions: 29 users

Tony Coles

Regular
Has anybody seen this before? Interesting, low power/battery operated face recognition.

Check out the video from the link from youtube.

 
  • Like
  • Fire
Reactions: 7 users

Murphy

Life is not a dress rehearsal!
Has anybody seen this before? Interesting, low power/battery operated face recognition.

Check out the video from the link from youtube.


Does it use Akida, Tony? Sounds remarkably like it.
 

Tony Coles

Regular
Does it use Akida, Tony? Sounds remarkably like it.

On the back round on the poster in the pic, enlarge it and it says Enable AI and Machine Learning, did you see the you tube video. Interesting. What court my eye at first was the poster in the background and then checked the link.
 

equanimous

Norse clairvoyant shapeshifter goddess
  • Like
  • Haha
  • Love
Reactions: 15 users

AARONASX

Holding onto what I've got
Has anybody seen this before? Interesting, low power/battery operated face recognition.

Check out the video from the link from youtube.


Analog
 
  • Like
Reactions: 7 users

Tony Coles

Regular

Mentions ultra low power deep neural network CNN accelerator for analog, interesting hey. From Arm cortex family and was posted only yesterday.
 
  • Like
Reactions: 2 users

Mt09

Regular
Mentions ultra low power deep neural network CNN accelerator for analog, interesting hey. From Arm cortex family and was posted only yesterday.
Analog not Akida.
 
  • Like
Reactions: 7 users

Lex555

Regular
Mentions ultra low power deep neural network CNN accelerator for analog, interesting hey. From Arm cortex family and was posted only yesterday.
Akida is digital neuromorphic, anything to do with analog is not as accurate and isn’t Akida
 
  • Like
Reactions: 15 users
Someone seems to like us :)

Recent little blog post by a small private Canadian engineering consultant / placement firm specialising in ASIC by looks.



"BrainChip: Revolutionizing AI Efficiency and Ubiquity for a Sustainable Future"​


Chipmonk Consulting
Chipmonk Consulting

Chipmonk Consulting​

"Empowering Semiconductor Innovation with Top-Tier Talent"​

Published Jul 19, 2023
+ Follow
"Welcome to another edition of Chipmonk's spotlight on the game-changers in the AI/ML and Accelerated Computing Ecosystem. Today, we delve into the innovative world of BrainChip, a company that is reshaping the landscape of artificial intelligence with its unique and efficient solutions. Join us as we explore how BrainChip is revolutionizing AI efficiency and ubiquity, paving the way for a sustainable future."

"BrainChip: Revolutionizing AI Efficiency and Ubiquity for a Sustainable Future"
BrainChip is a global technology company that is revolutionizing the field of Artificial Intelligence (AI) and Internet of Things (IoT) through its innovative products and solutions. The company's mission is to make AI ubiquitous by combining advanced IoT infrastructure with AI technologies inspired by the human brain. This unique approach leads to more efficient operations, proactive problem-solving, improved human-machine interactions, and enhanced security.
BrainChip's products are designed to be efficient, high-performing, and sustainable. The company's flagship product, Akida™, sets a new standard in AI efficiency. It delivers high performance with an extremely low-power budget and a negligible thermal footprint. Akida performs complex functions at the device level, providing real-time, instant responses. This efficiency allows Akida to substantially reduce the need for cloud computing in edge AI applications.
On-chip learning capabilities allow Akida to be customized on-device, eliminating the need for expensive retraining of models on the cloud. This feature also enhances data security as no learning data is stored anywhere, reducing the exposure of sensitive data on the internet.

BrainChip also offers MetaTF software and the Akida1000 reference chip for easy implementation and evaluation of its technology. MetaTF provides a model zoo, performance simulation, and CNN model conversion. The Akida1000 reference chip is fully functional and enables working system evaluation. BrainChip's development systems, including PCIe boards, Shuttle PCs, and Raspberry Pi, complement its IP and reference SoC to enable the easy design of intelligent endpoints.

BrainChip is not just about providing cutting-edge technology; it's also about building a community. The company invites innovators, technologists, and businesses to join them in pushing the limits of AI on-chip compute to maximize efficiency, eliminate latency, and conserve energy.

BrainChip: Pioneering Edge AI Technology
BrainChip is a global technology company that's revolutionizing the field of artificial intelligence (AI) and neuromorphic computing. With a mission to make every device with a sensor AI-smart, BrainChip is the worldwide leader in edge AI on-chip processing and learning. The company's world-class engineering team brings together global expertise in artificial intelligence, semiconductor design, software development, and technology from talent located in California, Toulouse France, Hyderabad India, and Perth Australia.

Leadership Team
The leadership team at BrainChip comprises of individuals with vast experience and expertise in their respective fields:
Sean Hehir, CEO: Sean has managed large global teams and been responsible for explosive revenue growth for global enterprise organizations such as Compaq and HP and smaller, fast-growing companies like Fusion-io.

Peter van der Made, Founder & CTO: Peter has been at the forefront of computer innovation for 40 years. He is the inventor of the computer immune system and has designed the first generations of digital neuromorphic devices on which the Akida chip is based.
Anil Mankar, Co-Founder & Chief Development Officer: Anil has spent 40 years developing products in the semiconductor industry. He has developed multiple products across industry segments and later became the company's Chief Development Officer overseeing all product development.
Ken Scarince, CFO: Ken has served as a consultant at 8020 Consulting, working on all aspects of finance at various companies globally. Previously, he served as Controller at Virgin Galactic and Vice President of Finance and Chief Accounting Officer at Virgin America.
Nandan Nayampally, CMO: Nandan is an entrepreneurial executive with over 25 years of success in building or growing disruptive businesses with industry-wide impact. He was most recently at Amazon, leading the delivery of Alexa AI tools for Echo, FireTV, and other consumer devices.

BrainChip's Products
1. Akida IP: Akida IP is a complete neural processing core for integration into ASIC devices. It brings AI to the edge in a way that existing technologies are not capable of. The solution is high-performance, small, ultra-low power and enables a wide array of edge capabilities.
2. Akida 2.0: Akida 2.0 is the second generation of BrainChip's neuromorphic processor IP. It offers significant enhancements over the first generation, including increased performance, lower power consumption, and improved ease of use.
3. MetaTF Development Environment: MetaTF is a development environment that allows users to develop, train, and test neural networks, supporting the creation of AI applications for various end markets.
4. Akida Neural Processor SoC: The Akida Neural Processor SoC is a complete neural network processing device ideal for edge applications. It brings AI to the edge in a way that existing technologies are not capable of.
5. Akida Enablement Platforms: The Akida Enablement Platforms are tiered programs that bring you from concept to working prototype with varying levels of model complexity and sensor integration.

BrainChip's innovative products and the expertise of its leadership team make it a formidable player in the AI industry. The company is publicly traded on the Australian Stock Exchange (BRN:ASX) and the OTC Market (BRCHF).

Note: This blog post is based on the information available on BrainChip's official website and other linked pages. For the most accurate and up-to-date information, please visit the official BrainChip website.


Was looking at employees / founders. AMD, Astera https://www.asteralabs.com

Employees at Chipmonk Consulting​

 
  • Like
  • Love
  • Fire
Reactions: 49 users
Hmmmmm.....:unsure:.....c'mon IFS.....ya know ya wanna FFS :LOL:

From a few hours ago.




JULY 27, 2023 BY LIDIA PERSKA

Intel CEO: AI to be Integrated into All Intel Products​

Intel CEO Pat Gelsinger announced during the company’s Q2 2023 earnings call that Intel is planning to incorporate artificial intelligence (AI) into every product it develops. This comes as Intel prepares to release Meteor Lake, its first consumer chip with a built-in neural processor for machine learning tasks.

Previously, Intel had hinted that only their premium Ultra chips would feature AI coprocessors. However, Gelsinger’s statement suggests that AI will eventually be integrated into all of Intel’s offerings.

Gelsinger often emphasizes the “superpowers” of technology companies, which typically include AI and cloud capabilities. However, he now suggests that AI and cloud are not mutually exclusive. Gelsinger points out that certain AI-powered tasks, such as real-time language translation in video calls, real-time transcription, automation inference, and content generation, need to be done on the client device rather than relying on the cloud. He highlights the importance of edge computing, where AI processing occurs locally, rather than relying on round-tripping data to the cloud.

Gelsinger envisions AI integration in various domains, including consumer devices, enterprise data centers, retail, manufacturing, and industrial use cases. He even mentions the potential for AI to be integrated into hearing aids.

This strategy is crucial for Intel to compete with Nvidia, the dominant player in AI chips powering cloud services. While Nvidia has seen immense success in the AI market, Intel aims to find its own path by integrating AI into their products. This aligns with the growing demand for edge computing and the desire for more localized AI processing.

Furthermore, Gelsinger’s remarks highlight the shift in the tech industry towards AI-driven innovation. Microsoft, for example, has embraced AI, with the forthcoming Windows 12 rumored to integrate Intel’s Meteor Lake chip with its built-in neural engine. Similarly, Microsoft’s AI-powered Copilot tool is expected to revolutionize document editing.

Overall, Intel’s plans to incorporate AI into its products signify the company’s commitment to staying competitive and driving innovation in the AI landscape.
 
  • Like
  • Fire
  • Love
Reactions: 86 users

Sam

Nothing changes if nothing changes
Oops…he did it again !!

Tech just mentioned his oft stated view that January 2025 is his personal timeline. Correct me if I’m wrong Texta, but by this you mean for solid and consistently rising earnings?

For mine, I think you’re being a tad conservative, but if you are right, there will have to be a whole raft of good things happen between now and then...and they will come into the public domain either by way of announcements (formal via the ASX and informal via press releases etc) OR from the admirable efforts of super sleuths such as a very own 1000 Eyes.

The impact of all this stuff coming to light will be very positive on our battler’s share price between now and January 2025.

After our fortuitous meeting with the Mercedes racer at The Fox Goes Free pub, my better half and I have been meandering around the West Country. The only thing I can add to what came from the get together, is that after eighteen months of being hit over the head because of a falling share price, my shareholding better half has been placated by what she learned.

On this note, I want to reiterate some previous utterings relating to going the way of Kodak and dinosaurs…

There continues to be much discussion about are we in this and is this us. I firmly believe that the technology world is very aware of Akida, very aware of what it can do, and very aware that it’s not going away anytime soon. Given this, if you were somebody in authority within a Nvidia, a Qualcomm or the like, do you think you would risk your company’s future, by ignoring something truly game changing and thus missing the Akida bus?

I don’t think so. This is why I believe that the Nvidia’s, the Qualcomm’s…name your company people…no matter what name you nominate, they and/or their favourite tech partner would be looking into how they can use Akida.

To be not doing so, would be risking going the way of Kodak and the dinosaurs.
Telstra 😬
 
Last edited:
I might have missed this short vid with Nandan being posted.

Nice to see Akida & the Prophesee camera working as well.

Posted by Edge AI & Vision Alliance a couple or so weeks ago.


BrainChip Demonstration of Sensor-agnostic, Event-based, Untethered Edge AI Inference and Learning​



 
  • Like
  • Fire
  • Love
Reactions: 27 users
And this quick one from Nikunj around the same date.



 
  • Like
  • Fire
Reactions: 21 users

CHIPS

Regular
I might be mistaken, but it seems to me that somebody in Germany has been buying BIG today via Tradegate.
 

Attachments

  • brainchip.JPG
    brainchip.JPG
    186.2 KB · Views: 183
  • Like
  • Thinking
Reactions: 12 users

Tothemoon24

Top 20

Researchers successfully train a machine learning model in outer space for the first time​

ARTIFICIAL INTELLIGENCEASTROPHYSICSINNOVATIONMATHEMATICAL, PHYSICAL AND LIFE SCIENCESRESEARCHSCIENCE
For the first time, researchers have trained a machine learning model in outer space, on board a satellite. This achievement could enable real-time monitoring and decision making for a range of applications, from disaster management to deforestation.
Machine learning has a huge potential for improving remote sensing – the ability to push as much intelligence as possible into satellites will make space-based sensing increasingly autonomous. This would help to overcome the issues with the inherent delays between acquisition and action by allowing the satellite to learn from data onboard. Vít’s work serves as an interesting proof-of-principle.
Professor Andrew Markham (Department of Computer Science), supervisor for the project
Data collected by remote-sensing satellites is fundamental for many key activities, including aerial mapping, weather prediction, and monitoring deforestation. Currently, most satellites can only passively collect data, since they are not equipped to make decisions or detect changes. Instead, data has to be relayed to Earth to be processed, which typically takes several hours or even days. This limits the ability to identify and respond to rapidly emerging events, such as a natural disaster.
To overcome these restrictions, a group of researchers led by DPhil student Vít Růžička(Department of Computer Science, University of Oxford), took on the challenge of training the first machine learning program in outer space. During 2022, the team successfully pitched their idea to the Dashing through the Stars mission, which had issued an open call for project proposals to be carried out on board the ION SCV004 satellite, launched in January 2022. During the autumn of 2022, the team uplinked the code for the program to the satellite already in orbit.
The researchers trained a simple model to detect changes in cloud cover from aerial images directly onboard the satellite, in contrast to training on the ground. The model was based on an approach called few-shot learning, which enables a model to learn the most important features to look for when it has only a few samples to train from. A key advantage is that the data can be compressed into smaller representations, making the model faster and more efficient.
Vít Růžička explained: ‘The model we developed, called RaVAEn, first compresses the large image files into vectors of 128 numbers. During the training phase, the model learns to keep only the informative values in this vector; the ones that relate to the change it is trying to detect – in this case, whether there is a cloud present or not. This results in extremely fast training due to having only a very small classification model to train.’ Whilst the first part of the model, to compress the newly-seen images, was trained on the ground, the second part (which decided whether the image contained clouds or not) was trained directly on the satellite.
Normally, developing a machine learning model would require several rounds of training, using the power of a cluster of linked computers. In contrast, the team’s tiny model completed the training phase (using over 1300 images) in around one and a half seconds.
When the team tested the model’s performance on novel data, it automatically detected whether a cloud was present or not in around a tenth of a second. This involved encoding and analysing a scene equivalent to an area of about 4.8 x 4.8 km2 area (equivalent to almost 450 football pitches).
According to the researchers, the model could easily be adapted to carry out different tasks, and to use other forms of data. Vít Růžička added: ‘Having achieved this demonstration, we now intend to develop more advanced models that can automatically differentiate between changes of interest (for instance flooding, fires, and deforestation) and natural changes (such as natural changes in leaf colour across the seasons). Another aim is to develop models for more complex data, including images from hyperspectral satellites. This could allow, for instance, the detection of methane leaks, and would have key implications for combatting climate change.’
Performing machine learning in outer space could also help overcome the problem of on-board satellite sensors being affected by the harsh environmental conditions, so that they require regular calibration. Vít Růžička said: ‘Our proposed system could be used in constellations of non-homogeneous satellites, where reliable information from one satellite can be applied to train the rest of the constellation. This could be used, for instance, to recalibrate sensors that have degraded over time or experienced rapid changes in the environment.’
This project has been summarised in a pre-print publication ‘Fast model inference and training on-board of Satellites’ available at arXiv. The work was also presented at the International Geoscience and Remote Sensing Symposium (IGARSS) conference on 21 July 2023.
This project was conducted in collaboration with the European Space Agency (ESA) Φ-lab via the Cognitive Cloud Computing in Space (3CS) campaign and the Trillium Technologies initiative Networked Intelligence in Space (NIO.space) and partners at D-Orbit and Unibap.
 
  • Like
  • Love
Reactions: 20 users
Top Bottom