BRN Discussion Ongoing

jrp173

Regular
Evening Frangipani ,

One has a favour to ask....

Regarding a job advertisement , from Brainchip manigement , advertised on Linkedin ,
Looking for / hireing a...

Social Media Marketing Intern in Laguna Hills , CA / Linked in.

* Amoungst all the prerequisites, at the bottom (apparently , and this also rings true with my recollection) the last prerequisite was....

* EXISTING MEMBRE OF TSEX OR HC FORUMS AN ADVANTAGE.

Only if you have the time.... would it be possible to locate this original post on LinkedIn & see if the above * BOLD WRIGHTING has been removed / or altered in any way.

Only reason i ask is I've seen your FORENSIC ability to track posts , day, date ,time utilising a specific program thingi, well and truely above and beyond my experties.


A photo or the original Post from BrainChip can be found here...

Over on the Crapper,
Poster: StockHound81
Date : 3/7/25
Post No: #79675458.

Trying to clear up much confusion amoungst shareholders, as its not like our manigement seem to bother.

Thankyou in advance.
Regards,
Esq.

Who knows if the version posted by Humble Genius that includes "existing member of Tsex and HC" is original or modified or where it came from but what seems strange to me that BrainChip (when advertising for a job based in the USA, for an intern probably in their 20s, who is probably not a shareholder), would use the expression Tsex and HC (rather than their full names). I personally think it's highly unlikely that someone that age, in the USA would even know what Tsex and HC is.....

Strange that the formatting on both ads are different. Check out the wrapping of text, the lack of spaces between the dot points and text, and no line between skills needed and the previous points..

Perhaps Humble Genius could shed some light on his post as to where he found this ad.

I wonder if someone is playing games, and added the extra line to stir things up (not suggesting Humble Genius, maybe whoever posted it originally!)

1751537378000.png
 
Last edited:
  • Fire
  • Like
Reactions: 3 users
Good Evening Fullmoonfever ,

BINGO , Cheers mate.

Just need Frangipani to run it through her special Forensic app / site which should reveal if the original has been diddled or not.

Regards,
Esq.
I don't recall humble posting a link myself so maybe they could share their source?

Can't seem to locate the original post either so maybe deleted or wrong keywords?

I just tried the link on Wayback Machine and it hasn't archived anything unfortunately.
 
  • Fire
Reactions: 1 users

Guzzi62

Regular
From FF


BrainChip and HaiLa cut power use for IoT monitoring

July 2, 2025 Steve Rogerson

Australian AI firm BrainChip is working with HaiLa Technologies, a Canadian innovator in low power wireless, on IoT connectivity across medical, environmental and infrastructure monitoring.

Together, the companies will demonstrate how BrainChip’s Akida neuromorphic technology pairs seamlessly with HaiLa’s BSC2000 radio frequency integrated circuit (RFIC) to enable breakthrough power efficiency for connected sensor applications in IoT, medical and smart infrastructure markets.

The combined technologies produce an efficient architecture that paves the way for continuously connected battery-operated devices that can last the entire life of the product on a single coin cell battery. This joint demonstration leverages HaiLa’s power-efficient passive backscatter wireless communications over standard wifi infrastructure with BrainChip’s Akida AKD1500 event-based AI processor.

The integration provides a platform for anomaly detection, condition monitoring and other sensor-intelligence tasks while operating on microwatts of power.

BrainChip and HaiLa are teaming up to deliver smarter, low-power offerings for intelligent connected edge devices, making it easier to run AI at the edge without draining battery life. HaiLa’s BSC2000 is a wifi-compatible connectivity RFIC designed to showcase power savings in IoT environments. When paired with Akida’s energy-efficient, event-driven AI compute, the result is said to be a uniquely optimised approach.

“As a pioneer in neuromorphic computing, we are excited to partner with HaiLa to demonstrate how advanced low-power AI processing can work in tandem with ultra-efficient wireless connectivity,” said Steve Brightfield, CMO at BrainChip. “By combining our Akida technology with HaiLa’s innovative RF platform, we’re making intelligent, battery-powered edge sensors a practical reality.”

Patricia Bower, vice president at HaiLa, added: “Our collaboration with BrainChip brings together two power-conscious technologies that redefine what is possible at the edge. With backscatter wifi and neuromorphic AI operating on microwatts, developers can create continuously monitored, intelligent sensors that last for years without battery replacement. This is transformative for anomaly detection, predictive maintenance and other real-time sensing applications.”

Founded in 2019, HaiLa (www.haila.io) is a fabless semiconductor and software company developing low-power multi-protocol radio communication for IoT devices. Originally conceptualised at Stanford University, HaiLa enables pervasive edge AI and the scaling of battery-free IoT by offering power-efficient wireless connectivity on standard wireless protocols such as wifi, Bluetooth and cellular.

BrainChip (www.brainchip.com) is a specialist in edge AI on-chip processing and learning. The company’s Akida processor is said to be the first commercial, fully digital, event-based AI that mimics the way the brain analyses data, processing only essential inputs with efficiency and speed. Akida supports edge learning directly on the chip, without the need for cloud connectivity, providing advantages in latency, privacy and energy consumption. Akida IP is suitable for integration into SoCs used in a wide range of real-world applications, from connected vehicles and consumer electronics to industrial automation and IoT sensors.


 
  • Like
  • Fire
Reactions: 16 users

Frangipani

Top 20
The following LinkedIn comment by Mercedes-Benz suggests to me once again that they continue to weigh their options with regards to various neuromorphic offerings - the good news is, we are apparently still in the running, but at the same time it is definitely not a confirmation of an upcoming deal either or an acknowledgment of Akida’s superiority over competitors’ technology (note they did not give Kimberly’s earlier post a 👍🏻).

A very diplomatic reply, I’d say, focusing on the potential of BrainChip being considered in the future based on a positive past experience.
But no promise.



View attachment 76873
And another reply from Mercedes:


View attachment 76887
Danke, @CHIPS!

This follow-up reply should put to bed any speculation that Akida might already be implemented in any of the Mercedes-Benz automobiles set to roll off the production line in the near future, such as the CLA 2025, which will celebrate its world premiere in March and will be the first model to feature the all-new MB.OS (Mercedes Benz Operating System).

(Not to mention that no IP license has been signed to date.)
A fellow forum user who in recent months repeatedly referred to his brief LinkedIn exchange with Mercedes-Benz Chief Software Officer Magnus Östberg (and thereby willingly revealed his identity to all of us here on TSE, which in turn means I’m not guilty of spilling a secret with this post that should have been kept private), asked Mercedes-Benz a question in the comment section underneath the company’s latest LinkedIn post on neuromorphic computing. This time, however, he decided not to share the carmaker’s reply with all of us here on TSE. You gotta wonder why.

Could it possibly have to do with the fact that MB’s reply refutes the hypothesis he had been advancing for months, namely that Mercedes-Benz, who have been heavily promoting their future SDV (software defined vehicle) approach that gives them the option of OTA (over-the-air) updates, would “more than likely” have used Akida 2.0/TENNs simulation software in the upcoming MB.OS release as an interim solution during ongoing development until the not-yet existing Akida 2.0 silicon became available at a later stage? The underlying reason being competitive pressure to be first-to-market…

The way I see it, the January 29 reply by MB clearly puts this speculation to bed:



View attachment 77012


Does that sound as if an MB.OS “Akida inside” reveal at the upcoming world premiere of the CLA were on the cards?


Setting aside the questions

a) about any requirements for testing and certification of car parts making up the infotainment system (being used to German/EU bureaucracy, I find it hard to believe there wouldn’t be any at all - maybe someone who is knowledgeable about automotive regulations within Germany and the EU could comment on this) and

b) whether any new MB model containing our tech could roll off the production line despite no prior IP license deal having been signed (or at least an Akida 1.0 chips sales deal; there has never been a joint development announcement either which could possibly somehow circumvent the necessity of an upfront payment showing up in our financials)….

… various MB statements in recent months (cf. Dominik Blum’s presentation at HKA Karlsruhe I shared in October: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352 - the university of applied sciences they have since confirmed to cooperate with regarding research on neuromorphic cameras, journalists quoting MB engineers after having visited their Future Technologies Lab as well as relevant posts and comments on LinkedIn) have diminished the likelihood of neuromorphic tech making its debut in any soon to be released Mercedes-Benz models.

If NC were to enhance voice control and infotainment functions in their production vehicles much sooner than safety-critical ones (ADAS), MB would surely have clarified this in their reply to the above question posed to them on LinkedIn, which specifically referred to the soon-to-be released CLA, which happens to be the first model to come with the next-generation MB.OS that also boasts the new AI-powered MBUX Virtual Assistant (developed in collaboration with Google).

Instead, they literally wrote:

“(…) To fully leverage the potential of neuromorphic processes, specialised hardware architectures that efficiently mimic biologically inspired systems are required (…) we’re currently looking into Neuromorphic Computing as part of a research project. Depending on the further development progress, integration could become possible within a timeframe of 5 to 10 years.”

They are evidently exploring full scale integration to maximise the benefits of energy efficiency, latency and privacy. The voice control implementation of Akida in the Vision EQXX was their initial proof-of-concept to demonstrate feasibility of NC in general (cf. the podcast with Steven Peters, MB’s former Head of AI Research from 2016-2022: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-407798). Whether they’ll eventually partner with us or a competitor (provided they are happy with their research project’s results) remains to be seen.

So I certainly do not expect the soon-to-be revealed CLA 2025 with the all-new MB.OS to have “Akida inside”, although I’d be more than happy to be proven wrong, as we’d all love to see the BRN share price soar on these news…
Time - and the financials - will ultimately tell.

I found evidence that Mercedes-Benz have shown continued interest in Akida throughout 2024, while at the same time also testing out several other neuromorphic processors - those by our competitors Intel, SynSense, and Innatera as well as presumably also Applied Brain Research’s Time Series Processor aka TSP (highly likely via the Uni Waterloo research project, as ABR is a Uni Waterloo spin-off company - the ABR Time Series Processor was also shown in a table depicting various examples of neuromorphic hardware during an October 2024 presentation at Hochschule Karlsruhe by Dominik Blum, one of MB’s neuromorphic engineers, cf. https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-440033).

Mercedes-Benz have now repeatedly stated they are looking into neuromorphic computing as part of a research project, emphasising that they are still at a very early stage of research and are experimenting with NC in their Future Technologies Lab (which Markus Schäfer referred to as MB’s “early-tech kitchen” in a November 2024 LinkedIn post), and that “depending on the further development progress, integration could become possible within a timeframe of 5 to 10 years” (comment by MB on LinkedIn, 29 January 2025, see my last tagged post above).

Part of this and other research at Mercedes-Benz is conducted by Bachelor, Master or PhD students, who were lucky enough to get an industrial placement with them for the duration of their theses.

One such example is the Master’s thesis “Analyzing Frameworks and Strategies for Converting Neural Networks for Neuromorphic Processors” by Sreelakshmi Rameshan, a 2024 Uni Stuttgart M.Sc. Computer Science graduate, that @Fullmoonfever and I had posted about in December.
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-443909
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-443910

And voilà, here is another relevant Master’s thesis I spotted:

In February, Krishnaprasad Thoombayil Satheesh graduated from Uni Stuttgart with an M.Sc. in Electrical Engineering, Smart Systems. During his last semester, he wrote his Master’s thesis on “Transitioning Deep Neural Network solutions into Neuromorphic Processors”, while being placed with Mercedes-Benz.

Unfortunately his thesis has not (yet?) been published on https://elib.uni-stuttgart.de/home, so all we’ve got for now is what he shared via his LinkedIn profile.

It does give us some novel info, though, as it reveals that
during the second half of 2024, Mercedes-Benz were experimenting with converting ANNs to SNNs for child presence detection, and apparently testing out the implementation on Akida, Speck* and Loihi 2.

*Speck combines an IniVation dynamic vision sensor (DVS) (=event camera) with SynSense’s spiking neural network processor Dynap-CNN - so we know this particular solution must have been based on vision, not on radar sensing, as some other CPD (child presence detection) solutions are.


As for the stated research thesis (under Education) “Implementation of Robotic Manipulator using ROS and Object Detection for Reliability Analysis”, I am a bit puzzled. AFAIK, a robotic manipulator is usually the term for the “arm” of a robot? What could this have to do with in-cabin monitoring systems? A camera mounted on a mobile arm? 🤔

And does that actually refer to his Master’s thesis at all or could it be unrelated and refer to some other research work during his graduate degree? See his work on “Segmentation and Localization of crane-hook” (under Experience) when he was working part-time as a research assistant (= HiWi in German) at Uni Stuttgart’s Institute of System Dynamics during his Master studies.
Any thoughts welcome…




8D926410-BD1A-4F83-9AFB-B7E6B6D2C58C.jpeg



22A7C9BD-BA55-433F-8EB4-C0D23FBD2BFB.jpeg

7C0CE303-23CE-4131-B724-C3149AD890A2.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 21 users

Frangipani

Top 20
I found evidence that Mercedes-Benz have shown continued interest in Akida throughout 2024, while at the same time also testing out several other neuromorphic processors - those by our competitors Intel, SynSense, and Innatera as well as presumably also Applied Brain Research’s Time Series Processor aka TSP (highly likely via the Uni Waterloo research project, as ABR is a Uni Waterloo spin-off company - the ABR Time Series Processor was also shown in a table depicting various examples of neuromorphic hardware during an October 2024 presentation at Hochschule Karlsruhe by Dominik Blum, one of MB’s neuromorphic engineers, cf. https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-440033).

Mercedes-Benz have now repeatedly stated they are looking into neuromorphic computing as part of a research project, emphasising that they are still at a very early stage of research and are experimenting with NC in their Future Technologies Lab (which Markus Schäfer referred to as MB’s “early-tech kitchen” in a November 2024 LinkedIn post), and that “depending on the further development progress, integration could become possible within a timeframe of 5 to 10 years” (comment by MB on LinkedIn, 29 January 2025, see my last tagged post above).

Part of this and other research at Mercedes-Benz is conducted by Bachelor, Master or PhD students, who were lucky enough to get an industrial placement with them for the duration of their theses.

One such example is the Master’s thesis “Analyzing Frameworks and Strategies for Converting Neural Networks for Neuromorphic Processors” by Sreelakshmi Rameshan, a 2024 Uni Stuttgart M.Sc. Computer Science graduate, that @Fullmoonfever and I had posted about in December.
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-443909
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-443910

And voilà, here is another relevant Master’s thesis I spotted:

In February, Krishnaprasad Thoombayil Satheesh graduated from Uni Stuttgart with an M.Sc. in Electrical Engineering, Smart Systems. During his last semester, he wrote his Master’s thesis on “Transitioning Deep Neural Network solutions into Neuromorphic Processors”, while being placed with Mercedes-Benz.

Unfortunately his thesis has not (yet?) been published on https://elib.uni-stuttgart.de/home, so all we’ve got for now is what he shared via his LinkedIn profile.

It does give us some novel info, though, as it reveals that
during the second half of 2024, Mercedes-Benz were experimenting with converting ANNs to SNNs for child presence detection, and apparently testing out the implementation on Akida, Speck* and Loihi 2.

*Speck combines an IniVation dynamic vision sensor (DVS) (=event camera) with SynSense’s spiking neural network processor Dynap-CNN - so we know this particular solution must have been based on vision, not on radar sensing, as some other CPD (child presence detection) solutions are.


As for the stated research thesis (under Education) “Implementation of Robotic Manipulator using ROS and Object Detection for Reliability Analysis”, I am a bit puzzled. AFAIK, a robotic manipulator is usually the term for the “arm” of a robot? What could this have to do with in-cabin monitoring systems? A camera mounted on a mobile arm? 🤔

And does that actually refer to his Master’s thesis at all or could it be unrelated and refer to some other research work during his graduate degree? See his work on “Segmentation and Localization of crane-hook” (under Experience) when he was working part-time as a research assistant (= HiWi in German) at Uni Stuttgart’s Institute of System Dynamics during his Master studies.
Any thoughts welcome…




View attachment 87992


View attachment 87993
View attachment 87994

Speaking of in-cabin monitoring:


351522A8-DF90-4D18-9FA1-633BA5BC61AD.jpeg



BrainChip has recently updated the White Paper “Designing Smarter and Safer Cars with Essential AI”, which says it is now Version 2, last updated 28 May 2025:



EE1420DD-A3C7-4FB5-9B05-F44C520F1F16.jpeg



I haven’t had time to compare the latest version with an older one, but here are the most important pages:

D2EB3635-26BF-4E18-B671-80890B21287E.jpeg
BF575C97-1FFD-4789-974B-DB69C930404B.jpeg
85CD4314-0F42-4751-B749-EB96BA4E8060.jpeg

5F2A0F69-C6CD-4244-B9BE-73DE1228E60D.jpeg
69119A88-DBB7-44BC-AFAA-BE90A9F44E0E.jpeg
B9756AB2-B9F0-4BBB-ADAC-5193AAA61267.jpeg
A3172E4C-BEE9-43C4-9028-73F616844351.jpeg
4EE43BCB-9D72-4D4D-B0B5-B90F5C47E6CF.jpeg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 24 users

7für7

Top 20
Good Evening Fullmoonfever ,

BINGO , Cheers mate.

Just need Frangipani to run it through her special Forensic app / site which should reveal if the original has been diddled or not.

Regards,
Esq.
What???? They mentioned Reddit, but not the biggest BRN Fan Community Platform on the planet ??!!!!! TSE?
A clear SELL!!!!
 
  • Haha
Reactions: 1 users

Tothemoon24

Top 20

IMG_1210.jpeg

IMG_1209.jpeg


By Douglas McLelland
July 3, 2025
cropa-280x112.webp

As humanity pushes the boundaries of space exploration, the requirements for intelligent, autonomous systems capable of operating in harsh, resource-constrained environments have never been more challenging. Traditional AI accelerators, while powerful, often require significant energy and cooling—resources that are scarce aboard spacecraft and satellites.

This creates a critical demand for ultra-low-power solutions that can deliver real-time inference without compromising performance. Enter Akida from BrainChip—a neuromorphic processor designed to mimic the efficiency of the human brain, offering a new compute paradigm for AI applications in space.

In this post, we’ll explore how Akida’s unique architecture addresses the energy and latency challenges of space-based AI, and why it’s poised to become a game-changer for edge intelligence beyond Earth.

Unlocking Hidden Efficiency: Sparsity and the Power of the Akida Architecture​

One of the standout features of the Akida architecture is its smart use of sparsity in neural network models and streaming data. Instead of following the traditional exhaustive compute approach that processes every single data point—whether it holds valuable information or not—Akida selectively computes that data that is more likely to impact the output result. It focuses only on what’s meaningful, skipping over zero activations and triggering computation only where it’s truly needed.

This event-based processing isn’t just clever—it’s efficient. But it naturally raises a key question: is there actually enough sparsity in neural networks to make this worthwhile?

Over the past several years, I’ve had the opportunity to collaborate closely with developers across a range of industries—including some working on cutting-edge space applications. One preconception I often hear is that Akida event-based processing might only perform well when paired with event-based inputs.

While it’s true that this can be a great pairing (more on this below), it’s equally important to emphasize that mainstream models using standard inputs—like images or audio—also contain significant sparsity. Indeed, when we analyze sparsity in models, we find that once you’re past the first few layers, sparsity is determined much more by architecture and task than by input type.

Real-World Validation: Testing Akida in a Satellite Workflow​

A great example of this can be found in a recent study we did alongside Gregor Lenz from Neurobus[1] on frame-based data. The study was to test Akida in a typical Earth Observation satellite workflow, to show the throughput and energy requirements that could be expected for space-based detection to avoid transmitting all of the images to the ground for subsequent processing.

One common element in these tasks is that objects of interest may only be present in a minority of images, and so an efficient system should perform low-cost filtering, if possible, prior to any more intensive analysis (or downlinking). So, we showed a very simple task, image classification (ship/no ship) on the Airbus Ship Detection dataset [2].

20250703b_1.webp


Figure 1: The Airbus Ship Detection dataset [2]. A minority of images (22%) contain one or more ships, and the task is to localize these, typically requiring processing of a relatively high-resolution image. However, many more images contain no ships: it is possible to perform ship/no-ship filtering very effectively at lower resolution. From Lenz & McLelland (2024).

Especially noteworthy is that even with standard image inputs, the model delivered significant energy savings—all thanks to activation sparsity.

As shown in Figure 2, the input image itself is fully dense, meaning there’s zero sparsity at the first layer. But that changes quickly. Just a few layers into the network, sparsity climbs past 50%, and by the deeper layers, it often exceeds 90%. That’s a very typical picture for a standard CNN backbone. (A quick clarification on sparsity, because it’s a bit of a back-to-front measure: high sparsity is good, it means fewer non-zero values so less processing on Akida. 90% sparsity means that there are only 10% non-zeros to process).

20250703b_2.webp


Figure 2: Input sparsity per layer for a classification model (ship/no-ship) processing dense image inputs. Even in the early layers, sparsity is typically higher than 50%, and by the latter part of the model is often higher than 90%. Adapted from Lenz & McLelland (2024).

This level of sparsity allowed Akida to process large volumes of satellite imagery efficiently, at just over 5 mJ per image, and a dynamic power of less than 0.5 W for 85 FPS. This level of performance enables systems to filter out irrelevant data before passing selected images for further processing. A similar approach was outlined by Kadway et al. (2023) [^3], where Akida was used to pre-filter satellite images based on cloud cover, so: high activation sparsity in models does not require event-based inputs.

The Natural Synergy Between Akida and Event-Based Sensors​

While Akida delivers strong efficiency gains even with traditional inputs, it truly shines when paired with event-based sensors—a combination especially relevant in space applications. Dynamic Vision Sensors (DVS), for example, have garnered increasing interest for tasks ranging from spacecraft navigation and landing [4] to space situational awareness [5]. DVS sensors offer two standout advantages that make them particularly well-suited for space:

  • Ultra-high temporal resolution: unlike traditional cameras that capture full frames at fixed intervals, DVS sensors operate asynchronously, recording changes in brightness at each pixel as they occur—often with microsecond precision. This allows them to capture fast-moving objects or sudden events, such as docking maneuvers or debris flybys, without motion blur or latency. In the vacuum of space, where objects can move at several kilometers per second, this responsiveness is critical for real-time perception and control.
  • Exceptional dynamic range: With dynamic ranges exceeding 120 dB—far surpassing conventional image sensors, DVS outperform traditional image sensors. This means they can function effectively in extreme lighting conditions, such as when a spacecraft transitions from deep shadow into direct sunlight. In such scenarios, traditional cameras often saturate or lose detail, while DVS continues to deliver usable data.

Making DVS Work for ML with Akida and TENNs​

Despite these attractive features, DVS has faced barriers to adoption on the machine learning side.

On the one hand, the more strongly neuromorphic approaches (with fully asynchronous spiking neural networks, for example) can achieve theoretically impressive results, but they have less familiar training pipelines and often lack support on commercially available hardware that could exploit their advantages.

On the other hand, standard CNNs have proven perfectly capable of processing DVS inputs, but typical NPU hardware implementations are then unable to exploit the very high spatial sparsity of the data.

That’s where Akida offers a combination of benefits: the efficiency of event-based processing but built on very familiar and mature ML training stacks for CNNs. Add in the support for our new TENNs models on Akida 2 – intrinsically designed to handle spatiotemporal data and tasks, thus perfect for DVS – and you really have a perfect match.

From Concept to Orbit: Akida Heads to Space with Frontgrade Gaisler Akida’s capabilities are not just theoretical. Frontgrade Gaisler, a leader in radiation-hardened microprocessors for space, is integrating Akida IP alongside their NOEL-V RISC-V processor in the upcoming GR801 SoC, part of the new GRAIN product line. This space-grade chip is designed specifically for energy-efficient in orbit AI, combining traditional and neuromorphic processing in a radiation-tolerant package.

Embedding Akida at the heart of this GRAIN architecture highlights its suitability for the harsh constraints of space missions and signals growing trust in neuromorphic computing as a cornerstone of next-gen space systems.

Looking Ahead​

I’m incredibly excited about the work we are advancing in this domain. We look forward to the results as the event-based datasets continue to mature. Meanwhile, our recent state-of-the-art achievements in other areas [^6] using these models leave me with no doubt about the enormous potential for Akida in future space applications.

If you’re exploring how to bring intelligent, low-power processing to your next project—whether in orbit or on Earth—we’d love to hear from you. Get in touch with the BrainChip team to learn more about how Akida can help you build smarter, more efficient AI at the edge.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 26 users

Frangipani

Top 20
Evening Frangipani ,

One has a favour to ask....

Regarding a job advertisement , from Brainchip manigement , advertised on Linkedin ,
Looking for / hireing a...

Social Media Marketing Intern in Laguna Hills , CA / Linked in.

* Amoungst all the prerequisites, at the bottom (apparently , and this also rings true with my recollection) the last prerequisite was....

* EXISTING MEMBRE OF TSEX OR HC FORUMS AN ADVANTAGE.

Only if you have the time.... would it be possible to locate this original post on LinkedIn & see if the above * BOLD WRIGHTING has been removed / or altered in any way.

Only reason i ask is I've seen your FORENSIC ability to track posts , day, date ,time utilising a specific program thingi, well and truely above and beyond my experties.


A photo or the original Post from BrainChip can be found here...

Over on the Crapper,
Poster: StockHound81
Date : 3/7/25
Post No: #79675458.

Trying to clear up much confusion amoungst shareholders, as its not like our manigement seem to bother.

Thankyou in advance.

Regards,
Esq.
Is this the one Esq?

Pretty sure humblegenious might have posted it originally here. That version was diff from this one I believe.

Still on LinkedIn.



Social Media Marketing Intern​

BrainChip Laguna Hills, CA​

3 weeks ago Over 200 applicants​

See who BrainChip has hired for this role

ApplySave

Use AI to assess how you fit​

Get AI-powered advice on this job and more exclusive features.
Am I a good fit for this job?Tailor my resume








Objectives:
To grow BrainChip’s social media footprint, increase engagement, and implement paid demand generation campaigns.

Description:
This role will oversee the expansion of BrainChip’s organic and paid social activities across LinkedIn, X, Bluesky, and Instagram as well as developer focused channels like Reddit, Substack and medium. Responsibilities include:

  • Expand social media reach to Bluesky, Reddit and Instagram platforms
  • Research best practices on each platform to increase engagement
  • Increase BrainChip's engagement with partner and prospect accounts
  • Assist with the development of paid advertising strategy for upcoming campaigns
  • Track & report on engagement metrics to identify highest performing tactics
  • Respond to activity on BrainChip accounts and field as needed
  • Assist in content development such as podcast to increase reach and audiences
  • Monitor and listen to industry social account and groups for key topics
  • Amplify corporate content with thought leadership
  • Identify engagement opportunities and industry influencers


Skills Needed:
  • Previous experience with organic social media content
  • Mastery of social media platforms: LinkedIn, X, Bluesky, Instagram
  • Basic skills in Canva or similar platform
  • Ability to work independently and in teams
  • Excellent verbal, written, and presentation skills
Who knows if the version posted by Humble Genius that includes "existing member of Tsex and HC" is original or modified or where it came from but what seems strange to me that BrainChip (when advertising for a job based in the USA, for an intern probably in their 20s, who is probably not a shareholder), would use the expression Tsex and HC (rather than their full names). I personally think it's highly unlikely that someone that age, in the USA would even know what Tsex and HC is.....

Strange that the formatting on both ads are different. Check out the wrapping of text, the lack of spaces between the dot points and text, and no line between skills needed and the previous points..

Perhaps Humble Genius could shed some light on his post as to where he found this ad.

I wonder if someone is playing games, and added the extra line to stir things up (not suggesting Humble Genius, maybe whoever posted it originally!)

View attachment 87997

Hi everyone, including StockHound81 over on HC, whom I trust will be reading this,

the image @Humble Genius posted with the added line “Existing members of Tsex or HC forums an advantage” is not a screenshot of the original LinkedIn job ad by BrainChip. I assume it was his idea of a joke, given the real LinkedIn ad does say “Monitor and listen to industry social account and groups for key topics”.

I know for sure it wasn’t the original, as I happened to be the poster who had posted the original (without the added TSE/HC line!) shortly before @Humble Genius posted the altered version, which he should have marked as parody to avoid all the confusion, although as @jrp173 correctly noted, it is in a different font from LinkedIn posts and was thus discernible as a joke to me personally (who had also just posted a screenshot of the real ad and was thus doubly aware of the inauthenticity of the added line). And I also agree that it is highly unlikely that BrainChip would include those forum abbreviations in a job ad for a US-based social media intern unlikely to have ever heard of them.

Saying that, there has indeed been a bit of editing/deleting going on with BrainChip LinkedIn posts in recent months, and in addition, the AGM webcast never got uploaded to YouTube (and we all know why…), so it should not come as a surprise when some people’s suspicion gets aroused.

Here is the proof (I needed to take two screenshots, as the ad was too long for one - the second screenshot starts below the lines I had marked in green):

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-465055

57B6C8E8-9FE0-43B3-BF75-E9717B6D992F.jpeg


CDC60F34-FF9C-471F-AB11-8F69C984F496.jpeg



Meanwhile, the contested post by @Humble Genius has apparently been deleted. It was between mine and that of @smoothsailing18 (to whom it was addressed), and @smoothsailing18’s reaction is still visible.

1DCA5C33-6830-43CE-8C19-F7B85871B743.jpeg



Let me make this clear: While the image @Humble Genius posted was not a screenshot of the original LinkedIn job ad, StockHound81 did not alter or forge that screenshot in any way, if it is the same as the one shown in @jrp173 ’s post (since I don’t have an HC account, I can read the texts, but images/screenshots posted are mostly blurry). It was definitely posted here on TSE on 7 June 2025 and was not marked as a parody post. I can vouch for that.

So yes, StockHound81 may in this case be “guilty” of gullibleness, but those posters who have accused him of forgery or criminal behaviour, apparently claiming the user were a paid downramper that wanted to harm the company owe him an apology, if you ask me…

Anyway, it would be best if @Humble Genius could clear up the confusion himself.
 
  • Like
  • Love
  • Fire
Reactions: 10 users
Evening Frangipani ,

One has a favour to ask....

Regarding a job advertisement , from Brainchip manigement , advertised on Linkedin ,
Looking for / hireing a...

Social Media Marketing Intern in Laguna Hills , CA / Linked in.

* Amoungst all the prerequisites, at the bottom (apparently , and this also rings true with my recollection) the last prerequisite was....

* EXISTING MEMBRE OF TSEX OR HC FORUMS AN ADVANTAGE.

Only if you have the time.... would it be possible to locate this original post on LinkedIn & see if the above * BOLD WRIGHTING has been removed / or altered in any way.

Only reason i ask is I've seen your FORENSIC ability to track posts , day, date ,time utilising a specific program thingi, well and truely above and beyond my experties.


A photo or the original Post from BrainChip can be found here...

Over on the Crapper,
Poster: StockHound81
Date : 3/7/25
Post No: #79675458.

Trying to clear up much confusion amoungst shareholders, as its not like our manigement seem to bother.

Thankyou in advance.

Regards,
Esq.
Hi again Esq.

Frangipani has already posted a side by side of the job descriptions and apparently LinkedIn does indicate if a job posting has been edited. I don't know if that is automatic by their system or something a prospective employer can bypass.

So, unless BRN maybe deleted the "so called original" that humble posted and replaced it with a new ad it appears that the existing ad still available on LinkedIn that Frangipani and I posted earlier has not been edited by BRN unless the "edited" notice can be bypassed or the onus is on the employer to flag it as such.

If the edit flag is LinkedIn generated, not employer, then it's plausible that someone else doctored the ad originally as a snap/screen shot and posted it themselves maybe ...in poor taste in the end.


Yes, LinkedIn does indicate when a job posting has been edited. While it doesn't show the specific changes made, it will display an "Edited" label next to the job posting, according to LinkedIn's help pages. This helps users understand that the job details have been updated since the initial posting.
 
  • Like
  • Love
Reactions: 4 users

Frangipani

Top 20


Edge AI Milan 2025​


Join BrainChip for Edge AI Milan 2024 July 2-4, an inspiring and forward-thinking event exploring how edge AI is bridging the gap between digital intelligence and real-world impact. Attendees can engage with industry leaders and experience the latest innovations including Brainchip’s Akida neuromorphic, event-driven AI at the edge.

Register



BrainChip will be exhibiting at Edge AI Milan 2025 next week.
It’s a pity, though, that no one from our company will be giving a presentation at that conference, especially since Innatera will be spruiking their technology.

In addition, Innatera’s Petruț Antoniu Bogdan will give a workshop on neuromorphic computing “on the current state of neuromorphic computing technologies and their potential to transform Edge AI” in his capacity as co-chair of the Edge AI Foundation Working Group on Neuromorphic Computing, which BrainChip also belongs to.
Our CMO Steve Brightfield is co-chair of another Edge AI Working Group, namely Marketing.


View attachment 87744





View attachment 87745 View attachment 87746



View attachment 87747

Pete Bernard, CEO of the Edge AI Foundation, on - you guessed it - Edge AI and the upcoming Edge AI Milan (🏆☕😊) in a recorded message to the Hackster.io community, which was played during their 26 June “Impact Spotlights: Edge AI” livestream, hosted by Edge Impulse staff who said they were not wearing their Edge Impulse hat for this event, but that of members of the Hackster community of developers. (In case you were wondering: None of the three projects introduced during the livestream used Akida as far as I could tell by fast-forwarding).










3B1352B0-262B-40D9-A457-F8BC8B06FBAE.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 11 users
Waiting for the day that never comes..

 
  • Haha
  • Fire
  • Like
Reactions: 4 users

Frangipani

Top 20
Hi everyone, including StockHound81 over on HC, whom I trust will be reading this,

the image @Humble Genius posted with the added line “Existing members of Tsex or HC forums an advantage” is not a screenshot of the original LinkedIn job ad by BrainChip. I assume it was his idea of a joke, given the real LinkedIn ad does say “Monitor and listen to industry social account and groups for key topics”.

I know for sure it wasn’t the original, as I happened to be the poster who had posted the original (without the added TSE/HC line!) shortly before @Humble Genius posted the altered version, which he should have marked as parody to avoid all the confusion, although as @jrp173 correctly noted, it is in a different font from LinkedIn posts and was thus discernible as a joke to me personally (who had also just posted a screenshot of the real ad and was thus doubly aware of the inauthenticity of the added line). And I also agree that it is highly unlikely that BrainChip would include those forum abbreviations in a job ad for a US-based social media intern unlikely to have ever heard of them.

Saying that, there has indeed been a bit of editing/deleting going on with BrainChip LinkedIn posts in recent months, and in addition, the AGM webcast never got uploaded to YouTube (and we all know why…), so it should not come as a surprise when some people’s suspicion gets aroused.

Here is the proof (I needed to take two screenshots, as the ad was too long for one - the second screenshot starts below the lines I had marked in green):

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-465055

View attachment 88008

View attachment 88009


Meanwhile, the contested post by @Humble Genius has apparently been deleted. It was between mine and that of @smoothsailing18 (to whom it was addressed), and @smoothsailing18’s reaction is still visible.

View attachment 88012


Let me make this clear: While the image @Humble Genius posted was not a screenshot of the original LinkedIn job ad, StockHound81 did not alter or forge that screenshot in any way, if it is the same as the one shown in @jrp173 ’s post (since I don’t have an HC account, I can read the texts, but images/screenshots posted are mostly blurry). It was definitely posted here on TSE on 7 June 2025 and was not marked as a parody post. I can vouch for that.

So yes, StockHound81 may in this case be “guilty” of gullibleness, but those posters who have accused him of forgery or criminal behaviour, apparently claiming the user were a paid downramper that wanted to harm the company owe him an apology, if you ask me…

Anyway, it would be best if @Humble Genius could clear up the confusion himself.


Since the post in question was either deleted by the poster himself or got moderated, which makes the situation even more confusing (why not just have it edited and clarify it was a parody post?!), here is further evidence that it did indeed exist:


https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-465078

07BC00CE-6E9C-49BE-83A9-0007567A1DA8.jpeg


When you click on the tagged post, you’ll get this message:

6B6D6943-9B7E-4191-9659-73B5067D5B9C.jpeg



https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-465178

B24469BD-94D3-4A46-B175-F1701B78D171.jpeg
 
  • Like
  • Fire
Reactions: 5 users

Frangipani

Top 20
Evening Frangipani ,

One has a favour to ask....

(…)

Only reason i ask is I've seen your FORENSIC ability to track posts , day, date ,time utilising a specific program thingi, well and truely above and beyond my experties.

Good Evening Fullmoonfever ,

BINGO , Cheers mate.

Just need Frangipani to run it through her special Forensic app / site which should reveal if the original has been diddled or not.

Regards,
Esq.

I don't recall humble posting a link myself so maybe they could share their source?

Can't seem to locate the original post either so maybe deleted or wrong keywords?

I just tried the link on Wayback Machine and it hasn't archived anything unfortunately.

Hi Esq.

the “forensic tool” you’re alluding to is exactly what @Fullmoonfever tried doing: using the Wayback Machine to search for archived webpages and then compare the previous version with the later one to look for differences.

No secret software or hacking involved! 😊

But of course not all webpages get archived, and those that do are not being webcrawled at regular intervals, so the search via the Wayback Machine may not be of much help in a specific case.

As long as someone knows how to copy and paste (or simply retype) a URL (aka web address), they can equally become time travel detectives onboard the Wayback Machine… 🕵️‍♀️ 🕵️‍♂️ Give it a try, it’s easy-peasy!




9306BA1A-68A3-4FFE-A92F-D5ECF2A4EC74.jpeg



 
  • Like
  • Love
Reactions: 4 users

itsol4605

Regular
Since the post in question was either deleted by the poster himself or got moderated, which makes the situation even more confusing (why not just have it edited and clarify it was a parody post?!), here is further evidence that it did indeed exist:


https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-465078

View attachment 88017

When you click on the tagged post, you’ll get this message:

View attachment 88018


https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-465178

View attachment 88019
From my observations and the posts, StockHound81 is somehow a complicated character.

When the modified job advertisement was posted, it was pretty clear that it was a joke – and others recognized it as a joke as well.
It's unclear what StockHound81's intention is in repeatedly spreading rumors.

StockHound81 claims to be an investor himself – unfortunately, this cannot be proven.

In any case, StockHound81 enjoys the attention some people give him.

Everyone has the opportunity to use the ignore list.
I use this opportunity and am very happy about it.
 
  • Like
  • Fire
Reactions: 2 users

Frangipani

Top 20

View attachment 88015



936CCB51-669A-4C4E-B143-BD28ED0E5EBA.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 9 users

Frangipani

Top 20
Speaking of interns:

Ghazaleh Ostovar recently started her summer internship with our Laguna Hills office (working remotely). Unlike other summer interns, who are usually still enrolled at uni, she has a PhD and years of work experience “in mathematical modeling, numerical simulation, and machine learning for biological systems”.

Given her background in health/medical physics and her stated interest in “applying data science and ML to problems in biotech and healthcare”, she presumably applied for Project 6, which was slated to have a team size of 4-5?


View attachment 87721


View attachment 87722




View attachment 87723





7FFCFB33-D8C6-453F-9DAD-B418E81F1E60.jpeg
 
  • Like
  • Thinking
  • Fire
Reactions: 6 users

Frangipani

Top 20

View attachment 88025

Looks as if the LinkedIn profile of one of our new interns, Mohamed Benomar El Kati, contains some info that hasn’t yet been posted:

It appears the official name of our FPGA-based hardware architecture that supports TENNs models is going to be Aether Core.


C33CD864-F012-4675-B145-5FBD8B97BE34.jpeg



AF805D2E-80E7-458A-9E64-F4D45FA9A43E.jpeg





0BB98D40-5B20-4D4E-AA7A-46561D0B838E.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 13 users

Frangipani

Top 20

View attachment 88025
Had a quick look at those interns, whose LinkedIn profiles were tagged:

We even have at least two “repeat offenders” 😉 - Daniel Endraws (a 2022 BrainChip summer intern alumnus) and Yueqi Zhao, who has returned as a research intern for the third consecutive year! Wow, must be a win-win situation.

What I find striking about this summer’s cohort is the number of interns that are either PhD students or already have a PhD…



59368C11-2FB4-4BEC-89CB-40A8D5E21C55.jpeg




DE0DFA1B-B829-4DAD-9C5C-BA4C94F618A2.jpeg




B87FC5D9-0C0A-48CA-892B-379F204A9971.jpeg


D0BDF3B1-1DF6-49E5-975E-36D2773CC101.jpeg




0EA037DE-8BCB-491A-9495-43FB7587B4AC.jpeg

E0731E3C-5D16-4EDD-B735-DA0EBDB8B736.jpeg




6DF82F94-5623-4DB0-A52C-6B8BE5F90F94.jpeg


2F7D4B72-A54E-4303-A949-893DB6C0DB89.jpeg




6C9A47E2-FA0D-4ACC-B996-D38F92CDF123.jpeg

8A220C9C-939A-4824-8CBF-567CC674B466.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 9 users
RECOVER YOUR SCAMMED CRYPTO FUNDS Call iFORCE HACKER RECOVERY
iFORCE HACKER RECOVERY rescued my shyness and inexperienced with cryptocurrency, I made a terrible mistake. Influenced by my in-law, who seemed confident about investing, I put over £77,000 my savings for a new home into what turned out to be a scam. When the promised returns never came, I realized I’d been swindled. Devastated, I searched for help and found reviews of iFORCE HACKER RECOVERY. After speaking with someone they’d helped, I reached out. Following their guidance, I was amazed to recover most of my funds. They turned my despair into hope. I'm truly grateful.
Contact iForce Hacker Recovery:
Email: iforcehk@consultant.com
WhatsApp: +1 (240) 803-3706
Website: iforcehackers.wixsite.com
IMG_20250606_005230.jpg
 
  • Haha
  • Like
Reactions: 2 users
Top Bottom