BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!

Here's an article published by NeuromorphicCore.ai, which is a niche, community-driven news/insights site focused on neuromorphic computing and related fields. It offers articles and technical overviews, sometimes written by enthusiasts or aggregated from across the space.

The article was published back in March. It appears to be largely speculative in relation to Intel’s alleged pivot back to neuromorphic computing since it is not backed by direct quotes or roadmaps from Intel but time will tell whether this reasoned hypothesis proves to be accurate.


Intel’s New CEO Lip-Bu Tan: A Neuromorphic Future Fueled by His Cadence Turnaround and a $25 Million Stock Bet​

zoey2002
NeuromorphicCoreMarch 18, 2025
0 Comments
ezgif-29ab51ea37c6f5.webp



On March 12, 2025, Intel Corporation (INTC) named Lip-Bu Tan as its new Chief Executive Officer, effective March 18, ushering in a critical juncture for the embattled chipmaker. Tan, a semiconductor luminary with an unmatched legacy at Cadence Design Systems (CDNS) from 2009 to 2021, recently underscored his commitment by purchasing $25 million in Intel stock. His expertise and bold vision, paired with Intel’s pioneering work in neuromorphic computing—embodied in Loihi 2, the Lava software framework, and the Hala Point system—could position the company to leapfrog competitors in the AI era.
A Proven Turnaround Artist from Cadence: The Proof is in the Pudding
In 2009, Lip-Bu Tan took the helm of Cadence Design Systems, then a struggling player in the electronic design automation (EDA) market. Over the next 12 years, he orchestrated a remarkable turnaround by enhancing shareholder returns, doubling top-line growth and driving a 3,200% increase in the company’s stock price. This is similar to the turnaround led by Lisa Su at Advanced Micro Devices (AMD). His approach was multifaceted. This included a relentless focus on customer-driven innovation, strategic partnerships with industry giants like Taiwan Semiconductor Manufacturing Company (TSMC), and a keen eye for emerging technological trends. By enhancing Cadence’s tools to meet the demands of cutting-edge chip designs, he positioned the company as an indispensable partner for fabless giants like Apple and Broadcom. This proven ability to transform a company through strategic vision and execution could be instrumental in Intel’s resurgence, with neuromorphic computing potentially taking center stage.

Neuromorphic Computing: Loihi 2, Lava, and Hala Point as Tan’s Launchpad​

Intel has established a strong foothold in neuromorphic computing, a field that aims to replicate the brain’s neural structure. The Loihi 2 chip, introduced in 2021, marks a significant advancement over its predecessor, providing eight times the neuron capacity, reaching up to one million neurons per chip, and enhanced energy efficiency for real time, brain like processing. Intel’s open source software framework, Lava, complements this hardware by simplifying programming for neuromorphic systems, enabling developers to tackle AI applications such as adaptive robotics and sensory processing.
Intel’s crown jewel is Hala Point, a neuromorphic supercomputer launched in 2024 with 1.15 billion neurons across 1,152 Loihi 2 chips. Hala Point performs 20 quadrillion operations per second, making it the world’s largest system of its kind, designed for sustainable AI and scientific research. These innovations provide Intel a unique advantage in a market dominated by power hungry GPUs, a foundation Tan could use, leveraging his Cadence expertise in advanced design and ecosystem integration.
Tan’s focus at Cadence on supporting complex, Next-generation architectures matches neuromorphic demands to a tee. In contrast to traditional processors, Loihi 2 and Hala Point excel at sparse, event driven computation, reducing energy consumption and enabling dynamic learning, which is ideal for edge AI, autonomous systems, and green computing just to name a few examples. With Lava simplifying development, Tan could accelerate adoption, similar to how he expanded Cadence’s influence through strategic tools and partnerships.

The $25 Million Vote of Confidence​

Tan’s 25 million dollar Intel stock purchase within his first 30 days as CEO, coupled with a 66 million dollar compensation package, demonstrates his strong belief in Intel’s potential. For a company that experienced a $19 billion dollar loss in 2024, its first since 1986, this personal investment reflects his Cadence strategy, aligning leadership with results. It also suggests ambitious plans, potentially focused on expanding neuromorphic technologies like Loihi 2 and Hala Point to restore Intel’s position as an innovation leader.

Challenges and Opportunities Ahead​

Intel faces challenges, including an AI strategy behind Nvidia, a foundry business behind TSMC, a stock price affected by skepticism, and internal cultural hurdles. However, Tan’s Cadence success, characterized by streamlined operations, minimizing Capital Expenditures (CAPEX), and investment in high growth areas, provides a potential path forward. His 3200% stock price increase at Cadence was not accidental but clearly the result of execution. In addition, his previous Intel board experience until August 2024 gave him insight into the company’s internal issues, which he reportedly addressed. Now as CEO, he is ready to make difficult choices to refocus the company.
Neuromorphic computing could be a vital area for Intel. While Nvidia leads AI with GPUs, Loihi 2’s efficiency and Hala Point’s scale offer a competitive advantage, particularly for energy efficient applications. Lava’s open source model has the potential to build a strong developer ecosystem, much like Cadence’s tools attracted chip designers. Tan’s ties to TSMC and his foundry expertise could also refine Intel’s IDM 2.0 strategy. IDM 2.0 represents Intel’s hybrid manufacturing model, where the company utilizes both its own advanced fabrication facilities and external foundries like TSMC. This approach provides greater manufacturing flexibility and enables Intel to produce a wider range of chips. By leveraging his experience to tailor IDM 2.0 to the unique requirements of neuromorphic computing, Tan could attract AI innovators seeking cutting edge hardware.

The Rising Market for Neuromorphic Computing​

The market potential for neuromorphic computing could significantly fuel Tan’s interest. The global neuromorphic computing market is projected to reach $6.7 billion dollars in 2024 and grow to $55.6 billion dollars by 2033, with a compound annual growth rate of 26.4%. Analysts also project the neuromorphic chip market, valued at $0.5 billion dollars in 2023, to reach $10 billion dollars by 2030, with a compound annual growth rate exceeding 50%. This rapid growth is driven by the demand for energy efficient AI in edge devices, autonomous vehicles, and IoT systems, areas where traditional architectures struggle. According to Steve Furber, a key figure in neuromorphic computing and co-designer of the ARM processor, the field is at a “critical juncture” needing a “killer app” to demonstrate its potential. Intel’s early lead with Loihi 2 and Hala Point positions it to capture a significant share, especially if Tan accelerates commercialization. His Cadence experience scaling EDA tools for emerging technologies suggests he could turn this niche into a mainstream revenue driver. Furthermore, Tan’s TSMC relationships and foundry knowledge may also improve Intel’s Integrated Device Manufacturing 2.0 (IDM 2.0) strategy, Intel’s initiative to combine internal manufacturing with external foundry capacity, adapting its manufacturing to the specific needs of neuromorphic technology and attracting AI innovators.

A New Chapter for Intel​

Lip Bu Tan’s ascent to Intel’s helm on March 18, 2025, backed by his $25 million dollar stake, signifies a defining moment. His Cadence triumph, turning a $4 stock into a $132 powerhouse for a 3200% gain, highlighted his ability to turn vision into value. Intel’s neuromorphic trio, Loihi 2, Lava, and Hala Point, provides the tools to do it again. As he navigates Intel’s challenges, Tan’s ability for bridging design and deployment could make brain inspired computing the key to Intel’s renaissance, redefining its role in an AI driven world. According to Steve Furber, co designer of the ARM processor, neuromorphic computing is at a critical juncture needing a killer app to demonstrate its potential. With Tan’s proven track record and Intel’s advanced neuromorphic technologies, the company is well positioned to deliver that application, potentially transforming the landscape of energy efficient AI.

 
Last edited:
  • Like
  • Thinking
  • Love
Reactions: 17 users
  • Haha
  • Like
Reactions: 5 users
Intel’s neuromorphic trio, Loihi 2, Lava, and Hala Point, provides the tools to do it again,.

Why aren't we getting mentioned
 
  • Like
  • Thinking
Reactions: 6 users

7für7

Top 20
  • Haha
Reactions: 1 users

genyl

Member
Intel’s neuromorphic trio, Loihi 2, Lava, and Hala Point, provides the tools to do it again,.

Why aren't we getting mentioned
Because the big fish are taking over as usual. Brainchip are probably getting crushed by the big money/companies which could be why we havent heard anything in a long time. Still havent lost hope completely but im getting close
 
  • Like
  • Fire
Reactions: 4 users

Yoda

Regular
Because the big fish are taking over as usual. Brainchip are probably getting crushed by the big money/companies which could be why we havent heard anything in a long time. Still havent lost hope completely but im getting close
If our technology is so good though, surely one of them is interested in a takeover if not a licence?
 
  • Like
  • Love
Reactions: 8 users
Parasites everywhere
 
  • Like
Reactions: 1 users

Guzzi62

Regular
Because the big fish are taking over as usual. Brainchip are probably getting crushed by the big money/companies which could be why we havent heard anything in a long time. Still havent lost hope completely but im getting close
If our technology is so good though, surely one of them is interested in a takeover if not a licence?
Yes, agreed, some of the big players might ignore us, but they will only do that until one of them says, heck this is really something we could use and earn a ton of money in the process. They will surely go for the best technology they can get!
NDAs are in play and some of them are (hopefully) playing around with BRN technology, like Akida2 and TENNs.
I don't know how long into the process the NDA will go, the customer might say to BRN; this is hush-hush until product launch, or they might even say, this is a secret sauce we don't know anybody to know about?
If it's the latter, we will first find out when the money starts to flow in! In a way, that could be really cool, we could see a rapid SP increase costing the short sellers a lot of money getting out of, LOL.

Surely some announcement will happen this year, but it has been awfully quiet since the AGM.

Since all announcements about a contract or IP deal so far were for Akida 1000/1500, I would really like to see one utilizing Akida2! Come on TATA, what are you waiting for?
 
  • Like
  • Fire
  • Haha
Reactions: 7 users

manny100

Top 20
With a total Neuromorphic market of $28 mill in 2024 i doubt anyone would be interested in acquiring BRN and or its tech until we get a little closer to the total market of $822 mill by 2029. Estimate of $8352 mill by 2034.
The total market of $28mill in 2024 is a reality check on our expectations.
Actually, if we get the $9 mill revenue as stated at the AGM we will be doing pretty well.
 
  • Like
Reactions: 5 users
  • Like
  • Haha
Reactions: 5 users

Mccabe84

Regular
Because the big fish are taking over as usual. Brainchip are probably getting crushed by the big money/companies which could be why we havent heard anything in a long time. Still havent lost hope completely but im getting close
I'm out if there's another cap raise
 
  • Like
  • Thinking
Reactions: 5 users

Frangipani

Top 20
First BRN press release I am aware of that was written by someone from Bospar Communications, the Public Relations & Marketing Agency that was recently hired:


Launch of BrainChip Developer Hub Accelerates Event-Based AI Innovation on Akida™ Platform with Release of MetaTF 2.13​

NEWS PROVIDED BY
Brainchip
June 19, 2025, 06:21 GMT



BrainChip announces new Developer Hub and MetaTF toolkit, enabling seamless development and deployment of machine learning models on its Akida™ platform.

LAGUNA HILLS, CA, UNITED STATES, June 19, 2025 /EINPresswire.com/ --

BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based brain-inspired AI, today announced the release of MetaTF 2.13 on its newly launched Developer Hub, a comprehensive portal designed to accelerate AI development on the Akida™ platform. The BrainChip Developer Hub serves as a centralized resource for developers building intelligent edge applications, providing access to tools, pre-trained models, technical documentation, and the company’s MetaTF toolkit. MetaTF 2.13 features seamless conversion, quantization, and deployment of machine learning models on Akida. It is compatible with leading frameworks including Keras and ONNX with support for Jupyter Notebooks, enabling rapid prototyping and optimization.

“We created the Developer Hub to streamline the experience for edge AI developers and give them the tools to move from concept to deployment quickly,”, said Sean Hehir, CEO of BrainChip. “With our Akida processor, highly intuitive software stack, and world class models, we’re delivering solutions that are both high-performing and energy-efficient.”

As part of this launch, BrainChip introduced two high-efficiency models optimized for edge performance. The eye-tracking model is ideal for smart glasses and wearable devices, delivering over 99% accuracy. Built on BrainChip’s proprietary Temporal Event-based Neural Networks (TENNs), it offers real-time gaze detection while dramatically reducing power consumption by processing only motion-relevant data.

The gesture recognition model is designed for embedded applications in consumer electronics, robotics, and IoT and achieves 97% accuracy. By leveraging Akida’s event-based processing and high-speed vision sensors from event-based cameras, it enables ultra-low latency gesture interfaces without sacrificing precision.

These models demonstrate the power of Akida’s event-based architecture across a wide array of real-world applications including autonomous vehicles, industrial automation, AR/VR and spatial computing, smart environments and IoT, and security and surveillance.

BrainChip’s new Developer Hub and AI models underscore the company’s commitment to making edge AI more accessible and scalable. With Akida, developers can build responsive, privacy-aware applications that operate at ultra-low power—ideal for battery-constrained and latency-sensitive environments.

Developers can access the models and tools today by visiting: https://developer.brainchip.com

About BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY)
BrainChip is the global leader in Edge AI on-chip processing and learning. The company’s first-to-market, fully digital, event-based AI processor, AkidaTM, uses neuromorphic principles to mimic the human brain. By analyzing only the essential sensor inputs at the point of acquisition, Akida delivers data processing with unmatched efficiency, precision, and energy savings. Integrated into SoCs on any digital process technology, Akida Neural Processor IP has demonstrated significant advantages across today's workloads and networks. It provides a platform for developers to build, fine-tune, and run their models using standard AI tools such as TensorFlow and Keras.

BrainChip’s Temporal Event-based Neural Networks (TENNs) build on state space models (SSMs) by introducing a time-sensitive, event-driven processing framework that enhances efficiency and makes them ideal for real-time, streaming Edge applications. By enabling efficient computation with optimized models and hardware execution, BrainChip makes real-time streaming Edge AI universally deployable across industries such as aerospace, autonomous vehicles, robotics, mobile, consumer electronics, and wearable technology. BrainChip is leading the way toward a future where ultra-low power, on-chip AI near the sensor not only transforms products but also benefits the planet. Learn more at www.brainchip.com.

Follow BrainChip on Twitter: @BrainChip_inc
Follow BrainChip on LinkedIn: BrainChip LinkedIn

Madeline Coe
Bospar Communications
+1 224-433-9056

maddie@bospar.com

Legal Disclaimer:
EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.
article.gif
 
  • Like
  • Fire
  • Love
Reactions: 44 users
I'm out if there's another cap raise

Don’t think we’ve finished raising funds from the last one? and at the current SP it’s probably going to be way short of what the company hoped for I guess. So another raise will be due with absolutely no sign of any decent announcement as we all hoped for.
 
  • Like
  • Sad
  • Wow
Reactions: 9 users

Frangipani

Top 20
On a brighter note, BrainChip will be attending four large-scale events later this month:

View attachment 86433

We’d already found out about three of the four events prior to the newsletter release. Here is some more info about the fourth one, the Living Planet Symposium in Vienna, organised by ESA.

Douglas McLelland and Gilles Bézard will be representing BrainChip with a talk titled “Event-driven computation and sparse neural network activity deliver low power AI” as part of the session Orbital Intelligence for Earth Observation applications: The edge of AI In Space, chaired by Gabriele Meoni from ESA, who has first-hand experience of AKD1000 cf. https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-462334


View attachment 86435

View attachment 86434 View attachment 86436 View attachment 86437 View attachment 86438 View attachment 86439 View attachment 86440

We’d already found out that Douglas McLelland and Gilles Bézard will be representing BrainChip at the upcoming Living Planet Symposium in Vienna (organised by ESA) thanks to a reference to that conference in the latest BRN newsletter (see my 8 June post for more details 👆🏻).

That event in Austria’s capital now also shows up on our website under “What’s New”:


D43256E3-082A-428A-979C-E2EDA434BE15.jpeg



36514338-E625-4654-BB83-46D7164766F7.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 25 users

DK6161

Regular
You were lead to believe in the company?

By who?(Be specific)

When you invest in a company generally you do some kind of research on the company before you invest in it.

Did YOU do this? Then when YOU decided to throw money at it. Did YOU continue to do research and keep updated on whats going on?

You say you No Longer believe that the company is going to succeed. So obviously YOU have seen something that has brought YOU to that conclusion.

Notice how I emphasised the word YOU there. That implies that it is YOU who made the decision to firstly research what the company does, then YOU decided to invest, then YOU continued to research and follow the progress of the company and now YOU find yourself no longer believing in what the company is trying to achieve for reasons XYZ.

To be honest with you if I were in your shoes I know exactly what I would do next. I'm not going to tell you what that is because that is a decision that ONLY YOU can make.

For the record we all make bad decisions and anyone that says otherwise is flat out lying. I've sold shares in 2 companies this year. 1 because I bought hype and got my ass handed to me and the other funnily enough I should have held on to as it has now done a x3 from my original buy price. I sold it at 150% profit.

I'm still invested in Brainchip because I still firmly believe that they will be successful. That is MY BELIEF. Taking a bit longer than I expected but hey I can't control tWe all know who the number 1 fan boy that made the company look like a "no brainer" to invest in.
1. We all know the no 1 fan boy that made the company looked like it was about to go to the moon.
2. Look at previous AGMs where the CEO said there was going to be an explosion of revenue.

But yes, I made the decision to invest based on the hype.
Thanks for pointing that out.
 
  • Like
  • Sad
  • Fire
Reactions: 7 users

DK6161

Regular
  • Like
  • Haha
  • Fire
Reactions: 4 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Back in January, after listening to a podcast featuring Steven Brightfield, I mentioned that I had the impression he may have been hinting at Meta as one of the companies involved in discussions around smart glasses (as above).

So, it’s with great interest that I’ve seen reports today about a new collaboration between Meta and Oakley on a sports-focused line of smart glasses, with an official announcement reportedly scheduled for Friday, 20 June.

The partnership builds on the existing relationship between Oakley and Ray-Ban under their parent company, EssilorLuxottica, which already works closely with Meta on the Ray-Ban Meta smart glasses.

I’ll definitely be tuning into this announcement to learn more. Early reports suggest the glasses are being optimised for sports and athletic use, with a focus on hands-free video recording capabilities. The announcement will also likely reveal specific features, pricing, and availability for the glasses.

Over the past few days, I’ve noticed a recurring theme across industry commentary - battery life continues to be a major challenge for smart glasses, particularly when video capture is involved, which significantly increases power consumption.

Thanks to our collaboration with Onsor, we know BrainChip’s Akida technology can support all-day battery life without the need for external battery packs or cloud connectivity. That’s a capability that most players in this space are still chasing.

If Meta plans to push harder into high-performance, video-centric smart glasses, it seems likey they would need a low-power solution like ours to get them across the line. Edit: As pointed out to me by @Diogenese, monitoring for early signs of a seizure, as in the Onsor glasses, is largely a passive function, similar to wake-word detection. In contrast, continuous video processing and classification is a significantly more active workload for any processor. So, while Akida with TENNs could theoretically help extend battery life, we’d need to wait for real-world performance data before drawing conclusions about its impact in such demanding use cases.

And - whether it’s meaningful or not - we’ve seen a few likes from Meta employees on LinkedIn.

Oh, and then there's also the small fact that Sean recently confirmed that the company that is manufacturing the glasses for Onsor is the same one doing them for Meta, EssilorLuxottica.

So, I welcome you to draw your own conclusions.



Video Start Time : 14.26 when Sean mentions the Onsor frames are made by EssilorLuxottica ,who also make the Meta frames.










View attachment 87214




An EXTRACT from this evenings press release.


Screenshot 2025-06-19 at 8.50.40 pm.png








Hopefully...🤞

Screenshot 2025-06-19 at 9.17.25 pm.png

Screenshot 2025-06-19 at 9.19.03 pm.png
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 34 users

manny100

Top 20
Its interesting how AKIDA and TENNs co exist. Bravo's post above concerning Onsor's epilepsy detectors is a good example. AKIDA the hardware provides the signals and TENNs (with perhaps software designed by Onsor?) does the thinking, tells AKIDA who provides the response, eg seizure within an hour.
They are complementary. TENNs are the brains, and Akida is the body.
Another example, Eg, if i am hiking in bush i have never been to before i could use a drone for forward navigation purposes.
If for some reason i wanted to communicate with the drone via hand signals, eg raised arm means return etc. A camera or other suitable device would pick up my arm motions.
AKIDA provides the cloud free, event based, low power, on chip learning platform.
TENNs do the 'thinking'.
Akida collects and pre-processes that event data in real time—say, my arm moves upward.
It then runs the TENNs model internally to interpret what that movement means—maybe it’s my gesture for “come back and hover.”
Then, Akida enables the appropriate response: the drone returns and hovers.
It's all in real time on low power. Traditional Edge AI would run out of power in no time.
The above is very basically how AKIDA and TENNs run together.
The key is that AKIDA is the hardware and it needs software, eg TENNs to do the 'thinking'.
Easy to see why tiny PICO and TENNs are a good pair.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 15 users

FiveBucks

Regular
I'm out if there's another cap raise
It is inevitable.

They said at the AGM they are aiming for $9 mill revenue this year. Our cash burn is significantly higher.
 
  • Like
  • Sad
  • Fire
Reactions: 11 users
New GitHub update 20hrs ago on Akida/CNN2SNN and including TENNs release, modules and models etc by the looks.

@Diogenese will probs know more of anything unusual or new tucked in there.



20 hours ago
@ktsiknos-brainchip
ktsiknos-brainchip
2.13.0-doc-1
d8435c2
Compare
Upgrade to Quantizeml 0.16.0, Akida/CNN2SNN 2.13.0 and Akida models 1.7.0
Latest


Update QuantizeML to version 0.16.0

New features​

  • Added a bunch of sanitizing steps targetting native hardware compatibility:
    • Handle first convolution that cannot be a split layer
    • Added support for "Add > ReLU > GAP" pattern
    • Added identity layers when no merge layers are present after skip connections
    • BatchNormalisation layers are now properly folded in ConvTranspose nodes
    • Added identity layers to enforce layers to have 2 outbounds only
    • Handled Concatenate node with a duplicated input
  • Added support for TENNs ONNX models, which include sanitizing, converting to inference mode and quantizing
  • Set explicit ONNXScript requirement to 0.2.5 to prevent later versions that use numpy 2.x

Bug fixes​

  • Fixed an issue where calling sanitize twice (or sanitize then quantize) would lead to invalid ONNX graphs
  • Fixed an issue where sanitizing could lead to invalid shapes for ONNX Matmul/GEMM quantization

Update Akida and CNN2SNN to version 2.13.0

Aligned with FPGA-1679(2-nodes)/1678(6-nodes)​

New features​

  • [cnn2snn] Updated requirement to QuantizeML 0.16.0
  • [cnn2snn] Added support for ONNX QuantizedBufferTempConv and QuantizedDepthwiseBufferTempConv conversion to Akida
  • [akida] Full support for TNP-B in hardware, including partial reconfiguration with a constraint that TNP-B cannot be the first layer of a pass
  • [akida] Full support of Concatenate layers in hardware, feature set aligned on Add layers
  • [akida] Prevented the mapping of models with both TNP-B and skip connections
  • [akida] Renamed akida.NP.Mapping to akida.NP.Component
  • [akida] Improved model summary for skip connections and TNP-B layers. The summary now shows the number of required SkipDMA channels and the number of components by type.
  • [akida] Updated mapping details retrieval: model summary now contains information on external memory used. For that purpose, some C++/Python binding was updated and cleaned. The NP objects in the API have external members for memory.
  • [akida] Renamed existing virtual devices and added SixNodesIPv2 and TwoNodesIPv2 devices
  • [akida] Introduced create_device helper to build custom virtual devices
  • [akida] Mesh now needs an IP version to be built
  • [akida] Simplified model statistics API and enriched with inference and program clocks when available
  • [akida] Dropped the deprecated evaluate_sparsity tool

Update Akida models to 1.7.0

  • Updated QuantizeML dependency to 0.16.0 and CNN2SNN to 2.13.0
  • Sparsity tool name updated. Now returns python objects instead of simply displaying data and support models with skip connections
  • Introduced tenn_spatiotemporal submodule that contains model definition and training pipelines for DVS128, EyeTracking and Jester TENNs models
  • Added creation and training/evaluation CLI entry points for TENNs

Introducing TENNs modules 0.1.0

  • First release of the package that aims at providing modules for Branchip TENNs
  • Contains blocks of layers for model definition: SpatialBlock, TemporalBlock, SpatioTemporalBlock that come with compatibility checks and custom padding for Akida
  • The TemporalBlock can optionally be defined as a PleiadesLayer following https://arxiv.org/abs/2405.12179
  • An export_to_onnx helper is provided for convenience

Documentation update

  • Added documentation for TENNs APIs, including tenns_modules package
  • Introduced two spatiotemporal TENNs tutorials
  • Updated model zoo page with mAP50, removed 'example' column and added TENNs
  • Added automatic checks for broken external links and fixed a few
  • Cosmetic changes: updated main logo and copyright to 2025
 
  • Like
  • Fire
  • Love
Reactions: 25 users
Top Bottom