BRN Discussion Ongoing

I think it’s basically a game of time… They’re trying to make sure that, with all the focus on the big names – Intel, IBM, or whoever – companies like BrainChip get overlooked. The narrative goes: ‘If the giants aren’t ready yet, then it will probably take a while.’ But once things are actually ready, they want to be the ones holding the sceptre again.

The thing is, BrainChip has already laid such solid groundwork with its partners that the big players can’t just ignore them when it’s time to share the cake… you better provide a big juicy cake..
"The narrative goes: ‘If the giants aren’t ready yet, then it will probably take a while.’ But once things are actually ready, they want to be the ones holding the sceptre again."

I think that's definitely what Intel thinks and although I'd say they are pretty far back from the 8 ball now, they do have the resources, if they choose to pursue Neuromorphic Technology more strongly, to potentially catch up.

Like LDN said years ago..

"You can't discount Intel"
Or words to that effect..
 
Last edited:
  • Like
Reactions: 7 users

Diogenese

Top 20
Ummmm ok.

TENNs???

@Diogenese ??

If this authors information is correct, I'll take a Siemens relationship and hopefully a contract, though it could be through a partner I guess, any day of the week.

Who knows though.... can't find any substantial confirmation yet.



Smart Contract Formal Verification: Practical Guide​

ANKUSH CHOUDHARY

17 Aug 2025 — 9 min read
As blockchain penetrates critical infrastructure sectors, 2025 has witnessed catastrophic smart contract failures causing $1.2B+ in losses - making formal verification transition from academic luxury to operational necessity for any production system handling digital assets or automated decisions.

Core Insights​

  • Neuromorphic co-processors accelerate verification runtime by 5.7x through parallel proof execution
  • Quantum-resistant cryptographic proofs now required for all financial smart contracts
  • EU AI Act Article 22 mandates formal verification for autonomous transaction systems
  • Generative adversarial networks synthesize edge-case contract scenarios undetectable by traditional testing
  • 2025 benchmark: 94% reduction in vulnerabilities when using model checking pre-deployment

Implementation Framework​

Modern verification stacks integrate three critical layers: symbolic execution engines for path exploration, temporal logic model checkers for property validation, and equivalence checkers for compiler output consistency. The 2025 reference architecture employs NVIDIA's cuVerify SDK leveraging GPU-accelerated SMT solvers alongside OpenAI's VeriGen for adversarial test generation. Crucially, neuromorphic co-processing units now handle state explosion problems through spiking neural networks that prune irrelevant execution paths in real-time.

Python
Copy
Formal verification pipeline with neuromorphic acceleration
from formal_verify_2025 import SymbolicEngine, NeuromorphicOptimizer
from eu_compliance import Article22Auditor

def verify_contract(contract_bytecode):
Initialize neuromorphic path optimizer
nopt = NeuromorphicOptimizer(cores=8, topology="sparse")

Symbolic execution with quantum-resistant constraints
engine = SymbolicEngine(
bytecode=contract_bytecode,
optimizers=[nopt],
crypto_standard="CRYSTALS-Kyber"
)

EU compliance pre-check
if not Article22Auditor.validate(engine):
raise ComplianceError("EU AI Act Article 22 requirements not met")

Generate proof certificates
return engine.verify(
properties="safety_properties.spec",
timeout=300 seconds
)

3.8xfaster verification cycles vs. 2024 toolchains (IEEE Security 2025)

Real-World Deployment: Siemens' Industrial Smart Contracts for Energy Grids

Facing critical infrastructure vulnerabilities in their blockchain-based energy trading platform, Siemens implemented a formal verification pipeline after a 2024 exploit drained 12,000 MWh of power credits. Their solution integrated Cadence's Temporal Logic Verifier with custom neuromorphic hardware from BrainChip, reducing state space analysis from 48 hours to 11 minutes. The system now automatically generates mathematical proofs for all contract upgrades before grid deployment.

CASE STUDY
Energy Grid Smart Contract Verification Metrics
Siemens implemented formal verification for 142 critical contracts:

  • Prevented 3 zero-day exploits during deployment
  • Reduced audit costs by 67%
  • Achieved EU regulatory compliance 4 months ahead of deadline

Implementation Results​

  • Verification throughput: 12 contracts/hour
  • False positive rate: 0.3%
  • Critical bug detection: 100% of known vulnerability classes
Interestingly @7für7 posted the other day that a Global Sales Exec from Siemens EDA commented on the BRN LinkedIn post re Akida Cloud.

Hi Fmf,

About blockchain "I know noTHing!" ... but "state space analysis" is suggestive of TENNs.
 
  • Like
  • Wow
  • Fire
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Howdy All,

I just noticed that Vision Transformers (ViTs) have been mentioned in Renesas’ latest blog.

BrainChip has been vocal about supporting transformer-based workloads, including ViTs, on Akida's roadmap.
  • Akida 2.0 introduced support for Transformer and CNN layers, making ViTs feasible for edge deployment.
  • Akida 3.0 is expected to extend this further with FP16/FP32 precision, opening the door to higher-accuracy applications.
  • BrainChip’s aTENNuate paper also explored techniques for making ViTs more energy-efficient, aligning perfectly with TinyML and embedded vision goals.
IMO the above steps position TENNs/Akida as one of the most relevant technologies for ViT deployment on edge devices.

Renesas and BrainChip are already official partners. So while this Renesas blog post doesn’t explicitly mention BrainChip, you'd think it would be reasonable enough to see TENNs/Akida as a candidate for Renesas’ long-term ViT strategy?

I would have thought that if Renesas plans to scale ViT-ready MPUs into production, utilizing TENNs/Akida for energy-efficient, low-latency ViT inference would be a logical and highly synergistic path.

As ViTs become standard in autonomous driving, robotics, and smart vision, hopefully BrainChip can capture market share by positioning TENNs/Akida as the go-to lightweight ViT accelerator within Renesas’ portfolio.🤞🙏


fingers-crossed-jenny-hagel.gif






Screenshot 2025-08-22 at 10.57.56 am.png





EXTRACT ONLY

Screenshot 2025-08-22 at 11.04.09 am.png
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 21 users

JB49

Regular
Howdy All,

I just noticed that Vision Transformers (ViTs) have been mentioned in Renesas’ latest blog.

BrainChip has been vocal about supporting transformer-based workloads, including ViTs, on Akida's roadmap.
  • Akida 2.0 introduced support for Transformer and CNN layers, making ViTs feasible for edge deployment.
  • Akida 3.0 is expected to extend this further with FP16/FP32 precision, opening the door to higher-accuracy applications.
  • BrainChip’s aTENNuate paper also explored techniques for making ViTs more energy-efficient, aligning perfectly with TinyML and embedded vision goals.
IMO the above steps position Akida as one of the most relevant technologies for ViT deployment on edge devices.

Renesas and BrainChip are already official partners. So while this Renesas blog post doesn’t explicitly mention BrainChip, you'd think it would be reasonable enough to see Akida as a candidate for Renesas’ long-term ViT strategy?

I would have thought that if Renesas plans to scale ViT-ready MPUs into production, utilizing Akida for energy-efficient, low-latency ViT inference would be a logical and highly synergistic path.

As ViTs become standard in autonomous driving, robotics, and smart vision, hopefully BrainChip can capture market share by positioning Akida as the go-to lightweight ViT accelerator within Renesas’ portfolio.🤞🙏


View attachment 89976





View attachment 89972




EXTRACT ONLY

View attachment 89974
they also mention post-quantum cryptography which is what we are currently working on with Chelpis and Mirle.
 
  • Like
  • Fire
Reactions: 13 users

Rach2512

Regular



Screenshot_20250822_105044_Samsung Internet.jpg

Screenshot_20250822_104554_Samsung Internet.jpg
Screenshot_20250822_104529_Samsung Internet.jpg
 
  • Like
  • Fire
  • Love
Reactions: 14 users

7für7

Top 20
Since we’re asking chatgpt about everything these days, I figured I’d ask it about this too… probably should’ve just left it alone though…. I think she is short


Overall Picture
  • Revenue growth: +859% sounds impressive, but it’s almost entirely based on service contracts, not product sales.
  • Cash burn: ~6.5M in 6 months → runway about 12–15 months (without new income).
  • Financing: The LDA program is the lifeline, but every draw means new shares → shareholders pay through dilution.
  • Risk: Without a larger commercial rollout, BrainChip remains a “service house” and survives only through issuing shares.


👉 In short: Technology: progress, partnerships, developer hub.
But finances: dependence on LDA, no product revenue, high cash burn.
 
  • Like
  • Thinking
  • Sad
Reactions: 8 users

Diogenese

Top 20



View attachment 89983
View attachment 89984 View attachment 89985
Originally I thought the GenX 320 was developed as a low pixel count sensor for compatibility with Synsense, but that was before Synsense teamed up with Inivation.

We had already been working with Prophesee for some time before that, although Synsense had been with them for longer.

Back then Prophesee had their larger sensor, and it seemed that Synsense struggled to handle its capabilities, but Akida 1 worked well.


https://inf.news/en/digital/176678d553cdb3d82a319e4838c15256.html

Sony reveals new stack CMOS sensor details, new process manufacturing​

2025-08-22

1755832836735.png


Previously photosensors and control electronics were formed on a single layer. In Sony's new system, the photosensors are formed on the top layer, and the control electronics are formed on the lower layer, thus increasing the proportion of surface are available for the photosensors.


This is the Prophesee hi-fi sensor built with Sony 3D tech:

https://docs.prophesee.ai/stable/hw/sensors/imx636.html

Sony Semiconductor Solutions and Prophesee collaborated to produce SONY IMX636 (HD), a stacked event based vision sensor developed around 4.86um contrast detection pixels.

This Event-based vision sensors feature high-speed/low-latency response, wide dynamic range operation, focal-plane data compression and low power consumption.

A high level of programmability is guaranteed via a four-wire serial peripheral interface. It allows the user to define region of interest and program various functions.

The vision sensor data output is fully digital and consists of 16-wire parallel data bus with an external clock and data valid flag. This interface can easily be connected to an external parallel to LVDS serializer IC.

Features


  • 1280x720 Pixel CMOS vision sensor
  • 4.86μm x 4.86μm event-based pixel
  • <1/2” optical format
  • Monochrome
  • 16-bit parallel data output
  • 1G event/s (1GEPS) peak output
  • Random Programmable Region of Interest (ROI)
  • Serial Peripheral Interface (SPI)
  • High Dynamic Range (HDR) beyond 120 dB
  • −40°C to +85°C Operational temperature range
  • 50-70mW typical power consumption

This has about 10 times the number of pixels of GenX320.

Now that Inivation has teamed up with Synsense, we're sure to be Prophesee's best friend.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 17 users

JB49

Regular
 
  • Like
Reactions: 3 users

IloveLamp

Top 20
1000010372.jpg
 
  • Like
  • Fire
  • Thinking
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

Elon Musk’s xAI Plan: Phones as AI Edge Nodes Bypassing OS and Apps​





EXTRACT ONLY



Elon Musk envisions smartphones and computers as "edge nodes" for AI inference, driven by bandwidth limits that make cloud processing impractical. xAI aims to enable devices to generate pixels and audio directly, bypassing traditional OS and apps. This could revolutionize interactions but raises concerns over power, security, and environmental impacts.
xai-tmp-imgen-286ba395-787e-40d8-86dc-5c51424fecb0.jpeg

Elon Musk’s xAI Plan: Phones as AI Edge Nodes Bypassing OS and Apps

Written by John Smart
Thursday, August 21, 2025
In a recent post on X, formerly known as Twitter, Elon Musk outlined a provocative vision for the future of computing, suggesting that personal devices like smartphones and computers will evolve into mere “edge nodes” for AI inference. This shift, he argued, is inevitable due to fundamental bandwidth limitations that make it impractical to handle all processing on remote servers. Musk’s statement, shared on August 21, 2025, echoes broader trends in AI development where edge computing—running models locally on devices—addresses latency and connectivity issues that plague cloud-dependent systems.

The idea builds on discussions within the tech community, including insights from X user @amXFreeze, who highlighted xAI’s long-term strategy to transform devices into platforms that directly generate pixels and audio via AI inference. This approach would bypass traditional operating systems and applications, rendering interfaces dynamically through artificial intelligence. Such a paradigm could revolutionize user experiences, making interactions more seamless and adaptive, but it raises questions about power consumption, hardware requirements, and software ecosystems.
 
  • Like
  • Love
  • Fire
Reactions: 12 users

Diogenese

Top 20
Originally I thought the GenX 320 was developed as a low pixel count sensor for compatibility with Synsense, but that was before Synsense teamed up with Inivation.

We had already been working with Prophesee for some time before that, although Synsense had been with them for longer.

Back then Prophesee had their larger sensor, and it seemed that Synsense struggled to handle its capabilities, but Akida 1 worked well.


https://inf.news/en/digital/176678d553cdb3d82a319e4838c15256.html

Sony reveals new stack CMOS sensor details, new process manufacturing​

2025-08-22

View attachment 89989

Previously photosensors and control electronics were formed on a single layer. In Sony's new system, the photosensors are formed on the top layer, and the control electronics are formed on the lower layer, thus increasing the proportion of surface are available for the photosensors.


This is the Prophesee hi-fi sensor built with Sony 3D tech:

https://docs.prophesee.ai/stable/hw/sensors/imx636.html

Sony Semiconductor Solutions and Prophesee collaborated to produce SONY IMX636 (HD), a stacked event based vision sensor developed around 4.86um contrast detection pixels.

This Event-based vision sensors feature high-speed/low-latency response, wide dynamic range operation, focal-plane data compression and low power consumption.

A high level of programmability is guaranteed via a four-wire serial peripheral interface. It allows the user to define region of interest and program various functions.

The vision sensor data output is fully digital and consists of 16-wire parallel data bus with an external clock and data valid flag. This interface can easily be connected to an external parallel to LVDS serializer IC.

Features


  • 1280x720 Pixel CMOS vision sensor
  • 4.86μm x 4.86μm event-based pixel
  • <1/2” optical format
  • Monochrome
  • 16-bit parallel data output
  • 1G event/s (1GEPS) peak output
  • Random Programmable Region of Interest (ROI)
  • Serial Peripheral Interface (SPI)
  • High Dynamic Range (HDR) beyond 120 dB
  • −40°C to +85°C Operational temperature range
  • 50-70mW typical power consumption

This has about 10 times the number of pixels of GenX320.

Now that Inivation has teamed up with Synsense, we're sure to be Prophesee's best friend.
Just noticed something interesting in the Sony diagram:

There are 4 photodiodes in a group:

1 RED;
1 Blue;
2 Green.

I recall that the human eye is most sensitive to green light, so this may be to imitate that.
 
  • Thinking
  • Fire
  • Wow
Reactions: 5 users

Rach2512

Regular
Just noticed something interesting in the Sony diagram:

There are 4 photodiodes in a group:

1 RED;
1 Blue;
2 Green.

I recall that the human eye is most sensitive to green light, so this may be to imitate that.


Thanks @Diogenese, sorry to sound like a thicky, does that mean that Akida will be a part of the below event based vision sensor?


Sony Semiconductor Solutions and Prophesee collaborated to produce SONY IMX636 (HD), a stacked event based vision sensor developed around 4.86um contrast detection pixels.
 
  • Like
  • Thinking
Reactions: 3 users

Diogenese

Top 20
Thanks @Diogenese, sorry to sound like a thicky, does that mean that Akida will be a part of the below event based vision sensor?


Sony Semiconductor Solutions and Prophesee collaborated to produce SONY IMX636 (HD), a stacked event based vision sensor developed around 4.86um contrast detection pixels.
Hi Rach,

I think my brain just short circuited:

"According to Sony, the stacked CMOS image sensor with two-layer transistor pixels can improve the image quality without increasing the size of the mobile phone sensor, but considering that the integrated DRAM stacked CMOS sensor is not yet popular in the mobile phone market, and the mobile phone market is increasingly crowded with toothpaste, and mobile phones are used in mobile phones. The new stack sensor is probably a matter of the year of the monkey."

This must be how it feels to be the president of the US.
 
  • Haha
Reactions: 7 users

Diogenese

Top 20
Hi Rach,

I think my brain just short circuited:

"According to Sony, the stacked CMOS image sensor with two-layer transistor pixels can improve the image quality without increasing the size of the mobile phone sensor, but considering that the integrated DRAM stacked CMOS sensor is not yet popular in the mobile phone market, and the mobile phone market is increasingly crowded with toothpaste, and mobile phones are used in mobile phones. The new stack sensor is probably a matter of the year of the monkey."

This must be how it feels to be the president of the US.
The blue box labeled "Circuitry" is where the signal processing would take place. Initially this would have been Synsense in the GenX320. The question is whether Prophesee will choose to switch out the Synsnese circuitry because Synsense is now integrated with their somewhat unfriendly rival Inivation (Synsense owns Inivation) so you'd think they would want to distance themselves from Inivation and, by association, Synsense in GenX320 which is a Prophesee product, ie, not branded Sony.

As to the ~1 million pixel Sony IMX636, as this is a Sony product, I guess it's Sony's decision, but, given the earlier comments from Luca implying Synsense's inadequacy re the large Prophesee pixel array, I think that must favour Akida, especially now we have TENNs.

Also note the article is dated 20250822, so ... it would be nice.
 
  • Like
  • Fire
  • Love
Reactions: 7 users

TheDrooben

Pretty Pretty Pretty Pretty Good
Buying Intel means securing the semiconductor industry in the US, jobs, reputation, and it is also part of MAGA. The Government always makes such a decision without consulting the parties needing the material. Even Trump wants the best drones, Marine, Air Force etc. He would not settle for less if somebody told him that Akida is better.

And they also cannot wait for years until Loihi 2 is ready for use and tested thoroughly. Akida is already there and tested!
I thought MAGA was against the govt involvement in private companies?????
Isn't that a form of socialism???? Probably hear nothing about that.
Oh dear....

200 (8).gif


Happy as Larry
 
  • Fire
  • Love
  • Haha
Reactions: 4 users

THE HACK ANGELS RECOVERY EXPERT // A LEGITIMATE CRYPTO BITCOIN / USDT RECOVERY EXPERT​


I came across a company website that promised a big return on investment. I put 790k USDT in an online cryptocurrency investment platform and I was scammed out of everything. I was very disappointed with myself. When I went online, I found testimony about this hacker called THE HACK ANGELS RECOVERY EXPERT collective with a reputation for recovering cryptocurrency, i decided to get in touch with them and submitted my case to the expert. After a few hours of work with them, I was shocked to learn that they had recovered all of my stolen USDT in just 24 hours. I said that I will not hold this to myself but share it to the public I advise everyone seeking to recover their lost bitcoin Reach out to THE HACK ANGELS You can have a chat with him through. EMAIL

Email at support@thehackangels.com

Website at www.thehackangels.com

WhatsApp +1(520)2 0 0-2 3 2 0
I am so happy and in tears of joy to get back my funds.

688803d141c15d001de74499.jpg
 

Diogenese

Top 20
I thought MAGA was against the govt involvement in private companies?????
Isn't that a form of socialism???? Probably hear nothing about that.
Oh dear....

View attachment 89995

Happy as Larry
I think Intel should pray that Trump has no managerial input.
 
  • Haha
  • Love
Reactions: 6 users
THE HACK ANGELS RECOVERY EXPERT // A LEGITIMATE CRYPTO BITCOIN / USDT RECOVERY EXPERT

I came across a company website that promised a big return on investment. I put 790k USDT in an online cryptocurrency investment platform and I was scammed out of everything. I was very disappointed with myself. When I went online, I found testimony about this hacker called THE HACK ANGELS RECOVERY EXPERT collective with a reputation for recovering cryptocurrency, i decided to get in touch with them and submitted my case to the expert. After a few hours of work with them, I was shocked to learn that they had recovered all of my stolen USDT in just 24 hours. I said that I will not hold this to myself but share it to the public I advise everyone seeking to recover their lost bitcoin Reach out to THE HACK ANGELS You can have a chat with him through. EMAIL

Email at support@thehackangels.com

Website at www.thehackangels.com

WhatsApp +1(520)2 0 0-2 3 2 0
I am so happy and in tears of joy to get back my funds.
688803d141c15d001de74499.jpg
 

Diogenese

Top 20
Dreddbot is sleeping on the job!
 
  • Haha
  • Fire
  • Like
Reactions: 5 users

TheDrooben

Pretty Pretty Pretty Pretty Good
I think Intel should pray that Trump has no managerial input.
Might be bankruptcy number 7 😂
 
  • Haha
  • Fire
Reactions: 3 users
Top Bottom