BRN Discussion Ongoing

I think it’s basically a game of time… They’re trying to make sure that, with all the focus on the big names – Intel, IBM, or whoever – companies like BrainChip get overlooked. The narrative goes: ‘If the giants aren’t ready yet, then it will probably take a while.’ But once things are actually ready, they want to be the ones holding the sceptre again.

The thing is, BrainChip has already laid such solid groundwork with its partners that the big players can’t just ignore them when it’s time to share the cake… you better provide a big juicy cake..
"The narrative goes: ‘If the giants aren’t ready yet, then it will probably take a while.’ But once things are actually ready, they want to be the ones holding the sceptre again."

I think that's definitely what Intel thinks and although I'd say they are pretty far back from the 8 ball now, they do have the resources, if they choose to pursue Neuromorphic Technology more strongly, to potentially catch up.

Like LDN said years ago..

"You can't discount Intel"
Or words to that effect..
 
Last edited:
  • Like
Reactions: 4 users

Diogenese

Top 20
Ummmm ok.

TENNs???

@Diogenese ??

If this authors information is correct, I'll take a Siemens relationship and hopefully a contract, though it could be through a partner I guess, any day of the week.

Who knows though.... can't find any substantial confirmation yet.



Smart Contract Formal Verification: Practical Guide​

ANKUSH CHOUDHARY

17 Aug 2025 — 9 min read
As blockchain penetrates critical infrastructure sectors, 2025 has witnessed catastrophic smart contract failures causing $1.2B+ in losses - making formal verification transition from academic luxury to operational necessity for any production system handling digital assets or automated decisions.

Core Insights​

  • Neuromorphic co-processors accelerate verification runtime by 5.7x through parallel proof execution
  • Quantum-resistant cryptographic proofs now required for all financial smart contracts
  • EU AI Act Article 22 mandates formal verification for autonomous transaction systems
  • Generative adversarial networks synthesize edge-case contract scenarios undetectable by traditional testing
  • 2025 benchmark: 94% reduction in vulnerabilities when using model checking pre-deployment

Implementation Framework​

Modern verification stacks integrate three critical layers: symbolic execution engines for path exploration, temporal logic model checkers for property validation, and equivalence checkers for compiler output consistency. The 2025 reference architecture employs NVIDIA's cuVerify SDK leveraging GPU-accelerated SMT solvers alongside OpenAI's VeriGen for adversarial test generation. Crucially, neuromorphic co-processing units now handle state explosion problems through spiking neural networks that prune irrelevant execution paths in real-time.

Python
Copy
Formal verification pipeline with neuromorphic acceleration
from formal_verify_2025 import SymbolicEngine, NeuromorphicOptimizer
from eu_compliance import Article22Auditor

def verify_contract(contract_bytecode):
Initialize neuromorphic path optimizer
nopt = NeuromorphicOptimizer(cores=8, topology="sparse")

Symbolic execution with quantum-resistant constraints
engine = SymbolicEngine(
bytecode=contract_bytecode,
optimizers=[nopt],
crypto_standard="CRYSTALS-Kyber"
)

EU compliance pre-check
if not Article22Auditor.validate(engine):
raise ComplianceError("EU AI Act Article 22 requirements not met")

Generate proof certificates
return engine.verify(
properties="safety_properties.spec",
timeout=300 seconds
)

3.8xfaster verification cycles vs. 2024 toolchains (IEEE Security 2025)

Real-World Deployment: Siemens' Industrial Smart Contracts for Energy Grids

Facing critical infrastructure vulnerabilities in their blockchain-based energy trading platform, Siemens implemented a formal verification pipeline after a 2024 exploit drained 12,000 MWh of power credits. Their solution integrated Cadence's Temporal Logic Verifier with custom neuromorphic hardware from BrainChip, reducing state space analysis from 48 hours to 11 minutes. The system now automatically generates mathematical proofs for all contract upgrades before grid deployment.

CASE STUDY
Energy Grid Smart Contract Verification Metrics
Siemens implemented formal verification for 142 critical contracts:

  • Prevented 3 zero-day exploits during deployment
  • Reduced audit costs by 67%
  • Achieved EU regulatory compliance 4 months ahead of deadline

Implementation Results​

  • Verification throughput: 12 contracts/hour
  • False positive rate: 0.3%
  • Critical bug detection: 100% of known vulnerability classes
Interestingly @7für7 posted the other day that a Global Sales Exec from Siemens EDA commented on the BRN LinkedIn post re Akida Cloud.

Hi Fmf,

About blockchain "I know noTHing!" ... but "state space analysis" is suggestive of TENNs.
 
  • Like
  • Wow
  • Love
Reactions: 4 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Howdy All,

I just noticed that Vision Transformers (ViTs) have been mentioned in Renesas’ latest blog.

BrainChip has been vocal about supporting transformer-based workloads, including ViTs, on Akida's roadmap.
  • Akida 2.0 introduced support for Transformer and CNN layers, making ViTs feasible for edge deployment.
  • Akida 3.0 is expected to extend this further with FP16/FP32 precision, opening the door to higher-accuracy applications.
  • BrainChip’s aTENNuate paper also explored techniques for making ViTs more energy-efficient, aligning perfectly with TinyML and embedded vision goals.
IMO the above steps position TENNs/Akida as one of the most relevant technologies for ViT deployment on edge devices.

Renesas and BrainChip are already official partners. So while this Renesas blog post doesn’t explicitly mention BrainChip, you'd think it would be reasonable enough to see TENNs/Akida as a candidate for Renesas’ long-term ViT strategy?

I would have thought that if Renesas plans to scale ViT-ready MPUs into production, utilizing TENNs/Akida for energy-efficient, low-latency ViT inference would be a logical and highly synergistic path.

As ViTs become standard in autonomous driving, robotics, and smart vision, hopefully BrainChip can capture market share by positioning TENNs/Akida as the go-to lightweight ViT accelerator within Renesas’ portfolio.🤞🙏


fingers-crossed-jenny-hagel.gif






Screenshot 2025-08-22 at 10.57.56 am.png





EXTRACT ONLY

Screenshot 2025-08-22 at 11.04.09 am.png
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 11 users

JB49

Regular
Howdy All,

I just noticed that Vision Transformers (ViTs) have been mentioned in Renesas’ latest blog.

BrainChip has been vocal about supporting transformer-based workloads, including ViTs, on Akida's roadmap.
  • Akida 2.0 introduced support for Transformer and CNN layers, making ViTs feasible for edge deployment.
  • Akida 3.0 is expected to extend this further with FP16/FP32 precision, opening the door to higher-accuracy applications.
  • BrainChip’s aTENNuate paper also explored techniques for making ViTs more energy-efficient, aligning perfectly with TinyML and embedded vision goals.
IMO the above steps position Akida as one of the most relevant technologies for ViT deployment on edge devices.

Renesas and BrainChip are already official partners. So while this Renesas blog post doesn’t explicitly mention BrainChip, you'd think it would be reasonable enough to see Akida as a candidate for Renesas’ long-term ViT strategy?

I would have thought that if Renesas plans to scale ViT-ready MPUs into production, utilizing Akida for energy-efficient, low-latency ViT inference would be a logical and highly synergistic path.

As ViTs become standard in autonomous driving, robotics, and smart vision, hopefully BrainChip can capture market share by positioning Akida as the go-to lightweight ViT accelerator within Renesas’ portfolio.🤞🙏


View attachment 89976





View attachment 89972




EXTRACT ONLY

View attachment 89974
they also mention post-quantum cryptography which is what we are currently working on with Chelpis and Mirle.
 
  • Like
Reactions: 7 users

Rach2512

Regular



Screenshot_20250822_105044_Samsung Internet.jpg

Screenshot_20250822_104554_Samsung Internet.jpg
Screenshot_20250822_104529_Samsung Internet.jpg
 
  • Like
Reactions: 1 users

7für7

Top 20
Since we’re asking chatgpt about everything these days, I figured I’d ask it about this too… probably should’ve just left it alone though…. I think she is short


Overall Picture
  • Revenue growth: +859% sounds impressive, but it’s almost entirely based on service contracts, not product sales.
  • Cash burn: ~6.5M in 6 months → runway about 12–15 months (without new income).
  • Financing: The LDA program is the lifeline, but every draw means new shares → shareholders pay through dilution.
  • Risk: Without a larger commercial rollout, BrainChip remains a “service house” and survives only through issuing shares.


👉 In short: Technology: progress, partnerships, developer hub.
But finances: dependence on LDA, no product revenue, high cash burn.
 
  • Sad
Reactions: 1 users
Top Bottom