Oh! It was Esq who had the G&T and shovel?Shoulda given it the shovel and put the bear to work digging your trenches to help out round the place![]()
Oh! It was Esq who had the G&T and shovel?Shoulda given it the shovel and put the bear to work digging your trenches to help out round the place![]()
Hi @7für7Ehh? Ceva is an American company an provides them solutions?
I asked chatty just for fun because some Speculations going on on crapper…
Possible M&A Roadmap for BrainChip
1. Preparation
- NVIDIA / another buyer runs due diligence (tech, patents, customer pipeline).
- Goal: enter as cheaply as possible, but still realistic enough to not alienate shareholders.
2. Initial Offer
- Buyer starts with A$1.20–1.40 (a typical M&A premium over the ~A$0.20 share price).
- Expectation: some shareholders celebrate, but long-term holders with the A$2 benchmark in mind reject immediately.
- Reaction: “Too low. Trim says A$1.00 fair value, and we already traded at A$2.00 during the Mercedes hype.”
3. Shareholder Front
- Large holders, forums, funds push back: “Akida is unique. We won’t sell under A$2.00.”
- Media & analysts pick up the story → more pressure on the buyer.
4. Counteroffer / Bidding Up
- Buyer raises to A$1.60–1.80 to soften resistance.
- Still: many investors see A$2.00 as the “magic level.”
5. Endgame
- Bidding war (Intel, Qualcomm, etc. join): price climbs to A$2.00–2.20 automatically.
- Single-bidder case: buyer wants certainty → goes straight to A$2.00 to lock the deal.
Outcome Probabilities
- Below A$1.50 → no chance, shareholders block.
- A$1.60–1.80 → possible compromise, but hard sell without board backing.
- A$2.00 → most likely minimum successful price.
- Above A$2.00 → only if a real bidding war happens or major automotive deals surface.
Conclusion
- A$2.00 isn’t just psychological; it’s the logical clearing price to win shareholder approval.
- NVIDIA (or any buyer) might start lower, but the end station is around A$2.00 unless there’s zero competition and weak resistance.
Hi @7für7
Have LLVision bought a license from Brainchip?
Have Ceva bought a license from Brainchip??
Is that NO.... to both?
Has America cracked down on anything tech that might benefit China???
Yep, BRN got a pass back in 2020, but it is a different world now. Real shame..............
BrainChip Receives U.S. Government Export Approval
9.18 onwards...this year.Podcast !!
![]()
BrainChip CEO on Transform NOW Podcast: Edge AI and Neuromorphic AI | BrainChip posted on the topic | LinkedIn
Our CEO, Sean Hehir, joined host Michael Marchuk on the Transform NOW Podcast to discuss the future of AI at the edge and what makes BrainChip’s approach unique. The conversation covers how AI is evolving from centralized data centers to edge computing, why neuromorphic AI matters, and how...www.linkedin.com
The bear is on the leaves againOh! It was Esq who had the G&T and shovel?
I thought thatHey !
just to clarify …my original question was more general (and to be honest, a bit on the light-hearted side) because his statement felt rather vague. And I wasn’t saying Ceva or LLVision already bought a BrainChip license.
And than..after your reply..my point was different..if Ceva, a US company, can openly power AR glasses developed by a Chinese firm (LLVision) and those products are set for global launch, then clearly it’s not some universal “forbidden zone” where no Western tech/IP can be used in China.
Sure, BrainChip doesn’t have a license there yet, but that’s not the same as saying it can’t happen. The Ceva example shows the door is not completely shut. Export restrictions hit certain categories (high-end GPUs, supercomputing, military tech), but they don’t block everything across the board.
So the whole “China is totally off-limits for BrainChip” argument doesn’t really hold up.
I thought that
“ we don’t need china Yet “
I am thinking this was said years ago by BrainChip’s ceo at the time
Can some one confirm this??? Or am I going crazy
Should be EDT and PDT in the US not standard time![]()
#ai #edgeai #brainchip #aiinfrastructure #solutionsreview | BrainChip
This Friday at 12:00 PM EST (9:00 AM PST), our CDO, Jonathan Tapson, will join industry experts on the Solutions Review Panel to discuss AI Infrastructure Strategy: Build, Buy, and Orchestrate for Intelligence at Scale. The session will explore how organizations are approaching AI...www.linkedin.com
View attachment 91254
When I first started working, our technical director had worked in England during the big show at a company that made the vacuum tubes for the radar, and he always told the new engineers this tale.Just a reminder where technology first starts before mainstream
So: military money and requirements kick-started and accelerated semiconductor R&D and early demand, especially during the Cold War and space race
- Pre-chips groundwork (1930s–1940s): Military needs (radar in WWII, communications, navigation) pushed research into vacuum tubes, microwave electronics and solid-state physics. That work laid groundwork for later devices.
- Transistor (1947): The transistor — the fundamental building block of modern chips — was invented at Bell Labs (Bardeen, Brattain, Shockley). That was largely academic/industrial research, although motivated by broader electronics needs (including military applications).
- Early semiconductor industry (1950s): Companies and labs (Bell, RCA, GE, Shockley Semiconductor, Texas Instruments) developed semiconductor devices. Military and government contracts provided important funding and demand (e.g., for radios, guidance systems, satellites), but commercial uses (telephony, instrumentation, calculators) also mattered.
- Integrated circuit (late 1950s): Jack Kilby (TI, 1958) and Robert Noyce (Fairchild, 1959) independently invented the integrated circuit. The space race and defense programs accelerated interest and procurement of ICs, but ICs were immediately attractive to commercial electronics and computing too.
- 1960s–1970s: Cold War, space, and defense programs (missiles, satellites, avionics, early computers) were big customers and sponsors (through agencies like DARPA, NASA), which helped scale manufacturing and design capability. At the same time, minicomputers, telecom, and consumer electronics created parallel commercial markets.
- Microprocessor era (1971 onward): Intel’s 4004 (1971) and subsequent microprocessors opened huge commercial markets (PCs, consumer devices). Military used them too, but the explosive growth came from business and consumer demand.
"Catastrophe" - usually you mount the head as the trophy ...Oh, Meta's AI glasses spectacularly flopped today, crumpling under the weight of their own hype like a pair of cheap shades in a toddler's grip—turns out, without BrainChip's Akida neuromorphic processor baked in, those fancy specs were basically just overpriced binoculars with a side of glitchy AI that couldn't tell a cat from a catastrophe, leaving users blindly fumbling through augmented reality fails while Akida sat on the sidelines, chuckling in low-power efficiency, proving once again that skimping on edge AI smarts is a surefire recipe for a face-plant in the tech world.
Did they? You have a supporting link?Oh, Meta's AI glasses spectacularly flopped today, crumpling under the weight of their own hype like a pair of cheap shades in a toddler's grip—turns out, without BrainChip's Akida neuromorphic processor baked in, those fancy specs were basically just overpriced binoculars with a side of glitchy AI that couldn't tell a cat from a catastrophe, leaving users blindly fumbling through augmented reality fails while Akida sat on the sidelines, chuckling in low-power efficiency, proving once again that skimping on edge AI smarts is a surefire recipe for a face-plant in the tech world.
Did they? You have a supporting link?
The reviewer from the Verge seems pretty impressed, the Verge has 3.5 million subscribers, so not some little obscure reviewer.