BRN Discussion Ongoing

AusEire

Founding Member.
I can't help but to agree with you. First thing tomorrow morning I will be emailing TD requesting Brainchip only employs deadbeats until revenue kicks in.
I couldn't agree more. In fact I'm going to make my email super abusive and call for the sacking of all management including PVDM and Anil Mankar.

Edit
I'm also going to take an impromptu trip to Perth just so I can do this to all the nerds in the Brainchip office as they leave work?

Homer Simpson Cartoon GIF
 
  • Haha
  • Like
  • Love
Reactions: 24 users

Luppo71

Founding Member
I couldn't agree more. In fact I'm going to make my email super abusive and call for the sacking of all management including PVDM and Anil Mankar.

Edit
I'm also going to take an impromptu trip to Perth just so I can do this to all the nerds in the Brainchip office as they leave work?

Homer Simpson Cartoon GIF
We'll be back.
 

Attachments

  • rs_1024x759-170626174434-1024.Curtis-Armstrong-Revenge-of-the-Nerds.ms.062617.jpg
    rs_1024x759-170626174434-1024.Curtis-Armstrong-Revenge-of-the-Nerds.ms.062617.jpg
    154.9 KB · Views: 120
  • Haha
  • Like
  • Love
Reactions: 9 users
Thanks for the fun banter tonight, makes a nice change from all the excitement from BrainChip inc.
Hope that we all have a great prosperous week
I feel every day we must be getting closer to a huge announcement.
Stay positive every body
Night JohnBoy
 
  • Like
  • Love
  • Haha
Reactions: 14 users
All levels of ability and logical thought inhabit these threads so it may not be immediately obvious to everyone that thanks to Brainchip nerds humble unqualified posters here at TSEx and at HC have used Brainchip AKIDA technology to make videos showing what use they have made of AKIDA1000 in the real world because it is so simplified and the technology IP has been licensed by international companies for real world applications.

Compared with this real world average Joe and commercial use we have Mike Davies and his team of hundreds of merry Nerds over at Intel having ONLY managed to dumb down the process of training and using Loihi 1. & 2. to a level where you now only require a PhD to use it for research purposes ONLY.

So I say three cheers for the Brainchip NERDS without them shareholder hangers on who add absolute nothing of commercial value to the company like me
would have zero, zilch, absolutely nothing of value.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 46 users

Foxdog

Regular
I can't help but to agree with you. First thing tomorrow morning I will be emailing TD requesting Brainchip only employs deadbeats until revenue kicks in.
Well how about Brainchip employs someone who can actually get the revenue to kick-in. Silicon Valley is crawling with super qualified out of work technicians ATM - there have been colossal, mass layoffs in the industry. I'm not convinced that any current additions have been attracted by AKIDA any more than their need to put food on the table. I just think that people here put way too much weight on the addition of new employees but far too little emphasis on the fact that we have made very little revenue producing IP sales progress. Perhaps the next 4C will prove me wrong, I certainly hope so but how many consecutive 4C's without self sustaining revenue will be acceptable? Oh but you don't have to consider that difficult question because we've just paid for another super qualified technician, who we might not actually need.
 
  • Like
  • Fire
Reactions: 5 users
Evening team, any reason for a Valeo Deep Learning employee to share our Gen2 news? :cool:
Screenshot_20230327_214946_LinkedIn.jpg
Screenshot_20230327_215126_LinkedIn.jpg
Screenshot_20230327_215146_LinkedIn.jpg
 
  • Like
  • Love
  • Fire
Reactions: 79 users

Slade

Top 20
Brainchip could save a lot of money if they replaced Peter and Anil with University work experience students. Same Same.
 
  • Haha
  • Like
  • Wow
Reactions: 22 users
I’m just about to listen to this podcast about VVDN’s future roadmap so I can’t comment on the content.





Edit: heard it now. No mention of Brainchip. More on companies expansion plans to support “Made in India”. Plans to expand from 10k staff to 35k staff! Also spend 100 mil on a new pcb plant.

Possibly 1.2bil revenue to compete for in India!

Interesting 17 min podcast!
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 17 users
  • Like
  • Fire
Reactions: 30 users

charles2

Regular
  • Like
  • Haha
  • Wow
Reactions: 11 users
Was just reading an article on some old news on the AKD1500.

Read it then had a look at the original BRN release.


The article author made a statement that i didn't see in the original BRN release but maybe mentioned elsewhere and I probs missed it?

If I didn't miss it & new, then is a nice to know :)

Highlighted towards end of write up.



BRAINCHIP ANNOUNCES SUCCESSFULLY TAPED OUT AKD1500 CHIP ON GLOBALFOUNDRIES’ 22NM FD-SOI PROCESS

Posted on 4 March, 2023 by Abhishek Jadhav
BrainChip announces successfully taped out AKD1500 chip on GlobalFoundries’ 22nm FD-SOI process
BrainChip Tapes Its Akida AKD1500
BrainChip, a leading provider of ultra-low power edge AI technology, has successfully taped out its AKD1500 chip on GlobalFoundries’ 22nm FD-SOI process. This marks an important milestone for BrainChip as it proves the portability of its technology and enables the company to take advantage of the benefits of GlobalFoundries’ process.
The AKD1500 is BrainChip’s flagship product designed to deliver AI processing capabilities on edge. The chip features BrainChip’s patented spiking neural network (SNN) technology, which is capable of learning, recognizing, and processing patterns in real-time. The AKD1500 is ideal for various applications, including advanced driver assistance systems (ADAS), surveillance, and autonomous robotics.
The AKD1500 combines event-based Akida AI IP with GlobalFoundry’s low leakage FDX process technology platform to deliver an always-on, at-sensor application with low power consumption, suitable for industrial automation and the automotive industry.
GlobalFoundries’ 22nm FD-SOI process is an ideal choice for BrainChip as it provides several benefits, including low power consumption, high performance, and excellent reliability. The process is also well-suited for edge AI applications as it offers a compact form factor and low cost.
“This is an important validation milestone in the continuing innovation of our event-based, neuromorphic Akida IP, even if it is fully digital and portable across foundries. The AKD1500 reference chip using GlobalFoundries’ very low-leakage FD SOI platform, showcases the possibilities for intelligent sensors in edge AI.”
The AKD1500 chip is expected to be available soon and will be a significant addition to BrainChip’s portfolio of edge AI solutions. The company has already received strong interest from several customers looking to use AKD1500 in their products.
BrainChip’s successful tape-out of the AKD1500 chip in GlobalFoundries’ 22nm FD-SOI process is an important milestone for the company. This will help BrainChip to deliver more innovative and advanced AI solutions to its customers in the future.
 
  • Like
  • Fire
  • Love
Reactions: 67 users

Sirod69

bavarian girl ;-)
BrainChip
BrainChipBrainChip
2 Std.

Edge Impulse -Edge ML Series is starting soon. Don't miss the BrainChip keynote - “Edge AI: Ready When You Are!” — Register here: https://lnkd.in/gxbB96Mp

1679938078312.png

The Edge ML Series is coming to San Jose on March 27th! Mark your calendars and request your invite now.

This exclusive, in-person, one-day event will explore the benefits of edge machine learning, ways to differentiate your products with embedded intelligence, and how to deliver value in less time while lowering operational cost using AI tools like Edge Impulse. Featuring:

• Keynotes from industry leaders

• Hands-on workshops

• Customer stories

• Insights on deploying ML solutions at scale

• Demos

• Networking opportunities

Agenda (PT)
8:30–9:00 Registration and coffee

9:00–10:40 Keynotes


  • Welcome and housekeeping (10 min)
  • “Demystifying Edge ML” — Edge Impulse keynote (30 min)
    The fast-evolving ecosystem of edge ML includes silicon, sensors, connectivity, and more. We'll guide you through the world of edge ML and uncover its potential for your business.
  • "Advancing Intelligence at the Edge with Texas Instruments" — Texas Instruments keynote (30 min)
  • “Edge AI: Ready When You Are!” — BrainChip keynote (30 min)
    AI promises to provide new capabilities, efficiencies, and economic growth, all with the potential for improving the human condition. There are numerous challenges to delivering on this, in particular the need for performant, secure edge AI. We'll describe the readiness of the industry to deliver edge AI, and the path to the imminent transition.
10:40–11:00 Coffee break (demo area)

11:00–12:00 Seminars


  • Edge Impulse — Use case (20 min)
  • “Why AI Acceleration Matters for Edge Devices” — Alif (20 min)
    Three key parameters to consider when selecting a hardware platform for AI-enable edge-ML are
    performance, power consumption, and price. This talk will help you maximize your projects chance of success by showing you how to maximize the performance parameter, while keeping the others in-check.
  • “The Advantages of Nordic’s Ultra-Low-Power Wireless Solutions and Machine Learning” — Nordic Semiconductors (20 min)
    Nordic Semiconductor creates low-power wireless devices that utilize popular protocols including Bluetooth Low Energy, Wi-Fi 6, and cellular IoT. Although optimizing the radio transmitter can just save power to a point, implementing ML on the processor to detect and send only relevant data can significantly further reduce power requirements on SoCs and SiPs. Discover how Nordic's wireless technology can leverage ML to achieve ultra-low power — without needing a dedicated machine learning core.
12:00–13:00 Lunch (demo area)

  • Demos from TI, BrainChip, Alif, Nordic, Sony, MemryX, and NovTech
13:00–15:30 Workshops

  • "Hands-On With the TDA4VM" — Texas Instruments workshop
    (75 min)
  • Break (15 min)
  • “Enabling the Age of Intelligent Machines” — Alif workshop
    (60 min)
    Alif Semiconductor and Edge Impulse will demonstrate how anyone can create, train, tune, and deploy advanced machine learning models on the next generation Ensemble family of AI accelerated microcontrollers.
15:30–16:00 Wrap and close

‍​

 
  • Like
  • Fire
  • Love
Reactions: 42 users

cosors

👀
Just a snippet.
She has an interesting job:
1679925227165.png

How AI research is creating new luxury in the vehicle.
"Hey Mercedes, drive me to work" - With artificial intelligence (AI), we will interact with our car intuitively in the future. Even small talk will be possible then," says Dr Teresa Botschen. Together with her colleagues in the AI Research Team at Mercedes-Benz, the PhD computer scientist and AI expert is working on making vehicles smarter - for even more comfort and the perfect driving experience. In her interview, she tells us how she bridges the gap between research and the vehicle, and how her team can implement ideas together.

Dr. Botschen, will vehicles soon be as smart as humans?

That depends on your definition of "smart" (laughs). Vehicles will certainly be able to react much more flexibly to different situations in the future. Machine learning makes this possible. The most prominent example of this is currently the development of automated vehicles. However, artificial intelligence offers many more fields of application. At Mercedes-Benz we use technology in a wide variety of areas. One field that I find particularly exciting is driver-vehicle communication, for example the question: How will we communicate and interact with our vehicles in the future? The potential here is huge.

At Mercedes-Benz for AI: In the AI Research Team, Dr. Teresa Botschen supports users from all over the Group in using machine learning.
At Mercedes-Benz for AI: In the AI Research Team, Dr. Teresa Botschen supports users from all over the Group in using machine learning.
In the AI Research Team you are the expert for Natural Language Processing. What exactly is this about?

Put simply, I teach our cars to understand human language better. A major challenge for machines is the ambiguity of our language. This often leads to misunderstandings in communication between people. A simple example: In the distant future, when you tell your self-driving vehicle to "park at the bank", it could be taken to mean park on the river bank or near the bank branch around the corner. It depends on the context. In our team, we are developing methods to enable vehicles to make situational decisions, for example by using additional cameras and combining voice and image processing in a multimodal way.

This means that in the future we will be able to have a proper conversation with our vehicle?

Almost (laughs). Imagine you are on your way to work in the morning and have your car read out the latest news to you, you comment on the news and the car searches for corresponding further information pertaining to your comment. This can already come very close to a real conversation, though we place great emphasis on the transparent and responsible use of AI. Our goal is not to pretend that the driver is interacting with a human being, but to support the driver in the car in the best possible way. But human-vehicle interaction will certainly become much more intuitive, and at the same time the vehicles will become more proactive.

What do you mean by that?

If you give your automated vehicle the order "Drive me to Stuttgart", the system could recognise who is currently on board and make individual suggestions for the route. For example, you get into the car early in the morning, carrying your laptop bag. The algorithm deduces that you are on your way to work and offers you a stop at your favourite bakery. And if you get in with your family in the afternoon, perhaps a stopover at a nice area with a playground.

One field that I find particularly exciting is driver-vehicle communication, for example how will we communicate with our vehicles in the future? There is huge potential for AI here.
Well connected to the next innovation: When developing smart AI solutions, Dr. Teresa Botschen is in contact with start-ups from the community and research.
Well connected to the next innovation: When developing smart AI solutions, Dr. Teresa Botschen is in contact with start-ups from the community and research.
And in the AI Research Team, you develop solutions to implement such technologies at Mercedes-Benz?

Yes, but language processing and driver-vehicle communication are only part of our projects. With our team, we support specialist departments across the whole Group in using various machine learning technologies for their processes - whether as a development tool or in the vehicle, for example for a Digital Luxury Experience, which are applications that ensure the perfect driving experience in our vehicles. Every project is different. There is no such thing as "one size fits all". That's what makes my work so exciting.

How would you describe your team?

Quite multifaceted and interdisciplinary. We are colleagues from very different disciplines - from physics and mathematics to electrical engineering and computer science. This is also important when we work on new, innovative topics. Depending on the expertise required for a project, we assemble a project group with two or three colleagues. Of course, we always work very closely with the development departments, but also with Legal. And for projects involving language and text processing, I have established a Group-wide NLP (Natural Language Processing) community together with a core team.

Simply try out new ideas: At Mercedes-Benz, Dr. Teresa Botschen has a lot of freedom to advance research on AI - and to put it directly into practice.
Simply try out new ideas: At Mercedes-Benz, Dr. Teresa Botschen has a lot of freedom to advance research on AI - and to put it directly into practice.
What makes the AI Research Team special, in your opinion?

We are a motivated and creative group. The interdisciplinary exchange is super-exciting, and often brings completely new perspectives. And we have a great team spirit, not only in the AI Research Team but in the whole department. We often also offer subjects for academic theses or projects for students in our team. The exchange with universities and start-ups is very important for me personally.

Because you come from a research background?

Yes, before I started in the AI Research Team, I was a doctoral student at the Technical University of Darmstadt, where I did my doctorate in the research field of Natural Language Processing and developed systems for automatic text analysis and multimodal speech understanding. During this time, I attended an international conference where my later colleagues from the Group also presented new research results, and we got talking. I was impressed by the depth of Mercedes-Benz's research into artificial intelligence. .

What makes Mercedes-Benz a good employer for you?

What I personally find great is that here I have the opportunity to implement the results from research in specific applications. Mercedes-Benz is investing a great deal in the future. We have a large innovation workshop where we can build prototypes and simply try out new ideas. After all, research means that sometimes the things that emerge can't immediately be mass-produced. Here we have the freedom to accelerate research into AI.

Finally, we would like to know: What is the must-have in your dream office?

A place for my shepherd dog. She keeps me company in the home office right now, and every video conference with colleagues becomes much more relaxed when she moves into the picture every now and then, to see whom I am talking to (laughs).

Dr Teresa Botschen (30) brings the latest AI technologies from science into the car at the Mercedes-Benz Group. Together with her interdisciplinary team, she is driving AI research forward in the Group. The natural language processing expert started her journey with artificial intelligence in the Cognitive Science programme at the University of Tübingen. She completed her Master's degree in Computational Statistics and Machine Learning at University College London, followed by a doctorate in Natural Language Processing at the Technical University of Darmstadt. Besides her research at Mercedes-Benz, she enjoys discovering the world on city trips and mountain tours, singing musical songs in a choir or having fun training her dog."
https://group.mercedes-benz.com/car...-intelligence/interviews/teresa-botschen.html
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 29 users

IloveLamp

Top 20
  • Like
  • Fire
  • Love
Reactions: 49 users

equanimous

Norse clairvoyant shapeshifter goddess
 
  • Like
  • Love
Reactions: 30 users

Mugen74

Regular
View attachment 33069
Think I'm crazy, but I'm developing the idea of bringing the AKD1000 bord to the largest computer museum in the world.
Do you know Mr. Nixdorf? He was a pioneer of modern computers even if the future of his company was different due to tragic decisions and almost no one knows him today. But one of his foundations for this world's largest museum exists as well as the museum itself.
If it goes after me I would like to see the exciting development step in the computer history the Neuromorophic Computing in the permanent exhibition as a further milestone - Loihi, TrueNorth and Akida in one exhibition. They are dealing with the issue of NC, of course.
Just my thought while hearing the last podcasts but it is developing in my brain and I am thinking about contacting the curator for the first time today so that there is remembered for PVDM and Heinz Nixdorf.
https://www.hnf.de/en/home.html
Dont forget Anil Mankar!
 
  • Like
  • Love
  • Fire
Reactions: 20 users

cosors

👀
Dont forget Anil Mankar!
I develop the thought. It is precisely because of comments like yours that I post this. It would certainly be possible for me to have this conversation.

___
Added:
I'm also interested in feeling out the resonance a little bit. It is perhaps still a little too early for this. Besides, the museum, like many others is a place to make knowledge tangible and understandable. And how could that be better than with event based cameras or face, gesture or speak recognition.
But this is only just emerging. Nevertheless, it can be attractive for a museum to be up to date. And in addition, it is not just a matter of putting up another display case, but of initiating the exhibition for NC. And that certainly takes a year in my mind. So I am wavering back and forth and will leave this here for a short while in case there are any thoughts from you.

___
My thought in the other thing:
My thought about fuel recognition: I will let rest from my side. It's just one of countless use cases the board is bound to be bombarded with I think and I don't feel in a position to address it properly. Even if the matter is now decided in the EU. But under a different sign. It will be evaluated at a later date how what when. So there is more time if the company would want to participate and to teach the sensor technology from wine to fuels. In addition, it is a politically charged topic.
 
Last edited:
  • Like
  • Love
Reactions: 8 users

Deadpool

Did someone say KFC
  • Like
Reactions: 6 users

goodvibes

Regular
Intel labs

Deep Learning with Spiking Neural Networks

Our Neuromorphic Computing Lab is highly active in researching Spiking Neural Networks (SNNs) and how they can be useful for deep learning. SNNs, sometimes dubbed the third generation of Artificial Neural Networks (ANNs), replace the non-linear activation functions in ANNs with spiking neurons and show real promise in AI, image processing, NLP, video compression, and more. Read the blog from Sumit Bam Shrestha to learn more about SNN research. https://intel.ly/40GzeiQ

#Research #DeepLearning #Developer

 
  • Like
Reactions: 7 users
This interesting that GM are awarding Valeo for service delivery in ADAS arena:

1679949170587.png



 
  • Like
  • Fire
  • Thinking
Reactions: 47 users
Top Bottom