BRN Discussion Ongoing

Aretemis

Regular
Hi Fmf,

That just triggered a couple of obscure dots ... Akida works on probability, what does the image most closely resemble?

In fact, I reckon we are on the path of the Infinite Improbability Drive. How many heads does PvdM have?
Not if the vogons get their hands on it first
 
  • Like
  • Haha
Reactions: 4 users
Recall someone (sorry not searched who) had mentioned Cambridge Consultants previously.

Excerpt from an Oct article by their VP referencing the need for energy considerations. Right up our alley.



Efficient AI means businesses can achieve more with less​

Cambridge Consultants's Ram Naidu outlines the how to pick the right technique for your AI needs.
October 20, 2022


Every business now has a concern over the rising cost of energy, linked to the carbon cost and the need for sustainable solutions. The more we can get away from the greedy energy demands of large compute costs and adopt efficient AI methods the better. This leads to an exploration of energy efficiency, whether through neuromorphic methods or using low-bit encoding. Again, there will not be a universal off-the-shelf solution to cutting energy costs. But it is a parameter we must consider and find where the right compromise can be made.

So, what does a successful AI solution look like? Its approach must depend on data quantity, labeled data availability, and energy cost implementation amongst a host of other considerations. Looking at one component of this in isolation isn’t the path to success. A successful AI solution requires a holistic approach to cover the needs and costs with a mature view of all the competing drivers. This was inevitable - AI had so much success so soon with low-hanging fruit. As the field matures, so must our ability to approach AI with a clear eye on the value it can bring. If you’re all set for the summit, great. I look forward to seeing you, and perhaps continuing the conversation, at the IoT World & The AI Summit in Austin, Texas, on Nov. 2-3, 2022.
 
  • Like
  • Love
  • Fire
Reactions: 20 users

TopCat

Regular
Recall someone (sorry not searched who) had mentioned Cambridge Consultants previously.

Excerpt from an Oct article by their VP referencing the need for energy considerations. Right up our alley.



Efficient AI means businesses can achieve more with less​

Cambridge Consultants's Ram Naidu outlines the how to pick the right technique for your AI needs.
October 20, 2022


Every business now has a concern over the rising cost of energy, linked to the carbon cost and the need for sustainable solutions. The more we can get away from the greedy energy demands of large compute costs and adopt efficient AI methods the better. This leads to an exploration of energy efficiency, whether through neuromorphic methods or using low-bit encoding. Again, there will not be a universal off-the-shelf solution to cutting energy costs. But it is a parameter we must consider and find where the right compromise can be made.

So, what does a successful AI solution look like? Its approach must depend on data quantity, labeled data availability, and energy cost implementation amongst a host of other considerations. Looking at one component of this in isolation isn’t the path to success. A successful AI solution requires a holistic approach to cover the needs and costs with a mature view of all the competing drivers. This was inevitable - AI had so much success so soon with low-hanging fruit. As the field matures, so must our ability to approach AI with a clear eye on the value it can bring. If you’re all set for the summit, great. I look forward to seeing you, and perhaps continuing the conversation, at the IoT World & The AI Summit in Austin, Texas, on Nov. 2-3, 2022.
They’re a design partner with ARM and they’ve worked with Prophesee to develop PureSentry , a way of detecting contamination in cell therapy.
 
  • Like
  • Fire
Reactions: 10 users
1671090440375.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 25 users

Mt09

Regular
  • Like
Reactions: 7 users

Diogenese

Top 20
I don’t remember seeing or reading this article but I would like to think that Rob Telson stating that they saw Nvidia more as a partner than as a competitor and with Nvidia through Mercedes Benz at least being fully aware of AKIDA Science Fiction that in their role as a consultant to Sony EV they may have mentioned Brainchip:

Computing Hardware Underpinning the Next Wave of Sony, Hyundai, and Mercedes EVs​

January 30, 2022 by Tyler Charboneau

Major automakers Sony, Hyundai, and Mercedes-Benz have recently announced their EV roadmaps. What computing hardware will appear in these vehicles?

With electric vehicles (EVs) becoming increasingly mainstream, automakers are engaging in the next great development war in hopes of elevating themselves above their competitors. Auto executives expect EVs, on average, to account for 52% of all sales by 2030. Accordingly, investing in new computing technologies and EV platforms is key.
While the battery is the heart of the EV, intelligently engineering the car's “brain” is equally important. The EV’s computer is responsible for controlling a plethora of functions—ranging from regenerative-braking feedback, to infotainment operation, to battery management, to instrument cluster operation. Specifically, embedded chips like the CPU enable these features.

Diagram of some EV subsystems

Diagram of some EV subsystems. Image used courtesy of MDPI

Modernized solutions like GM’s Super Cruise and Ultra Cruise claim to effectively handle 95% of driving scenarios. Ultra Cruise alone will leverage a new AI-capable 5nm processor. Drivers are demanding improved safety features like advanced lane centering, emergency braking, and adaptive cruise control. In fact, Volkswagen’s ID.4 EV received poor marks from buyers because it lacked such core capabilities.
What other hardware-level developments have manufacturers unveiled?

Sony Enters the EV Fray​

At CES 2022, Sony announced its intention to form a new company called Sony Mobility. This offshoot will be dedicated solely to exploring EV development—building on Sony’s 2020 VISION-S research initiative. While Sony unveiled its coup EV prototype two years ago, dubbed VISION-S 01, this year’s VISION-S 02 prototype is an SUV. However, the company hasn’t committed to bringing these cars to mass-market consumers themselves.
It’s said that both Qualcomm and NVIDIA have been involved throughout the development process. However, the two prominent electronics manufacturers haven’t made their involvement with Sony clear (and vice versa). Tesla has adopted NVIDIA hardware to support its machine-learning algorithms; it’s, therefore, possible that Sony has taken similar steps.
Additionally, NVIDIA has long touted its DRIVE Orin SoC, DRIVE Hyperion, and DRIVE AGX Pegasus SoC/GPU. These are specifically built to power autonomous vehicles. The same can be said for its DRIVE Sim program, which enables self-driving simulations based on dynamic data.

The NVIDIA DRIVE Atlan

The NVIDIA DRIVE Atlan. Image used courtesy of NVIDIA

The Sony VISION-S 02 features a number of internal displays and driver-monitoring features. This is where Qualcomm’s involvement may begin. The chipmaker previously introduced the Snapdragon Digital Chassis, a hardware-software suite that supports the following:
  • Advanced driver-assistance feature development
  • 4G, 5G, Wi-Fi, and Bluetooth connectivity
  • Virtual assistance, voice control, and graphical information
  • Car-to-Cloud connectivity
  • Navigation and GPS
It’s unclear if any of Sony’s EVs are reliant on either supplier for in-cabin functionality or overall development. However, both companies have a vested interest in the EV-AV market, and at least have held consulting roles with Sony for two years.

Hyundai and IonQ Join Forces​

SCROLL TO CONTINUE WITH CONTENT

Since Hyundai unveiled its BlueOn electric car in 2010, the company has been hard at work developing improved EVs behind the scenes. These efforts have led to recent releases of the IONIQ EV and Kona Electric. However, the automaker concedes that battery challenges have plagued the ownership experience of EVs following their market launch. Batteries continue to suffer wear and tear from charge and discharge cycling. Capacities have left something to be desired, as have overall durability and safety throughout an EV’s lifespan.
A recent partnership with quantum-computing experts at IonQ aims to solve many of these problems. Additionally, the duo hopes to lower battery costs while improving efficiencyalong the way. IonQ’s quantum processors are doing the legwork here—alongside the company’s quantum algorithms. The goal is to study lithium-based battery chemistries while leveraging Hyundai’s data and expertise in the area.

IonQ

One of IonQ’s ion-trap chips announced in August 2021. Image used courtesy of IonQ

By 2025, Hyundai is aiming to introduce more than 12 battery electric vehicles (BEVs) to consumers. Batteries remain the most expensive component in all EVs, and there’s a major incentive to reduce their costs and pass savings down to consumers. This will boost EV uptake. While the partnership isn’t supplying Hyundai vehicles with hardware components at scale, the venture could help Hyundai design better chip-dependent battery-management systems in the future.

Mercedes-Benz Delivers Smarter Operation​

Stemming from time in the lab, including contributions from Formula 1 and Formula E, Mercedes-Benz has developed its next-generation VISION EQXX vehicle. A major selling point of Mercedes’ newest EV is the cockpit design—which features displays and graphics spanning the vehicle’s entire width. The car is designed to be human-centric and actually mimic the human mind during operation.
How is this possible? The German automaker has incorporated BrainChip’s Akida neural processor and associated software suite. This chipset powers the EQXX’s onboard systems and runs spiking neural networks. This operation saves power by only consuming energy during periods of learning or processing. Such coding dramatically lowers energy consumption.

Diagram of some of Akida's IP's IP

Diagram of some of Akida's IP. Image used courtesy of Brainchip

Additionally, it makes driver interaction much smoother via voice control. Keyword recognition is now five to ten times more accurate than it is within competing systems, according to Mercedes. The result is described as a better driving experience while markedly reducing AI energy needs across the vehicle’s entirety. The EQXX and EVs after it will think in much more humanistic ways and support continuous learning. By doing so, Mercedes hopes to continually refine the driving experience throughout periods of extended ownership, across hundreds of thousands of miles.

The Future of EV Electronics​

While companies have achieved Level 2+ autonomy through driver-assistance packages, upgradeable EV software systems may eventually unlock fully-fledged self-driving. Accordingly, chip-level innovations are surging forward to meet future demand.
It’s clear that EV development has opened numerous doors for electrical engineers and design teams. The inclusion of groundbreaking new components rooted in AI and ML will help drivers connect more effectively with their vehicles. Interestingly, different automakers are taking different approaches on both software and hardware fronts.
Harmonizing these two facets of EV computing will help ensure a better future for battery-powered cars—making them more accessible and affordable to boot”


The Brainchip stated ambition in automotive is to first make every automotive sensor smart and later take control by becoming the central processing unit to which all these smart sensors report.

My opinion only so DYOR
FF

AKIDA BALLISTA

PS: As we approach the festive season when hopefully there will be time for reflection please if you have been too busy to decide upon a plan as 2023 is shaping up as a breakout year for Brainchip use some of that time to do so.

If it was not clear to you from the MF article it should be that manipulators are already planning their activities for 2023 and will be out in force even if the price is rising off the back of price sensitive announcements claiming that any income no matter that starts to appear does not justify the share price hoping to manipulate retail.

The only way to avoid being manipulated is to have a plan locked in before emotion comes into play and hasty decisions are made which are later become a cause for regret.

I always find that it is useful to look at the timing of events such as collaborations, product launches ...

Sony Mobility was announced as a concept at CES 2022.

Sony Enters the EV Fray​

At CES 2022, Sony announced its intention to form a new company called Sony Mobility. This offshoot will be dedicated solely to exploring EV development—building on Sony’s 2020 VISION-S research initiative. While Sony unveiled its coup EV prototype two years ago, dubbed VISION-S 01, this year’s VISION-S 02 prototype is an SUV. However, the company hasn’t committed to bringing these cars to mass-market consumers themselves.
It’s said that both Qualcomm and NVIDIA have been involved throughout the development process. However, the two prominent electronics manufacturers haven’t made their involvement with Sony clear (and vice versa). Tesla has adopted NVIDIA hardware to support its machine-learning algorithms; it’s, therefore, possible that Sony has taken similar steps.
Additionally, NVIDIA has long touted its DRIVE Orin SoC, DRIVE Hyperion, and DRIVE AGX Pegasus SoC/GPU. These are specifically built to power autonomous vehicles. The same can be said for its DRIVE Sim program, which enables self-driving simulations based on dynamic data
.


CES 2022 was in January, 6 months before the Prophesee/Akida reveal and 6 months after the Sony/Prophesee collaboration was announced (but a couple of years after the collaboration commenced ... NDA?)


https://www.prophesee.ai/2021/09/09/sony-event-based-vision-sensors-prophesee-co-development/

20210909:

Sony to Release Two Types of Stacked Event-Based Vision Sensors with the Industry’s Smallest*1 4.86μm Pixel Size for Detecting Subject Changes Only​


Atsugi, Japan — Sony Semiconductor Solutions Corporation (“Sony”) [SSS] today announced the upcoming release of two types of stacked event-based vision sensors. These sensors designed for industrial equipment are capable of detecting only subject changes, and achieve the industry’s smallest*1 pixel size of 4.86μm.

These two sensors were made possible through a collaboration between Sony and Prophesee, by combining Sony’s CMOS image sensor technology with Prophesee’s unique event-based vision sensing technology.
This enables high-speed, high-precision data acquisition and contributes to improve the productivity of the industrial equipment.

So clearly, Prophesee and Sony had been collaborating for quite a while before September 2021, going through initial feasibility analysis, software simulation, silicon design & tapeout, engineering samples ... . That looks like maybe 2 years of collaboration. Presumably iCatch/VeriSilicon were in at the start as well.

... and thanks to @Fullmoonfever we know that the logic was from iCatch using CNN hardware from VeriSilicon.

... and thanks to Prophesee (June 2022), we know Akida beat the socks off VeriSilicon and any other pretenders (including Qualcomm and Nvidia).

So maybe Sony Mobility can save itself a couple of years futile tinkering ...

... if only they talk to SSS.

@Diogenese

Maybe the answer.

In 2020 started using VeriSilicon NPU for their latest V37 at the time.

Would presume has just been iterations of same since.

Haven't looked into the VeriSilicon capabilities yet. Time for 💤



VeriSilicon VIP9000 and ZSP are Adopted by iCatch Next Generation AI-powered Automotive Image Processing SoC

Shanghai, China, May 12, 2020 – VeriSilicon today announced that iCatch Technology, Inc. (TPEX: 6695), a global leader in low-power and intelligent image processing SoC solutions, has selected VeriSilicon VIP9000 NPU and ZSPNano DSP IP. Both will be utilized in the iCatch’s next generation AI-powered image processing SoC with embedded neural network accelerators powered by VeriSilicon’s NPU for applications such as automotive electronics, industrial, appliance, consumer electronics, AIoT, smart home, commercial and more.
 
  • Like
  • Fire
  • Love
Reactions: 32 users
The International VLSI Design & Embedded Systems conference held in Hyderabad, India will be happening from 8 - 12 Jan 2023. A lot of the big name Semiconductor companies will be there.

Here's the link https://vlsid.org/

"International VLSI Design & Embedded Systems conference is a Premier Global conference with legacy of over three and half decades. This Global Annual technical conference that focusses on latest advancements in VLSI and Embedded Systems, is attended by over 2000 engineers, students & faculty, industry, academia, researchers, bureaucrats and government bodies.

Semiconductors are the intangible backbone of every industry across the globe. Silicon took the lion’s share over the past decades and remained the primary enabler for digitization of the world. With scaling reaching its fundamental limits, it is time to look at addressing technological challenges at higher levels of abstraction in CMOS based design and at the same
time, look beyond Silicon for further performance enhancement.

VLSID 2023 – the first physical conference post pandemic, acts a platform for industry and academia alike to discuss, deliberate and explore into the frontiers of semiconductor eco-system that could eventually enable disruptive technologies for global digitalization."
 
  • Like
  • Fire
  • Love
Reactions: 21 users

equanimous

Norse clairvoyant shapeshifter goddess
  • Like
  • Fire
  • Love
Reactions: 39 users

TechGirl

Founding Member
Word is getting out there (y)

Little 6 min podcast from yesterday, first 2 mins he talks about OTC, worth listing too as what they are doing no doubt our tech will help them grow, next 2 mins he talks about BRN recently joining IFS & minute 5 he talks about "the market for Machine Learning is projected to grow from $21.5 billion USD in 2021 to $276.58 billion by 2028", last minute is just his disclosures.


AI Eye Podcast 744: Stocks discussed: (OTCPINK: GTCH) (ASX: BRN) (NasdaqGS: INTC)​





Listen to today's podcast:

https://www.investorideas.com/Audio/Podcasts/2022/121422-AI-Eye.mp3


Vancouver, Kelowna, Delta, BC - December 14, 2022 (Investorideas.com Newswire) Investorideas.com, a global investor news source covering Artificial Intelligence (AI) brings you today's edition of The AI Eye - watching stock news, deal tracker and advancements in artificial intelligence - featuring technology company GBT Technologies Inc. (OTCPINK:GTCH).


AI Eye Podcast 744:​

Stocks discussed: (OTCPINK: GTCH) (ASX: BRN) (NasdaqGS: INTC)


Hear the AI Eye on Spotify
Today's Column -
The AI Eye - Watching stock news, deal tracker and advancements in artificial intelligence

GBT Files Continuation Application for AI-Powered Facial/Body Recognition Patent, and BrainChip Joins Intel Foundry Services


Stocks discussed: (OTCPINK:GTCH) (ASX:BRN) (NasdaqGS:INTC)


Link to website:
 
  • Like
  • Love
  • Fire
Reactions: 61 users
... and in the wheel department, we are looking for anyone who has any ideas about how to reduce the wear on the corners of our basalt square tyres.

We have also had some reports of mal de mer on our rectangular tyres, but only when they get out of synch.
That’s easily solved just fit square pneumatic rubber compound tyres.

Had exactly the same problem with my wheel barrow. Fixed it instantly.😂🤣🤡😂😂🤓
 
  • Haha
  • Like
  • Wow
Reactions: 9 users
Word is getting out there (y)

Little 6 min podcast from yesterday, first 2 mins he talks about OTC, worth listing too as what they are doing no doubt our tech will help them grow, next 2 mins he talks about BRN recently joining IFS & minute 5 he talks about "the market for Machine Learning is projected to grow from $21.5 billion USD in 2021 to $276.58 billion by 2028", last minute is just his disclosures.


AI Eye Podcast 744: Stocks discussed: (OTCPINK: GTCH) (ASX: BRN) (NasdaqGS: INTC)​





Listen to today's podcast:

https://www.investorideas.com/Audio/Podcasts/2022/121422-AI-Eye.mp3


Vancouver, Kelowna, Delta, BC - December 14, 2022 (Investorideas.com Newswire) Investorideas.com, a global investor news source covering Artificial Intelligence (AI) brings you today's edition of The AI Eye - watching stock news, deal tracker and advancements in artificial intelligence - featuring technology company GBT Technologies Inc. (OTCPINK:GTCH).


AI Eye Podcast 744:​

Stocks discussed: (OTCPINK: GTCH) (ASX: BRN) (NasdaqGS: INTC)


Hear the AI Eye on Spotify
Today's Column -
The AI Eye - Watching stock news, deal tracker and advancements in artificial intelligence

GBT Files Continuation Application for AI-Powered Facial/Body Recognition Patent, and BrainChip Joins Intel Foundry Services


Stocks discussed: (OTCPINK:GTCH) (ASX:BRN) (NasdaqGS:INTC)


Link to website:
“the market for Machine Learning is projected to grow from $21.5 billion USD in 2021 to $276.58 billion by 2028",


You were waiting for this.

Some though will think it’s Science Fiction but one percent by 2027 is $2.76 billion approximately.

Half of one percent by 2027 is $1.38 billion approximately.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 54 users

Braintonic

Regular
In everything everywhere = $276billion by 2027 😉
 
  • Haha
  • Like
  • Fire
Reactions: 26 users

TechGirl

Founding Member
  • Haha
  • Like
Reactions: 17 users
Some of the recent changelog entries in MetaTF are quite interesting, including:
-Mentions of AKD500
-Mention of an attention later, plus multiple mentions of transformer models such as ViT and DEiT (all these are for AKD2000). .
-Mention of an Akida USB vendor from August

Note that a possible reason for AKD2000 being delayed could include the software not being ready, so its good to see continuing progress on this front.


2.2.6 (14 Dec)
  • [akida] Attention layer
  • [akida] Identify AKD500 devices
  • Transformers pretrained models updated to 4-bits

2.2.3 (23 Aug)
  • [akida] update Akida USB vendor ID
  • introduced ViT and DeiT transformer model architectures that are using quantizeml features
 
  • Like
  • Fire
  • Love
Reactions: 69 users
Longtime lurker from Germany,
but now I have to get a few thoughts of my mind regarding the discussion/rumors about the apple/sony/prophesee -> akida dots ...

If we can assume the main features/advantages of this sensor (sony/prophesee/akida fusion) would be:
  • low energy consumption (portable device?)
  • compatibility with high dynamic range light settings (darkness, bright sunlight etc.)
  • extremly fast object detection/recognition/tracking (at high frame rates)
I wonder if an iphone camera/sensor upgrade would be my first guess as a related product.
Sure, that makes sense also and would be a nice addition.
But couldn't it be a about product that is in need of all these features much more?

Something where object-detection and recognizing perspective/view/position in a 3-dimensional environment
at the lowest energy consumption possible is of utmost requirement?

If I had to guess, I would assume we could be talking just as well about a product apple is rumored to finally reveal in 2023.
What do you think?
 

Boab

I wish I could paint like Vincent
Some of the recent changelog entries in MetaTF are quite interesting, including:
-Mentions of AKD500
-Mention of an attention later, plus multiple mentions of transformer models such as ViT and DEiT (all these are for AKD2000). .
-Mention of an Akida USB vendor from August

Note that a possible reason for AKD2000 being delayed could include the software not being ready, so its good to see continuing progress on this front.


2.2.6 (14 Dec)
  • [akida] Attention layer
  • [akida] Identify AKD500 devices
  • Transformers pretrained models updated to 4-bits

2.2.3 (23 Aug)
  • [akida] update Akida USB vendor ID
  • introduced ViT and DeiT transformer model architectures that are using quantizeml features
I'm not sure what all this means but it looks like the "Bug Fixes" have been reduced.
There is also the mention of the Transformers again. Coincidence from Rob's hint on the last podcast.
It appears we are making progress?
Thanks for sharing.
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Violin1

Regular
  • Like
Reactions: 6 users
Hi Slade,

There are several Maxim Integrated patents:

https://worldwide.espacenet.com/patent/search?q=pa = "maxim integrated" AND nftxt = "machine learning"

After close perusal of these documents, this seems to be the most relevant:

US2022334634A1 SYSTEMS AND METHODS FOR REDUCING POWER CONSUMPTION IN EMBEDDED MACHINE LEARNING ACCELERATORS

View attachment 20998

[0051] As previously mentioned, the architecture of hardware accelerator 508 may be different from that of memory 504 or the CPU that memory 504 is embedded in. For example, the bus word size of hardware accelerator 508 may be different from the typical 32-bit or 64-bit bus word size of the CPU or memory 504 . Instead, the architecture of hardware accelerator 508 may be optimized to efficiently perform computations on various sizes of data that do not nicely align with the sizes found in common memory devices.

The MAX7800 data sheet mentions 1, 2, 4, 8 bits:
https://datasheets.maximintegrated.com/en/ds/MAX78000.pdf
442k 8-Bit Weight Capacity with 1,2,4,8-Bit Weights


View attachment 20995

[0013] FIG. 8 depicts a simplified block diagram of a computing device/information handling system, in accordance with embodiments of the present disclosure.

The data sheet does not talk about spikes (except in the context of interference), but it looks like they are running 1, 2, 4, or 8 bits on CPU/GPU.
@Diogenese

Was doing some googling as usual and came across some info on Maxim, did a TSE search and see you've had a look as well.

Anyway, it's on the MAX5871 Evaluation Kit which not sure when came out but anyway, looks like it could be using SyncNN on Xylinx by Futurewei.

I could be wrong and SyncNN means something else in the diagrams?

If you do a SyncNN word search on the attached data sheet you'll find the mentions.



You probs be able to check and confirm and if correct that could rule them out of Akida for the mo.

Also, Aug 22 paper by team at Futurewei Technologies & Canadian Uni about the SyncNN and Xylinx with various benchmarks against other systems.

Appears to be 4 & 8 bit SNN and haven't read all but anyway thought you may find interesting.



 
Last edited:
  • Like
  • Fire
Reactions: 15 users

Diogenese

Top 20
@Diogenese

Was doing some googling as usual and came across some info on Maxim, did a TSE search and see you've had a look as well.

Anyway, it's on the MAX5871 Evaluation Kit which not sure when came out but anyway, looks like it could be using SyncNN on Xylinx by Futurewei.

I could be wrong and SyncNN means something else in the diagrams?

If you do a SyncNN word search on the attached data sheet you'll find the mentions.



You probs be able to check and confirm and if correct that could rule them out of Akida for the mo.

Also, Aug 22 paper by team at Futurewei Technologies & Canadian Uni about the SyncNN and Xylinx with various benchmarks against other systems.

Appears to be 4 & 8 bit SNN and haven't read all but anyway thought you may find interesting.



Hi Fmf,

I looked at the MAX7800 which includes a hardware CNN accelerator. Interestingly it has ARM Cortex M4 and a 32-bit RISC-V processor, as well as the CNN accelerator. They talk about 1, 2, 4 and 8 bit weights in the NN accelerator, and having the flexibility to support other NNs, such as RNN as well as MLP, but they don't mention SNN.

A CNN accelerator will need MAC matrix silicon. They also refer to the CNN engine having a 442k weight memory

1671113465104.png




1671113485439.png




Re MAX5871,
SYNCNN is an output port from the MAX5871 (see pg 48, bottom left and zoom in:


1671114392995.png

MAX5871 is an RF ADC I can't see any relevance to SNNs.
https://au.mouser.com/new/maxim-integrated/maxim-max5871-dac/#Bullet-2
Maxim MAX5871 Interpolating & Modulating RF 16-bit DAC enables multi-standard and multi-band transmitters in wireless communications applications. The radio frequency digital-to-analog converter can directly synthesize up to 600MHz of instantaneous bandwidth from direct current to frequencies up to 2.8GHz. The 5.9Gsps device meets spectral mask requirements for communication standards such as Global System for Mobile (GSM), Universal Mobile Telecommunications System (UMTS), and Long-Term Evolution (LTE). Applications include cellular base-station transmitters, point-to-point microwave links, and wireless backhaul. MAX5871 also uses a differential current-steering architecture and can produce a 0dBm full-scale output signal level with a 50Ω load. The device consumes 2.5W at 4.9Gsps operating from 1.8V and 1.0V power supplies.

From the look of their "neural" patents, they are stuck in the CNN era.
https://worldwide.espacenet.com/patent/search?q=pa = "maxim integrated" AND nftxt = "neural"
 
  • Like
  • Love
  • Fire
Reactions: 28 users

Sirod69

bavarian girl ;-)

Why ADLINK & Arm ?​

ADLINK is a trusted Edge AI solution provider, a partner with Arm, in Project Cassini and Arm SystemReady, and an active participant in PSA Certification and Parsec compliance programs. ADLINK is keen on assisting industry users to build an open, secured, and seamless cloud-native software experience. It effectively improves clients’ time-to-market deployment, OPEX savings, IoT device management, and security.
Based on Arm architecture, ADLINK also collaborates with Ampere, NXP, MediaTek, Qualcomm, and Rockchip in module computing development and value-added solutions across varied industries, including smart manufacturing, autonomous driving, robotics, AMR, drone, transportation, logistics, retail, infotainment, healthcare, security, and more.
With plug-and-play tools, development kits, and all-encompassing systems, ADLINK and Arm empowers developers to accelerate and realize their innovations.

 
  • Like
  • Love
Reactions: 8 users
Top Bottom