BRN Discussion Ongoing

Diogenese

Top 20
Look at the power savings with akida View attachment 1759 View attachment 1760

Can someone point me to a post or website that gives a comprehensive technical review/paper dealing with Akida? I like to have my friend (IT, ex Google) read up on it. There are too many posts here now and ... I can't see the forrest for trees. I checked already "The Brainchip Story" by Fact Finder or general information dealing with neuromorphic engineering ... and look for something more technical.
Hi Berlinforever,

TechGirl posted a comparative performance graph a few pages back.
 
  • Like
Reactions: 5 users

YLJ

Swing/Position Trader


I'm sure this is not new to many of you here (seems to have been uploaded 2 days ago), but after some search inputs in the forum, I did not get any matching results, so just in case...
I would like to see some case demonstrations on something a little more subtle but that is probably more connected to
sensors than to the chip. Looks like a while since they have uploaded that type of content, so hopefully more to come...
 
  • Like
  • Fire
Reactions: 12 users

Diogenese

Top 20
Hi Berlinforever,

TechGirl posted a comparative performance graph a few pages back.
... or you could just take the word of Mercedes, Renesas, Socionext, TSMC, Valeo, MagikEye, Democritus University of Thrace, MegaChips ...
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Diogenese

Top 20
Hi Berlinforever,

TechGirl posted a comparative performance graph a few pages back.

... or there's this:
1646310649718.png
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Proga

Regular
Sorry to intrude on the NRL/Newtown discussion (Didn’t Singo fly over to Thailand straight away-I wonder why) but I have a worry about some of the newbies and forgotten soldiers of BRN on the HC site.
Yes everyone is out of it for legitimate reasons, however there is a bit of a concern.
The site-as most will have witnessed- is awash with all sorts of nonsense opinions and videos from the usual suspects; but even more so now.
The problem is there are a few heroic BRN holders (not me, I dont know enough) battling an avalanche of negativity and misinformation over there.
I invested in BRN originally because of the quality of information supplied by the people in HC who are now here. It was a major factor in holding during grim times.
Now I fear that any newbies and those having doubts will fall victim to the siren song of these types running their agendas unfettered.
Is there any way of supplying one post per day with real information over there which at least gives some respite from the constant fear mongering that will possibly affect investor sentiment?
We don’t want The Dean, Shareman et al creating the prevailing narrative and investors leave or potential investors shy away.
Yes it’s the individual investor’s decision but ironically it’s these retail investors who are being duped by the very people who purport to be acting in their interests.
Help is needed over there.
Back to the fox hole. Best I can do is throw in the odd sarcastic comment. Doug Piranha would be proud.
The problem is Paddle, those few, in the past many, actual thought they were heroic but are actually part of the problem. If you ignore trolls they go away. What all the heroic holders were actually doing by replying was creating a target rich environment for trolls. And didn't the trolls love it. If those few heroic posters still left stopped directly replying to the trolls eg engaging them, they'll soon disappear because all the fun has gone.
 
  • Like
  • Thinking
Reactions: 12 users

AusEire

Founding Member. It's ok to say No to Dot Joining
Peter(in the first pic) if you're in here mate I'd love to know how the investigation is going. These attacks are outrageous!

@butcherano solid effort tackling that muppet 😂

I would like to address one thing though! To the scumbag who accused PVDM of scamming investors. I call on you to put your name to the accusation on here 🤬
 

Attachments

  • Screenshot_20220303-234644.png
    Screenshot_20220303-234644.png
    541 KB · Views: 126
  • Screenshot_20220303-234107.png
    Screenshot_20220303-234107.png
    507.5 KB · Views: 140
  • Like
  • Wow
Reactions: 17 users

Quatrojos

Regular
  • Like
  • Fire
Reactions: 13 users

Quatrojos

Regular
  • Like
  • Fire
  • Love
Reactions: 19 users


I'm sure this is not new to many of you here (seems to have been uploaded 2 days ago), but after some search inputs in the forum, I did not get any matching results, so just in case...
I would like to see some case demonstrations on something a little more subtle but that is probably more connected to
sensors than to the chip. Looks like a while since they have uploaded that type of content, so hopefully more to come...


@YLJ

You may or may not have seen this when I posted it here https://thestockexchange.com.au/threads/asynchronously-coded-electronic-skin-aces-platform.3385/

This Singapore company are about the sensors but the algos etc are written with Akida or Loihi as preferred tactile processing chips.

Their web says they should be shipping early 2022.



To break new ground in robotic perception, we are exploring neuromorphic technology – an area of computing that emulates the neural structure and operation of the human brain – to process sensory data from ACES.

We are developing sensory integrated artificial brain system that mimics biological neural networks, which can run on a power-efficient neuromorphic processor, such as Intel’s Loihi chip and Brainchip’s Akida neural processor. This novel system integrates ACES and vision sensors, equipping robots with the ability to draw accurate conclusions about the objects they are grasping based on the data captured by the sensors in real-time - while operating at a power level efficient enough to be deployed directly inside the robot.


Idea in vids of what capable of sensor wise, but obviously needs processing with which they indicated the preferred chips so be good if they move units WITH AKIDA or with advice that AKIDA is PREFERRED chip.





 
  • Like
  • Fire
Reactions: 25 users
Renesas been busy or seems with what @Quatrojos posted and below.

Question for anyone more tech savvy.

Fixstars appears more aligned to the software and CNN side with Renesas providing evaluation boards with chip.

With Akida CNN2SNN capability I wonder if the specialised Dev by Fixstars for ADAS / AD is easily applied across to a board with Akida processing and remove cloud requirements 🤔


Renesas and Fixstars to Establish Automotive SW Platform Lab to Develop Software and Operating Environments for Deep Learning

As Part of Collaboration, Companies Launch Cloud-Based Evaluation Environment for R-Car SoCs to Enable Simple Initial Evaluation for Device Selection
TOKYO — (BUSINESS WIRE) — March 2, 2022 — Renesas Electronics Corporation (TSE:6723), a premier supplier of advanced semiconductor solutions, and Fixstars Corporation, a global leader in multi-core CPU/GPU/FPGA acceleration technology, announced their intention to collaborate in the automotive deep learning field. In April 2022, the two companies will establish an Automotive SW Platform Lab tasked with the development of software and operating environments for Renesas automotive devices. The new Lab will support early development and ongoing operation of advanced driver-assistance systems (ADAS) and autonomous driving (AD) systems. The two companies will develop technologies aimed at software development for deep learning and building operating environments that have the ability to continuously update learned network models to maintain and enhance recognition accuracy and performance.
This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20220302005331/en/
(Graphic: Business Wire)

(Graphic: Business Wire)
“Fixstars possesses both advanced software technology for deep learning and optimization technology that allows more efficient utilization of hardware,” said Takeshi Kataoka, Senior Vice President, General Manager of the Automotive Solution Business Unit at Renesas. “I am confident that our collaboration will enable us to provide strong support for software development optimized for automotive applications and allow our customers to fully leverage the superior performance of Renesas’ automotive devices.”
“After developing a deep learning application, it is not possible to maintain high recognition accuracy and performance without constantly updating it with the latest learning data,” said Satoshi Miki, CEO of Fixstars. “Fixstars plans to focus on these machine learning operations (MLOps) for the automotive field, as we work together with Renesas to develop a deep learning development platform optimized for Renesas devices.”
GENESIS for R-Car Cloud-Based Evaluation Environment
As part of their collaboration, today Renesas and Fixstars are launching GENESIS for R-Car, a cloud-based evaluation environment for R-Car that supports early development of ADAS and AD systems. The new environment facilitates instant initial evaluations when selecting devices. It utilizes the GENESIS cloud-based device evaluation environment from Fixstars as its platform.
Poring over specifications is time consuming and inefficient. Evaluation based on actual use cases is essential when selecting devices. Users typically need to obtain an evaluation board and basic software to evaluate devices, and technical expertise is also required in order to build an evaluation environment. The new GENESIS for R-Car cloud-based evaluation environment does not require specialized technical expertise.
GENESIS for R-Car lets engineers confirm the processing execution time in frames per second (fps) and recognition accuracy percentage of R-Car V3H’s CNN accelerators on sample images using generic CNN models, such as ResNet or MobileNet. It also allows engineers to select the device and network they wish to evaluate and perform operations remotely on an actual board. Engineers can use the GENESIS environment to confirm evaluation results in tasks such as image classification and object detection, with the option to use their own images or video data. This greatly simplifies the initial evaluation to determine whether R-Car V3H is suitable for the customer’s system. Future plans include the rollout of a service that will allow customers to use their own CNN models for evaluations.
Availability
The GENESIS for R-Car is available now. For more information, please visit: https://www.renesas.com/software-tool/genesis-r-car-fixstars.
 
  • Like
Reactions: 7 users

Quatrojos

Regular

'...Renesas already has an impressive global customer base in industrial automation. This includes companies such as Rockwell Automation and Emerson in the Americas, Siemens, Hilscher and Schneider Electric in Europe, Inovance and Delta in the Asia-Pacific region and heavy hitters Mitsubishi Electric, Fanuc and Yaskawa in Japan.

With these and many other customers under its belt, Renesas touts itself as a leading semiconductor supplier for "Industry 4.0." In this vision for the future, highly integrated smart factories dominate the industrial landscape, promising to significantly enhance productivity, efficiency, and safety by implementing advanced automation, control, monitoring, and analysis capabilities...'

 
  • Like
Reactions: 6 users

Diogenese

Top 20
Renesas been busy or seems with what @Quatrojos posted and below.

Question for anyone more tech savvy.

Fixstars appears more aligned to the software and CNN side with Renesas providing evaluation boards with chip.

With Akida CNN2SNN capability I wonder if the specialised Dev by Fixstars for ADAS / AD is easily applied across to a board with Akida processing and remove cloud requirements 🤔


Renesas and Fixstars to Establish Automotive SW Platform Lab to Develop Software and Operating Environments for Deep Learning

As Part of Collaboration, Companies Launch Cloud-Based Evaluation Environment for R-Car SoCs to Enable Simple Initial Evaluation for Device Selection
TOKYO — (BUSINESS WIRE) — March 2, 2022 — Renesas Electronics Corporation (TSE:6723), a premier supplier of advanced semiconductor solutions, and Fixstars Corporation, a global leader in multi-core CPU/GPU/FPGA acceleration technology, announced their intention to collaborate in the automotive deep learning field. In April 2022, the two companies will establish an Automotive SW Platform Lab tasked with the development of software and operating environments for Renesas automotive devices. The new Lab will support early development and ongoing operation of advanced driver-assistance systems (ADAS) and autonomous driving (AD) systems. The two companies will develop technologies aimed at software development for deep learning and building operating environments that have the ability to continuously update learned network models to maintain and enhance recognition accuracy and performance.
This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20220302005331/en/
(Graphic: Business Wire)

(Graphic: Business Wire)
“Fixstars possesses both advanced software technology for deep learning and optimization technology that allows more efficient utilization of hardware,” said Takeshi Kataoka, Senior Vice President, General Manager of the Automotive Solution Business Unit at Renesas. “I am confident that our collaboration will enable us to provide strong support for software development optimized for automotive applications and allow our customers to fully leverage the superior performance of Renesas’ automotive devices.”
“After developing a deep learning application, it is not possible to maintain high recognition accuracy and performance without constantly updating it with the latest learning data,” said Satoshi Miki, CEO of Fixstars. “Fixstars plans to focus on these machine learning operations (MLOps) for the automotive field, as we work together with Renesas to develop a deep learning development platform optimized for Renesas devices.”
GENESIS for R-Car Cloud-Based Evaluation Environment
As part of their collaboration, today Renesas and Fixstars are launching GENESIS for R-Car, a cloud-based evaluation environment for R-Car that supports early development of ADAS and AD systems. The new environment facilitates instant initial evaluations when selecting devices. It utilizes the GENESIS cloud-based device evaluation environment from Fixstars as its platform.
Poring over specifications is time consuming and inefficient. Evaluation based on actual use cases is essential when selecting devices. Users typically need to obtain an evaluation board and basic software to evaluate devices, and technical expertise is also required in order to build an evaluation environment. The new GENESIS for R-Car cloud-based evaluation environment does not require specialized technical expertise.
GENESIS for R-Car lets engineers confirm the processing execution time in frames per second (fps) and recognition accuracy percentage of R-Car V3H’s CNN accelerators on sample images using generic CNN models, such as ResNet or MobileNet. It also allows engineers to select the device and network they wish to evaluate and perform operations remotely on an actual board. Engineers can use the GENESIS environment to confirm evaluation results in tasks such as image classification and object detection, with the option to use their own images or video data. This greatly simplifies the initial evaluation to determine whether R-Car V3H is suitable for the customer’s system. Future plans include the rollout of a service that will allow customers to use their own CNN models for evaluations.
Availability
The GENESIS for R-Car is available now. For more information, please visit: https://www.renesas.com/software-tool/genesis-r-car-fixstars.

I would like to see the next update of the R-Car V3H incorporate Akida 1000 ... see:
1646320436062.png
in the block diagram below.

https://www.renesas.com/us/en/document/fly/renesas-r-car-v3h?language=en&r=1215766
1646320297533.png


https://www.renesas.com/us/en/produ...ntelligent-camera-deep-learning-capabilities?

Building on the state-of-the-art recognition technology introduced with the R-Car V3H in February 2018, which includes integrated IP for convolutional neural networks (CNN), the updated R-Car V3H delivers 4 times the performance for CNN processing compared to the earlier version

Features

  • Low power consumption and highly-efficient image recognition engine delivers up to 7.2 TOPS, including 3.7 TOPS for CNN, with optimized performance-to-power balance
  • Quad Arm® Cortex®-A53 for application programming and dual Cortex-R7 lockstep cores to run AUTOSAR, supporting ASIL D development process for systematic capability for the full SOC
  • Features a full set of video processing and image recognition IP for advanced sensing and recognition, including CNN-IP, computer vision engine, image-distortion-correction IPs, stereovision, classifier and dense optical flow
I wonder what "low power" is in this context.
 
  • Like
  • Fire
Reactions: 20 users

buena suerte :-)

BOB Bank of Brainchip
As the big names keep rolling in.... it's looking like we are going to be contributing to just about every Global manufacturer thanks to our great partnership with RENESAS/MEGACHIPS/TSMC and others.... from EV's to household products/phones/security/aeronautics/gaming/health care/accounting/recognition/safety and the list is ....ENDLESS!
We really do have such a unique 'once in a lifetime' revolutionary product that is GAME CHANGING ...Very exciting times ahead.

I would like to add that without the BRILLIANCE of PVDM and his endless research and hours in the lab and at home clocking up untold hours! chopping and changing until he finally got what he has been trying to achieve for many years and making his dreams and hard work come to fruition is testament to his passion/intelligence and commitment to getting BRN to where it is now and to where it will be in the future .. this conversation/forum/excitement/life changing wealth/world changing tech would not be possible! so for all that got on board good luck and be happy .. I am so pleased to have been part of this huge journey since 2015.... there have been plenty of ups and downs but stoked to have kept my faith in this awesome company.. for my wife and I it has changed our lives dramatically and will continue to do so! I hope this will be the same for many of you guys :) also thanks to the amazing wealth of knowledge that is shared by so many contributors on our great new site ( Thanks Zeeb0t) :) over and out ..... take care..... x
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 49 users

Yak52

Regular
"Usually they're just pinging away in the background and don't affect price much."

Here in lies the problem a fair market is always fair not just usually. If you went to a public auction sale and the auctioneer slammed his hammer down before the first bid was publicly made and said 'Sold to the bot that no one can see' you would immediately call foul. No difference at all here. A fair market provides an equal playing field. A fair market does not have classes of fairness.

My opinion only DYOR
FF

AKIDA BALLISTA

Usually they're just pinging away in the background and don't affect price much."

Here in lies the problem a fair market is always fair not just usually. If you went to a public auction sale and the auctioneer slammed his hammer down before the first bid was publicly made and said 'Sold to the bot that no one can see' you would immediately call foul. No difference at all here. A fair market provides an equal playing field. A fair market does not have classes of fairness.

My opinion only DYOR
FF
---------------------------------------------------

Brilliantly put FF. Loved and laughed at your answer as it is exactly so!

The "ONLY" exception I can to see for using a BoT is if your a fundie/bank and doing either a big sell off or buy up and do not want to unduly effect the SP so you use a BoT to "dribble" either sell or buy orders into the Market over an extended time period. (Rarely done just for this)

The argument against this is .......does not the Market (ie: us all) deserve to know when a large player is either getting out (selling off) or jumping in (buying up) which said info would affect our decisions in our own holdings?? Fair & equal playing field no?

As far as what a BoT does, (trades) well we traders used to sit around doing it ourselfs before BoTs ever came into existence! BoTs are definitely more efficient than we were but this is no excuse for their use to control Markets and SPs and therefore companies!

Just my opinions anyway.
DYOR etc blah blah.

Yak52
 
  • Like
  • Fire
Reactions: 21 users

Yak52

Regular
My post was a response to the complaint about small value trades. Instos accumulating or distributing large orders in small chunks (like $10 every 30 seconds over 6 months) is perfectly legal. In fact it's a good thing. If an insto sold a holding of $10 mill in BRN in one chunk, you can imagine the chaos that would ensue. Even if they broke it down into $500k lots, it would affect the price and volatility enormously. It could easily trigger a crash. Instos pay low or no brokerage in return for their high turnover. If you or I were turning over $100's of millions each day, we'd be offered the same deal.

The illegal stuff is something we have no control over. Retail traders just have to work within the constraints it imposes. What other choice is there? In any case, it's not always instos doing the wrong thing. Some of the bigger retail traders would manipulate the depth and message boards, I'm sure of it.
Hmmm......hello windfall. Surprised to see your posts with such a support of the BoT trading carried out by Insto's here on the ASX.

You seem to have some inside knowledge about BoTs and Brokerage trading which is interesting when considering your claim that bigger retail would be manipulating the depth & message boards here!

Had another tag (ID) over on Hotcrapper we would know of at all windfall? @zeeb0t

Yak52.
 
  • Like
Reactions: 13 users

Yak52

Regular
I think you're quoting me there with "I am an accountant not a techie" but that post will have been days or weeks old. I can't seem to find it now.

There are plenty of people out there that have had plenty of hard knocks in their life and through the constant effort of fighting an uphill battle will learn to live with a 'victim mentality'. The below passage is an amazing example of this and there are many people that feel this way, whether deep down or at the surface level:

"I have never been right about anything else in my life. I never manage to buy a winning scratchie or lotto ticket. The car I bought last year was a lemon. The plumber ripped me off. I am a failure so how can I have got this right."

It's so sad that there are businesses out there that know this and will feed on it. Even sadder that the people who can't afford to be lured into this sort of strategy are the ones that usually are.

It can be a very cruel and unforgiving world.

Silver linings, at least our world is semi-civilized unlike the animal kingdom where it is literally kill-or-be-killed on a daily basis to survive. For those that haven't seen the "Nature is Metal" instagram account, check it out: https://www.instagram.com/accounts/login/?next=/natureismetal/

Very eye opening as to the benefits of having a civilized society.
Sorry SERA2g.................but looking at current events over in Ukraine I fail to see what benefits Europe is having (especially the Ukrainians) by having a so called civilized society.

For those here is Oz the similarities between the build up between Ukraine & Russia leading up to this mess and current China directions are talked about with detailed plans made by those living here in northern Qld. We have one of the (3) best beachhead locations on the east coast of Australia here which again has similarities in Ops to the Ukrainian conflict. Also within 10 klms is Qlds best deep water port namely Mourilyan Harbour.
Many of us are ex-mil and have family currently in the services too. We feel strongly about what Ukraine is going through.

Yak52
 
  • Like
  • Thinking
  • Fire
Reactions: 17 users

Rskiff

Regular
Brainchip gets a great run down again on this video. "market worth 116billion by 2026" now a healthy% of that would be nice for brn!
 
  • Like
  • Fire
  • Love
Reactions: 29 users

Sickdude

Member
 
  • Like
  • Love
  • Fire
Reactions: 15 users

TechGirl

Founding Member
Brainchip just Tweeted



max-0137-image-for-home-page.jpg

March 3, 2022

Brain Chips are Here!​


by Max Maxfield
I remember the heady thrill of the early 1980s when the unwashed masses (in the form of myself and my friends) first started to hear people talking about “expert systems.” These little rascals (the systems, not the people) were an early form of artificial intelligence (AI) that employed a knowledge base of facts and rules and an inference engine that could use the knowledge base to deduce new information.
It’s not unfair to say that expert systems were pretty darned clever for their time. Unfortunately, their capabilities were over-hyped, while any problems associated with creating and maintaining their knowledge bases were largely glossed over. It was also unfortunate that scurrilous members of the marketing masses started to attach the label “artificial intelligence” to anything and everything with gusto and abandon, but with little regard for the facts (see also An Expert Systems History).

As a result, by the end of the 1990s, the term “artificial intelligence” left a bad taste in everyone’s mouths. To be honest, I’d largely pushed AI to the back of my mind until it — and its machine learning (ML) and deep learning (DL) cousins — unexpectedly burst out of academia into the real world circa 2015, give or take.

There are, of course, multiple enablers to this technology, including incredible developments in algorithms and frameworks and processing power. Now, of course, AI is all around us, with new applications popping up on a daily basis.

Initially, modern incarnations of AI were predominantly to be found basking in the cloud (i.e., on powerful servers in data centers). The cloud is still where a lot of the training of AI neural networks takes place, but inferencing — using trained neural network models to make predictions — is increasingly working its way to the edge where the data is generated. In fact, the edge is expected to account for almost three quarters of the inference market by 2025.

lg.php


max-0137-01-inference-market-2025.jpg

The predicted inference market in 2025 (Image source: BrainChip)

The thing about working on the edge is that you are almost invariably limited with respect to processing performance and power consumption. This leads us to the fact that the majority of artificial neural networks (ANNs) in use today are of a type known as a convolutional neural network (CNN), which relies heavily on matrix multiplications. One way to look at a CNN is that every part of the network is active all of the time. Another type of ANN, known as a spiking neural network (SNN), is based on events. That is, the SNN’s neurons do not transmit information at each propagation cycle (as is the case with multi-layer CNNs). Rather, they transmit information only when incoming signals cross a specific threshold.

The ins and outs (no pun intended) of all this are too complex to go into here. Suffice it to say that an SNN-based neuromorphic chip can process information in nanoseconds (as opposed to milliseconds in a human brain or a GPU) using only microwatts to milliwatts of power (as opposed to ~20 watts for the human brain and hundreds of watts for a GPU).

Now, if you are feeling unkind, you may be thinking something like, “Pull the other one, it’s got bells on.” Alternatively, if you are feeling a tad more charitable, you might be saying to yourself, “I’ve heard all of this before, and I’ve even seen demos, but I won’t believe anything until I’ve tried it for myself.” Either way, I have some news that will bring a smile to your lips, a song to your heart, and a glow your day (you’re welcome).

You may recall us talking about a company called BrainChip here on EEJournal over the past couple of years (see BrainChip Debuts Neuromorphic Chip and Neuromorphic Revolution by Kevin Morris, for example). Well, I was just chatting with Anil Mankar, who is the Chief Development Officer at BrainChip. Anil informed me that the folks at BrainChip are now taking orders for Mini PCIe boards flaunting their AKD1000 neuromorphic AIoT chips.
max-0137-02-brainchip-akd1000-1024x640.jpg

Mini PCIe board boasting an AKD1000 neuromorphic AIoT chip
(Image source: BrainChip)


These boards and chips and BrainChip’s design environment support all of the common AI frameworks like TensorFlow, including trained CNN to SNN translation.

If you are a developer, you can plug an AKD1000-powered Mini PCIe board into your existing system to unlock the capabilities for a wide array of edge AI applications, including Smart City, Smart Health, Smart Home, and Smart Transportation. Even better, BrainChip will also offer the full PCIe design layout files and the bill of materials (BOM) to system integrators and developers to enable them to build their own boards and implement AKD1000 chips in volume as stand-alone embedded accelerators or as co-processors. Just to give you a tempting teaser as to what’s possible, there’s a bunch of videos on BrainChip’s YouTube channel, including the following examples:
These PCIe boards are immediately available for pre-order on the BrainChip website, with pricing starting at $499. As the guys and gals at BrainChip like to say: “We don’t make the sensors — we make them smart; we don’t add complexity — we eliminate it; we don’t waste time — we save it.” And, just to drive their point home, they close by saying, “We solve the tough Edge AI problems that others do not or cannot solve.” Well, far be it for me to argue with logic like that. For myself, I’m now pondering how I can use one of these boards to detect and identify stimuli and control my animatronic robot head. Oh, I forgot that I haven’t told you about that yet — my bad. Fear not, I shall rend the veils asunder in a future column. In the meantime, do you have any thoughts you’d care to share of a neuromorphic nature?
 
  • Like
  • Fire
  • Love
Reactions: 57 users
Top Bottom