BRN Discussion Ongoing

Don’t give up on the bread making @Rise from the ashes I have been making sourdough bread for 3 years now and it took a few attempts to get it right. It took a while but now I understand the process and know when my dough is ready to bake. Just like everything in life you learn most when things don’t go quite right. 🍞🍞🍞🍞
Ohhh. I may hit you up in due course for tips if that's ok.
This was my latest attempt early this morning. https://thestockexchange.com.au/threads/cooking-baking-with-air-fryers.136192/
 
  • Like
Reactions: 1 users
Don’t give up on the bread making @Rise from the ashes I have been making sourdough bread for 3 years now and it took a few attempts to get it right. It took a while but now I understand the process and know when my dough is ready to bake. Just like everything in life you learn most when things don’t go quite right. 🍞🍞🍞🍞
@Deadpool sourdough can be tolerated by most who are gluten intolerant as the long ferment changes the structure of the gluten to make it more digestible. Is the reason I started making it. There is something so satisfying in being able to make your own bread and the smell and taste is soooo gooood 🍞🍞🍞🍞
 
  • Like
  • Love
  • Fire
Reactions: 14 users

buena suerte :-)

BOB Bank of Brainchip
  • Haha
  • Like
Reactions: 6 users

buena suerte :-)

BOB Bank of Brainchip
@Deadpool sourdough can be tolerated by most who are gluten intolerant as the long ferment changes the structure of the gluten to make it more digestible. Is the reason I started making it. There is something so satisfying in being able to make your own bread and the smell and taste is soooo gooood 🍞🍞🍞🍞
Absolutely.... I have had my sourdough starter for around three years now.... makes awesome bread 🫓🍞 (y)
 
  • Like
  • Love
Reactions: 5 users

Diogenese

Top 20
I certainly pays to not be a listed ASX company when it comes to getting your message out.

Then again you could be smart like Brainchip and have the best of both worlds by partnering with companies not hamstrung by the ASX like Nviso, Prophesee, ARM, Renesas, Intel, ISL, Socionext, Mercedes Benz etc;

My opinion only DYOR
FF

AKIDA BALLISTA
The ASX is an alternative universe isolated from the real world by its own event horizon.
 
  • Like
  • Haha
  • Love
Reactions: 21 users

Tothemoon24

Top 20
We are all here with the same intentions, to make a little dough
 
  • Haha
  • Like
Reactions: 21 users

wilzy123

Founding Member
We are all here with the same intentions, to make a little dough
penguin-publishing.gif
 
  • Haha
  • Like
Reactions: 12 users

TheFunkMachine

seeds have the potential to become trees.

Arrived in my email​

NVISO BrainChip CES 2023​

NOT YET RATED​

12 hours agoMore

NVISO
Follow


Share

NVISO's unique holistic platform incorporates 3D headpose that enables many applications such as emotion analysis, facial behavior analysis, and affective computing. Industry use cases included digital avatars, human machine interfaces, and driver monitoring for drowsiness and fatigue.
See live demo at CES - January 5th to 8th 2023!
Socionext in the Vehicle Tech and Advanced Mobility Zone
Las Vegas Convention Center
North Hall in Booth 10654

Leave the first comment:​


default-blue_40x40

Add a new comment








I’m sure this post from Tim at Nviso has been posted, I just have t had a chance to scroll trough, but it is worth a reshare! The amount of quality free advertising we get from Nviso is amazing. Can’t wait to see how this plays out!
 

Attachments

  • 7046B6E2-DFB2-4C1B-AACC-E8BDF2DCFE63.png
    7046B6E2-DFB2-4C1B-AACC-E8BDF2DCFE63.png
    651 KB · Views: 98
  • F5CFA8AE-7907-48CB-8C4F-1DDB2A80B6FC.png
    F5CFA8AE-7907-48CB-8C4F-1DDB2A80B6FC.png
    776.8 KB · Views: 99
  • Like
  • Love
  • Fire
Reactions: 25 users

Newk R

Regular
Absolutely.... I have had my sourdough starter for around three years now.... makes awesome bread 🫓🍞 (y)
OK, I'm gluten intolerant (not coeliac). I gave up gluten free over Christmas/New Year and of course I have paid the price. I do buy sourdough bread and love it.
Here's a little tip, not proven so DOYR. I only use French T45, T55 and T65 flour at home and make pizza bases, pasta etc. using it. Apparently the French have to use old grain flour to make bread and call it a baguette. They can't use crap engineered grains as grown in the US and Aus. It is illegal. So the old grain, as I understand it, has a shorter gluten protein which is easily digestible. Also, as I understand it, there is an enzyme in wheat that helps digestion of gluten. We process that out. The French leave it in.
As I said, not proven so DOYR. But it works for me. I have no reaction to it.
 
  • Like
  • Love
  • Fire
Reactions: 15 users

Diogenese

Top 20
View attachment 26003
Just posted on Nviso twitter page, can’t get the link to work sorry

nViso has expanded the range of applications in its Software Development kit (SDK) including Akida, and this increases the potential market for Akida with nViso.

https://www.nviso.ai/en/news/nviso-...l-neuromorphic-processor-platform-at-ces-2023

NVISO’s latest Neuro SDK to be demonstrated running on the BrainChip Akida fully digital neuromorphic processor platform at CES 2023

...
Lausanne, Switzerland – 2nd January, 2023 – nViso SA (NVISO), the leading Human Behavioural Analytics AI company, is pleased that its Neuro SDK will be demonstrated running on the Brainchip Akida platform at the Socionext stand at CES2023. Following the porting of additional AI Apps from its catalogue, NVISO has further enhanced the range of Human Behavioural AI Apps that it supports on the BrainChip Akida event-based, fully digital neuromorphic processing platform. These additions include Action Units, Body Pose and Gesture Recognition on top of the Headpose, Facial Landmark, Gaze and Emotion AI Apps previously announced with the launch of the Evaluation Kit (EVK) version. This increased capability supports the further deployment of NVISO Human Behavioural Analytics AI software solutions with these being able to further exploit the performance capabilities of BrainChip neuromorphic AI processing IP to be deployed within the next generation of SOC devices. Target applications include Robotics, Automotive, Telecommunication, Infotainment, and Gaming.

“BrainChip’s event-based Akida platform is accelerating today’s traditional networks and simultaneously enabling future trends in AI software applications” said Rob Telson, VP Ecosystems, BrainChip. “NVISO is a valued partner of Brain Chip’s growing ecosystem, and their leadership in driving extremely efficient software solutions gives a taste of what compelling applications are possible at the edge on a minimal energy budget”.

...
Benchmark performance data for the AI Apps running on the BrainChip neuromorphic AI processor

NVISO%20Neuro%20SDK%20Latency%20Under%201ms.png



A major factor in Akida's performance is the use of 2-bit or 4-bit model libraries (and the secret sauce).

The performance is equal to or better than a GPU, but the cost differential is massive.
 
  • Like
  • Love
  • Fire
Reactions: 57 users

buena suerte :-)

BOB Bank of Brainchip
OK, I'm gluten intolerant (not coeliac). I gave up gluten free over Christmas/New Year and of course I have paid the price. I do buy sourdough bread and love it.
Here's a little tip, not proven so DOYR. I only use French T45, T55 and T65 flour at home and make pizza bases, pasta etc. using it. Apparently the French have to use old grain flour to make bread and call it a baguette. They can't use crap engineered grains as grown in the US and Aus. It is illegal. So the old grain, as I understand it, has a shorter gluten protein which is easily digestible. Also, as I understand it, there is an enzyme in wheat that helps digestion of gluten. We process that out. The French leave it in.
As I said, not proven so DOYR. But it works for me. I have no reaction to it.
Thanks for the flour tips Newk ... (y)
 
  • Like
Reactions: 4 users

Harwig

Regular
Run up this afternoon for us... c'est la vie shorters.
 
  • Like
  • Haha
Reactions: 7 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This could be something and it could be nothing but I'm sharing it because of the links between Mercedes and "Avatar: The Way of the Water" and the VISION AVTR concept car and also because it says "built-in neural network".


666666 pm.png


77777 pm.png


 
  • Like
  • Fire
  • Thinking
Reactions: 20 users

Boab

I wish I could paint like Vincent
nViso has expanded the range of applications in its Software Development kit (SDK) including Akida, and this increases the potential market for Akida with nViso.

https://www.nviso.ai/en/news/nviso-...l-neuromorphic-processor-platform-at-ces-2023

NVISO’s latest Neuro SDK to be demonstrated running on the BrainChip Akida fully digital neuromorphic processor platform at CES 2023

...
Lausanne, Switzerland – 2nd January, 2023 – nViso SA (NVISO), the leading Human Behavioural Analytics AI company, is pleased that its Neuro SDK will be demonstrated running on the Brainchip Akida platform at the Socionext stand at CES2023. Following the porting of additional AI Apps from its catalogue, NVISO has further enhanced the range of Human Behavioural AI Apps that it supports on the BrainChip Akida event-based, fully digital neuromorphic processing platform. These additions include Action Units, Body Pose and Gesture Recognition on top of the Headpose, Facial Landmark, Gaze and Emotion AI Apps previously announced with the launch of the Evaluation Kit (EVK) version. This increased capability supports the further deployment of NVISO Human Behavioural Analytics AI software solutions with these being able to further exploit the performance capabilities of BrainChip neuromorphic AI processing IP to be deployed within the next generation of SOC devices. Target applications include Robotics, Automotive, Telecommunication, Infotainment, and Gaming.

“BrainChip’s event-based Akida platform is accelerating today’s traditional networks and simultaneously enabling future trends in AI software applications” said Rob Telson, VP Ecosystems, BrainChip. “NVISO is a valued partner of Brain Chip’s growing ecosystem, and their leadership in driving extremely efficient software solutions gives a taste of what compelling applications are possible at the edge on a minimal energy budget”.

...
Benchmark performance data for the AI Apps running on the BrainChip neuromorphic AI processor

NVISO%20Neuro%20SDK%20Latency%20Under%201ms.png



A major factor in Akida's performance is the use of 2-bit or 4-bit model libraries (and the secret sauce).
After watching this mornings promo video for CES one of the "expanded range" surely is the ability to measure breathing seemingly without being hooked up to a heart rate monitor?
There was so much happening in the video it was hard to keep up.
Cheers from Big Kev, I mean Boab.😁😁

The performance is equal to or better than a GPU, but the cost differential is massive.
 
  • Like
  • Haha
  • Fire
Reactions: 12 users

jk6199

Regular
@Fact Finder , may need marriage advice, damn neuromorphic subliminal BRN messaging made me buy some more :-(
 
  • Haha
  • Fire
  • Love
Reactions: 21 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

Tracking How the Event Camera is Evolving​

Article By : Sunny Bains​

AI_cover3-1.jpg

Event camera processing is advancing and enabling a new wave of neuromorphic technology.
Sony, Prophesee, iniVation, and CelePixel are already working to commercialize event (spike-based) cameras. Even more important, however, is the task of processing the data these cameras produce efficiently so that it can be used in real-world applications. While some are using relatively conventional digital technology for this, others are working on more neuromorphic, or brain-like, approaches.

Though more conventional techniques are easier to program and implement in the short term, the neuromorphic approach has more potential for extremely low-power operation.

By processing the incoming signal before having to convert from spikes to data, the load on digital processors can be minimized. In addition, spikes can be used as a common language with sensors in other modalities, such as sound, touch or inertia. This is because when things happen in the real world, the most obvious thing that unifies them is time: When a ball hits a wall, it makes a sound, causes an impact that can be felt, deforms and changes direction. All of these cluster temporally. Real-time, spike-based processing can therefore be extremely efficient for finding these correlations and extracting meaning from them.

Last time, on Nov. 21, we looked at the advantage of the two-cameras-in-one approach (DAVIS cameras), which uses the same circuitry to capture both event images, including only changing pixels, and conventional intensity images. The problem is that these two types of images encode information in fundamentally different ways.

Common language

Researchers at Peking University in Shenzhen, China, recognized that to optimize that multi-modal interoperability all the signals should ideally be represented in the same way. Essentially, they wanted to create a DAVIS camera with two modes, but with both of them communicating using events. Their reasoning was both pragmatic—it makes sense from an engineering standpoint—and biologically motivated. The human vision system, they point out, includes both peripheral vision, which is sensitive to movement, and foveal vision for fine details. Both of these feed into the same human visual system.

The Chinese researchers recently described what they call retinomorphic sensing or super vision that provides event-based output. The output can provide both dynamic sensing like conventional event cameras and intensity sensing in the form of events. They can switch back and forth between the two modes in a way that allows them to capture the dynamics and the texture of an image in a single, compressed representation that humans and machines can easily process.

These representations include the high temporal resolution you would expect from an event camera, combined with the visual texture you would get from an ordinary image or photograph.

They have achieved this performance using a prototype that consists of two sensors: a conventional event camera (DVS) and a Vidar camera, a new event camera from the same group that can efficiently create conventional frames from spikes by aggregating over a time window. They then use a spiking neural network for more advanced processing, achieving object recognition and tracking.

The other kind of CNN

At Johns Hopkins University, Andreas Andreou and his colleagues have taken event cameras in an entirely different direction. Instead of focusing on making their cameras compatible with external post-processing, they have built the processing directly into the vision chip. They use an analog, spike-based cellular neural network (CNN) structure where nearest-neighbor pixels talk to each other. Cellular neural networks share an acronym with convolutional neural networks, but are not closely related.

In cellular CNNs, the input/output links between each pixel and its eight nearest are built directly in hardware and can be specified to perform symmetrical processing tasks (see figure). These can then be sequentially combined to produce sophisticated image-processing algorithms.

Two things make them particularly powerful. One is that the processing is fast because it is performed in the analog domain. The other is that the computations across all pixels are local. So while there is a sequence of operations to perform an elaborate task, this is a sequence of fast, low-power, parallel operations.

A nice feature of this work is that the chip has been implemented in three dimensions using Chartered 130nm CMOS and Terrazon interconnection technology. Unlike many 3D systems, in this case the two tiers are not designed to work separately (e.g. processing on one layer, memory on the other, and relatively sparse interconnects between them). Instead, each pixel and its processing infrastructure are built on both tiers operating as a single unit.

Andreou and his team were part of a consortium, led by Northrop–Grumman, that secured a $2 million contract last year from the Defence Advanced Research Projects Agency (DARPA). While exactly what they are doing is not public, one can speculate the technology they are developing will have some similarities to the work they’ve published.

Shown is the 3D structure of a Cellular Neural Network cell (right) and layout (bottom left) of the John’s Hopkins University event camera with local processing.
In the dark

We know DARPA has strong interest in this kind of neuromorphic technology. Last summer the agency announced that its Fast Event-based Neuromorphic Camera and Electronics (FENCE) program granted three contracts to develop very-low-power, low-latency search and tracking in the infrared. One of the three teams is led by Northrop-Grumman.

Whether or not the FENCE project and the contract announced by Johns Hopkins university are one and the same, it is clear is that event imagers are becoming increasingly sophisticated.

 
  • Like
  • Love
  • Fire
Reactions: 21 users

wilzy123

Founding Member
I have asked nabtrade (just now) - will report back.

Maybe they too (like the ASX) sit around all day firing pencils into the ceiling instead of ensuing their systems work as they should.

@jk6199 - I received a response from nabtrade today: "Thanks for your email query. The top 20s lists would exclude stocks that have a stock price less than $1.00 as small changes in the stock price of these low priced stocks can still lead to large % changes thereby if included would skew the list to these kinds of stocks.". Dumb logic... but at least we know why now.
 
  • Like
  • Love
  • Fire
Reactions: 21 users

JK200SX

Regular
Perhaps TseX forum is already using AKIDA, as forum members are already liking my posts from tomorrow:)
I guess a big announcement is on the cards tomorrow:)
 

Attachments

  • FCE29590-3F60-4591-BF53-288CF0230D24.png
    FCE29590-3F60-4591-BF53-288CF0230D24.png
    766.7 KB · Views: 158
  • Haha
  • Like
  • Wow
Reactions: 18 users
Silly question incoming...
Do we have any confirmation that BRN will be uploading footage of our tech being demonstrated at CES?
 
  • Like
Reactions: 3 users
Perhaps TseX forum is already using AKIDA, as forum members are already liking my posts from tomorrow:)
I guess a big announcement is on the cards tomorrow:)
I've noticed that several times happening in my notifications.
 
  • Like
  • Haha
Reactions: 4 users
Top Bottom