BRN Discussion Ongoing

jk6199

Regular
@Fact Finder , may need marriage advice, damn neuromorphic subliminal BRN messaging made me buy some more :-(
 
  • Haha
  • Fire
  • Love
Reactions: 21 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

Tracking How the Event Camera is Evolving​

Article By : Sunny Bains​

AI_cover3-1.jpg

Event camera processing is advancing and enabling a new wave of neuromorphic technology.
Sony, Prophesee, iniVation, and CelePixel are already working to commercialize event (spike-based) cameras. Even more important, however, is the task of processing the data these cameras produce efficiently so that it can be used in real-world applications. While some are using relatively conventional digital technology for this, others are working on more neuromorphic, or brain-like, approaches.

Though more conventional techniques are easier to program and implement in the short term, the neuromorphic approach has more potential for extremely low-power operation.

By processing the incoming signal before having to convert from spikes to data, the load on digital processors can be minimized. In addition, spikes can be used as a common language with sensors in other modalities, such as sound, touch or inertia. This is because when things happen in the real world, the most obvious thing that unifies them is time: When a ball hits a wall, it makes a sound, causes an impact that can be felt, deforms and changes direction. All of these cluster temporally. Real-time, spike-based processing can therefore be extremely efficient for finding these correlations and extracting meaning from them.

Last time, on Nov. 21, we looked at the advantage of the two-cameras-in-one approach (DAVIS cameras), which uses the same circuitry to capture both event images, including only changing pixels, and conventional intensity images. The problem is that these two types of images encode information in fundamentally different ways.

Common language

Researchers at Peking University in Shenzhen, China, recognized that to optimize that multi-modal interoperability all the signals should ideally be represented in the same way. Essentially, they wanted to create a DAVIS camera with two modes, but with both of them communicating using events. Their reasoning was both pragmatic—it makes sense from an engineering standpoint—and biologically motivated. The human vision system, they point out, includes both peripheral vision, which is sensitive to movement, and foveal vision for fine details. Both of these feed into the same human visual system.

The Chinese researchers recently described what they call retinomorphic sensing or super vision that provides event-based output. The output can provide both dynamic sensing like conventional event cameras and intensity sensing in the form of events. They can switch back and forth between the two modes in a way that allows them to capture the dynamics and the texture of an image in a single, compressed representation that humans and machines can easily process.

These representations include the high temporal resolution you would expect from an event camera, combined with the visual texture you would get from an ordinary image or photograph.

They have achieved this performance using a prototype that consists of two sensors: a conventional event camera (DVS) and a Vidar camera, a new event camera from the same group that can efficiently create conventional frames from spikes by aggregating over a time window. They then use a spiking neural network for more advanced processing, achieving object recognition and tracking.

The other kind of CNN

At Johns Hopkins University, Andreas Andreou and his colleagues have taken event cameras in an entirely different direction. Instead of focusing on making their cameras compatible with external post-processing, they have built the processing directly into the vision chip. They use an analog, spike-based cellular neural network (CNN) structure where nearest-neighbor pixels talk to each other. Cellular neural networks share an acronym with convolutional neural networks, but are not closely related.

In cellular CNNs, the input/output links between each pixel and its eight nearest are built directly in hardware and can be specified to perform symmetrical processing tasks (see figure). These can then be sequentially combined to produce sophisticated image-processing algorithms.

Two things make them particularly powerful. One is that the processing is fast because it is performed in the analog domain. The other is that the computations across all pixels are local. So while there is a sequence of operations to perform an elaborate task, this is a sequence of fast, low-power, parallel operations.

A nice feature of this work is that the chip has been implemented in three dimensions using Chartered 130nm CMOS and Terrazon interconnection technology. Unlike many 3D systems, in this case the two tiers are not designed to work separately (e.g. processing on one layer, memory on the other, and relatively sparse interconnects between them). Instead, each pixel and its processing infrastructure are built on both tiers operating as a single unit.

Andreou and his team were part of a consortium, led by Northrop–Grumman, that secured a $2 million contract last year from the Defence Advanced Research Projects Agency (DARPA). While exactly what they are doing is not public, one can speculate the technology they are developing will have some similarities to the work they’ve published.

Shown is the 3D structure of a Cellular Neural Network cell (right) and layout (bottom left) of the John’s Hopkins University event camera with local processing.
In the dark

We know DARPA has strong interest in this kind of neuromorphic technology. Last summer the agency announced that its Fast Event-based Neuromorphic Camera and Electronics (FENCE) program granted three contracts to develop very-low-power, low-latency search and tracking in the infrared. One of the three teams is led by Northrop-Grumman.

Whether or not the FENCE project and the contract announced by Johns Hopkins university are one and the same, it is clear is that event imagers are becoming increasingly sophisticated.

 
  • Like
  • Love
  • Fire
Reactions: 21 users

wilzy123

Founding Member
I have asked nabtrade (just now) - will report back.

Maybe they too (like the ASX) sit around all day firing pencils into the ceiling instead of ensuing their systems work as they should.

@jk6199 - I received a response from nabtrade today: "Thanks for your email query. The top 20s lists would exclude stocks that have a stock price less than $1.00 as small changes in the stock price of these low priced stocks can still lead to large % changes thereby if included would skew the list to these kinds of stocks.". Dumb logic... but at least we know why now.
 
  • Like
  • Love
  • Fire
Reactions: 21 users

JK200SX

Regular
Perhaps TseX forum is already using AKIDA, as forum members are already liking my posts from tomorrow:)
I guess a big announcement is on the cards tomorrow:)
 

Attachments

  • FCE29590-3F60-4591-BF53-288CF0230D24.png
    FCE29590-3F60-4591-BF53-288CF0230D24.png
    766.7 KB · Views: 140
  • Haha
  • Like
  • Wow
Reactions: 18 users
Silly question incoming...
Do we have any confirmation that BRN will be uploading footage of our tech being demonstrated at CES?
 
  • Like
Reactions: 3 users
Perhaps TseX forum is already using AKIDA, as forum members are already liking my posts from tomorrow:)
I guess a big announcement is on the cards tomorrow:)
I've noticed that several times happening in my notifications.
 
  • Like
  • Haha
Reactions: 4 users

buena suerte :-)

BOB Bank of Brainchip
Maybe a little run 🆙 this arvo?! 🤞

611 buyers for 7,406,890 units

273 sellers for 2,747,793 units
 
  • Like
  • Haha
Reactions: 13 users

HopalongPetrovski

I'm Spartacus!
Run up this afternoon for us... c'est la vie shorters.
Hope so. After spending the weekend reviewing and updating my plans threw them all out the window in a moment of impetuosity this morning when became convinced the run up had started and got another parcel at 78,..... before it really took off......🤣
Literally, and I kid you not, within nano seconds of me executing the trade, she started on the way back down! 🤣
Seriously, you guys should get together and pay me to sell some 'cause I can guarantee you that will put a rocket under the share price.:ROFLMAO:
Somewhere in the family history we must've come from the Murphy's. 🤣
Oh well, back to the old sour dough.

 
  • Haha
  • Like
  • Love
Reactions: 24 users

equanimous

Norse clairvoyant shapeshifter goddess
The ASX is an alternative universe isolated from the real world by its own event horizon.
The ASX is still prehistoric. It only understands digging and moving rocks...
 
  • Like
  • Haha
Reactions: 9 users

Oops if this is a double up but the video is brilliant !! 👏

A1F2885B-41CB-4A1C-BFCD-98525B36517F.jpeg
 
  • Like
  • Love
Reactions: 27 users
Hope so. After spending the weekend reviewing and updating my plans threw them all out the window in a moment of impetuosity this morning when became convinced the run up had started and got another parcel at 78,..... before it really took off......🤣
Literally, and I kid you not, within nano seconds of me executing the trade, she started on the way back down! 🤣
Seriously, you guys should get together and pay me to sell some 'cause I can guarantee you that will put a rocket under the share price.:ROFLMAO:
Somewhere in the family history we must've come from the Murphy's. 🤣
Oh well, back to the old sour dough.


Let me know the next time you top up so I can wait a few seconds/ minutes to grab some bargain prices🤣
You're not alone mate that has happened to me on more than one occasion.
Latest results on poll. People are damn excited about 2023 for BRN.
 
  • Haha
  • Like
Reactions: 17 users

Kachoo

Regular
Hope so. After spending the weekend reviewing and updating my plans threw them all out the window in a moment of impetuosity this morning when became convinced the run up had started and got another parcel at 78,..... before it really took off......🤣
Literally, and I kid you not, within nano seconds of me executing the trade, she started on the way back down! 🤣
Seriously, you guys should get together and pay me to sell some 'cause I can guarantee you that will put a rocket under the share price.:ROFLMAO:
Somewhere in the family history we must've come from the Murphy's. 🤣
Oh well, back to the old sour dough.


I have felt that pain before lol. Last week I asked the missus to give me money to buy 10k shares as I'm all tapped out for cash I can't find any. She did agree but said it will be her profit not mine.

I'll likely have to also cover the tax while she will walk away with it at face value. 🙄😔
 
  • Haha
  • Like
  • Love
Reactions: 21 users

Mt09

Regular
Who have we got at CES 2023?

Socionext
Nviso
VVDN technologies
Prophesee

Renesas- will they have their mcu with Akida on show?
Mercedes- will they confirm Akida in production vehicles or any further use cases?
Valeo- Is our IP in their next gen Lidar solution?

Exciting times ;)
 
  • Like
  • Love
  • Fire
Reactions: 53 users
Last edited:
  • Like
  • Fire
Reactions: 6 users

Oops if this is a double up but the video is brilliant !! 👏

View attachment 26023
Great advertising, I agree!
The CEO out there sprucing our tech too gives me such a warm glow.
What I aslo like to see is a long list, 18 infact, of videos doing the same. Applications for BRN tech offered by NVISO are displayed bottom right in this link


So we hear that the partnership with NVISO is 9 months on and market applciations are now available through that relationship.
I love ecosystems, they make accessibility issues disapear over time.

Our day in th sun is approaching;
  • products are being sold since 'yesterday'
  • others contiune to be developed today and
  • non existing applications are being discovered for tomorrow.
Everday is a good day for me as a holder....

1672718548656.png
 
  • Like
  • Fire
  • Love
Reactions: 48 users

Worker122

Regular
Hope so. After spending the weekend reviewing and updating my plans threw them all out the window in a moment of impetuosity this morning when became convinced the run up had started and got another parcel at 78,..... before it really took off......🤣
Literally, and I kid you not, within nano seconds of me executing the trade, she started on the way back down! 🤣
Seriously, you guys should get together and pay me to sell some 'cause I can guarantee you that will put a rocket under the share price.:ROFLMAO:
Somewhere in the family history we must've come from the Murphy's. 🤣
Oh well, back to the old sour dough.


I believe that amazingly in the time your buy or sell gets to the exchange, it is possible for the manipulators to change the price in their favour. E trade used to have a advice popup advising me that my trade would be 3 pips higher or lower than when I pressed the button, can’t win.
 
  • Like
  • Wow
  • Sad
Reactions: 12 users
Could be worth a watch or look into.

Haven't dug far yet however I noticed they will be at CES and as the article says below, they are ARM IoT funded and they use ARM CPU and NPU which at first I thought yeah ok....probs Ethos or something.

Then I visited their site and spotted something as below pic. Didn't think Ethos N Series was this???

Could be nothing....could be something :unsure:

IMG_20230103_114956.jpg



ACCESSWIRE

ACCESSWIRE​

eYs3D Hits CES with Computer Vision and AI for Medical, AIoT, Industrial and Retail Robotics​

NEWS PROVIDED BY
ACCESSWIRE
Dec 15, 2022, 4:31 PM ET
Intelligent Heterogeneous Integration Gives Machines Brains, Eyes and Senses
TAIPEI, TAIWAN / ACCESSWIRE / December 15, 2022 / eYs3D, a silicon-plus-AI computer vision solutions company, will showcase a powerful new vision platform for robotics at the upcoming CES 2023, January 5-8. Additionally it will demonstrate innovative applications for a range of market sectors, including smart medical, AIoT, industrial, and retail robotics. eYs3D will demonstrate its solutions at 2 booth locations, one at the LVCC, Booth #15769, Central Hall and the other in the Venetian at Eureka Park, Booth 65200, AT1, Hall G.
eYs3D offers a one-stop-shop computer vision source, ranging from a computer vision development framework and processor, stereo video and 3D depth camera, to prototyping 3D sensing camera design services and tailoring a subsystem for custom products.
The stereo vision depth sensing products use advanced ISPs (Imaging Signal Processors), stereo depth processors and various operating system/AI SDKs (software development kits) and wrappers that are easily adapted and integrated for different solutions:
  • Medical: The company's intelligent medical products offer precise depth maps, 3D point cloud images, and reproduction of objects as holograms for augmented reality (AR) and 3D reconstructed scenes.
  • AIoT: Its 3D cameras utilize AIoT technology to capture 3D facial models for anti-spoofing facial recognition and authentication as well as people-counting, and collection of demographic information for smart entry and digital kiosks.
  • Industrial: For industrial autonomous mobile robots (AMRs) eYs3D offers accurate depth perception and a 108-degree diagonal field of vision.
  • Retail: Its retail computer vision subsystems upgrade a customer's purchase experience, using eYs3D's latest 3D cameras for tracking merchandise movements and transactions. The computer vision solution also allows for grab-and-go checkout, inventory, and logistics management.
eYs3D CES Product Showcase
The company will highlight its XINK PaaS (Platform as a Service) , a cost-effective development tool for applications in the robotics and AIoT markets. XINK is powered by eYs3D's eCV1 AI chip -- incorporating an ARM-based CPU and NPU processing unit.
It will also feature a next-generation stereo video and depth processor, the eSP879. This IC enhances object edge, optimizes depth de-noising and outputs HD quality 3D depth data, saving computing power and bandwidth.
Additionally the company will focus on application-specific stereo depth cameras, including the G100-2 wide FoV, the G100g2 GMSL2 interface, as well as an eSP879-enabled 3D camera YX8077. These cameras deliver a variety of solutions and frame rates for 3D point cloud data creation.
The company also will have on hand jointly-developed ASV (Active Stereo Vision) depth cameras Ref-B3 and Ref-B6, the result of a partnership with STMicroelectronics. The cameras incorporate a pair of STMicro's high-BSI performance global shutter image sensors for mid-to-long-range 3D sensing distances. The cameras are ideal for applications such as service robots, AMRs (Autonomous Mobile Robots), and AGVs (Autonomous Guided Vehicles).
"The robotics industry - from autonomous mobile applications to smart retail - demands sophisticated levels of computer vision and processing," said James Wang, Chief Strategy Officer of eYs3D. "The surging interest in our technology motivated us to come back to CES with our partners and show a wide range of new capabilities and demos."
eYs3D will showcase its solutions at 2 booth locations:
LVCC, Booth #15769, Central Hall and
Venetian, Eureka Park, Booth 65200, AT1, Hall G
About eYs3D Microelectronics
eYs3D Microelectronics is an ARM IoT fund-invested fabless IC design house with the ability in semiconductor and system design.
The company focuses on computer vision processors and specializes in 3D stereo vision solutions. As one of the earliest ventures in 3D technology, eYs3D was design-in with multiple tier-one brands in VR, robotics and IoT devices. eYs3D's state-of-the-art stereo vision depth IC and module offer customers more integrated value in bringing 3d sensing into real applications, realizing computer vision with human perception incorporated with A.I. For further information visit www.eys3d.com.
 
  • Like
  • Fire
  • Thinking
Reactions: 28 users

Xray1

Regular
Could be worth a watch or look into.

Haven't dug far yet however I noticed they will be at CES and as the article says below, they are ARM IoT funded and they use ARM CPU and NPU which at first I thought yeah ok....probs Ethos or something.

Then I visited their site and spotted something as below pic. Didn't think Ethos N Series was this???

Could be nothing....could be something :unsure:

View attachment 26028


ACCESSWIRE

ACCESSWIRE​

eYs3D Hits CES with Computer Vision and AI for Medical, AIoT, Industrial and Retail Robotics​

NEWS PROVIDED BY
ACCESSWIRE
Dec 15, 2022, 4:31 PM ET
Intelligent Heterogeneous Integration Gives Machines Brains, Eyes and Senses
TAIPEI, TAIWAN / ACCESSWIRE / December 15, 2022 / eYs3D, a silicon-plus-AI computer vision solutions company, will showcase a powerful new vision platform for robotics at the upcoming CES 2023, January 5-8. Additionally it will demonstrate innovative applications for a range of market sectors, including smart medical, AIoT, industrial, and retail robotics. eYs3D will demonstrate its solutions at 2 booth locations, one at the LVCC, Booth #15769, Central Hall and the other in the Venetian at Eureka Park, Booth 65200, AT1, Hall G.
eYs3D offers a one-stop-shop computer vision source, ranging from a computer vision development framework and processor, stereo video and 3D depth camera, to prototyping 3D sensing camera design services and tailoring a subsystem for custom products.
The stereo vision depth sensing products use advanced ISPs (Imaging Signal Processors), stereo depth processors and various operating system/AI SDKs (software development kits) and wrappers that are easily adapted and integrated for different solutions:
  • Medical: The company's intelligent medical products offer precise depth maps, 3D point cloud images, and reproduction of objects as holograms for augmented reality (AR) and 3D reconstructed scenes.
  • AIoT: Its 3D cameras utilize AIoT technology to capture 3D facial models for anti-spoofing facial recognition and authentication as well as people-counting, and collection of demographic information for smart entry and digital kiosks.
  • Industrial: For industrial autonomous mobile robots (AMRs) eYs3D offers accurate depth perception and a 108-degree diagonal field of vision.
  • Retail: Its retail computer vision subsystems upgrade a customer's purchase experience, using eYs3D's latest 3D cameras for tracking merchandise movements and transactions. The computer vision solution also allows for grab-and-go checkout, inventory, and logistics management.
eYs3D CES Product Showcase
The company will highlight its XINK PaaS (Platform as a Service) , a cost-effective development tool for applications in the robotics and AIoT markets. XINK is powered by eYs3D's eCV1 AI chip -- incorporating an ARM-based CPU and NPU processing unit.
It will also feature a next-generation stereo video and depth processor, the eSP879. This IC enhances object edge, optimizes depth de-noising and outputs HD quality 3D depth data, saving computing power and bandwidth.
Additionally the company will focus on application-specific stereo depth cameras, including the G100-2 wide FoV, the G100g2 GMSL2 interface, as well as an eSP879-enabled 3D camera YX8077. These cameras deliver a variety of solutions and frame rates for 3D point cloud data creation.
The company also will have on hand jointly-developed ASV (Active Stereo Vision) depth cameras Ref-B3 and Ref-B6, the result of a partnership with STMicroelectronics. The cameras incorporate a pair of STMicro's high-BSI performance global shutter image sensors for mid-to-long-range 3D sensing distances. The cameras are ideal for applications such as service robots, AMRs (Autonomous Mobile Robots), and AGVs (Autonomous Guided Vehicles).
"The robotics industry - from autonomous mobile applications to smart retail - demands sophisticated levels of computer vision and processing," said James Wang, Chief Strategy Officer of eYs3D. "The surging interest in our technology motivated us to come back to CES with our partners and show a wide range of new capabilities and demos."
eYs3D will showcase its solutions at 2 booth locations:
LVCC, Booth #15769, Central Hall and
Venetian, Eureka Park, Booth 65200, AT1, Hall G
About eYs3D Microelectronics
eYs3D Microelectronics is an ARM IoT fund-invested fabless IC design house with the ability in semiconductor and system design.
The company focuses on computer vision processors and specializes in 3D stereo vision solutions. As one of the earliest ventures in 3D technology, eYs3D was design-in with multiple tier-one brands in VR, robotics and IoT devices. eYs3D's state-of-the-art stereo vision depth IC and module offer customers more integrated value in bringing 3d sensing into real applications, realizing computer vision with human perception incorporated with A.I. For further information visit www.eys3d.com.
I think,that BRN is most likely getting to the stage, that they will need to now hire another Patents Lawyer just to check if there are any infringements upon our own patents by other Co's or developers.
 
  • Like
Reactions: 8 users

HopalongPetrovski

I'm Spartacus!
I believe that amazingly in the time your buy or sell gets to the exchange, it is possible for the manipulators to change the price in their favour. E trade used to have a advice popup advising me that my trade would be 3 pips higher or lower than when I pressed the button, can’t win.
Yeah, the bastardo's have it pretty much rigged to greatly favour "the house".
Fortunately for me, the few extra cents each, I have paid for my top up today will likely pale into insignificance by the time I come to sell them.
Won't need or want to part with these particular new additions to my flock for much under $40 per.
I figure we'll be there in about 5 years time, or maybe less, depending on just how the cookie crumbles. :ROFLMAO:
GLTAH
 
  • Like
  • Fire
  • Love
Reactions: 26 users

hamilton66

Regular

A topic for our keen diggers of info.
The ocean of things a DARPA program worth kicking over a few sea shells to see if anything related to Akida on the net.
Rise, lots of rabbits, digging plenty of holes. I have another 1. Given the significant increase in data breaches not only in Oz over the past year, but the entire world, I'm mystified that BRN news on this front has pretty much been non-existent. 2 yrs ago, it seemed such a promising source of income. Now it seems to have completely gone off radar. Or is it in stealth mode? Any thoughts from the 1000 eyes would be appreciated. Looks like we've navigated todays' market carnage quite well. Will have to wait and see what damage the bots can conjure up in the closing auction.
GLTA
 
  • Like
  • Fire
  • Love
Reactions: 9 users
Top Bottom