BRN Discussion Ongoing

zeeb0t

Administrator
Staff member
“Attack Problems, not People”.

I’ll be cleaning up this thread today to erase the long term embarrassment that was a lot of posts yesterday, up until now…

Please take this as a warning.

Continue debating and attacking problems by all means - but attacking each others characters is not an acceptable practice, nor does it positively reflect or embolden your point to make.

If anything it detracts from both your point and potentially your own character.

It’s a simple mantra to live by.

I challenge everyone to consider this, even when you’ve been personally attacked - does meeting them at their level, aid you in all of your points made to date, and in the future, if you will let your opponent make yourself look defeated? Rise above this embarrassing display and let them damage their own character, lest you meet them at their level and damage your own equally.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 141 users

Vanman1100

Regular
 
  • Like
  • Love
  • Fire
Reactions: 10 users
“Attack Problems, not People”.

I’ll be cleaning up this thread today to erase the long term embarrassment that was a lot of posts yesterday, up until now…

Please take this as a warning.

Continue debating and attacking problems by all means - but attacking each others characters is not an acceptable practice, nor does it positively reflect or embolden your point to make.

If anything it detracts from both your point and potentially your own character.

It’s a simple mantra to live by.

I challenge everyone to consider this, even when you’ve been personally attacked - does meeting them at their level, aid you in all of your points made to date, and in the future, if you will let your opponent make yourself look defeated? Rise above this embarrassing display and let them damage their own character, lest you meet them at their level and damage your own equally.
I believe you should leave the posts so people can judge who's character is flawed. But it's your site and I understand.
 
Last edited:
  • Like
  • Fire
Reactions: 8 users

wilzy123

Founding Member
 
  • Like
  • Fire
Reactions: 11 users

Serengeti

Regular
Hey Rise,

In response to your post #47,878 if you click on the magnifying glass and type in a members name in ‘By Member’ field, even if they’ve blocked their profile from view, all the queried members posts pop up for you to read. Quite interesting…sometimes, personally, I pick up a pattern.

Screen shot attached below to help.

Hope it helps 😸
 

Attachments

  • C95E56C8-609E-4334-97F1-FB901E387C42.png
    C95E56C8-609E-4334-97F1-FB901E387C42.png
    1 MB · Views: 146
Last edited:
  • Like
  • Fire
Reactions: 8 users
Can any of our German friends attend this with questions from us especially from @Diogenese ?


I won't be able to attend, but it has been shared in the German forum already and some investors seem to be very keen on attending the conference. Maybe you guys can convince them :D

 
  • Like
  • Love
  • Fire
Reactions: 15 users

Esq.111

Fascinatingly Intuitive.
“Attack Problems, not People”.

I’ll be cleaning up this thread today to erase the long term embarrassment that was a lot of posts yesterday, up until now…

Please take this as a warning.

Continue debating and attacking problems by all means - but attacking each others characters is not an acceptable practice, nor does it positively reflect or embolden your point to make.

If anything it detracts from both your point and potentially your own character.

It’s a simple mantra to live by.

I challenge everyone to consider this, even when you’ve been personally attacked - does meeting them at their level, aid you in all of your points made to date, and in the future, if you will let your opponent make yourself look defeated? Rise above this embarrassing display and let them damage their own character, lest you meet them at their level and damage your own equally.
Good Morning Zeeb0t,

Thankyou.

Unpleasant to see from start to end.

Regards,
Esq.
 
  • Like
  • Fire
  • Love
Reactions: 32 users

BaconLover

Founding Member
Hey Rise,

In response to your post #47,878 if you click on the magnifying glass and type in BaconLover or anyone elses name in ‘By Member’ field even if they’ve blocked their profile from view all the members posts pop up for you to read. Quite interesting…sometimes, personally, I pick up a pattern.

Screen shot attached below to help.

Hope it helps 😸

I have unblocked my profile page.
Glad for people to go there and see what downramping I have done. I haven't deleted any posts over the last few days/weeks may be.

Whole day I was being accused of it, I'd love someone to show me what I've done wrong.
If people can't find it, move on.

(Not directed at you Serengiti, just a public Ann)
 
  • Like
  • Fire
  • Love
Reactions: 11 users

BaconLover

Founding Member
“Attack Problems, not People”.

I’ll be cleaning up this thread today to erase the long term embarrassment that was a lot of posts yesterday, up until now…

Please take this as a warning.

Continue debating and attacking problems by all means - but attacking each others characters is not an acceptable practice, nor does it positively reflect or embolden your point to make.

If anything it detracts from both your point and potentially your own character.

It’s a simple mantra to live by.

I challenge everyone to consider this, even when you’ve been personally attacked - does meeting them at their level, aid you in all of your points made to date, and in the future, if you will let your opponent make yourself look defeated? Rise above this embarrassing display and let them damage their own character, lest you meet them at their level and damage your own equally.
👍
I'll not fire first.
 
  • Like
Reactions: 2 users
 
  • Like
  • Fire
  • Love
Reactions: 25 users

BaconLover

Founding Member
Anyone who calls someone else a prick is a prick! ... oops! I think I just got sucked into the vortex. :ROFLMAO:
Need a strong :coffee: now.
Deena,
See Zs post above.
I wont stop speaking my opinion, even if others call names on me. I did bite back last time, from now I'll stay off calling names in rebuttal but I won't stop saying my opinions and facts about the company.

I'm more than happy for posters to put me on ignore if you don't want to read my posts. Brainchip is an investment for me. It is a forum, you'll hear both sides, if not prepared for it, that's a shame.
Have a great day.
 
  • Like
  • Fire
  • Love
Reactions: 24 users

Boab

I wish I could paint like Vincent
  • Like
  • Fire
  • Love
Reactions: 16 users
  • Haha
  • Like
  • Fire
Reactions: 8 users

Boab

I wish I could paint like Vincent
  • Like
  • Fire
Reactions: 7 users

Tothemoon24

Top 20

Menu ☰ National World – Canada – New Zealand – UK Local – Australia – Government – Local Council – Police News Business Technology Science – Education Life – Health – Social – Travel – Sport – University News Timeline

Science
22 FEB 2023 2:14 AM AEDT
Share

Neuromorphic Camera Boosts Nanoscopic Imaging with ML​



Indian Institute of Science (IISc)
In a new study, researchers at the Indian Institute of Science (IISc) show how a brain-inspired image sensor can go beyond the diffraction limit of light to detect miniscule objects such as cellular components or nanoparticles invisible to current microscopes. Their novel technique, which combines optical microscopy with a neuromorphic camera and machine learning algorithms, presents a major step forward in pinpointing objects smaller than 50 nanometers in size. The results are published in Nature Nanotechnology.
Since the invention of optical microscopes, scientists have strived to surpass a barrier called the diffraction limit, which means that the microscope cannot distinguish between two objects if they are smaller than a certain size (typically 200-300 nanometers). Their efforts have largely focused on either modifying the molecules being imaged, or developing better illumination strategies – some of which led to the 2014 Nobel Prize in Chemistry. “But very few have actually tried to use the detector itself to try and surpass this detection limit,” says Deepak Nair, Associate Professor at the Centre for Neuroscience (CNS), IISc, and corresponding author of the study.

Measuring roughly 40 mm (height) by 60 mm (width) by 25 mm (diameter), and weighing about 100 grams, the neuromorphic camera used in the study mimics the way the human retina converts light into electrical impulses, and has several advantages over conventional cameras. In a typical camera, each pixel captures the intensity of light falling on it for the entire exposure time that the camera focuses on the object, and all these pixels are pooled together to reconstruct an image of the object. In neuromorphic cameras, each pixel operates independently and asynchronously, generating events or spikes only when there is a change in the intensity of light falling on that pixel. This generates sparse and lower amount of data compared to traditional cameras, which capture every pixel value at a fixed rate, regardless of whether there is any change in the scene. This functioning of a neuromorphic camera is similar to how the human retina works, and allows the camera to “sample” the environment with much higher temporal resolution – because it is not limited by a frame rate like normal cameras – and also perform background suppression.
“Such neuromorphic cameras have a very high dynamic range (>120 dB), which means that you can go from a very low-light environment to very high-light conditions. The combination of the asynchronous nature, high dynamic range, sparse data, and high temporal resolution of neuromorphic cameras make them well-suited for use in neuromorphic microscopy,” explains Chetan Singh Thakur, Assistant Professor at the Department of Electronic Systems Engineering (DESE), IISc, and co-author.
In the current study, the group used their neuromorphic camera to pinpoint individual fluorescent beads smaller than the limit of diffraction, by shining laser pulses at both high and low intensities, and measuring the variation in the fluorescence levels. As the intensity increases, the camera captures the signal as an “ON” event, while an “OFF” event is reported when the light intensity decreases. The data from these events were pooled together to reconstruct frames.
To accurately locate the fluorescent particles within the frames, the team used two methods. The first was a deep learning algorithm, trained on about one and a half million image simulations that closely represented the experimental data, to predict where the centroid of the object could be, explains Rohit Mangalwedhekar, former research intern at CNS and first author of the study. A wavelet segmentation algorithm was also used to determine the centroids of the particles separately for the ON and the OFF events. Combining the predictions from both allowed the team to zero in on the object’s precise location with greater accuracy than existing techniques.
“In biological processes like self-organisation, you have molecules that are alternating between random or directed movement, or that are immobilised,” explains Nair. “Therefore, you need to have the ability to locate the centre of this molecule with the highest precision possible so that we can understand the thumb rules that allow the self-organisation.” The team was able to closely track the movement of a fluorescent bead moving freely in an aqueous solution using this technique. This approach can, therefore, have widespread applications in precisely tracking and understanding stochastic processes in biology, chemistry and physics.

/Public Release. This material from the originating organization/author(s) may be of a point-in-time nature, edited for clarity, style and length. The views and opinions expressed are those of the author(s).View in full here.

Digital Perception Altered by Content: New Research


Canadian Registry Exposes Injustices of Criminal Justice System


NIH Uncovers Gene Linked to Yeast Toxin

 
  • Like
  • Fire
  • Love
Reactions: 10 users

Damo0127

Member
  • Like
Reactions: 4 users

alwaysgreen

Top 20
Let's hope we get some sort of announcement this week with some revenue attached before the half yearly drops.
 
  • Like
  • Fire
  • Haha
Reactions: 9 users

chapman89

Founding Member

As the say.. matter of opinions but it is always good to see different views.
Luke Winchester has forgotten that just between Renesas and Megachips alone we would have 1%, but no mention of those companies or the fact that Renesas are currently taping out chips containing akida IP.

So yes, he is right in saying whoever can create low power solutions will be very lucrative 😉
 
  • Like
  • Love
  • Fire
Reactions: 59 users
Screenshot_20230222-081414.png

I think this is the first time this company has mentioned "and others" I only recall them ever only mentioning Loihi before
 
  • Like
  • Fire
  • Thinking
Reactions: 12 users

Dozzaman1977

Regular
THIS IS NEW . FROM THE BLOG ON BRN WEBSITE. INTERESTING STUFF!!!!!!!!!!!!!!!!!!!!!!!
FEB 21st 2023


Developing CNNs for Neuromorphic Hardware​

(Been There! Done That!)​

By Nikunj Kotecha​

Often, we hear that neuromorphic technology is cool, classy, low power, next-gen hardware for AI and the most suitable technology for edge devices. Neuromorphic technology mimics the brain, the most efficient computation engine known, to create a computing and natural learning paradigm for devices. Neuromorphic design is complemented with Spiking Neural Networks (SNNs) which emulate how neurons fire and hence only compute when absolutely necessary. This is unlike today’s “MAC monsters" engines -ones that execute lots of MACs (Multiply Accumulate operations which are the basis of most AI computation) in parallel, many of which often get discarded.
So neuromorphic hardware is exciting! However, it is very difficult to develop and deploy current state-of-the-art solutions onto neuromorphic hardware. It’s also extremely limiting to use existing convolutional neural networks (CNNs) based on these platform models. The difficulty primarily stems from the assumption that neuromorphic hardware is analog, and they only run advanced algorithms with SNNs, which are currently in short supply. Therefore, the production models of today —typically accelerated by traditional Deep Learning Accelerators (DLAs) as a safe path to commercialization—are not supported by neuromorphic hardware. But BrainChip Akida TM is changing the game.
Akida is a fully digital, synthesizable and thus process-independent, and silicon-proven neuromorphic technology. It is designed to be scalable and portable across foundries and architected to be embedded into low-power edge devices. It fully supports acceleration for feed-forward CNNs and accelerates other neural networks like Deep Neural Networks (DNNs), Recurrent Neural Networks (RNNs) and more, while providing the efficiency benefits of a neuromorphic design. It removes risks and simplifies.

Figure 1. Akida Tech Foundations. Fundamentally different and extremely efficient.​

At BrainChip, our mission is to Unlock the future of AI. We believe we can do this by enabling Edge devices with Akida, next-generation technology to advance the growth and smartness of these devices and ultimately provide end users with a sense of privacy and security, energy and cost savings while having access to new features. We realize to achieve our mission, we must make it easy for our end users (who may be experts or non-experts in the field of AI) to use our technology and support current solutions. BrainChip provides development boards of their reference chip AKD1000 for anybody in the community to use and build prototype modules. For commercial use, BrainChip provides licenses of the technology to get it integrated into a custom ASIC, board, or a module that can be used in millions of edge devices.
BrainChip promotes neuromorphic technology with proven silicon, like AKD1000, but also focuses on enabling end users to use the benefits of this technology with little to no knowledge of neuromorphic science. There are three ways to leverage Akida technology (refer to Figure 2) and deploy complex models:


Figure 2. BrainChip development ecosystem and access to deployment of solutions to Akida technology​


1. Through BrainChip MetaTF™ framework: It is a unique and free robust ML framework that has Python packages for model development and conversion of TensorFlow/Keras models into Akida. It is a framework that is very popular among AI experts and custom developers. Python packages for MetaTF framework are public, and developers can access the framework here: https://doc.brainchipinc.com
2. Through Edge Impulse studio: It is a unique platform that provides end-to-end development and deployment of Machine Learning models on targeted technology with little to no-code AI expertise. Core functions of BrainChip MetaTF framework are embedded into Edge Impulse studio to deploy models onto Akida targeted silicon, such as AKD1000 SoC. To learn more about how to develop using Edge Impulse, click here or visit https://www.edgeimpulse.com
3. Through Solutions Partners of BrainChip such as NVISO: BrainChip has partnered with solutions providers and enabled them to create complex models using MetaTF and build applications for specific and most common AI use cases such as Human Monitoring solutions. This allows for faster time to market solutions using Akida technology. To learn more about BrainChip Solutions Partners, contact us at sales@brainchip.com.
These avenues provide an opportunity to create and develop a functioning model that is suitable for running on Akida technology. The models are converted using MetaTF and are saved into a serialized byte file. These models can be evaluated offline by running simulations using the Software Runtime provided with MetaTF or can be evaluated on AKD1000 mini PCIe development board using Hardware backend as shown in Figure 3a. Once through the evaluation stage, these models can be deployed into production on any target device with Akida technology. Akida Runtime library, which is a low-level library that is OS agnostic, is used to compile the saved model and inference on any target device that has Akida technology. Customers who license Akida technology for their device are able to compile this Akida
Runtime library with any of their application software and host OS, as shown in Figure 3b.


Figure 3a. Using MetaTF Software backend for simulations and Hardware backend for model deployment on AKD1000 mini PCIe development board​



Figure 3b. Using low-level Akida Runtime Library for production deployment of models in target devices with Akida technology​

BrainChip is very excited about the ecosystem that is available for our end users to use, develop, and deploy complex AI models on Akida neuromorphic technology. Expert AI developers who are familiar with CNN architectures can use BrainChip MetaTF framework to deploy familiar models on Akida. Developers with little to no code experience can use Edge Impulse studio to deploy models on Akida technology and users who want faster time to market can work with Solutions Partners such as NVISO.
To learn more about how you can harness the power of AI, request a demo or visit BrainChip.com.
Nikunj Kotecha is a Machine Learning Solutions Architect at BrainChip. With many years of experience and a strong programming background, Kotecha brings a passion for AI-driven solutions to the BrainChip team, with a unique eye for data visualization and analysis. He develops neural networks for neuromorphic hardware and event-based processors and optimizes CNN-based networks for conversion to SNN. Nikunj has an MS in Computer Science from Rochester Institute of Technology.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 57 users
Top Bottom