BRN Discussion Ongoing

jtardif999

Regular
Now, here's a thing:

Air, as opposed to a vacuum, has a different refractive index from the refractive index of a vacuum. Nominal refractive index for air is 1.000273 compared to 1 for a vacuum.

The speed of light in a medium is inversely proportional to its refractive index.

But the refractive index is dependent on wavelength. That is to say, different wavelengths meet different "resistance" in the same medium. This means that different wavelengths travel at different speeds in the same medium, eg, air. Think of a prism.

So increasing the frequency (reducing the wavelength) of radar transmissions means the shorter wavelength (higher frequency) radar signals do travel faster than the longer wavelength radar waves of yore. But you'd need to be quick with a stopwatch to measure the difference over a 200 m roundtrip.

https://en.wikipedia.org/wiki/Refractive_index#Dispersion

The refractive index of materials varies with the wavelength (and frequency) of light.[28] This is called dispersion and causes prisms and rainbows to divide white light into its constituent spectral colors.[29] As the refractive index varies with wavelength, so will the refraction angle as light goes from one material to another. Dispersion also causes the focal length of lenses to be wavelength dependent. This is a type of chromatic aberration, which often needs to be corrected for in imaging systems. In regions of the spectrum where the material does not absorb light, the refractive index tends to decrease with increasing wavelength, and thus increase with frequency. This is called "normal dispersion", in contrast to "anomalous dispersion", where the refractive index increases with wavelength.[28] For visible light normal dispersion means that the refractive index is higher for blue light than for red.
I think you are taking FF down the proverbial rabbit hole 🙂
 
  • Like
  • Love
Reactions: 4 users
I received another email this morning after I asked where he can see Brainchip being involved. His reply-

‘Nothing will come out of this neuromorphic silicon neurons approaches. Brainchip is behind Intel and they can’t make it!’

He also stated in a previous email-

‘I believe there is no future of neural board replicating neurons, the use is very limited, HOTS as an example could work on a spiking neural network but this is not silicon friendly. The idea is to adapt what you want to compute to the available substrate.’

🤔
Either Mercedes Benz is lying or this person is a Richard Cranium.

When you consider all of those who do not support his view apart from WANCA I know which answer I prefer.

His response completely denies the reality that Socionext went to a trade show last year on its own account and was promoting a real world commercial chip called AKD1000 available for sale as chip or IP BACK IN August, 2021.

While at the same time Intel is making videos and broadcasting to the World Loihi is still in research and they are not sure how it will be used stating it may not ever come to market as a chip but find a place in the cloud one day.

I think he needs to be ignored particularly after the statements by Nviso, SiFive and the MegaChips podcast today wherein they state they have had their own engineers dig deep into Brainchip’s Ai solution over the last 18 months and are finding near term traction with their customers who see the low cost, low power and on chip learning as distinct advantages.

My opinion only supported by publicly available facts.
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 43 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I thought it was great that Doug highligted Akida's versatility when he said something alone these lines "There is a need for more than one solution. There isn't a single thing that fits all applications but one of the things that we feel strongly about the BrainChip solution is that it fits into a wide variety of applications."

When asked about what excites him most about BrainChip he mentions the value in terms of:
  • it being a small, low cost, low power solution where "virtually every customer has a concern and the BrainChip solution has immediate attraction"
  • the on-chip learning capability
  • the "real traction in the marketplace" and the validation from customers
It's pretty clear from what Doug said that we can expect to see Akida in the NEAR-TERM in the following areas:
  • cameras
  • gaming
  • appliances
  • Industrial applications such as anomaly detection and preventative maintenance
For the LONGER-TERM (which I guess is 3-5 years) he seemed very excited about:
  • "very large" opportunities in automotive.
 
  • Like
  • Fire
  • Love
Reactions: 52 users

Bloodsy

Regular
MegaChip podcast, Rob telson asks a question- “what key areas do you believe will incorporate AI into their product offerings in the short term?”

Answer by Doug Fairbairn-“ so MegaChip already has a significant position in consumer related products, so there are applications in CAMERAS GAMING AND APPLIANCES” which I think are going to yield some near term applications and production opportunities for BOTH OF US


Now does anybody need anymore reassurance, not that we needed anymore, that Brainchip WILL be a very successful company, not IMO, but in the industry experts put customers opinion.
More music to my ears

"We are a very high volume producer, we ship 150 to 200 MILLION units PER YEAR"

"We feel strongly about the Brainchip solution is the fact that it actually fits into a wide variety of applications"

Now what amount of millions in units counts as wide?

50/200 million?
100/200 million?
200/200 million?

Any of those numbers x $1 per chip or $2 per chip or $5 per chip, and the days of worrying about 200k in a quarterly will be a laughing matter.

Megachips is just ONE of Brainchips customers :love:
 
  • Like
  • Fire
  • Love
Reactions: 65 users

jtardif999

Regular
"The idea is to adapt what you want to compute to the available substrate"
... in that case why did nvidia invent the GPU?
Why adapt what you want to yesterdays' paradigms?
Because yesterday is today and not tomorrow…
 
  • Love
  • Like
Reactions: 2 users
  • Haha
  • Like
Reactions: 8 users

Taproot

Regular
Well Ouster have heard of neural networks, and there is plenty of scope for them to incorporate Akida:

WO2021046547A1 PROCESSING OF LIDAR IMAGES

View attachment 5400


a kernel-based coprocessor; and a scanning light ranging device comprising: a transmission circuit comprising a plurality of light sources that emit light pulses; a detection circuit comprising: an array of photosensors that detect reflected light pulses and output signals measured over time; and a signal processor connected to the array of photosensors and configured to determine depth values from measurements using the array of photosensors; and an image reconstruction circuit communicably coupled with the detection circuit and configured to: assign a sensor ID to each of first depth values for a first scan of the scanning light ranging device; construct a first lidar image using the first depth values by: mapping, using the sensor IDs, the first depth values to first lidar pixels in the first lidar image, the first lidar image being a rectilinear image, wherein the mapping uses a mapping table that specifies a lidar pixel based on a corresponding sensor ID; and store the first lidar pixels of the first lidar image in a local image buffer of the scanning light ranging device; and send the first lidar pixels of a local frame of the first lidar image or of a complete frame of the first lidar image to the kernel-based coprocessor.


[0177] Classifier 1514 can provide classification information (e.g., classified lidar images with certain pixels identified as corresponding to a same object) to a signal processor 1516, which may be optional or have functions that are implemented in classifier 1514 instead. Various models can be used for classifier 1514, including convolution neural networks, which can include convolutional kernels that can be implemented by a filter kernel (e.g., 1414 of FIG. 14). A classified color image can assign each lidar pixel to an object (e.g., using an ID) so that all pixels corresponding to a same object can be determined via the contents of the classified lidar image. Accordingly, the classification information can indicate which lidar pixels of a lidar image correspond to the same object.

...

For example, classifications using various models can be applied to a same lidar image, e.g., a decision tree and a neural network, or different types of such models, or use of different parameters for a same model, such as number of nodes or hidden layers.

...

[0216] ...
Machine learning techniques, e.g., a neural network may be used.
For those who are not aware of the Ouster / Brainchip connection:

Ouster are well aware of Akida thru Emmanuel T. ( Manny ) Hernandez who has been on the board of directors at Ouster since its inception in 2015.

"Manny" is still on the board of Ouster today .

"Manny" also joined the board of directors at Brainchip in 2017, and of course went on to become Brainchip's Chairman.

He stepped aside for Antonio Viana earlier this year.
 
  • Like
  • Fire
  • Love
Reactions: 28 users

Taproot

Regular
This bloke does some dot joining between Apple + Ouster which is really interesting.

 
  • Like
  • Fire
  • Love
Reactions: 14 users
D

Deleted member 118

Guest
Either Mercedes Benz is lying or this person is a Richard Cranium.

When you consider all of those who do not support his view apart from WANCA I know which answer I prefer.

His response completely denies the reality that Socionext went to a trade show last year on its own account and was promoting a real world commercial chip called AKD1000 available for sale as chip or IP BACK IN August, 2021.

While at the same time Intel is making videos and broadcasting to the World Loihi is still in research and they are not sure how it will be used stating it may not ever come to market as a chip but find a place in the cloud one day.

I think he needs to be ignored particularly after the statements by Nviso, SiFive and the MegaChips podcast today wherein they state they have had their own engineers dig deep into Brainchip’s Ai solution over the last 18 months and are finding near term traction with their customers who see the low cost, low power and on chip learning as distinct advantages.

My opinion only supported by publicly available facts.
FF

AKIDA BALLISTA
 
  • Haha
  • Like
Reactions: 9 users

chapman89

Founding Member
I thought it was great that Doug highligted Akida's versatility when he said something alone these lines "There is a need for more than one solution. There isn't a single thing that fits all applications but one of the things that we feel strongly about the BrainChip solution is that it fits into a wide variety of applications."

When asked about what excites him most about BrainChip he mentions the value in terms of:
  • it being a small, low cost, low power solution where "virtually every customer has a concern and the BrainChip solution has immediate attraction"
  • the on-chip learning capability
  • the "real traction in the marketplace" and the validation from customers
It's pretty clear from what Doug said that we can expect to see Akida in the NEAR-TERM in the following areas:
  • cameras
  • gaming
  • appliances
  • Industrial applications such as anomaly detection and preventative maintenance
For the LONGER-TERM (which I guess is 3-5 years) he seemed very excited about:
  • "very large" opportunities in automotive.
It is of my opinion and I stress, my opinion only that we will see revenue from IP sales from MegaChip before Renesas.
 
  • Like
  • Fire
  • Love
Reactions: 29 users
  • Like
  • Fire
Reactions: 4 users
It is of my opinion and I stress, my opinion only that we will see revenue from IP sales from MegaChip before Renesas.
Hi @chapman89

Is it not great that we have reached a point with Brainchip where we can have an argument about which multinational partner is going to produce income for Brainchip first.

The shortest design cycle according to the former CEO Mr. Dinardo was for appliances ie White Goods and that was six months.

This six months starts from when they have gone through the sales and agreement cycles.

So while I love your enthusiasm and I get it as it is a great endorsement/validation of Brainchip’s future with MegaChips I think Renesas with an MCU suite of products incorporating AKIDA due for release in the next couple of months are going to be the winner.

My opinion only DYOR
FF

AKIDA BALLISTA

PS: There of course is room for someone to argue based on Doug’s statement that one of the things that attracted MegaChips to Brainchip was the already existing traction they have in the market with customers (???) that first income will come from left field.
 
  • Like
  • Fire
  • Love
Reactions: 49 users

Fox151

Regular
is there an opportunity for Lidar to work on consensus? and by that i mean, literally communicate with one another and form an opinion on what a given environment looks like.

ie a mesh network where all the " receivers " within range talk with one another to adopt a view of a given environment? So lets say Car A is travelling North and surveying the road ahead. Car B is travelling South "handshakes" Car A and the two compare notes on the shared interpretation of the terrain and risks until they pass...

I dived down a rabbit hole on Near Earth Automation and bent my brain on the potentialities for 4 cars sitting at an intersection all sharing real time Lidar data of both physical elements and moving targets like pedestrians.

If all information is sorted at the Edge as individual "sensors".. then uplinked to a global view, weight could be granted to the "best perspective" of a given obstacle, then a consensus adopted and fed back to all participants. The same provides an opportunity to cancel noise within a "radar congested" environment..

Thoughts?
Don't need the internet and a global view. You could potentially use local data links like navy ships do to create a common operating picture.
 
  • Like
  • Love
  • Fire
Reactions: 6 users

Labsy

Regular
I still can't believe this awesome spike we had on the back of the Mercedes news....Imagine what the future holds... gives me good bumps. At any moment this could happen again, perhaps all the way to 5 dollars this time.
Of course I am but a humble observer with no financial knowledge.
AKIDA BALLISTA!!
 

Attachments

  • Screenshot_20220503-103308_Chrome.jpg
    Screenshot_20220503-103308_Chrome.jpg
    187.6 KB · Views: 59
  • Like
  • Fire
  • Love
Reactions: 38 users
I still can't believe this awesome spike we had on the back of the Mercedes news....Imagine what the future holds... gives me good bumps. At any moment this could happen again, perhaps all the way to 5 dollars this time.
Of course I am but a humble observer with no financial knowledge.
AKIDA BALLISTA!!
The good old moments. When we hit $5.00 we will look back and say remember when you could have bought BRN for $2.34.

Feel like a Little River song coming on. 😂
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 42 users

chapman89

Founding Member
The good old moments. When we hit $5.00 we will look back and say remember when you could have bought BRN for $2.34.

Feel like a Little River song coming on. 😂
FF

AKIDA BALLISTA
No FF, we will look back at Christmas 2024 and say remember when it was $10 😎
 
  • Like
  • Fire
  • Haha
Reactions: 44 users
I love when it's free to dream
 
  • Like
Reactions: 1 users
No FF, we will look back at Christmas 2024 and say remember when it was $10 😎
Gosh your in an argumentative mood today. Renesas no MegaChips. $5 no $10. Sorry I upset you. Walking on egg shells time.😂🤣😎

FF

AKIDA BALLISTA
 
  • Haha
  • Like
  • Thinking
Reactions: 28 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
The good old moments. When we hit $5.00 we will look back and say remember when you could have bought BRN for $2.34.

Feel like a Little River song coming on. 😂
FF

AKIDA BALLISTA


 
  • Like
Reactions: 5 users

Diogenese

Top 20
is there an opportunity for Lidar to work on consensus? and by that i mean, literally communicate with one another and form an opinion on what a given environment looks like.

ie a mesh network where all the " receivers " within range talk with one another to adopt a view of a given environment? So lets say Car A is travelling North and surveying the road ahead. Car B is travelling South "handshakes" Car A and the two compare notes on the shared interpretation of the terrain and risks until they pass...

I dived down a rabbit hole on Near Earth Automation and bent my brain on the potentialities for 4 cars sitting at an intersection all sharing real time Lidar data of both physical elements and moving targets like pedestrians.

If all information is sorted at the Edge as individual "sensors".. then uplinked to a global view, weight could be granted to the "best perspective" of a given obstacle, then a consensus adopted and fed back to all participants. The same provides an opportunity to cancel noise within a "radar congested" environment..

Thoughts?
There are proposals (and patents) for AVs to communicate with one another.
 
  • Like
  • Fire
  • Love
Reactions: 7 users
Top Bottom