BRN Discussion Ongoing

Evermont

Stealth Mode
Yes, that's what the naming process looks like.

I actually think my version, sounds betterer..

This is the Big One we've been waiting for!

AKIDA2000 (if that's what the next iteration of AKIDA 1.0 is called) is not as big a deal as AKIDA 2.0, which this is.

It makes me wonder though, is this effectively AKIDA 2.0 1000? 🤔..

I understand Akida 2.0 is Akida 2000.

Looks like the naming conventions were revised.

1678070696780.png
 
  • Like
  • Thinking
  • Love
Reactions: 17 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
It's a bit like not knowing which size of the Russian Doll family you're supposed to belong to.

f6f955cbaf6cd493abf9c7a81311acbc.gif
 
  • Haha
  • Like
  • Fire
Reactions: 10 users

wasMADX

Regular
Ok simple. Do I have milk in my cornflakes or 30 year old scotch in my cornflakes?
Cornflakes??? Why not shaved almonds, coconut, and cashews interspersed with gold leaf?
 
  • Haha
  • Like
  • Love
Reactions: 8 users

Pandaxxx

Regular
A formal platform launch timed for US market open.
Why would they time this for US market open, when BRN is listed on the ASX?

Not sure I follow the significance?
 
  • Like
Reactions: 3 users
Cornflakes??? Why not shaved almonds, coconut, and cashews interspersed with gold leaf?
That actually sounds pretty good.
 
  • Haha
  • Like
Reactions: 4 users
So to clarify - Akida 2000 still to come and this is Akida 2 which is entirely different. Remember the project manager role for 2.0 launch....
No.
THE AKIDA TECHNOLOGY FAMILY IS MADE UP OF:

A. AKD 1000 rebranded AKIDA 1.0 from time to time

B. AKD 1500 yet to be rebranded AKIDA 1.5

C. AKIDA next generation was to be AKD 2000 then rebranded AKIDA 2.0 then company advised they were considering what it would be officially named.

In short there are three chip designs

On the original roadmap put out when Peter van der Made was Acting CEO there was AKD 500,1000 & 1500 after which came AKD 2000, 2500,3000,3500 4000,4500 & 5000 and in last podcast 10,000 was slated for 2030.

The AKD500 slated for 2023 has not yet appeared but I suspect the Renesas chip that is built around two nodes may fill that position even though it will likely carry a Renesas name. The AKD 500 was talked about for use in white goods and other appliances.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 68 users
Why would they time this for US market open, when BRN is listed on the ASX?

Not sure I follow the significance?
I think it has more to do with the fact that it is a public holiday in Western Australia and Tony Dawe is in the office but probably no one else and it is Sunday in the USA so no one there till their Monday morning.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
Reactions: 23 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I thought it went something like this:

AKIDA 1.0:
  • Akida 1000
  • Akida 1500
AKIDA 2.0:
  • Akida 2000
  • Akida 2500 (maybe, who knows in the near future?)
Then it goes all the way up to...

AKIDA 10.00 (in 2030)
  • AKIDA 10,000 (the "holy grail" of general artificial intelligence, by which time we'll all be so vastly wealthy IMO that we might not give two figs what it's called).
 
  • Like
  • Haha
  • Love
Reactions: 29 users

Xray1

Regular
AND DON'T FORGET EVERYONE .....

" A formal platform launch press release will take place on 7 March at 1:00am AEDT, 6 March at 6:00am US PST. "


I think this BRN Co press release in the USA tomorrow morning will most likely gain substantive international exposure of our Co and it's technology and may even entice a large number of O/sea's financial institutions to invest in the Co ..... I'm hoping that Cathy Wood will read the release when she has her first cup of coffeee in the morning

Maybe .... LDA can sell them some spare shares if need be.
 
  • Like
  • Fire
  • Love
Reactions: 25 users

Steve10

Regular
Chat GPT reckons:

BrainChip has partnered with several companies to integrate its Akida chip into their products. For example, they have partnered with the German company, Infineon, to integrate the Akida chip into Infineon's REAL3 ToF (Time of Flight) 3D image sensor for use in smartphones and other devices.


The Infineon tech was used for Dreame's W10 vacuum robot. Maybe they have more new products.

Infineon’s REAL3™ ToF imager enables advanced obstacle avoidance and smart navigation in DREAME’s new vacuum cleaning robot W10 Pro​



1678071770391.png
 
  • Like
  • Fire
  • Haha
Reactions: 17 users

ndefries

Regular
  • Like
  • Fire
  • Haha
Reactions: 16 users

VictorG

Member
I'd really like to see a gap at close today and another gap up at opening tomorrow.
 
  • Like
  • Fire
Reactions: 11 users
Will today have any impact on our ASX200 standing ??
Rebalancing already announced, of which BRN is still included, so no worries there.

BTW some great news to all BRN faithful. Congratulations on this news. It's great to see it was ASX allowed and you can get some recognition through the official channels..

GLTAH, and I'll be on my way again.
 
  • Like
Reactions: 11 users

HopalongPetrovski

I'm Spartacus!
  • Like
Reactions: 8 users

Boab

I wish I could paint like Vincent
There are more sellers appearing but the price is holding.
Brain.jpg
 
  • Like
  • Fire
Reactions: 12 users

rgupta

Regular
It took so some time for a non technical person what akida can do. Now with second generation they come out with blastic possibilities, I imagine the same will melt even a few tech savvy people.
But on the whole it is not just like iPhone 13 to iPhone 14 but it is a sea change in capabilities of newer akida.
DYOR
2nd Generation - when no-one else can offer a first generation competitor...
 
  • Like
  • Love
  • Fire
Reactions: 11 users

Steve10

Regular

ToF 3D image sensors are changing the way we engage with photography and mixed reality​

author avatar

Fatima Khalid
23 Aug, 2022

1678072533079.png


Article #3 of the Enabling IoT Series. This article looks at how Time of Flight (ToF) sensors are used for this purpose and are changing the way we can interact with VR, and other video technologies.​

Augmented Reality (AR)- IoT- Semiconductors- Sensors- Virtual Reality (VR)


This is the third article in a 6-part series exploring the technologies enabling the cutting-edge IoT.

The emergence of video technology brought a revolution in audio-visual communication across the globe. Today, the world heavily relies on photo and video-based mediums for a range of purposes, including business, communication, education, and entertainment. Mechanisms of capturing and presenting video have also seen rapid advancement. The introduction of 3D cameras has allowed users to view things with a greater amount of information than ever before.

An integral part of this process is the addition of depth information in digital videos which takes them one step closer to reality. This article looks at how Time of Flight (ToF) sensors are used for this purpose and are changing the way we can interact with VR, and other video technologies.

Introducing ToF 3D Image Sensors

Time of Flight or ToF is basically a method to measure the distance to an object based on the flight time of light. ToF sensors or cameras emit a modulated infra-red beam and measure the phase difference between the light received from different surfaces at various distances. This data is translated into distance and gives depth to the illuminated scene. ToF sensors use pixel-by-pixel processing which requires sophisticated algorithms, but its use is becoming increasingly popular in logistics, automotive, and consumer electronics due to its ubiquitous applicability. By scanning in three dimensions, ToF sensors can create 3D images of target objects.

ToF Sensor Working Principle

The operation of a 3D ToF Image sensor is quite straightforward. Detection is based on the principle of Photonic Mixer Devices (PMD). There are two general categories of ToF sensors: direct and indirect sensors. Direct ToF sensors work by emitting light, collecting the reflected signal, and then measuring the delay in the signal [Fig 1]. The time of flight for the light is then converted into distance measurement, which enables the 3D reconstruction of the scene.

1678072593703.png

Indirect ToF sensors continuously emit modulated signals and analyse the reflected signals to determine phase shift. Two receiving wells are continuously opened and closed at the modulated light frequency with one receiver operating out of phase with its counterpart. Therefore, each receiver gets a portion of the reflected signal, and based on the amount of light received, the phase shift is determined, as shown in Fig 2.

1678072620509.png



Benefits and Applications

ToF provides an optimized solution for 3D enhanced photography by increasing the range without sacrificing the image resolution and at the same time, delivering every digital detail in a low-cost, reliable, and compact package. Latest ToF sensors consume low power, thus making them an attractive choice for mobile devices. Additionally, ToF sensors are less dependent on precise mechanical alignment compared to technologies such as structured light emission. Newer ToF sensors allow the use of higher modulation frequencies which significantly increase the accuracy.

According to Stratview Research, ToF is expected to see an annual growth of 16.8% from 2021 to 2026 (1). The applications of ToF cameras are mainly focused on 3D imaging and scanning in both consumer electronics and the logistics industry. It has the potential to revolutionize 3D detection and imaging by providing rich 3D visuals at a low cost.

ToF sensors can be deployed to aid areas such as facial recognition, security tracking, inventory management, and environmental and architectural mapping. In modern vehicles, ToF sensors enable mapping of the close environment, assist in parking, and many other functions in the cabin such as driver monitoring or gesture control. The most intriguing application, however, lies in consumer electronics and mixed reality. Due to its small form factor and power efficiency, ToF may be the only 3D imaging technology prepared to venture into consumer electronics.

Case Study: Enhancing the AR/MR experience

Incorporation of 3D ToF sensing in consumer electronics is an untapped market with game changing implications. According to Counterpoint Research, incorporation of ToF cameras in smartphones is expected to be a major trend of the future (2). In a recently held webinar, experts from TECNO, Samsung, and DXOMARK Image Labs claimed that features like ToF and Dynamic Vision Sensors (DVS) could enable mobile phone cameras to outperform DSLR cameras (3). Smartphones and gaming devices have utilized 3D ToF cameras in the past for highly realistic augmented/mixed-reality (AR/MR) applications. AR/MR applications blend the real world with virtual elements to give users an enhanced experience of business interactions, 3D design and gaming or entertainment.

Although AR/MR has not become a mainstream application yet, however, with the rapid progression in the tools behind these technologies, it is a matter of time before it becomes synonymous with everyday smartphone use.

The two main elements required to initiate an MR experience are a smartphone camera and an application that can detect the user’s surroundings and show them on the screen. In this way, the real world can be mapped and virtual images can then be placed into it. To achieve this, the camera of the smartphone must be able to provide an in-depth and detailed image of the scanned area.

ToF sensors become the main player in this scenario by accurately mapping objects and their surroundings. MR is a promising technological aid in 3D design, for instance, an interior designer can map out a room using their phone and place digitally created furnishings. Similarly, any real-world object can be 3D mapped using a ToF sensor camera and digitally manipulated. A number of such commercial mobile applications already exist but fail to give a realistic mapped result due to the device’s camera (4).

ToF would significantly enhance depth information and in turn, mapped images opening endless possibilities for innovative and quick 3D design. Another application of mixed reality lies in business interactions, whereby professionals can show realistic site/project images for greater understanding.

MR is also being used to create real-world business backdrops where people can be inserted into a person’s field of view, and a virtual meeting can become a real-life meeting. ToF lies at the heart of this operation as only the depth information provided by 3D imaging can take this experience to a level close to reality. The recently introduced metaverse concept by social networking giants will make use of this precise technology increasing the demand and innovation for ToF sensors.

Gaming is another area that is already utilizing the benefits of ToF in AR/MR domains. Gaming devices and now smartphones are using the gesture recognition capability of ToF sensors for total immersive experiences. When introduced into the front camera of mobile devices, ToF can greatly enhance security measures such as face recognition. With the use of 3D in the rear camera, ToF promises outstanding computational photography and an immersive VR experience. Without extensive post-processing, ToF sensors can effectively render background blur in pictures and videos.

Ideally, mixed reality is much more advanced compared with virtual and augmented reality technologies. It sets comparatively stringent requirements on processing power and requires highly accurate acquisition of 3D depth data in real time. Moreover, lighting conditions present a major challenge in achieving the desired realism in MR. A robust ToF camera must be able to capture accurate depth data in all lighting conditions ranging from indoor environments like houses and vehicles to outdoors weather-based lighting differences.

The use of MR in business interactions and day-to-day life may seem a thing of the far future to many. However, the world experienced an exponential increase in the demand for services that blur the boundaries between the virtual and real world in the era of the COVID-19 pandemic leading to an acceleration of non-face-to-face services. It can, therefore, be adequately expected that AR/MR will emerge as essential technologies for work and entertainment in the near future. Needless to say, technologies like ToF, which form the backbone of superior MR performance, will become all the more important.

ToF sensors from Infineon

Infineon introduced the REAL3TM image sensor family in collaboration with pmd technologies, a German company specializing in ToF technology and leading in that area. The sensors use infrared light for Time of Flight sensing and deploy micro lenses for each pixel resulting in virtually no loss and higher accuracy. Infineon’s sixth generation of REAL3™ 3D ToF sensors are especially focused on consumer applications, such as smartphone applications specifically AR/MR based photo and videography. The sensor consumes 40% less power than previous generations, which is key to preserving battery life in applications such as gaming that require the ToF camera to be on for long intervals. Keeping in view the strict area constraints of modern smart devices, the ToF sensor is designed to occupy a mere 4.4 × 4.8 mm² footprint, which is a 35% compared to the older generations.

Another distinguishing feature of the REAL3™ imager system is the enhanced sensing range of up to 10 m. A ToF sensor’s depth accuracy increases with higher modulation frequency, however, the sensor’s range suffers. The REAL3™ imager, therefore, utilizes two modulation frequencies and careful post-processing of the collected data to achieve a substantial range without sacrificing depth quality. The sensors are also equipped with pmd technologies’ SBI technology (Suppression of Background Illumination) which reduces pixel saturation allowing the camera to operate in various lighting conditions. The sensors can also be dynamically reconfigured via the I²C interface to adapt to new operating environments. With all these features consolidated into a single imaging system, seemingly impossible applications now appear to be a definite reality. Designers must keep a lookout for Infineon’s latest innovations for highly accurate 3D imagers.

Details of ToF Sensors from Infineon for consumer and industrial markets can be accessed here and for automotive from here.

About the sponsor: Infineon Technologies

Infineon Technologies AG is a world leader in semiconductor solutions that make life easier, safer and greener. Microelectronics from Infineon are the key to a better future. With around 50,280 employees worldwide, Infineon generated revenue of about €11.1 billion in the 2021 fiscal year (ending 30 September) and is one of the ten largest semiconductor companies worldwide. To learn more click here.
1678072673451.png


References

1. Stratview Research. Time-of-Flight (ToF) Sensor Market | Market Size, Share & Forecast Analysis | 2021-26 [Internet]. 2022 Jul [cited 2022 Jul 31]. Available from: https://www.stratviewresearch.com/1731/time-of-flight-(ToF)-sensor-market.html

2. Counterpoint Research. Brighter, steadier, smarter: How smartphone cameras will improve in 2022 [Internet]. BBC Future. [cited 2022 Aug 20]. Available from: https://www.bbc.com/storyworks/futu...r-how-smartphone-cameras-will-improve-in-2022

3. Counterpoint Research. Global Mobile Camera Trends 2022: Innovation Talk [Internet]. Counterpoint Research. 2021 [cited 2022 Aug 20]. Available from: https://www.counterpointresearch.com/global-mobile-camera-trends-2022-innovation-talk/

4. Velichko Y. How AR Fills Your Room with Virtual Decor [Internet]. PostIndustria. 2021 [cited 2022 Jul 31]. Available from: https://postindustria.com/how-ar-fi...irtual-furniture-placement-apps-ar-furniture/
 
  • Like
  • Fire
  • Love
Reactions: 11 users
I thought it went something like this:

AKIDA 1.0:
  • Akida 1000
  • Akida 1500
AKIDA 2.0:
  • Akida 2000
  • Akida 2500 (maybe, who knows in the near future?)
Then it goes all the way up to...

AKIDA 10.00 (in 2030)
  • AKIDA 10,000 (the "holy grail" of general artificial intelligence, by which time we'll all be so vastly wealthy IMO that we might not give two figs what it's called).
If we're having this much trouble, with the naming of the product lineup, imagine what trouble Kochie would be having..

_20230306_135246.JPG


He's still trying to work out how to pronounce nuromorphic! 🤣

(edit. Of course I know how to spell neuromorphic! Sheezz 🙄.. Do you think I'm heavily invested in something I can't even spell, let alone understand? That was put in for effect 😉)
 
Last edited:
  • Haha
  • Like
Reactions: 27 users
D

Deleted member 2799

Guest
I wonder what the status is regarding the collaboration with the IFS Accelerator - IP Alliance. This must be a real blow for the other participants, but also an enrichment... Has anyone heard anything about the current status in the past? It's also dead quiet there
 
  • Like
Reactions: 1 users
Top Bottom