BRN Discussion Ongoing

ndefries

Regular
I'll just leave this here 👇👇


View attachment 31272
shorters have about 80M shares to buy to get us down to about 2% shorted which is more common. Will be great to have the shorters on the buy side each day.
 
  • Like
  • Fire
  • Haha
Reactions: 16 users

VictorG

Member
I'd really like to see a gap at close today and another gap up at opening tomorrow.
 
  • Like
  • Fire
Reactions: 11 users
Will today have any impact on our ASX200 standing ??
Rebalancing already announced, of which BRN is still included, so no worries there.

BTW some great news to all BRN faithful. Congratulations on this news. It's great to see it was ASX allowed and you can get some recognition through the official channels..

GLTAH, and I'll be on my way again.
 
  • Like
Reactions: 11 users

HopalongPetrovski

I'm Spartacus!
  • Like
Reactions: 8 users

Boab

I wish I could paint like Vincent
There are more sellers appearing but the price is holding.
Brain.jpg
 
  • Like
  • Fire
Reactions: 12 users

rgupta

Regular
It took so some time for a non technical person what akida can do. Now with second generation they come out with blastic possibilities, I imagine the same will melt even a few tech savvy people.
But on the whole it is not just like iPhone 13 to iPhone 14 but it is a sea change in capabilities of newer akida.
DYOR
2nd Generation - when no-one else can offer a first generation competitor...
 
  • Like
  • Love
  • Fire
Reactions: 11 users

Steve10

Regular

ToF 3D image sensors are changing the way we engage with photography and mixed reality​

author avatar

Fatima Khalid
23 Aug, 2022

1678072533079.png


Article #3 of the Enabling IoT Series. This article looks at how Time of Flight (ToF) sensors are used for this purpose and are changing the way we can interact with VR, and other video technologies.​

Augmented Reality (AR)- IoT- Semiconductors- Sensors- Virtual Reality (VR)


This is the third article in a 6-part series exploring the technologies enabling the cutting-edge IoT.

The emergence of video technology brought a revolution in audio-visual communication across the globe. Today, the world heavily relies on photo and video-based mediums for a range of purposes, including business, communication, education, and entertainment. Mechanisms of capturing and presenting video have also seen rapid advancement. The introduction of 3D cameras has allowed users to view things with a greater amount of information than ever before.

An integral part of this process is the addition of depth information in digital videos which takes them one step closer to reality. This article looks at how Time of Flight (ToF) sensors are used for this purpose and are changing the way we can interact with VR, and other video technologies.

Introducing ToF 3D Image Sensors

Time of Flight or ToF is basically a method to measure the distance to an object based on the flight time of light. ToF sensors or cameras emit a modulated infra-red beam and measure the phase difference between the light received from different surfaces at various distances. This data is translated into distance and gives depth to the illuminated scene. ToF sensors use pixel-by-pixel processing which requires sophisticated algorithms, but its use is becoming increasingly popular in logistics, automotive, and consumer electronics due to its ubiquitous applicability. By scanning in three dimensions, ToF sensors can create 3D images of target objects.

ToF Sensor Working Principle

The operation of a 3D ToF Image sensor is quite straightforward. Detection is based on the principle of Photonic Mixer Devices (PMD). There are two general categories of ToF sensors: direct and indirect sensors. Direct ToF sensors work by emitting light, collecting the reflected signal, and then measuring the delay in the signal [Fig 1]. The time of flight for the light is then converted into distance measurement, which enables the 3D reconstruction of the scene.

1678072593703.png

Indirect ToF sensors continuously emit modulated signals and analyse the reflected signals to determine phase shift. Two receiving wells are continuously opened and closed at the modulated light frequency with one receiver operating out of phase with its counterpart. Therefore, each receiver gets a portion of the reflected signal, and based on the amount of light received, the phase shift is determined, as shown in Fig 2.

1678072620509.png



Benefits and Applications

ToF provides an optimized solution for 3D enhanced photography by increasing the range without sacrificing the image resolution and at the same time, delivering every digital detail in a low-cost, reliable, and compact package. Latest ToF sensors consume low power, thus making them an attractive choice for mobile devices. Additionally, ToF sensors are less dependent on precise mechanical alignment compared to technologies such as structured light emission. Newer ToF sensors allow the use of higher modulation frequencies which significantly increase the accuracy.

According to Stratview Research, ToF is expected to see an annual growth of 16.8% from 2021 to 2026 (1). The applications of ToF cameras are mainly focused on 3D imaging and scanning in both consumer electronics and the logistics industry. It has the potential to revolutionize 3D detection and imaging by providing rich 3D visuals at a low cost.

ToF sensors can be deployed to aid areas such as facial recognition, security tracking, inventory management, and environmental and architectural mapping. In modern vehicles, ToF sensors enable mapping of the close environment, assist in parking, and many other functions in the cabin such as driver monitoring or gesture control. The most intriguing application, however, lies in consumer electronics and mixed reality. Due to its small form factor and power efficiency, ToF may be the only 3D imaging technology prepared to venture into consumer electronics.

Case Study: Enhancing the AR/MR experience

Incorporation of 3D ToF sensing in consumer electronics is an untapped market with game changing implications. According to Counterpoint Research, incorporation of ToF cameras in smartphones is expected to be a major trend of the future (2). In a recently held webinar, experts from TECNO, Samsung, and DXOMARK Image Labs claimed that features like ToF and Dynamic Vision Sensors (DVS) could enable mobile phone cameras to outperform DSLR cameras (3). Smartphones and gaming devices have utilized 3D ToF cameras in the past for highly realistic augmented/mixed-reality (AR/MR) applications. AR/MR applications blend the real world with virtual elements to give users an enhanced experience of business interactions, 3D design and gaming or entertainment.

Although AR/MR has not become a mainstream application yet, however, with the rapid progression in the tools behind these technologies, it is a matter of time before it becomes synonymous with everyday smartphone use.

The two main elements required to initiate an MR experience are a smartphone camera and an application that can detect the user’s surroundings and show them on the screen. In this way, the real world can be mapped and virtual images can then be placed into it. To achieve this, the camera of the smartphone must be able to provide an in-depth and detailed image of the scanned area.

ToF sensors become the main player in this scenario by accurately mapping objects and their surroundings. MR is a promising technological aid in 3D design, for instance, an interior designer can map out a room using their phone and place digitally created furnishings. Similarly, any real-world object can be 3D mapped using a ToF sensor camera and digitally manipulated. A number of such commercial mobile applications already exist but fail to give a realistic mapped result due to the device’s camera (4).

ToF would significantly enhance depth information and in turn, mapped images opening endless possibilities for innovative and quick 3D design. Another application of mixed reality lies in business interactions, whereby professionals can show realistic site/project images for greater understanding.

MR is also being used to create real-world business backdrops where people can be inserted into a person’s field of view, and a virtual meeting can become a real-life meeting. ToF lies at the heart of this operation as only the depth information provided by 3D imaging can take this experience to a level close to reality. The recently introduced metaverse concept by social networking giants will make use of this precise technology increasing the demand and innovation for ToF sensors.

Gaming is another area that is already utilizing the benefits of ToF in AR/MR domains. Gaming devices and now smartphones are using the gesture recognition capability of ToF sensors for total immersive experiences. When introduced into the front camera of mobile devices, ToF can greatly enhance security measures such as face recognition. With the use of 3D in the rear camera, ToF promises outstanding computational photography and an immersive VR experience. Without extensive post-processing, ToF sensors can effectively render background blur in pictures and videos.

Ideally, mixed reality is much more advanced compared with virtual and augmented reality technologies. It sets comparatively stringent requirements on processing power and requires highly accurate acquisition of 3D depth data in real time. Moreover, lighting conditions present a major challenge in achieving the desired realism in MR. A robust ToF camera must be able to capture accurate depth data in all lighting conditions ranging from indoor environments like houses and vehicles to outdoors weather-based lighting differences.

The use of MR in business interactions and day-to-day life may seem a thing of the far future to many. However, the world experienced an exponential increase in the demand for services that blur the boundaries between the virtual and real world in the era of the COVID-19 pandemic leading to an acceleration of non-face-to-face services. It can, therefore, be adequately expected that AR/MR will emerge as essential technologies for work and entertainment in the near future. Needless to say, technologies like ToF, which form the backbone of superior MR performance, will become all the more important.

ToF sensors from Infineon

Infineon introduced the REAL3TM image sensor family in collaboration with pmd technologies, a German company specializing in ToF technology and leading in that area. The sensors use infrared light for Time of Flight sensing and deploy micro lenses for each pixel resulting in virtually no loss and higher accuracy. Infineon’s sixth generation of REAL3™ 3D ToF sensors are especially focused on consumer applications, such as smartphone applications specifically AR/MR based photo and videography. The sensor consumes 40% less power than previous generations, which is key to preserving battery life in applications such as gaming that require the ToF camera to be on for long intervals. Keeping in view the strict area constraints of modern smart devices, the ToF sensor is designed to occupy a mere 4.4 × 4.8 mm² footprint, which is a 35% compared to the older generations.

Another distinguishing feature of the REAL3™ imager system is the enhanced sensing range of up to 10 m. A ToF sensor’s depth accuracy increases with higher modulation frequency, however, the sensor’s range suffers. The REAL3™ imager, therefore, utilizes two modulation frequencies and careful post-processing of the collected data to achieve a substantial range without sacrificing depth quality. The sensors are also equipped with pmd technologies’ SBI technology (Suppression of Background Illumination) which reduces pixel saturation allowing the camera to operate in various lighting conditions. The sensors can also be dynamically reconfigured via the I²C interface to adapt to new operating environments. With all these features consolidated into a single imaging system, seemingly impossible applications now appear to be a definite reality. Designers must keep a lookout for Infineon’s latest innovations for highly accurate 3D imagers.

Details of ToF Sensors from Infineon for consumer and industrial markets can be accessed here and for automotive from here.

About the sponsor: Infineon Technologies

Infineon Technologies AG is a world leader in semiconductor solutions that make life easier, safer and greener. Microelectronics from Infineon are the key to a better future. With around 50,280 employees worldwide, Infineon generated revenue of about €11.1 billion in the 2021 fiscal year (ending 30 September) and is one of the ten largest semiconductor companies worldwide. To learn more click here.
1678072673451.png


References

1. Stratview Research. Time-of-Flight (ToF) Sensor Market | Market Size, Share & Forecast Analysis | 2021-26 [Internet]. 2022 Jul [cited 2022 Jul 31]. Available from: https://www.stratviewresearch.com/1731/time-of-flight-(ToF)-sensor-market.html

2. Counterpoint Research. Brighter, steadier, smarter: How smartphone cameras will improve in 2022 [Internet]. BBC Future. [cited 2022 Aug 20]. Available from: https://www.bbc.com/storyworks/futu...r-how-smartphone-cameras-will-improve-in-2022

3. Counterpoint Research. Global Mobile Camera Trends 2022: Innovation Talk [Internet]. Counterpoint Research. 2021 [cited 2022 Aug 20]. Available from: https://www.counterpointresearch.com/global-mobile-camera-trends-2022-innovation-talk/

4. Velichko Y. How AR Fills Your Room with Virtual Decor [Internet]. PostIndustria. 2021 [cited 2022 Jul 31]. Available from: https://postindustria.com/how-ar-fi...irtual-furniture-placement-apps-ar-furniture/
 
  • Like
  • Fire
  • Love
Reactions: 11 users
I thought it went something like this:

AKIDA 1.0:
  • Akida 1000
  • Akida 1500
AKIDA 2.0:
  • Akida 2000
  • Akida 2500 (maybe, who knows in the near future?)
Then it goes all the way up to...

AKIDA 10.00 (in 2030)
  • AKIDA 10,000 (the "holy grail" of general artificial intelligence, by which time we'll all be so vastly wealthy IMO that we might not give two figs what it's called).
If we're having this much trouble, with the naming of the product lineup, imagine what trouble Kochie would be having..

_20230306_135246.JPG


He's still trying to work out how to pronounce nuromorphic! 🤣

(edit. Of course I know how to spell neuromorphic! Sheezz 🙄.. Do you think I'm heavily invested in something I can't even spell, let alone understand? That was put in for effect 😉)
 
Last edited:
  • Haha
  • Like
Reactions: 27 users
D

Deleted member 2799

Guest
I wonder what the status is regarding the collaboration with the IFS Accelerator - IP Alliance. This must be a real blow for the other participants, but also an enrichment... Has anyone heard anything about the current status in the past? It's also dead quiet there
 
  • Like
Reactions: 1 users
It took so some time for a non technical person what akida can do. Now with second generation they come out with blastic possibilities, I imagine the same will melt even a few tech savvy people.
But on the whole it is not just like iPhone 13 to iPhone 14 but it is a sea change in capabilities of newer akida.
DYOR
Well at least one thing will come out of this we will not have to suffer the 'is this competition' where AKIDA 2000 is concerned because with a five year lead and all that it can do about to be laid out in the press release, the website and by Edge Impulse there will be no grey areas for people to worry someone has a similar product.

It will be like comparing a genuine Police Call Box with Dr. Who's Tardis.

The only similarity will be they are painted the same colour.

Like the Tardis the minute you open the door to the AKIDA specs it will make glaringly obvious that you have walked into a science fiction future.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Haha
Reactions: 48 users
If we're having this much trouble, with the naming of the product lineup, imagine what trouble Kochie would be having..

View attachment 31282

He's still trying to work out how to pronounce nuromorphic! 🤣
Hope he's enjoying his Bitcoin fame🤣
 
  • Like
  • Haha
Reactions: 6 users

BaconLover

Founding Member
  • Like
  • Love
  • Fire
Reactions: 11 users

TheFunkMachine

seeds have the potential to become trees.
Without going into full detail that ANN has some very powerful wording from start to finish.
Its like Sean has gone into full beast mode. 💪
And this is how you defend your company with a falling share price.
 
  • Like
  • Love
  • Fire
Reactions: 10 users
Can some Answer me this riddle
Why are we not on comsec most advanced stock list
Or on there as med cap we are up 18.2% for f sake
 
  • Like
Reactions: 2 users

Beebo

Regular
Why would they time this for US market open, when BRN is listed on the ASX?

Not sure I follow the significance?
Because exposing this Australian gem of a company to THE top dog market of the world is a no brainer!

We have the GOODS to become tech-darling with explosive sales around the corner.

So what else will the official launch unveil? Who else might be invited to? 😀
 
  • Like
  • Fire
Reactions: 12 users

Mugen74

Regular
Press more and Look in sml cap most active
 
  • Like
  • Thinking
Reactions: 4 users

Cyw

Regular
Can some Answer me this riddle
Why are we not on comsec most advanced stock list
Or on there as med cap we are up 18.2% for f sake
It was on the small cap section earlier before all these mining companies shot up more than 20% and pushed BRN off the list. Don't forget BRN is a small cap stock now in Commsec's books.
 
  • Like
Reactions: 2 users

Diogenese

Top 20
What seems like a lifetime ago but it was only last year I reminded everyone that Brainchip in 2019 around May announced that they had released AKIDA1000 IP to select customers. Then in early 2020 they announced that the AKIDA1000 IP was generally available. But it was not until October, 2020 that the actual engineering samples of the AKIDA1000 was released to Early Access Customers.

I suggested that based upon this experience the Fact that Peter van der Made had completed the AKD2000 IP in late 2021 and was shortly going to hand it off to engineering that they may well also have released this IP to select customers and as it was predicted by Peter van der Made to possibly solve the plastic bag blowing across the road problem that Mercedes Benz might well be one of those select customers not to mention Valeo.

So if as some have suggested that a customer or customers will be part of the upcoming press release then for this to occur my idea that there was an early release of the IP would need to have been correct.

The fact that today's announcement makes a strong point about customer feedback being the driver and the recent reveal that Megachips worked on the backend design for AKIDA1500 suggests very strongly that this has been the case.

If I am correct then any such early customer engagement with the IP will have significant importance to the semiconductor industry at large.

Assuming of course that any such comments are positive.

My opinion only DYOR
FF

AKIDA BALLISTA
At an early stage of the design of 2.0, a software simulation (a la ADE/MetaTF) would have been produced and distributed to EAPs, and changes in accordance with customer recommendations incorporated as they were received.

The software implementations would have been the easiest part of the development. Designing the electrical logic circuitry to implement the changes would have been the most difficult part ... and then the electrical circuit would have been mapped to the mask works for the tape out (just a simple cut and paste/drag and drop job) prior to putting the wafer in the oven.

If Intel do not embrace Akida, they run the risk to being relegated to a fly-spot on the tapestry of history.
 
  • Like
  • Fire
  • Love
Reactions: 57 users

Dr E Brown

Regular
Mickleboro article is out on Fool. Oh dear!
 
  • Haha
  • Like
Reactions: 8 users
Top Bottom