BRN Discussion Ongoing

Where I picked up this idea was looking at the idea of a Nasdaq listing and I read about special provisions for Emerging Growth Companies which would cover at face value Brainchip if it chose to move to the Nasdaq.

My first reading indicated that reporting conditions were greatly reduced and now on a second reading it seems that the level and type of reporting is determine on a case by case basis so I am unable to say if my first impression was correct. It does seem that annual reporting requires full disclosure.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
Reactions: 5 users
Where I picked up this idea was looking at the idea of a Nasdaq listing and I read about special provisions for Emerging Growth Companies which would cover at face value Brainchip if it chose to move to the Nasdaq.

My first reading indicated that reporting conditions were greatly reduced and now on a second reading it seems that the level and type of reporting is determine on a case by case basis so I am unable to say if my first impression was correct. It does seem that annual reporting requires full disclosure.

My opinion only DYOR
FF

AKIDA BALLISTA
@Fullmoonfever above was for you I did not attach it properly.
 
  • Love
  • Like
Reactions: 2 users

equanimous

Norse clairvoyant shapeshifter goddess

1661770220518.png

[AI Seminar] Limitations of Neuro AI​

Aug 9, 2021
In the fifth AI seminar, we had the opportunity to listen to a lecture on the 'Limitations of Neuro AI' given by senior researchers from LG Electronics AI Research Center, Kim Ko-keun and Kim Jae-hong. The lecture was divided into two sessions. The first session introduced Neuromorphic and Neuro-symbolic AIs, and the second session dealt with some research cases in the field of Neuro-symbolic AI.

Session 1: Neuromorphic AI and Neuro-symbolic AI​

Over the past few years, the field of AI made a big leap forward with great improvements in processing capacity and computational efficiency as well as new insight in deep learning. Despite the developments, AI still has limitations. How can these limitations be overcome?
There are many ways to approach them, but in this session, we took a look at "Hyperscale AI", "Neuromorphic AI", and "Neuro-symbolic AI."
3 ways to overcome the limitations of Neuro AI


3 ways to overcome the limitations of Neuro AI

1. Hyperscale AI​

'Hyperscale AI' uses computing infrastructure with high computation speed to learn large volumes of data.
LG Electronics plans to invest at least 100 million dollars in AI computing infrastructure over the next three years to develop Hyperscale AI that thinks, learns and makes decisions on its own. That is a huge investment, right? It will be the first investment of this scale in this field made by a global manufacturing company. LG Electronics will continue its efforts to improve customer values through Hyperscale AI, by enhancing service quality, shortening product development process, and more.

2. Neuromorphic AI​

The human brain has more than 150 trillion neuronal junctions called synapses, which are used by neurons to exchange signals with other neurons, and processes information in a split second. The high computational and decision-making speeds of a human brain cannot yet be matched by the highest-quality AI there is. The trend now is to mimic the human brain to achieve a higher processing speed, and a synaptic information transfer system called the spiking neural network is emerging as a new key in AI. AI that mimics the neural structure of the human body is called “Neuromorphic AI.”
Neuromorphic AI lets the computer imitate the parallel processing of the human brain rather than use serial processing in order to remember and compute large amounts of data simultaneously. Many efforts are currently being made to narrow the gap between the human brain and AI.
Comparison of human neural networks and AI computational processing methods


Comparison of human neural networks and AI computational processing methods

3. Neuro-symbolic AI​

Do you remember that the title of the fourth AI seminar was “Can deep learning teach rationality, ethics and philosophy from real life?” We had already dealt with "Neuro-symbolic AI" in that seminar, and we re-visited the subject in this lecture. Certainly, this tells us how hot this topic is in the field of AI at the moment.
Neuro-symbolic AI combines deep learning with reasoning and application capabilities. It enables the computer to understand human languages and dialogs better and is being used as an important measure of development in the field of natural language processing. If the development of Neuro-symbolic AI can be accelerated, we will soon be able to see our voice assistants and chatbots giving us helpful answers to most of our questions rather than nonsense ones.

Session 2: Neuro-symbolic AI research cases​

Neuro-symbolic AI is categorized into six types as illustrated in the figure below.
The most prominent of these at the moment is the third one, which uses a neural model to sense objects and a symbolic model to reason. In the second session, we looked at research cases on this particular type of neuro-symbolic AI, focusing on formula and research trends.
6 Types of Neuro-Symbolic Systems


6 Types of Neuro-Symbolic Systems

Although AI technology is growing at a rapid pace, it has not yet reached a level comparable to human intelligence. However, as the gap between humans and AI gets narrower and the AI technology more sophisticated, the AI will be able to better understand our thoughts and communicate with us more smoothly.
Let’s hope that the "Hyperscale AI", "Neuromorphic AI" and "Neuro-symbolic AI" technologies can enable the AI to more deeply understand humans and provide more beneficial answers to human society.

Overcoming the limitations of Neuro AI by Hyperscale AI, Neuromorphic AI, and Neuro-symbolic AI.


Overcoming the limitations of Neuro AI by Hyperscale AI, Neuromorphic AI, and Neuro-symbolic AI.

ThinQ.AI is also making strides to reach a point where it does not need any human intervention to function.
This September, LG Electronics will be releasing the knowledge-based ThinQ.AI 5.0 to offer real values to customers. Please look forward to experiencing this product.
Introduce ThinQ.AI 5.0
 
  • Like
  • Fire
  • Love
Reactions: 14 users

JK200SX

Regular
Back in January we were all contemplating AKIDA going into space with NASA, and there was also the comment by Anil Mankar of the 19nm or 90nm chip that may be used by them.
Artemis 1 launches in about an hour......... possibility of an AKIDA chip onboard?
 
  • Like
  • Love
  • Thinking
Reactions: 15 users

wilzy123

Founding Member
There are two explanations for a post like this.

1. Somebody has access to pharmaceuticals that they shouldn't necessarily be indulging in

OR

2. Somebody is overdue for a session with their therapist

Either way, I value the enthusiasm.

 
  • Haha
  • Like
  • Fire
Reactions: 33 users

uiux

Regular
Back in January we were all contemplating AKIDA going into space with NASA, and there was also the comment by Anil Mankar of the 19nm or 90nm chip that may be used by them.
Artemis 1 launches in about an hour......... possibility of an AKIDA chip onboard?

There's cubesats onboard apparently..

But NASAs mission plans are explained here:


"July 2023: PACE4 nanosat mission through Van Allen belts. TRL"

Always a possibility there are classified project's happening in parallel to the public projects. If that's the case, who is to say Akida isn't already orbital?
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 23 users

Kozikan

Regular
  • Haha
  • Like
Reactions: 7 users

uiux

Regular
Ford research on event based vision eg. Prophesee



EBSnoR: Event-Based Snow Removal by Optimal Dwell Time Thresholding

We propose an Event-Based Snow Removal algorithm called EBSnoR. We developed a technique to measure the dwell time of snowflakes on a pixel using event-based camera data, which is used to carry out a Neyman-Pearson hypothesis test to partition event stream into snowflake and background events. The effectiveness of the proposed EBSnoR was verified on a new dataset called UDayton22EBSnow, comprised of front-facing event-based camera in a car driving through snow with manually annotated bounding boxes around surrounding vehicles. Qualitatively, EBSnoR correctly identifies events corresponding to snowflakes; and quantitatively, EBSnoR-preprocessed event data improved the performance of event-based car detection algorithms.

---

This work was made possible in part by funding from Ford Motor Company University Research Program.
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Proga

Regular
  • Like
Reactions: 4 users

uiux

Regular
 
  • Like
Reactions: 5 users

uiux

Regular
1661778212893.png
 
  • Haha
  • Like
Reactions: 9 users

Learning

Learning to the Top 🕵‍♂️
The mckinsey Technology Trends Report.
It's great to be a shareholder.
 
  • Like
  • Love
Reactions: 13 users
I personally think you can argue two ways:

1. As income is reported quarterly and licence fees are reported on this basis he could mean quarterly, or

2. As the Annual Report is final arbiter of the years performance he could mean annually.

I think 2. because he is American and quarterly reporting is not a thing required by the SEC.

My wild speculation so DYOR
FF

AKIDA BALLISTA
Or if company X pays 2million in licensing BRN could be expected to be paid 8 to 10 million in roalties from that same company once implemented in their product has taken place.

That was my laymans interpretation of that interview
 
  • Like
Reactions: 7 users
What an amazing individual you are. Many thanks I now have something to read at Spotlight while my darling wife takes hours to not make up her mind. LOL

FF

AKIDA BALLISTA
amazing individual:ROFLMAO: You're too kind with your words. I hope the spotlight shopping expedition was not too painful.
It brings me back to the days when I was a toddler going to Kmart with my mother. The dress aisles were so boring when I just wanted to hang out in the toy section and the bicycle section.
 
  • Like
  • Love
  • Fire
Reactions: 12 users
The risk is in the tail:

“ If the announcement is not capable of being drafted to meet these requirements without including the commercially sensitive information, then Listing Rule 3.1 will require the commercially sensitive information to be disclosed”

The problem with the ASX is they will not give guidance or advice to companies in advance and only step forward to judge after publication when the ASX can direct additional information.

It is a legal nightmare for small growing technology companies on the ASX.

My opinion only DYOR
FF

AKIDA BALLISTA
Look up Asx Listing Rules Guidance Note 8

SC
 
  • Like
Reactions: 1 users

Proga

Regular
True, generally, younger people are MORE tech savvy, however, the some of the older generation likes to read the cars user manual back to front first. There is also a reasonable older generation here that are invested in BRN, you could say they are also tech savvy

Either way like most things in life people are adaptable and can learn!

If Merc are going to role it out, they're going to capitalise and role it out on all applicable models,
Sorry @AARONASX I missed your post. Too busying defending myself from a couple of grubs.

If Merc are going to role it out, they're going to capitalise and role it out on all applicable models - that was my first thought until Bravo posted the article. Bravo and I had been discussing it for a few weeks and my thinking not dissimilar to yours was Merc was going to wow the market and role it out in all models. I posted that a couple of times in our discussions. But looks like they are taking the conservative route to make sure they get it right before unleashing it on their more valuable higher margin clients. Interesting to note, the C and E class are MB's highest selling models.

When reading it I had the same thoughts about us older'ish tech savvy posters in here. It was disappointing as hell and made me re-evaluate my own retirement plans. Bravo and I were hoping for it to be released in 2023 for the 2024 model but as Bravo reposted today it was going to be released in 2024 for the 2025 models. Then only in the MMA platform electric versions. I knew they weren't going to electrify the B-class and if they did would have been produced on the MMA platform, so didn't include them in my original post. I read another article some months back they were still thinking about producing ICE vehicles after 2030 for non-western countries if the demand was there. Obviously they'll be slower to adopt and implement the infrastructure required for EV's to start legislating the banning of ICE vehicles.

from the article:
Mercedes-Benz's plan is to build mb.OS for the first time in compact and medium-sized electric vehicles based on the MMA platform, and to adopt a self-developed operating system in all subsequent models - I'm assuming/praying this includes ICE vehicles as well not just their electric range

The reason for choosing to launch MB.OS on the MMA platform model, Mercedes-Benz's consideration is that the users of the entry-level model are younger, more receptive to new things, and can also propose improvements to the system.

I hope this helps
 
Last edited:
  • Like
Reactions: 5 users

Cardpro

Regular
I DON'T KNOW ABOUT ANYONE ELSE, BUT I"M GETTING VERY EXCITED!!!!!!!

The new Mercedes EQE SUV will make its official world debut on October 16 2022!

So, we all know that Mercedes-Benz, plans to officially launch MB.OS in 2024. But what is less well-known is that from 2022 to 2023, Mercedes-Benz will be equipped with a lightweight version of the MB.OS operating system on the next generation of new E-class models, before the full version of the MB.OS operating system is launched in 2024.

Here are some snippets from various articles about the EQE SUV which are making me feel obligated to dash off to Dan Murphy's to top-up my diminishing supply of champers!

🍾🥂



View attachment 15305




View attachment 15306

View attachment 15307





View attachment 15308








EQE and the MBUX Hyper screen
 
  • Like
  • Fire
Reactions: 10 users
  • Like
  • Fire
Reactions: 3 users
  • Like
  • Fire
  • Wow
Reactions: 12 users
Top Bottom