BRN Discussion Ongoing

AusEire

Founding Member. It's ok to say No to Dot Joining
Perhaps this is a good omen to keep your mouth watering.......... :)

View attachment 45975

Interesting codename 🧐

Drool would be the least of my worries if it contains Akida in any form!
Aroused The Office GIF
 
  • Haha
  • Like
Reactions: 14 users

Wags

Regular
Hi all, not sure of the correct protocol, if any.
Just advising name change from Maccareadsalot, to Wags.
Same person, same attitude, same ingredients.
Staywell all,
Macca
 
  • Like
  • Haha
  • Fire
Reactions: 26 users

Diogenese

Top 20
  • Haha
  • Like
Reactions: 12 users

Deadpool

hyper-efficient Ai
Not technically brn related, but Space Jesus is worried 😟

 
  • Haha
  • Like
Reactions: 4 users

cosors

👀
Good question, I wonder myself where all those dumb nonsense speculations without any proof of context come from.
According Mr Karl Popper and me a theory is right or not wrong until it is disproved. So make an effort to disprove.
I stick to a plausible causal chain until it has been disproved.
Simply stating that something is dumb nonsense is not enough for me.
Before you were specific with Lidar and MB. Here you are much too general for me.
Someone once claimed that the world is a sphere. That was considered heresy. We know the rest of the story.
Just as an example of why I think this approach makes more sense than calling others out for their research and dot joining.
 
Last edited:
  • Like
  • Love
Reactions: 17 users

cosors

👀
Great suggestion! Let’s see some “serious research” from you then!
Thank you for your tireless work. Please never let up!
And, I really like your feet. Maybe this will help motivate you.
 
  • Haha
  • Like
  • Love
Reactions: 19 users

Satchmo25

Member
  • Like
  • Thinking
Reactions: 2 users

cosors

👀
I am thrilled! I have Sunday to check only three pages of posts to see if anyone is saying anything wrong.)
 
  • Haha
  • Like
  • Love
Reactions: 16 users

IloveLamp

Top 20
 
  • Like
  • Fire
  • Love
Reactions: 11 users

Dhm

Regular
This is Apple's toaster patent:

US2022222510A1 MULTI-OPERATIONAL MODES OF NEURAL ENGINE CIRCUIT 20210113



View attachment 45961

[0052] Referring to FIG. 3, an example neural processor circuit 218 may include, among other components, neural task manager 310 , a plurality of neural engines 314 A through 314 N (hereinafter collectively referred as “neural engines 314 ” and individually also referred to as “neural engine 314 ”), kernel direct memory access (DMA) 324 , data processor circuit 318 , data processor DMA 320 , and planar engine 340

[0053] Each of neural engines 314 performs computing operations for machine learning in parallel. Depending on the load of operation, the entire set of neural engines 314 may be operating or only a subset of the neural engines 314 may be operating while the remaining neural engines 314 are placed in a power-saving mode to conserve power. Each of neural engines 314 includes components for storing one or more kernels, for performing multiply-accumulate operations, for performing parallel sorting operations, and for post-processing to generate an output data 328 , as described below in detail with reference to FIGS. 4A and 4B. Neural engines 314 may specialize in performing computation heavy operations such as convolution operations and tensor product operations. Convolution operations may include different kinds of convolutions, such as cross-channel convolutions (a convolution that accumulates values from different channels), channel-wise convolutions, and transposed convolutions
.


View attachment 45962


[0063] FIG. 4A is a block diagram of neural engine 314 , according to one embodiment. Specifically, FIG. 4A illustrates neural engine 314 perform operations including operations to facilitate machine learning such as convolution, tensor product, and other operations that may involve heavy computation in the first mode. For this purpose, neural engine 314 receives input data 322 , performs multiply-accumulate operations (e.g., convolution operations) on input data 322 based on stored kernel data, performs further post-processing operations on the result of the multiply-accumulate operations, and generates output data 328 . Input data 322 and/or output data 328 of neural engine 314 may be of a single channel or span across multiple channels.

It has 16 neural engines each with an array of MAC (multiply accumulate) processors.

They need to find some sort of neural processor which does not rely on MACs.
Hi @Diogenese thanks for your stirling efforts explaining this. Rhetorical question coming: why couldn’t our senior management make it abundantly clear we have the solution for Apples problem?
 
  • Like
  • Fire
  • Thinking
Reactions: 17 users

Esq.111

Fascinatingly Intuitive.
  • Like
  • Wow
  • Thinking
Reactions: 17 users

jtardif999

Regular
Looks like our mates at the US Air Force - AFRL got another $7m in the kitty approved as below.

Hopefully they allocate some to Dev with Akida.



Washington, September 25, 2023

Amendments to H.R. 4365 – Department of Defense Appropriations Act, 2024​



Carey (R-OH) – Amendment No. 116 – Increases and decreases by $7 million for research, development, test and evaluation for the Air Force with the intent that the $7 million will be used for the development of a cognitive EW machine learning/neuromorphic processing device to counter AI-enabled adaptive threats (Air Force RDT&E, Line 162, PE# 0207040F, Multi-Platform Electronic Warfare Equipment)
Worked with EW years ago; have always thought it would be an eventual likely use case for Akida.
 
  • Like
Reactions: 10 users
I do agree with your views, but I would like to add another dynamic, maybe just maybe two things are or were at play this week, the ASX didn't accept the way the company's planned announcement was presented or there is a material aspect to the announcement/s that wasn't accepted in the way it was presented for release, personally I'm happy with my earlier view of something within the first 2 weeks of October, time shall tell.

The company could be erring on the side of caution as AKD 2.0 wasn't quite ready for release, and that's a mature and responsible approach also.

Go Brisbane 🦁 Regards...Tech
This is definitely a possibility that a clarification email exchange between ASX and BRN has occurred.. It’s fairly common..
 
  • Like
Reactions: 3 users
Probably not what we want to be honest. That would just give shorters further ammunition. Though arguably there are a couple of members that could do with the flick for adding no value.

I have met Antonio (and a couple of other BM personally), and I’m comfortable with him leading our board. Well spoken, knowledgeable, and very experienced with this type of business at this stage of its life.

Let’s just hope that the landscape is different by the next AGM and shareholders are happy with what the company has delivered.

They’re publicly stating that AKD2000 is a game changer for us so in my eyes we should see some contracts before then. The EAP have apparently had this chip for some months, and AKD2000 was in itself a baby of forged from customer directive. With this in mind I see no reason why the company can’t get these contracts across the line. I would also expect our two existing contract to deliver fruit soon, and I’m particularly interested in the Valeo LiDAR system piece. I believe we could be involved and with $Billions in preorders we should get a slice.

This next 12 months are make or break in my eyes, let’s see what they can produce, hopefully without any unnecessary restructures.
I’m 💯 in agreement with you Rob.. Surely there’ll be a atleast 1-2 new IP deals before the years out as Chapman eluded to, and that should ward off the wolves atleast for the next AGM..
 
  • Like
Reactions: 6 users

Mt09

Regular
  • Like
Reactions: 8 users

Deadpool

hyper-efficient Ai
  • Like
  • Love
  • Fire
Reactions: 23 users
They make a NPU Neural Processing Unit, but aren't touting neuromorphic?.. I think there's a difference @Diogenese?


A lot of talk about use for Chat GPT models, so like in your article Esq, they definitely are trying to "ride" the A.I. wave..
They know where the attention is and they're going for it.

I don't think what they've got is a technical threat to us, but they are competition.
 
  • Like
  • Thinking
  • Fire
Reactions: 8 users
They make a NPU Neural Processing Unit, but aren't touting neuromorphic?.. I think there's a difference @Diogenese?


A lot of talk about use for Chat GPT models, so like in your article Esq, they definitely are trying to "ride" the A.I. wave..
They know where the attention is and they're going for it.

I don't think what they've got is a technical threat to us, but they are competition.

A neural processor, a neural processing unit (NPU), or simply an AI Accelerator is a specialized circuit that implements all the necessary control and arithmetic logic necessary to execute machine learning algorithms, typically by operating on predictive models such as artificial neural networks (ANNs) or random forests (RFs).

Nothing special, an accelerator..
 
  • Like
  • Fire
Reactions: 10 users
Top Bottom