Pom down under
Top 20
Wonder what they mean by more hmm
MorningMorning
Yes - it's above my pay grade and very heavy going.Thank you.
I understood that.
A score of 8000 previous words for context is amazing to me.
Had no idea LLMs were that powerful.
No wonder they use a lot of juice.![]()
Well don’t got back to bed and get up againMorning
our price was red when I got up, now it was green and now it is at 0%. At the moment he doesn't know where he wants to go.
![]()
It's definitely Intel right?..
Last time I looked it wasIt's definitely Intel right?..
The official comments coming from them on this, sound a little "too" casual?...
Hi Bravo,Ooh-Ooh-Ooh!
Hey Eskie, there maybe 4 entities as Softbank‘s Masayoshi Son is looking to raise up to $100 billion for a chip venture that will rival Nvidia. This project is apparently set to focus on semiconductors essential for artificial intelligence.
Ps: Bravo reporting for duty, currently couch surfing at my Mums armed with an iPad and not much else after exiting my town that is still without power or interwebs.![]()
If you swap the numbers on each side of the decimal point ...
Hi Sirod,Machine Learning Engineer - Neuromorphic Computing (Akida 2) - London
We are at the forefront of advancing neuromorphic computing technology. We are dedicated to developing cutting-edge solutions that transform how machines learn and interact with the world. Our team is growing, and we are seeking a talented Machine Learning Engineer to join our London office, focusing on developing applications using the Akida 2 neuromorphic computing platform.
Job Description:
As a Machine Learning Engineer, you will play a crucial role in our dynamic team, focusing on the development and implementation of machine learning algorithms tailored for the Akida 2 neuromorphic computing platform. Your expertise will contribute to optimizing AI models for energy efficiency and performance, aligning with the unique capabilities of neuromorphic computing.
Machine Learning Engineer - Neuromorphic Computing (Akida 2) - London
We are at the forefront of advancing neuromorphic computing technology. We are dedicated to developing cutting-edge solutions that transform how machines learn and interact with the world. Our team is growing, and we are seeking a talented Machine Learning Engineer to join our London office, focusing on developing applications using the Akida 2 neuromorphic computing platform.
Job Description:
As a Machine Learning Engineer, you will play a crucial role in our dynamic team, focusing on the development and implementation of machine learning algorithms tailored for the Akida 2 neuromorphic computing platform. Your expertise will contribute to optimizing AI models for energy efficiency and performance, aligning with the unique capabilities of neuromorphic computing.
The share price will go up, down or stay the same - or all of them.Morning
our price was red when I got up, now it was green and now it is at 0%. At the moment he doesn't know where he wants to go.
![]()
Larry needs an accurate wake word recognition program and digital assistant!!!!!! Now where would I find one??????? (Warning strong language)
Definitely the company Intel as you click and it takes to their page
I tried that, but it appears to be just an image for me..Definitely the company Intel as you click and it takes to their page
Yikes!Yes - it's above my pay grade and very heavy going.
In speech recognition, in some cases, the speech must first be converted to text although, it is also possible to work with phonemes.
The processor needs to understand the nature of the words:
Nouns
Verbs
Adjectives
Adverbs
Prepositions
Conjunctions
Articles
... then to comprehend the context.
One problem is discovering how far back you need to go to find the correct context.
In written language, a lot of context would be found in a single sentence. A paragraph would capture a lot more context. But the context. In a book, it may be necessary to recall a chapter to descry the context.
In normal speech, the context should be close at hand (or ear), unless it is a familiar term.
LSTM, Transformers, and Attention, not to mention LLMs, have come along in quick succession.
This 2021 paper gives an inkling of the complexity:
Thank you for Attention: A survey on Attention-based Artificial Neural Networks for Automatic Speech Recognition
Priyabrata Karmakar, Shyh Wei Teng, Guojun Lu
https://arxiv.org/pdf/2102.07259.pdf
There have been several attempts to find the optimal Attention system:
TABLE I
DIFFERENT TYPES OF ATTENTION MECHANISM FOR ASR
Name Short description
Global/Soft [10] At each decoder time step, all encoder hidden states are attended.
Local/Hard [23] At each decoder time step, a set of encoder hidden states (within a window) are attended.
Content-based [24] Attention calculated only using the content information of the encoder hidden states.
Location-based [25] Attention calculation depends only on the decoder states and not on the encoder hidden states.
Hybrid [11] Attention calculated using both content and location information.
Self [20] Attention calculated over different positions(or tokens) of a sequence itself.
2D [26] Attention calculated over both time-and frequency-domains.
Hard monotonic [27] At each decoder time step, only one encoder hidden state is attended.
Monotonic chunkwise [28] At each decoder time step, a chunk of encoder states (prior to and including the hidden state identified by the hard monotonic attention) are attended.
Adaptive monotonic chunkwise [29] At each decoder time step, the chunk of encoder hidden states to be attended is computed adaptively. models
This is a diagram of the configuration of a transformer-based encoder/decoder:
View attachment 57212
Unfortunately, the paper does not cover TeNNs.
It's getting to where there's only one chair left. Who will find a seat when the music stops?