Looks to me like the different charges went bang and we've lost both?
Hopefully I'm wrong.
Frangi, I would also miss your detective work for us very much.
Simon Thorpe has advocated a universal income for many years to offset the disruption to the workplace from AI.
Now Sam Altman has jumped on the bandwagon and will no doubt make a bigger splash:
OpenAI's Statement SHOCK the Entire Industry! AI Riots vs "Moore's Law for Everything" by Sam Altman (youtube.com)
OpenAI's Statement SHOCK the Entire Industry! AI Riots vs "Moore's Law for Everything" by Sam Altman
...
@ 19:30
View attachment 58361
This will be a hard sell in ideologically entrenched America. Still they hounded Chaplain out so this bloke shouldn't prove too difficult.
Smart people in Singapur right?
Foundation models, now powering most of the exciting applications in deep learning, are almost universally based on the Transformer architecture and its core attention module. Many subquadratic-time architectures such as linear attention, gated convolution and recurrent models, and structured state space models (SSMs) have been developed to address Transformers' computational inefficiency on long sequences, but they have not performed as well as attention on important modalities such as language. We identify that a key weakness of such models is their inability to perform content-based reasoning, and make several improvements. First, simply letting the SSM parameters be functions of the input addresses their weakness with discrete modalities, allowing the model to selectively propagate or forget information along the sequence length dimension depending on the current token. Second, even though this change prevents the use of efficient convolutions, we design a hardware-aware parallel algorithm in recurrent mode. We integrate these selective SSMs into a simplified end-to-end neural network architecture without attention or even MLP blocks (Mamba). Mamba enjoys fast inference (5× higher throughput than Transformers) and linear scaling in sequence length, and its performance improves on real data up to million-length sequences. As a general sequence model backbone, Mamba achieves state-of-the-art performance across several modalities such as language, audio, and genomics. On language modeling, our Mamba-3B model outperforms Transformers of the same size and matches Transformers twice its size, both in pretraining and downstream evaluation.
Smart people in Singapur right?
Whilst I originally posted the NASA Brainstack paper back in Aug, I just found the presso from the same time while doing some random googling.Whilst only a small mention, great to see confirmation we're still in the mix with some heavy hitters over at NASA & Ames Research Centre.
Gotta say something about the tech
The original neuromorphic flight "test" was with Loihi and the TechEd Sat from what I read but states moving to next phases with more neuromorphic.
BRAINSTACK – A Platform for Artificial Intelligence & Machine Learning Collaborative Experiments on a Nano-Satellite
Date Acquired
August 2, 2023
Publication Date
August 11, 2023
Publication Information
Publication: SmallSat Conference Proceedings
View attachment 42347
View attachment 42348
Part of the conclusion...
Building on earlier successes with GPUs, and more recently, a neuromorphic processor flight test, the notion of a collaborative BrainStack orbital AI/ML
laboratory module is presented. The intention is to be able to perform experiments on multiple hardware and
software AI/ML elements on the same flight with different collaborative teams.
The TES-n Common AI/ML Software Interface will permit a menu driven set
of experiments across individual elements. The next set of TES BrainStack experiments will host combinations
of GPUs and neuromorphic processors, with flexibility to support upcoming novel systems and their unique interfaces. Such a collaborative BrainStack system will greatly expand the use of these remarkable new tools and methods in the space sector.
Full paper:
HERE
You're a bad smell frangipani. Stay away until those you targeted feel free to return.Don’t worry, my detective agency has not closed down. So stay tuned!
But I do have a life outside TSE, and yesterday, I preferred to enjoy real-life fresh spring air over breathing in all those toxic fumes blowing in my direction in the virtual world of anonymous posters.
Maybe this type of message is best sent through messages so we don't all have to read it? Or do you need team support for this type of comment? (Asking for a friend).You're a bad smell frangipani. Stay away until those you targeted feel free to return.
Good Morning!
Can I see a BOOM please? Thank you!
Maybe tell your friend to follow their own advice. Frankly, if they can't be more generous and less of a narcissist, if they can't be slightly distracted by the improving SP, then move on. Why do they want to target individual contributors anyway?Maybe this type of message is best sent through messages so we don't all have to read it? Or do you need team support for this type of comment? (Asking for a friend).
A "supercut " of Intels presentation with "ARM" partnership worth a watch imo.Goodmorning everyone.... Have a great week! It's going to be a cracker.... ( just hoping of course, no insider info).
A "supercut " of Intels presentation with "ARM" partnership worth a watch imo.
Now what little start-up is partnered with both ? ............ ..........................................................
AKIDA BALLISTA