BRN Discussion Ongoing

Sirod69

bavarian girl ;-)
telson.jpg
 
  • Like
  • Fire
Reactions: 13 users

Deadpool

hyper-efficient Ai
not sure if posted but this is for UK defence

Senior Data Scientist - Spiking Neural Networks​

Posted 24/01/2023 by MFK Recruitment
Quick apply
Location:TW8, Brentford, Greater London
Quick Apply
Our client based in Brentford is seeking a Senior Data Scientist (Spiking Neural Networks) to join their team.
MFK Recruitment has successfully recruited 2 individuals to the team in the past 18 months, both candidates are really enjoying their roles!
The Senior Data Scientist will work on-site/Hybrid in Brentford, you will be working on fascinating projects mainly for clients in the Defence sector.
My client delivers systems that can perceive and understand the world with the same semantic richness as the human brain and using the same neural mechanisms they believe are at work in the brain.
Their approaches are inspired by their recent advances in theoretical neuroscience, which are beginning to deliver an understanding of how the brain makes sense of the world.
My client requires a Senior Data Scientist with a passion for both technology and the development of practical usable applications to join a growing project team. The project team will work on the development of a Spiking Neural Network (SNN) Engine designed for practical engineering applications.
The successful applicant will have current experience working on the development of SNNs, with strong coding skills in C++ and knowledge of Cuda. Experience in Python and Pytorch would also be valuable. Postgraduate or post-doctoral experience in working on SNNs would be an advantage. Experience working on SNN development in a commercial setting would also be valued.
If you join our team, you will have the opportunity to interact with and contribute to work on the cutting edge of research through the development of key core component technologies required by the company. When you push out your code, you will be able to see the results of your efforts through your impact on product development and the enrichment of key customer relationships. You will contribute to the system design process and work towards the delivery of our collective business goals. Your code and your work will have a very direct, measurable impact on achieving those goals.
Senior Data Scientist – Spiking Neural Networks
Job responsibilities:

  • Apply your existing knowledge of SNNs, machine vision and image analysis techniques to the development of a novel SNN engine based on our research outputs. You may also be asked to work on projects that make use of CNNs and Transformer networks.
  • Enhance machine learning software with the latest in machine learning algorithms.
  • Work with other team members tasked with the development of a system for automating the production of synthetic training images derived from digital 3D virtual reality models of objects.
  • Contribute to the development of processes to curate real-world intelligence, surveillance, and reconnaissance (ISR) imagery, and to present both ISR and synthetic images to networks during training and testing.
  • Contribute to leading development and software engineering efforts to develop solutions to our most pressing problems and to put your work into practice.
  • Contribute to the development of systems that will be core components of our machine vision, image analysis and other AI/ML products.
  • Support the work of other data scientists and developers in the team.
  • Work with research colleagues and offer technical support to colleagues in commercial roles in developing business opportunities. Work with research colleagues to understand the solutions you will need to code.
  • Contribute to development and software engineering efforts as required on other projects yet to be defined.
Spiking Neural Networks, where have I heard that before?
Dare I say, has SNN became ubiquitous.
With Pete's and Anil's love child "baby Akida" front and centre of this technological alchemy of modern times.
"It's good to be the king"
 
  • Like
  • Fire
Reactions: 14 users

Teach22

Regular
I have actually been a LTH for some time. I just do charts to help people out who don’t know how to do TA. Yes our share price is affected by manipulators etc, but that doesn’t mean that charting can’t help some people make decisions about their financial future. Yes I trade the shares up and down and sell and buy etc, I do this because I like everyone else here want to make money not watch it go down the drain.

I get that your angry or whatever but I’m not manipulating the charts as I don’t write them… simply informing people so they can make a decision on their financial future…I want the company to succeed I heavily invested in it.
Interestingly, there was a chartist, who, some 12-18 months ago, was considered something of a messiah on the BRN threads.
As the BRN SP went from 3c to over $2, the messiah was adored by many BRN holders.
Everyone loved this chartist, her charts were telling her that the BRN SP was going to keep going higher and higher, the crowd was delirious, charts were cool. Lots of holders wanted to learn more about TA.

But, like all good things, it came to an end.

All those that had bought since that high are showing red on their portfolio (if they haven’t sold). Holders have now realised that charts are no longer relevant to this company so please move on unless your charts start telling you that we are heading for more blue sky.
 
  • Like
  • Haha
  • Love
Reactions: 14 users

Violin1

Regular
FYI there is a technical analysis thread.
Totally agree. And an obvious reason why nobody is over there reading!
 
  • Like
  • Haha
Reactions: 12 users
Interestingly, there was a chartist, who, some 12-18 months ago, was considered something of a messiah on the BRN threads.
As the BRN SP went from 3c to over $2, the messiah was adored by many BRN holders.
Everyone loved this chartist, her charts were telling her that the BRN SP was going to keep going higher and higher, the crowd was delirious, charts were cool. Lots of holders wanted to learn more about TA.

But, like all good things, it came to an end.

All those that had bought since that high are showing red on their portfolio (if they haven’t sold). Holders have now realised that charts are no longer relevant to this company so please move on unless your charts start telling you that we are heading for more blue sky.
"Everyone loved this chartist"

I can personally attest, to that being a false statement Teach22..
 
  • Like
  • Haha
  • Fire
Reactions: 20 users

equanimous

Norse clairvoyant shapeshifter goddess
Spiking Neural Networks, where have I heard that before?
Dare I say, has SNN became ubiquitous.
With Pete's and Anil's love child "baby Akida" front and centre of this technological alchemy of modern times.
"It's good to be the king"
And we have the patents. Looking forward to your new movie too
 
  • Like
  • Haha
Reactions: 4 users

charles2

Regular
Preamble to what could/will be the most edifying 50 minutes that you all could ever spend digesting how small company stocks are manipulated with impunity ....and some bankrupted ...and retail investors are similarly decimated.

Forget all you preconceived notions...they are the bare tip of the iceberg.

"My company Genius Group $GNS has hired ShareIntel CEO David Wenger & BuyIn.Net Tom Ronk to track illegal trading in our company shares. In this eye-opening video they share how they became the Top 2 trackers of Wall St illegal trading, why it's so hard to track trades, what illegal activity is hiding in our markets and who's hiding it. If you are a CEO or investor in a public company where you suspect foul play, this video is essential viewing to give you a full understanding of what you are up against and the steps you can take to protect your company and your shareholders. There's a movement against Wall St Fraud, and the tide is going out... "Only when the tide goes out do you discover who's been swimming naked.” - Warren Buffett.

It starts slowly.....don't get antsy.

 
  • Like
  • Fire
  • Love
Reactions: 11 users

TopCat

Regular
GlaxoSmithKline have an address in Brentford….

View attachment 29963 View attachment 29962
This article is from a few years ago, but it appears GlaxoSmithKline are no strangers to neuromorphic computing. The attached diagrams are over my head so maybe someone else can make sense of them.
Edit: not sure why the link has superimposed itself onto one of the attachments?



In 2017, GlaxoSmithKline, National Cancer Insitute, and Department of Energy formed ATOM Consortium (An Accelerating Therapies for Opportunities in Medicine (ATOM). ATOM’s ambitious project is to apply deep learning for drug discovery through high-performance computing and DOE Labs at Lawrence Livermore National Laboratories by scanning through the big data of millions of molecules and identifying the relationships in genomics and apply to the new data. Drug discovery with deep learning frameworks requires profound amounts of large-scale big data to research and apply on molecular structures. Oak Ridge Laboratories that specializes in DANNA (Dynamic Adaptive Neural Network Arrays) neuromorphic architecture inspired by brain-like biological architecture that can compute millions of data points with low-power cost using spiking neural networks collaborates with ATOM Consortium for drug discovery. DANNA dominates many neuromorphic hardware implementations that have fixed number of neurons and static synapses. DANNA networks are programmable with the design of evolutionary optimizations.


Stanford researchers applied machine learning and deep learning techniques with deep neural networks through multi-layer architectures. Stanford researchers have applied One-Shot learning paradigm with long short-term memory on TensorFlow with DeepChem library. GlaxoSmithKline has agreed to share clinical trial data with chemical compounds and toxicology big data for a period of 15 years. GlaxoSmithKline is expected to revolutionize the drug discovery in the next few years with supercomputing and deep learning.
 

Attachments

  • 2654607B-0CCF-460D-B741-656D93875DFF.jpeg
    2654607B-0CCF-460D-B741-656D93875DFF.jpeg
    630.1 KB · Views: 69
  • 9D007D7B-2A7A-41B2-8E88-BA3EC9C0E587.jpeg
    9D007D7B-2A7A-41B2-8E88-BA3EC9C0E587.jpeg
    588.3 KB · Views: 64
Last edited:
  • Like
  • Fire
Reactions: 11 users

Tothemoon24

Top 20
So much happening in the tech world atm , sometimes I just want to scream “BRAINCHIP’ , my doctor said it’s not allowed due to the other mental health patients becoming confused 🤪
 
Last edited:
  • Like
  • Haha
Reactions: 20 users

Tothemoon24

Top 20
Skip to main content

DefenseScoop
OPEN NAVIGATION
Advertisement
AI

AI agents take control of modified F-16 fighter jet​

DARPA's air combat evolution program aims to advance the Pentagon’s autonomous systems capabilities as the U.S. military pursues robotic wingmen and other drones.
BYJON HARPER
FEBRUARY 14, 2023
X-62.jpeg
The Variable In-flight Simulator Aircraft (VISTA) flies in the skies over Edwards Air Force Base, California, shortly after receiving its new paint scheme in early 2019. The aircraft was redesignated from NF-16D to the X-62A, June 14, 2021. F-16 AI agents developed under DARPA’s Air Combat Evolution (ACE) program controlled the X-62A during test flights over Edwards AFB, California, in December 2022. (Air Force photo by Christian Turner)
Artificial intelligence agents have demonstrated their ability to control a modified F-16 fighter jet during an initial round of test flights in California as the Defense Advanced Research Projects Agency moves forward with its Air Combat Evolution program, according to DARPA.
The ACE project aims to advance the Pentagon’s autonomous systems capabilities as the U.S. military pursues robotic wingmen and other drones. Industry participants for the recent tests included EpiSci, PhysicsAI, Shield AI and the Johns Hopkins Applied Physics Laboratory, which put their algorithms through their paces.
“In early December 2022, ACE algorithm developers uploaded their AI software into a specially modified F-16 test aircraft known as the X-62A or VISTA (Variable In-flight Simulator Test Aircraft), at the Air Force Test Pilot School (TPS) at Edwards Air Force Base, California, and flew multiple flights over several days. The flights demonstrated that AI agents can control a full-scale fighter jet and provided invaluable live-flight data,” DARPA said in a press release Monday.
“We conducted multiple sorties [takeoffs and landings] with numerous test points performed on each sortie to test the algorithms under varying starting conditions, against various simulated adversaries, and with simulated weapons capabilities,” ACE program manager Lt. Col. Ryan “Hal” Hefron said in a statement.
Advertisement

“We didn’t run into any major issues but did encounter some differences compared to simulation-based results, which is to be expected when transitioning from virtual to live. This highlights the importance of not only flight testing advanced autonomous capabilities but doing so on testbeds like VISTA, which allowed us to rapidly learn lessons and iterate at a much faster rate than with other air vehicles,” he added.
AI agents had previously defeated a human F-16 pilot during a series of simulations that were part of DARPA’s AlphaDogTrials.
DARPA did not disclose any additional information about the different results that were found in the recent flight tests compared to previous simulations.
The agency noted that a human pilot was onboard the two-seat aircraft to take over if anything went awry while the AI agents were in control during the test flights.
Although the X-62A is a modified F-16, it can also be programmed to demonstrate the flight-handling characteristics of a variety of different aircraft types. And the VISTA will support a variety of programs, according to officials.
Advertisement

The platform was recently upgraded with what officials are calling a System for Autonomous Control of Simulation (SACS).
“What we’ve done with investments from DARPA, with investments from the [Air Force] Research Lab is put it an autonomy core kind of brain on there. That’s going to allow us to actually go fly autonomy [technology] and have a person still in the aircraft to intervene if we need to,” Maj. Gen. Evan Dertien, commander of the Air Force Test Center, told reporters during a media roundtable in September at AFA’s Air, Space and Cyber conference.
The VISTA is going to be very busy with flight testing, he noted.
“The X-62 is booked solid. We have a roadmap for the next probably two or three years of all the different programs it will support. We’re also looking at efforts to try to figure out how we would actually bring up more aircraft and get autonomy engines on to accelerate this. But as far as what we’ll continue to do — that that will probably evolve based on the data of what we do. But I think increasing capacity right now is one of our [desired] things,” he said.
“Eventually, hopefully, we can get some other aircraft modified with the autonomy core engine and start accelerating the pace of testing, look at teaming tactics, and get two ships and three ships and things like that going,” he added.
Advertisement

Air Force Secretary Frank Kendall has said that previous progress with the ACE initiative contributed to his decision to move forward with a “collaborative combat aircraft” program. That drone project is expected to receive significant funding in the fiscal 2024 budget, although Kendall has suggested that many aspects of the program will be classified.
“We’re heading down the path to have much more capability for uncrewed aircraft,” Air Force Chief of Staff Gen. Charles “CQ” Brown said Monday at a Brookings Institution event. “When you look at one of our operational imperatives — next-generation air dominance family of systems — we’re going down the path of collaborative combat aircraft.”
Officials envision teaming those next-gen drones with manned platforms such as the F-35 and a forthcoming NGAD fighter.
“As we look into our future budgets there’s three aspects of this. There’s the platform itself, there’s the autonomy that goes with it, and then there’s how we organize, train and equip to build the organizations to go [use that technology]. And we’re trying to do all those in parallel. So we are thinking through aspects” of that, Brown said. “I think you’ll see as we start looking at our future budgets and the analysis we’re doing as part of our operational imperatives that we are committed to more uncrewed capability.”

In This Story​

More Scoops
Image of DARPA AlphaDogfight Trials. (Screenshot from DARPA video)

 
  • Like
  • Fire
Reactions: 15 users

Tothemoon24

Top 20
  • Like
  • Fire
  • Thinking
Reactions: 21 users

equanimous

Norse clairvoyant shapeshifter goddess
Skip to main content

DefenseScoop
OPEN NAVIGATION
Advertisement
AI

AI agents take control of modified F-16 fighter jet​

DARPA's air combat evolution program aims to advance the Pentagon’s autonomous systems capabilities as the U.S. military pursues robotic wingmen and other drones.
BYJON HARPER
FEBRUARY 14, 2023
X-62.jpeg
The Variable In-flight Simulator Aircraft (VISTA) flies in the skies over Edwards Air Force Base, California, shortly after receiving its new paint scheme in early 2019. The aircraft was redesignated from NF-16D to the X-62A, June 14, 2021. F-16 AI agents developed under DARPA’s Air Combat Evolution (ACE) program controlled the X-62A during test flights over Edwards AFB, California, in December 2022. (Air Force photo by Christian Turner)
Artificial intelligence agents have demonstrated their ability to control a modified F-16 fighter jet during an initial round of test flights in California as the Defense Advanced Research Projects Agency moves forward with its Air Combat Evolution program, according to DARPA.
The ACE project aims to advance the Pentagon’s autonomous systems capabilities as the U.S. military pursues robotic wingmen and other drones. Industry participants for the recent tests included EpiSci, PhysicsAI, Shield AI and the Johns Hopkins Applied Physics Laboratory, which put their algorithms through their paces.
“In early December 2022, ACE algorithm developers uploaded their AI software into a specially modified F-16 test aircraft known as the X-62A or VISTA (Variable In-flight Simulator Test Aircraft), at the Air Force Test Pilot School (TPS) at Edwards Air Force Base, California, and flew multiple flights over several days. The flights demonstrated that AI agents can control a full-scale fighter jet and provided invaluable live-flight data,” DARPA said in a press release Monday.
“We conducted multiple sorties [takeoffs and landings] with numerous test points performed on each sortie to test the algorithms under varying starting conditions, against various simulated adversaries, and with simulated weapons capabilities,” ACE program manager Lt. Col. Ryan “Hal” Hefron said in a statement.
Advertisement

“We didn’t run into any major issues but did encounter some differences compared to simulation-based results, which is to be expected when transitioning from virtual to live. This highlights the importance of not only flight testing advanced autonomous capabilities but doing so on testbeds like VISTA, which allowed us to rapidly learn lessons and iterate at a much faster rate than with other air vehicles,” he added.
AI agents had previously defeated a human F-16 pilot during a series of simulations that were part of DARPA’s AlphaDogTrials.
DARPA did not disclose any additional information about the different results that were found in the recent flight tests compared to previous simulations.
The agency noted that a human pilot was onboard the two-seat aircraft to take over if anything went awry while the AI agents were in control during the test flights.
Although the X-62A is a modified F-16, it can also be programmed to demonstrate the flight-handling characteristics of a variety of different aircraft types. And the VISTA will support a variety of programs, according to officials.
Advertisement

The platform was recently upgraded with what officials are calling a System for Autonomous Control of Simulation (SACS).
“What we’ve done with investments from DARPA, with investments from the [Air Force] Research Lab is put it an autonomy core kind of brain on there. That’s going to allow us to actually go fly autonomy [technology] and have a person still in the aircraft to intervene if we need to,” Maj. Gen. Evan Dertien, commander of the Air Force Test Center, told reporters during a media roundtable in September at AFA’s Air, Space and Cyber conference.
The VISTA is going to be very busy with flight testing, he noted.
“The X-62 is booked solid. We have a roadmap for the next probably two or three years of all the different programs it will support. We’re also looking at efforts to try to figure out how we would actually bring up more aircraft and get autonomy engines on to accelerate this. But as far as what we’ll continue to do — that that will probably evolve based on the data of what we do. But I think increasing capacity right now is one of our [desired] things,” he said.
“Eventually, hopefully, we can get some other aircraft modified with the autonomy core engine and start accelerating the pace of testing, look at teaming tactics, and get two ships and three ships and things like that going,” he added.
Advertisement

Air Force Secretary Frank Kendall has said that previous progress with the ACE initiative contributed to his decision to move forward with a “collaborative combat aircraft” program. That drone project is expected to receive significant funding in the fiscal 2024 budget, although Kendall has suggested that many aspects of the program will be classified.
“We’re heading down the path to have much more capability for uncrewed aircraft,” Air Force Chief of Staff Gen. Charles “CQ” Brown said Monday at a Brookings Institution event. “When you look at one of our operational imperatives — next-generation air dominance family of systems — we’re going down the path of collaborative combat aircraft.”
Officials envision teaming those next-gen drones with manned platforms such as the F-35 and a forthcoming NGAD fighter.
“As we look into our future budgets there’s three aspects of this. There’s the platform itself, there’s the autonomy that goes with it, and then there’s how we organize, train and equip to build the organizations to go [use that technology]. And we’re trying to do all those in parallel. So we are thinking through aspects” of that, Brown said. “I think you’ll see as we start looking at our future budgets and the analysis we’re doing as part of our operational imperatives that we are committed to more uncrewed capability.”

In This Story​

More Scoops
Image of DARPA AlphaDogfight Trials. (Screenshot from DARPA video)

And just a refresher to your post.

1676755086284.png

 
  • Like
  • Fire
Reactions: 15 users

Tothemoon24

Top 20

Attachments

  • 57572005-CBB0-4EB3-A5CE-1DCF20D27FC7.png
    57572005-CBB0-4EB3-A5CE-1DCF20D27FC7.png
    1.3 MB · Views: 226
  • 8ECDA399-7E85-4B52-98C1-6992FB7265A5.jpeg
    8ECDA399-7E85-4B52-98C1-6992FB7265A5.jpeg
    820.5 KB · Views: 219
  • Like
  • Fire
  • Love
Reactions: 59 users

Deadpool

hyper-efficient Ai
And we have the patents. Looking forward to your new movie too
Yes I'm thinking of doing a doco "The irrelevance of Von Neuman in the modern era"
 
  • Like
  • Fire
  • Haha
Reactions: 11 users
  • Like
  • Love
  • Fire
Reactions: 26 users

equanimous

Norse clairvoyant shapeshifter goddess
  • Haha
  • Like
Reactions: 8 users

Steve10

Regular
A company similar to Prophesee is iniVation with their neuromorphic vision systems.


iniVation partnered with SynSense in 2019 to develop Speck which is a low power smart vision sensor for mobile & IoT devices.


Speck™ is a fully event-driven neuromorphic vision SoC. Speck™ is able to support large-scale spiking convolutional neural network (sCNN) with a fully asynchronous chip architecture. Speck™ is fully configurable with the spiking neuron capacity of 320K. Furthermore, it integrates the state-of-art dynamic vision sensor (DVS) that enables fully event-driven based, real-time, highly integrated solution for varies dynamic visual scene. For classical applications, Speck™ can provide intelligence upon the scene at only mWs with a response latency in few ms.

Prophesee partnered with SynSense in 2021 to develop a one chip event based smart sensing solution for low power edge AI.

Prophesee partnered with BrainChip in 2022 to optimize computer vision AI performance & efficiency.

Prophesee CEO has mentioned BrainChip is a perfect fit for their event based camera vision sensor.

Qualcomm have recently partnered with Prophesee which has been working with Snapdragon processors since 2018.

Qualcomm mentioned Prophesee event based cameras will be launched this year in their recent presentation, however, there was no mention of SynSense's Speck.

It's a puzzle this one. Unless Qualcomm will use Prophesee's metavision event based sensor only with their own processor suitable for neuromorphic SNN if they have one.

I am intrigued because the smartphone market dominated by Qualcomm will result in big revenue for BRN if Akida IP is embedded in their chip for Prophesee's event based camera. It took ARM nearly 10 years from when they started to get into smartphones.
 
  • Like
  • Love
  • Fire
Reactions: 40 users

Steve10

Regular
I think they will cure blindness in the future. They will develop a micro retina type camera to be placed in the eye ball which will somehow connect to the brain wireless via bluetooth. A small surgery for the eye to install micro camera & another to embed micro neuro transmitter/receiver/processor in the brain.
 
  • Like
  • Fire
  • Thinking
Reactions: 17 users

mrgds

Regular
Seems like Open Ai has changed tack for CHAT Gpt4
I know they talk of their partnership with Cerebras, ................. BUT


Is it just me , or , does anybody else find themselves saying,

"hey, thats what Akida can do/make better " ................... :sneaky:

Its the newest version of CHAT Gpt ( 4 )

IE ............ SPARSITY = less computational power consumption
............. MULTIMODAL LANGUAGE MODEL = RTs continual emphasise on multi modalities
..............ADVANTAGE OF FASTER CHIPS OR HARDWARE = have they found a better way to reduce the computational power issues?
...............SELF HEALING SENSORS = Brainchips catchcry ........... making sensors smart
................ARTIFICIAL INTELIGENCE ON EDGE DEVICES = eliminating the need for large cloud servers
.................RESOURCE CONSTRAINED ENVIROMENTS = low power consumption, ? 6mths on a ÄAA" battery
.................BIOLOGICAL BRAINS ABLE TO LEARN = one shot learning

Check out this video if interested




Just wondering whether Elon has taken onboard my numerous emails to him ...................:unsure:

AKIDA ( G iveit Patience Time ) BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 25 users
Top Bottom