BRN Discussion Ongoing

Iseki

Regular
This article was posted online 11 hours ago .

I believe BrainChip is at the intersection of AI and space technology at exactly the right point in time. Especially if satellites are equipped with neuromorphic vision sensors and neuromorphic processors (ie. AKIDA) for real-time detection, tracking, and interception of the proposed US Iron Dome defense system.

As rocket scientist Ari Sacher states ""If you can solve that problem in outer space, then you can use it on the ground for a whole bunch of other control problems; controlling fires, controlling electric grids, controlling everything… That's the secret: control."

If you click onto the article linked below, you can check out the video with Ari Sachar's interview. At 3.21 mins in he says "I worked together with Raytheon. I was actually the manager of Raytheon from our company in a project called David's Sling. And Raytheon did a whole bunch of meaningful stuff. They designed the computer. They designed a whole bunch of other stuff. They have a tremendous cadre of scientists there. That was about 10 years ago. I don't beleive anything has changed. The Unites States has some amazing scientists and if the government decides to fund this to how much is necessary, then you guys are going to blow everything out of the water. I have absolutely no suspicion otherwise."

His company is Rafael Advanced Defense Systems which I posted about previously. Raytheon and Rafael Advanced Defense Systems have collaborated on multiple defense projects, particularly in missile and air defense systems.




View attachment 77658

View attachment 77659







Published February 16, 2025 7:00am EST

US Iron Dome needs to be 'far more complex' to deal with 'near-peer threats,' expert says​

  • Agustin Hays

By Agustin Hays FOXBusiness

US needs something 'far more complex' than Iron Dome, rocket scientist says

Rocket scientist Ari Sacher explains the capabilities of the Iron Dome missile defense system, why the U.S. needs something more complex and comments on the arms sale to Israel.
President Donald Trump is seeking to bolster the defense of the American homeland with a U.S.-style Iron Dome missile system. However, one expert believes that a system similar to Israel's is "not needed."

"So let me tell you at the outset, the president is using the term ‘Iron Dome’ as a metaphor," rocket scientist Ari Sacher said during an interview on FOX Business' "Mornings with Maria" Monday. "It's perfect for defending Israel from Gaza, Lebanon, it is not something that the United States needs very much."
In President Trump's first few weeks in office, he signed a slew of executive orders, with one focused on the construction of an American Iron Dome. The order addressed the need for the implementation of a next-generation missile defense shield to protect the homeland "against ballistic, hypersonic, advanced cruise missiles, and other next-generation aerial attacks," as well as to "further the goals of peace through strength."
5 THINGS TO KNOW ABOUT PRESIDENT DONALD TRUMP'S 'IRON DOME' PLAN FOR AMERICA
Sacher explained that when it comes to missile defense, the U.S. needs a more extensive system than Israel's to grapple with distant adversaries.
Rocket scientist weighs in on Trump Iron Dome executive order

Rocket scientist Ari Sacher says an American Iron Dome should be different from Israel's system. (Getty Images)

"To defend the U.S. homeland, as the president wants to do, you need something completely different," he said. "You're defending against rockets not launched from Canada or Mexico… you're defending against rockets that are launched from North Korea, from China, from Russia, potentially, and you need something far more complex than [an] Iron Dome to shoot it down."
The rocket scientist, who has expertise in missile defense, further detailed how the system could look under President Trump.
"What the president is looking at is something that probably would be called space-based intercept. You bring up a whole bunch of interceptors into outer space, and the whole intercept will take place in outer space. So if you want to call it ‘Iron Dome’ or you want to call it ‘Fred,’ doesn't make a difference, it's not [an] Iron Dome."


Stuart Varney: Trump's 'Iron Dome' dream provoked a typical media response

'Varney & Co.' host Stuart Varney discusses President Donald Trump's plan for an American 'Iron Dome.'
However achieved, Sacher believes that the American Iron Dome's chances of success are "excellent," and that "the U.S. has a tremendous amount of engineers and gumption." The expert also pointed out the threats that U.S. missile defense could address with the more complex shield compared to that of the Israeli system.
"We're talking about Korea and points west, China's even farther. That's the threats America has to look at, our near-peer threats."
He continued, comparing those threats to those of the Middle East.
"Things like Gaza and Hezbollah, that's just too small," he said. "That's a minor league United States of America."
Sacher also revealed the key challenge when it comes to missile defense systems.

8VC managing partner Joe Lonsdale joins ‘Mornings with Maria’ to provide analysis of President Donald Trump’s eye-opening plan to build an American ‘Iron Dome.’

Trump’s ‘Iron Dome’ plan is being ‘overlooked’ by the media, expert says

8VC managing partner Joe Lonsdale joins ‘Mornings with Maria’ to provide analysis of President Donald Trump’s eye-opening plan to build an American ‘Iron Dome.’
"There's a whole new slew of technologies that are needed to do this sort of thing. [The] most difficult one is, believe it or not, not the interceptor, it's not the launcher. The most difficult thing is [not even] getting it into outer space. The most difficult thing is controlling everything," he stressed.
He broke down the different elements one needs to be aware of while operating the Iron Dome.

"It's understanding what we call sky picture," Sacher stressed. "You got to know when you're shooting an Iron Dome. You got to know who's firing on you, how many, which is a good guy, which is a bad guy. 'What's that 777 landing at the airport? Can't shoot that down.' Imagine doing all of that in outer space. And there's so much more to take care of and there's so much more that could go wrong, and you have to take account of all these things."

Israel's Iron Dome is not 100%, must get better with drones: Yuval Steinitz

Former Israel finance minister Yuval Steinitz discusses the technology behind the Iron Dome system as Hamas missile attacks continue on 'Cavuto: Coast to Coast.'
Emphasizing the importance of control, Sacher said that once the situation is resolved in space, the system can be applied for use on Earth.
"If you can solve that problem in outer space, then you can use it on the ground for a whole bunch of other control problems; controlling fires, controlling electric grids, controlling everything… That's the secret: control."


And yet RTX hasn't signed their subcontractor agreement to make a start on trialling akida, even though AFRL will pay them to give it a go. Have they lost it?
 
  • Like
  • Thinking
Reactions: 2 users

uiux

Regular
And yet RTX hasn't signed their subcontractor agreement to make a start on trialling akida, even though AFRL will pay them to give it a go. Have they lost it?

RTX are to be paid to trial Akida?



Where is that information from ?
 
  • Like
Reactions: 17 users

MrNick

Regular
ec9ae486a1d66ad804c45e669579769b99-09-dome-simpsons.h375.w560 copy.jpg
 
  • Haha
Reactions: 12 users

TECH

Regular
And yet RTX hasn't signed their subcontractor agreement to make a start on trialling akida, even though AFRL will pay them to give it a go. Have they lost it?

Honestly Eye Sea Key,

Where do you come up with this information ?

You are in fact correct in saying that, AFRL are (in effect) paying a fee as part of the parent contract, which Brainchip won (awarded) for 1.8 Million USD.......none of us really know if it was Brainchip who negotiated a third party fee of $800,000 USD or if it was imposed upon Brainchip by AFRL as part of the parent contract.

My personal opinion is that the AFRL told Brainchip who the 3rd party was going to be, to sow the entire deal together, but as I have mentioned before, the fact that Brainchip released a statement stating that we (Brainchip), were committed to pay a third party $800,000 USD, well, am I the only one on this forum who thinks the terms had already been agreed to, can you disclose a figure when the deal is still up in the air ?

Your comment that the third party is RTX......that's pure speculation, which division are you referring to Eye Sea Key ?

Collins Aerospace ?

Just the opinion of a rusted on shareholder...........Tech.
 
  • Like
  • Thinking
Reactions: 17 users

uiux

Regular
Honestly Eye Sea Key,

Where do you come up with this information ?

You are in fact correct in saying that, AFRL are (in effect) paying a fee as part of the parent contract, which Brainchip won (awarded) for 1.8 Million USD.......none of us really know if it was Brainchip who negotiated a third party fee of $800,000 USD or if it was imposed upon Brainchip by AFRL as part of the parent contract.

My personal opinion is that the AFRL told Brainchip who the 3rd party was going to be, to sow the entire deal together, but as I have mentioned before, the fact that Brainchip released a statement stating that we (Brainchip), were committed to pay a third party $800,000 USD, well, am I the only one on this forum who thinks the terms had already been agreed to, can you disclosure a figure when the deal is still up in the air ?

Your comment that the third party is RTX......that's pure speculation, which division are you referring to Eye Sea Key ?

Collins Aerospace ?

Just the opinion of a rusted on shareholder...........Tech.

It reads like Iseki is just making shit up just so he can complain about it
 
  • Like
  • Haha
  • Fire
Reactions: 49 users

Iseki

Regular
Honestly Eye Sea Key,

Where do you come up with this information ?

You are in fact correct in saying that, AFRL are (in effect) paying a fee as part of the parent contract, which Brainchip won (awarded) for 1.8 Million USD.......none of us really know if it was Brainchip who negotiated a third party fee of $800,000 USD or if it was imposed upon Brainchip by AFRL as part of the parent contract.

My personal opinion is that the AFRL told Brainchip who the 3rd party was going to be, to sow the entire deal together, but as I have mentioned before, the fact that Brainchip released a statement stating that we (Brainchip), were committed to pay a third party $800,000 USD, well, am I the only one on this forum who thinks the terms had already been agreed to, can you disclose a figure when the deal is still up in the air ?

Your comment that the third party is RTX......that's pure speculation, which division are you referring to Eye Sea Key ?

Collins Aerospace ?

Just the opinion of a rusted on shareholder...........Tech.
Well, the award went to BRN because some fabulously well heeled defense supplier had already done some work on neuromorphioc analysis of radar-doppler signals, and AFRL wants to see if it can be done in hardware. So everyone in the defense industry knows who it is.

Moreover the $1.8 mill is chicken feed, and I'm presuming at this stage RTX is refusing to sign because it is such a measly amount they quite frankly couldn't give a damn. (Quite possibly they asked for $18Mil.) They'd probably just take that $800K off the next party bar-tab and not have to do all that peski grant acquittal stuff.

So, in answer to your question, I reckon it would be the RTX petty-cash division.

TBH, I have never been so humiliated as a shareholder, at least not since the NEUROBUS debacle, which has also ground to a halt with nothing new reported in a while. Let's face it: Where are ther photos of Sean throwing his hat in the air as images of a satelite actually working is beamed to Parkes? That is some of my evidence.
 
  • Like
  • Fire
Reactions: 4 users

uiux

Regular
Well, the award went to BRN because some fabulously well heeled defense supplier had already done some work on neuromorphioc analysis of radar-doppler signals, and AFRL wants to see if it can be done in hardware. So everyone in the defense industry knows who it is.

Moreover the $1.8 mill is chicken feed, and I'm presuming at this stage RTX is refusing to sign because it is such a measly amount they quite frankly couldn't give a damn. (Quite possibly they asked for $18Mil.) They'd probably just take that $800K off the next party bar-tab and not have to do all that peski grant acquittal stuff.

So, in answer to your question, I reckon it would be the RTX petty-cash division.

TBH, I have never been so humiliated as a shareholder, at least not since the NEUROBUS debacle, which has also ground to a halt with nothing new reported in a while. Let's face it: Where are ther photos of Sean throwing his hat in the air as images of a satelite actually working is beamed to Parkes? That is some of my evidence.



So now you are just making more shit up
 
  • Haha
  • Like
  • Fire
Reactions: 26 users

Iseki

Regular
It reads like Iseki is just making shit up just so he can complain about it
TBH, you couldn't make this stuff up.

Win a grant, and the co-applicant won't sign for LOVE NOR MONEY.

Nowadays, I imagine our founder, Peter, sitting in one of those antique telephone booths ( half desk, half uncomfortable chair, with a small comapartment for the phone), talking with Sean about "shooting to the stars", while on the other end of the line, Sean is mostly focused on a new natty NVIDIA device that analyzes his golf swing and that can, at the flick of a switch, run hilights of the hit show SUCCESSION, which he will show at the upcoming board meeting.

Is it any wonder I'm complaining?!
 
  • Fire
Reactions: 1 users

uiux

Regular
TBH, you couldn't make this stuff up.

Win a grant, and the co-applicant won't sign for LOVE NOR MONEY.

Nowadays, I imagine our founder, Peter, sitting in one of those antique telephone booths ( half desk, half uncomfortable chair, with a small comapartment for the phone), talking with Sean about "shooting to the stars", while on the other end of the line, Sean is mostly focused on a new natty NVIDIA device that analyzes his golf swing and that can, at the flick of a switch, run hilights of the hit show SUCCESSION, which he will show at the upcoming board meeting.

Is it any wonder I'm complaining?!

How do you know if they have or haven't signed?
What are they exactly signing?
Who is the contractor?
Does it include an IP deal?
What instructions came from AFRL?
What radar device is it?
What does the 800k entail?


Spare me the fan fiction Iseki - you have zero idea what you should even complain about in regards to the AFRL grant. You don't even know if it relates to RTX as the contractor.


I'd be more humiliated typing the shit you have in the last few posts more than anything else
 
  • Like
  • Haha
  • Fire
Reactions: 51 users

Iseki

Regular
So now you are just making more shit up
The time is nigh to decide if your loyalty is to the founders' invention, or (exclusive) , to someone given the job of commercializing that invention.
 
  • Fire
Reactions: 1 users

Iseki

Regular
How do you know if they have or haven't signed?
What are they exactly signing?
Who is the contractor?
Does it include an IP deal?
What instructions came from AFRL?
What radar device is it?
What does the 800k entail?


Spare me the fan fiction Iseki - you have zero idea what you should even complain about in regards to the AFRL grant. You don't even know if it relates to RTX as the contractor.


I'd be more humiliated typing the shit you have in the last few posts more than anything else
Uiux, I'm not going to argue with you.

You have done more for Brainchip than Stevens and Telson combined, thou possibly not as much as Gelsinger who has sent Intel broke.

Time will tell if the AFRL deal is a right royal stuff-up, or a stroke of genius.
 
Well, the award went to BRN because some fabulously well heeled defense supplier had already done some work on neuromorphioc analysis of radar-doppler signals, and AFRL wants to see if it can be done in hardware. So everyone in the defense industry knows who it is.

Moreover the $1.8 mill is chicken feed, and I'm presuming at this stage RTX is refusing to sign because it is such a measly amount they quite frankly couldn't give a damn. (Quite possibly they asked for $18Mil.) They'd probably just take that $800K off the next party bar-tab and not have to do all that peski grant acquittal stuff.

So, in answer to your question, I reckon it would be the RTX petty-cash division.

TBH, I have never been so humiliated as a shareholder, at least not since the NEUROBUS debacle, which has also ground to a halt with nothing new reported in a while. Let's face it: Where are ther photos of Sean throwing his hat in the air as images of a satelite actually working is beamed to Parkes? That is some of my evidence.
Well sell up and

1739771917105.gif
 
  • Like
  • Haha
  • Fire
Reactions: 12 users

manny100

Top 20
One thing seems certain is that the Navy is transitioning to AI at the Edge. See previously posted Bascom Hunter/Navy transitioning posts.
It's likely US defense forces and security will transition also. Space as well.
I very much doubt RTX and Lockheed-Martin will allow Bascom to eat the whole cake.
Did I hear Sean say during the podcast something about Bascom heading for great things???
 
  • Like
  • Love
Reactions: 15 users
@Dolci so glad I listened to you and brought when I did, but I wonder how many are still waiting for 0.15c

1739776441291.gif
 
  • Like
  • Haha
  • Fire
Reactions: 10 users

uiux

Regular
One thing seems certain is that the Navy is transitioning to AI at the Edge. See previously posted Bascom Hunter/Navy transitioning posts.
It's likely US defense forces and security will transition also. Space as well.
I very much doubt RTX and Lockheed-Martin will allow Bascom to eat the whole cake.
Did I hear Sean say during the podcast something about Bascom heading for great things???

The navy is transitioning to Terminator robots with neuromorphic processors:




FY 2024 Base Plans:

Cognitive Science for Human-Machine Teaming and Computational Neuroscience


Continue:
- Assess feasibility of incorporating realistic neural systems into autonomous systems for more robust on-board perception and intelligence.
- Conducting applied research on system interface designs and human-machine interaction methodologies that enable or enhance Naval Warfighter performance and human-machine teaming.
- Conduct applied research to develop agile humanoid robot teammates with enhancements including: (i) Embedding computer vision with visual-spatial reasoning; (ii) Auditory systems to enable human communication; and (iii) Neuromorphic (brain-like) processors.
- Conduct applied research to train mission-capable robots to perform complex manipulation tasks, integrated with the ability to recognize patterns and learn from data (self-learning).
- Investigate the effectiveness of incorporating vision and language processes in robots to facilitate human-robot team performance learning and communication
 
  • Like
  • Fire
  • Wow
Reactions: 37 users

manny100

Top 20
The navy is transitioning to Terminator robots with neuromorphic processors:




FY 2024 Base Plans:

Cognitive Science for Human-Machine Teaming and Computational Neuroscience


Continue:
- Assess feasibility of incorporating realistic neural systems into autonomous systems for more robust on-board perception and intelligence.
- Conducting applied research on system interface designs and human-machine interaction methodologies that enable or enhance Naval Warfighter performance and human-machine teaming.
- Conduct applied research to develop agile humanoid robot teammates with enhancements including: (i) Embedding computer vision with visual-spatial reasoning; (ii) Auditory systems to enable human communication; and (iii) Neuromorphic (brain-like) processors.
- Conduct applied research to train mission-capable robots to perform complex manipulation tasks, integrated with the ability to recognize patterns and learn from data (self-learning).
- Investigate the effectiveness of incorporating vision and language processes in robots to facilitate human-robot team performance learning and communication
I clicked on the link and searched neuromorphic and it came up with 8 references a few of which i have posted below.
Appears the Navy is pretty much "all in". Included a few below but best for posters to look it up themselves.
No way is the Navy alone in the transition.
Also page 32 of 77
" Continue to explore neuromorphic spiking neuron hardware designs based on brain models that are suitable for future edge computing and signal processing in small naval platforms.- Continue to explore autonomous problem solving and curiosity driven search for robust performance under unexpected conditions.- Initiate research to identify, characterize and model adversarial AI.- Initiate research exploring theory and algorithms for learning and decision making in multi-agent systems, particularly in adversarial situations"

Also page 9 of 17
" Science of Artificial Intelligence:- Continue applied research on principled computational frameworks for integrating domain knowledge and machine learning for fast robust learning of diverse, complex concepts and tasks with minimal supervision to analyze the sparse, noise and unlabeled data of the Naval domain.- Continue the application of new brain-inspired artificial intelligence algorithms and architectures for the development of compact neuromorphic hardware suitable for edge computing and signal processing in Naval platforms.- Continue the use Artificial Intelligence (AI) for enhanced collaborative complex decision-making and human machine dialogue to increase the speed and quality of operational decisions.- Continue research on embedding AI in robotic systems to enable human-machine collaboration and robot training for hazardous missions.- Continue to integrate physical models with machine learning to enable predictive maintenance for autonomous Naval platforms and enable long duration autonomous missions.- Continue to conduct AI-based analysis of data from wearable sensors and task performance measures to monitor and optimize human performance.- Continue research on the ability to enable a humanoid robot to adapt skills learned in one environment or context, to new situations. Cues of the current context, including the environmental state or goals of the robot or its teammates, will modulate the execution of existing robotic skills, such as adjusting the robot's speed."

Also page 10 of 17
" - Initiate applied research to design embedded neuromorphic processors into intelligent autonomous systems to permit onboard analysis of target data to enable single-pass mine countermeasures missions.- Initiate applied research to validate AI algorithms to provide distributed perception in networks of interacting autonomous agents in the presence of varying levels of reliability and trust at both network and individual agent.- Initiate applied research on AI tools for multi-level optimization of shipyard maintenance scheduling to accelerate on time delivery of ships out of maintenance and improve ship availability and fleet readiness.- Initiate research techniques for training AI to perform tasks from human behavior and natural language instruction"
My bold above.
 
  • Like
  • Fire
  • Wow
Reactions: 37 users

cosors

👀
  • Like
  • Wow
  • Thinking
Reactions: 14 users

Taproot

Regular
The navy is transitioning to Terminator robots with neuromorphic processors:




FY 2024 Base Plans:

Cognitive Science for Human-Machine Teaming and Computational Neuroscience


Continue:
- Assess feasibility of incorporating realistic neural systems into autonomous systems for more robust on-board perception and intelligence.
- Conducting applied research on system interface designs and human-machine interaction methodologies that enable or enhance Naval Warfighter performance and human-machine teaming.
- Conduct applied research to develop agile humanoid robot teammates with enhancements including: (i) Embedding computer vision with visual-spatial reasoning; (ii) Auditory systems to enable human communication; and (iii) Neuromorphic (brain-like) processors.
- Conduct applied research to train mission-capable robots to perform complex manipulation tasks, integrated with the ability to recognize patterns and learn from data (self-learning).
- Investigate the effectiveness of incorporating vision and language processes in robots to facilitate human-robot team performance learning and communication
I reckon you would really enjoy this interview, ( if you haven't already seen it )
Palmer Luckey ( Anduril founder ) 32 years old.
Just took over $22 Billion IVAS contract from Microsoft 4 days ago.
Started Occulus when he was 19.
After watching this, I'm just praying that he's aware of BrainChip
Would be nice to have this bloke on side.

 
  • Like
  • Wow
  • Fire
Reactions: 12 users

uiux

Regular
I reckon you would really enjoy this interview, ( if you haven't already seen it )
Palmer Luckey ( Anduril founder ) 32 years old.
Just took over $22 Billion IVAS contract from Microsoft 4 days ago.
Started Occulus when he was 19.
After watching this, I'm just praying that he's aware of BrainChip
Would be nice to have this bloke on side.





US20230007162 - EVENT-BASED COMPUTATIONAL PIXEL IMAGERS

Applicants
Anduril Industries, Inc.

A computational pixel imaging device that includes an array of pixel integrated circuits for event-based detection and imaging. Each pixel may include a digital counter that accumulates a digital number, which indicates whether a change is detected by the pixel. The counter may count in one direction for a portion of an exposure and count in an opposite direction for another portion of the exposure. The imaging device may be configured to collect and transmit key frames at a lower rate, and collect and transmit delta or event frames at a higher rate. The key frames may include a full image of a scene, captured by the pixel array. The delta frames may include sparse data, captured by pixels that have detected meaningful changes in received light intensity. High speed, low transmission bandwidth motion image video can be reconstructed using the key frames and the delta frames.

1739793700732.png


From GPR:

The patent US20230007162 - Event-Based Computational Pixel Imagers describes an imaging system that integrates in-pixel signal processing, event-based detection, and multi-threaded computation. Unlike traditional CMOS or CCD image sensors that capture full frames at fixed intervals, this system detects and processes changes in light intensity at the pixel level. Each pixel contains a counter that increments and decrements based on detected intensity changes, allowing it to identify significant events while filtering out noise. The system employs a dual-frame acquisition strategy, capturing key frames at a lower frequency and delta frames at a higher frequency to efficiently track motion. Additionally, it incorporates infinite dynamic range counters, which utilize most significant bit (MSB) readout techniques to extend dynamic range without increasing counter bit-depth. This reduces data transmission requirements and enhances computational efficiency by prioritizing meaningful image data.


The architecture supports distributed in-pixel computation, including local filtering, thresholding, and adaptive exposure adjustments. Pixels can share data through orthogonal transfer mechanisms, enabling on-chip signal processing functions such as edge detection, temporal filtering, and convolution. The design also incorporates multi-threaded execution within the pixel array, allowing different processing tasks to run concurrently. The system's event-driven nature reduces power consumption and bandwidth usage by transmitting only data from pixels that detect significant intensity changes. These features align with neuromorphic imaging principles, as the system mimics biological vision by emphasizing changes in a scene rather than static information. However, the patent does not explicitly reference neuromorphic computing or spiking neural networks, instead focusing on the efficiency of event-based imaging and in-pixel digital processing.
 
  • Like
  • Fire
  • Love
Reactions: 20 users

FuzM

Member

US20230007162 - EVENT-BASED COMPUTATIONAL PIXEL IMAGERS

Applicants
Anduril Industries, Inc.

A computational pixel imaging device that includes an array of pixel integrated circuits for event-based detection and imaging. Each pixel may include a digital counter that accumulates a digital number, which indicates whether a change is detected by the pixel. The counter may count in one direction for a portion of an exposure and count in an opposite direction for another portion of the exposure. The imaging device may be configured to collect and transmit key frames at a lower rate, and collect and transmit delta or event frames at a higher rate. The key frames may include a full image of a scene, captured by the pixel array. The delta frames may include sparse data, captured by pixels that have detected meaningful changes in received light intensity. High speed, low transmission bandwidth motion image video can be reconstructed using the key frames and the delta frames.

View attachment 77688

From GPR:

The patent US20230007162 - Event-Based Computational Pixel Imagers describes an imaging system that integrates in-pixel signal processing, event-based detection, and multi-threaded computation. Unlike traditional CMOS or CCD image sensors that capture full frames at fixed intervals, this system detects and processes changes in light intensity at the pixel level. Each pixel contains a counter that increments and decrements based on detected intensity changes, allowing it to identify significant events while filtering out noise. The system employs a dual-frame acquisition strategy, capturing key frames at a lower frequency and delta frames at a higher frequency to efficiently track motion. Additionally, it incorporates infinite dynamic range counters, which utilize most significant bit (MSB) readout techniques to extend dynamic range without increasing counter bit-depth. This reduces data transmission requirements and enhances computational efficiency by prioritizing meaningful image data.


The architecture supports distributed in-pixel computation, including local filtering, thresholding, and adaptive exposure adjustments. Pixels can share data through orthogonal transfer mechanisms, enabling on-chip signal processing functions such as edge detection, temporal filtering, and convolution. The design also incorporates multi-threaded execution within the pixel array, allowing different processing tasks to run concurrently. The system's event-driven nature reduces power consumption and bandwidth usage by transmitting only data from pixels that detect significant intensity changes. These features align with neuromorphic imaging principles, as the system mimics biological vision by emphasizing changes in a scene rather than static information. However, the patent does not explicitly reference neuromorphic computing or spiking neural networks, instead focusing on the efficiency of event-based imaging and in-pixel digital processing.

Any chance "TENNs-PLEIADES: Building Temporal Kernels with Orthogonal Polynomials" be in play here?
 

Attachments

  • TENNs PLEIADES.pdf
    632 KB · Views: 77
  • Like
Reactions: 2 users
Top Bottom