7für7
Top 20
-10% in germanistan… what’s up there?Don't worry, today is the day! It is down 3.8 % already, more is to come for sure.
![]()
-10% in germanistan… what’s up there?Don't worry, today is the day! It is down 3.8 % already, more is to come for sure.
![]()
Our share price....
A company listed on several exchanges have to have the same value on all of them.-10% in germanistan… what’s up there?
Evening Tothemoon24 ,ITL ventures into neuromorphic computing
By Megan Saxton
U.S. ARMY ENGINEER RESEARCH AND DEVELOPMENT CENTER
Published Sept. 23, 2024
Updated: Sept. 23, 2024
FacebookXEmailShare
![]()
PHOTO DETAILS / DOWNLOAD HI-RES 1 of 1
The U.S. Army Engineer Research and Development Center (ERDC) Information Technology Laboratory (ITL) Edge Computing Lab has long been on the cutting-edge of this field and is now exploring something new: neuromorphic computing.
PRINT | E-MAIL
In recent years, edge computing has revolutionized the technology landscape for users situated in remote areas or away from primary devices. By bringing computation and data storage closer to the location where it is needed, response times, reliability and performance are greatly improved, latency and bandwidth costs are reduced and privacy and security are enhanced. The U.S. Army Engineer Research and Development Center (ERDC) Information Technology Laboratory (ITL) Edge Computing Lab has long been on the cutting-edge of this field and is now exploring something new: neuromorphic computing.
“Neuromorphic computing is a process in which computers are designed and engineered to mirror the structure and function of the human brain,” said Dr. Raju Namburu, ITL chief technology officer and a senior scientific technical manager. “Using artificial neurons and synapses, neuromorphic computers simulate the way our brains process information, allowing them to solve problems, recognize patterns and make decisions more quickly and efficiently than the traditional high-performance computing systems we use today.”
The driving force behind ITL’s research into this emerging technology is the U.S. military’s need to know more, sooner, to allow rapid, decisive action on the multi-domain battlefield. The battlespace has become characterized by highly distributed processing, heterogeneous and mobile assets with limited battery life, communications- dominated but restricted network capacity and operating with time-critical needs in a rapidly changing hostile environment. Distributed and low power edge processing is one of the essential technologies for maintaining overmatch in various emerging operational and contested environments, as is the need to take advantage of machine learning (ML) and generative artificial intelligence (AI).
“Overall, neuromorphic chips offer the DoD community a number of potential benefits including improved performance, resilience, cost-efficiency, security, privacy, power-efficiency, signal processing, ML capabilities and more,” said Dr. Ruth Cheng, a computer scientist in ITL’s Supercomputing Research Center. “By keeping an eye on developments in this technology, the DoD community can ensure it remains at the forefront of military and defense innovation.”
“Computations performed at the molecular, atomic, and neuro scales mimicking the human brain are showing tremendous viability,” added Namburu. “We just started this work on next generation advanced computing, which is significantly different from traditional computing systems historically used at ERDC. Neuromorphic computing represents a paradigm shift in computing, promising significant advancements in ML, generative AI, scientific applications and sensor processing compared to traditional computing. Moreover, neuromorphic chips emulate the brain's plasticity, enabling learning and adaptation over time, unlike traditional systems.”
Ongoing efforts edge computing efforts include agnostic graphics processing unit (GPU) ray tracing development, benchmarking deep neural networks, sensor-data management, ML for underwater invasive plants, railcar inspection, photogrammetry, reservoir frameworks, decentralized edge computing, bi-directional digital twins and algorithms for anomaly detection. ITL is also exploring emerging AI chips for edge computing including novel algorithms and sustainable software.
“Overall, edge computing is helping to enable new use cases and provide better experiences to the users by making applications faster, more reliable and more secure,” said Cheng. “Neuromorphic chips are well-suited for edge computing, which is becoming increasingly important in military and defense applications, and ITL is already aiding in this process that will touch everything from lowering the cost of deployments by eliminating the need for expensive, high-powered servers and data centers to support of mobile and autonomous systems. This is the future.”
I met Bi-twins once. Only problem for me was they were a pigeon pair. Bolted real quickEvening Tothemoon24 ,
Good article.
Christ the yanks are slow to get their shite together , now thay are talking about bi directional digital twins.... kinky buggers .
Regards,
Esq.
Sean in a interview/ presentation around a year ago said that if you buy a camera powered by Prophesee you want to know its got AKIDA in it.I really don't know.
We do know that Sony and Prophesee went with Synsense for the lo-fi version (380*380 pixels) - the apochryphal low hanging fruit.
From your June 2022 link:
“We’ve successfully ported the data from Prophesee’s neuromorphic-based camera sensor to process inference on Akida with impressive performance,” said Anil Mankar, Co-Founder and CDO of BrainChip. “This combination of intelligent vision sensors with Akida’s ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution.”
“By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings,” said Luca Verre, CEO and co-founder of Prophesee."
Akida IP includes more than the tape-out specs. It would also include, eg, the copyright-protected software.
To the uninitiated, Anil's comments imply that the Prophesee data was applied to Akida 1 SoC, whereas Luca's coments can be interpreted as encompassing incorporating Akida software, eg, TeNNs simulation software into the Prophesee Metavision software.
The patent application for teNNs was filed 3 days after the article was published, so clearly BRN had been testing it beforehand as software. I think it is probable that TeNNs was used in tests on the Prophesee data, and that combining the Akida-based software IP with Metavision would have been the only available means of testing the Prophesee data against Akida 2, as we still have not seen the SoC. So it is not outside the bounds of possibility that TeNNs/Akida 2 software has been combined with Metavision.
This Monday awaits .... 30th September 2024 ...it is the last day in September and the end of this quarterly period ..... I would like to see some sort of actual ASX announcement either Price Sensitive or even at the very least a Non Price Sensitive Co Update in nature being actually lodge with the ASX ..... I'm sick of Co newsletters and podcasts which imo only promote the IR departments PR fluffy stuff they want us to hear, so as to make us believe that things are supposedly progressing, even though there have been no substantial revenue streams nor any new IP agreements being made for sometime now especially during our current CEO's tenure. IMO, there are also way too many unsubstantiated and dot joining poster's materials floating around here on this and other forums .....However, imo this is understandable given the Co's seeming " Cone of Silence " stance and somewhat reluctance to actually formalise information for s/holder discimination via formal ASX announcement either as Price Sensitive or Non Price Sensitive, as part of the Co's ongoing obligation to provide full and frank disclosure as part of their fiduciary duty to all s/holders.Great September newsletter just received. Plenty going on.
Agree completely. Time to start making some proper progress announcements on engagements, even as non price sensitive ones, as that is their duty to shareholders.This Monday awaits .... 30th September 2024 ...it is the last day in September and the end of this quarterly period ..... I would like to see some sort of actual ASX announcement either Price Sensitive or even at the very least a Non Price Sensitive Co Update in nature being actually lodge with the ASX ..... I'm sick of Co newsletters and podcasts which imo only promote the IR departments PR fluffy stuff they want us to hear, so as to make us believe that things are supposedly progressing, even though there have been no substantial revenue streams nor any new IP agreements being made for sometime now especially during our current CEO's tenure. IMO, there are also way too many unsubstantiated and dot joining poster's materials floating around here on this and other forums .....However, imo this is understandable given the Co's seeming " Cone of Silence " stance and somewhat reluctance to actually formalise information for s/holder discimination via formal ASX announcement either as Price Sensitive or Non Price Sensitive, as part of the Co's ongoing obligation to provide full and frank disclosure as part of their fiduciary duty to all s/holders.
Looks to be some very interesting cameratechnology coming up at Vision 2024
LUCID to Unveil Latest GigE Vision Cameras and Advanced Sensing Technologies at VISION 2024
![]()
Richmond, BC, Canada – August 22, 2024 – LUCID Vision Labs, Inc., a leading designer and manufacturer of industrial cameras, will showcase a range of new GigE Vision cameras and advanced sensing technologies at VISION 2024, which takes place from October 8–10, 2024, in Stuttgart, Germany.
LUCID is set to introduce the first member of its intelligent vision camera family, the Triton® Smart camera featuring Sony’s IMX501 intelligent vision sensor with AI processing. The Triton Smart is an easy-to-use, cost-effective intelligent vision camera capable of outputting inference results alongside regular 12.3 MP images for every frame. Its on-sensor AI processing reduces data bandwidth, alleviates processing load on the host PC, and minimizes latency.![]()
Expanding the Triton2 – 2.5GigE camera family, LUCID will showcase two new models for advanced sensing applications. The Triton2 EVS event-based 2.5GigE camera, powered by Sony’s IMX636 or IMX637 vision sensor and our Arena® SDK as well as PROPHESEE’s Metavision® SDK, is designed to deliver enhanced performance, lower power consumption, and greater flexibility for machine vision applications. It is ideal for motion analysis, vibration monitoring, object tracking, optical flow, autonomous driving, and high-speed detection.
Additionally, the new Triton2 4K line scan camera, equipped with Gpixel’s GL3504 image sensor, will also be unveiled. Featuring 4096 (H) x 2 (V) at 3.5 μm pixels, this camera is ideal for high-speed, high-resolution imaging.![]()
The Atlas10 camera – 10GigE camera family is welcoming a new high-resolution model featuring the 45-megapixel (8192 x 5460) onsemi XGS45000 CMOS global shutter image sensor, capable of running at 16 fps. This RDMA-enabled camera offers a unique combination of high resolution, high frame rate, and superior image quality, making it well-suited for applications such as flat panel inspection, aerial surveillance, mapping, and electronics inspection.![]()
LUCID is also expanding its Helios®2 3D Time-of-Flight camera family with the introduction of the Helios2 Narrow Field-of-View (FoV)variant. This model integrates Sony’s DepthSense™ IMX556PLR back-illuminated ToF image sensor. It produces a tighter point cloud, and the narrower illumination area reduces the likelihood of multipath error, making it ideal for applications requiring precise 3D depth measurement in confined spaces.![]()
As part of Industrial VISION Days 2024, organized by VDMA Machine Vision, LUCID’s Director of Product Management, Alexis Teissie, will present “The Benefits of RDMA for 10GigE Cameras and Beyond” on Wednesday, October 9th at 2:40 pm.![]()
Stay tuned for additional product highlights to be unveiled on the show floor. Join LUCID at VISION 2024 from October 8–10, 2024, in Stuttgart, Germany at Booth 10E40.
Just got an offer to invest in sifive at 4.5billion valuation , they have been kicking big goals while we’ve been struggling![]()
SiFive on LinkedIn: #riscv
Discover the world of #RISCV with SiFive’s HiFive Unmatched Rev. B development boards! Now available for just $299 for a limited time, grab yours today from…www.linkedin.com
View attachment 69985
View attachment 69981![]()
Tata Elxsi on LinkedIn: #tataelxsi #aatek #denso #robotics #automation #designdigital…
Tata Elxsi, in collaboration with AAtek Group and DENSO, proudly unveils the ‘Robotics and Automation - Innovation Lab,’ in Frankfurt, Germany, designed to…www.linkedin.com