BRN Discussion Ongoing

M_C

Founding Member
Hi MC,

I just noticed this update on the NASA Space Station Status Report 10 May 2022.

(Extract Only)

SpaceBorne Computer-2 (SBC-2): In an attempt to recover functionality of one of the two SBC-2 server units, an inspection and replacement of the system's ethernet cable was performed. Spaceborne Computer-2 High Performance Commercial Off-The-Shelf (COTS) Computer System on the ISS (SBC-2) builds upon the successes of the Spaceborne Computer. The Spaceborne Computer explored how COTS can advance exploration by processing data significantly faster in space with edge computing and artificial intelligence (AI) capabilities. Spaceborne Computer-2 further tests additional techniques for recovering or mitigating errors in the extreme environment of unprotected solar radiation, galactic cosmic radiation (GCR) and other events. Additionally, Hewlett Packard Enterprise (HPE) works with the space community and the International Space Station-National Laboratory (ISS-NL) to test and demonstrate that current Earth-based data processing of ISS experimental data can be performed onboard during the anticipated 24-to-36-month mission of Spaceborne Computer-2.


Hey mate,

I have raised HPE's Spaceborne 2 in the past (on Hot Crapper), but I was dually shot down by both @Diogenese and @uiux sadly, still I find it hard to believe we aren't involved given our association with NASA and our CEO's history....perhaps a little time might reveal something more
 
  • Like
  • Fire
  • Love
Reactions: 10 users

somme

Member
I gave this investment advice over at the other place and it is the only investment advice I will ever give anyone but you would be a fool not to take your profits from Brainchip and buy shares in the company that takes it over if that were to occur. I do not even have to ask Blind Freddie for his opinion on this one it is a complete BRAINER.

My opinion only DYOR
FF

AKIDA BALLISTA
Hi Fact Finder,

There was also other Financial Advice that you gave that I believe was more important.
That was to have your investments ( BRN ) thanks in your SMSF.
Less or no tax on the gains.
I have done this and am way ahead, on paper.

thanks
 
  • Like
  • Love
  • Fire
Reactions: 9 users

Diogenese

Top 20
Did BrainChip just win an Emmy Award with GoPro and Socionext?



Logic as follows:

Samantha Hamilton is ex GoPRO and it seems that she is tight with the old crew (photo below)

Following her LinkedIn likes GoPro just won an Emmy for their advanced image stabilisation (photo below)

Socionext are in partnership with GoPro and share the Emmy due to their contribution - camera sensors (photo below)

Socionext are also obviously involved with BrainChip


View attachment 6548
View attachment 6557


View attachment 6550
View attachment 6551

View attachment 6553

View attachment 6554
Hi tls,

A few days ago, I wondered aloud about the effect of a corrugated road on LiDaR ... and of course the answer is image stabilization.
 
  • Like
  • Fire
  • Love
Reactions: 27 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hey mate,

I have raised HPE's Spaceborne 2 in the past (on Hot Crapper), but I was dually shot down by both @Diogenese and @uiux sadly, still I find it hard to believe we aren't involved given our association with NASA and our CEO's history....perhaps a little time might reveal something more


Then you might also find this article VERRRY interesting!!

HPE_Spaceborne_Computer-2_in_International_Space_Station_Columbus_module._Credit-_NASA-scaled.jpeg
Hewlett Packard Enterprise’s Spaceborne Computer-2, sent to the International Space Station in February 2021, is linked to Microsoft’s Azure cloud through NASA and HPE ground stations. Credit: NASA



COLORADO SPRINGS – Hewlett Packard Enterprise, Microsoft and NASA will share details at the 37th Space Symposium on 24 research experiments completed to date on the International Space Station’s HPE Spaceborne Computer-2, including analysis of astronaut gloves that relies on artificial intelligence.
Since the Spaceborne Computer-2 was installed in ISS in May 2021, HPE has been working with Microsoft and NASA to demonstrate a variety of applications. Experiments conducted to date have focused on astronaut healthcare, image processing, natural disasters, 3D printing and 5G communications.
The astronaut glove experiment relies on artificial intelligence to analyze photos and videos of the gloves astronauts wear when repairing equipment and installing instruments outside ISS. NASA and Microsoft developed a glove-analyzer AI model that looks for signs of glove damage. When damage is detected, an annotated image is sent automatically to Earth for further review.
“By introducing edge computing and AI capabilities to the International Space Station with Spaceborne Computer-2, we have helped foster a growing, collaborative research community that shares a common goal to make scientific and engineering breakthroughs that benefit humankind, in space and here on Earth,” Mark Fernandez, HPE Spaceborne Computer-2 principal investigator, said in a statement.
Edge processing will become increasingly important for human space exploration because astronauts traveling to the moon, Mars and other deep space destinations will experience communications delays. If used wisely, AI, cloud computing and space-based edge processors could eliminate the need for astronauts to constantly send information to the ground for processing and analysis.

Another widely discussed experiment looks for mutations in astronaut DNA. Prior to the Spaceborne Computer-2, sending a 1.8 gigabit raw DNA sequence took more than 12 hours to deliver the data to researchers on the ground for processing. Now, the data can be processed on the space station in six minutes, compressed and sent to Earth in two seconds, according to an HPE April 4 news release.
The Spaceborne Computer-2 also is being used to test automatic interpretation of satellite imagery. NASA Jet Propulsion Laboratory researchers use a type of artificial intelligence called deep learning to automatically interpret data captured in orbit of land and structures after disasters like floods and hurricanes.
Another experiment by the Cornell University Fracture Group tested modeling software that simulates 3D printing of metal parts and predicts failure or deformation that could result. Tests on the Spaceborne Computer-2 validated the software.
Mobile network operator Cumucore tested various features of its 5G core network on Spaceborne Computer-2. The experiment indicated that installing 5G equipment on some satellites and spacecraft could enhance space-based communications.
The Spaceborne Computer-2, which HPE sent to orbit in February 2021 in a Northrop Grumman Cygnus capsule in collaboration with the ISS National Laboratory, is expected to remain on ISS for approximately two more years.
 
  • Like
  • Fire
Reactions: 25 users

M_C

Founding Member
Then you might also find this article VERRRY interesting!!

HPE_Spaceborne_Computer-2_in_International_Space_Station_Columbus_module._Credit-_NASA-scaled.jpeg
Hewlett Packard Enterprise’s Spaceborne Computer-2, sent to the International Space Station in February 2021, is linked to Microsoft’s Azure cloud through NASA and HPE ground stations. Credit: NASA



COLORADO SPRINGS – Hewlett Packard Enterprise, Microsoft and NASA will share details at the 37th Space Symposium on 24 research experiments completed to date on the International Space Station’s HPE Spaceborne Computer-2, including analysis of astronaut gloves that relies on artificial intelligence.
Since the Spaceborne Computer-2 was installed in ISS in May 2021, HPE has been working with Microsoft and NASA to demonstrate a variety of applications. Experiments conducted to date have focused on astronaut healthcare, image processing, natural disasters, 3D printing and 5G communications.
The astronaut glove experiment relies on artificial intelligence to analyze photos and videos of the gloves astronauts wear when repairing equipment and installing instruments outside ISS. NASA and Microsoft developed a glove-analyzer AI model that looks for signs of glove damage. When damage is detected, an annotated image is sent automatically to Earth for further review.
“By introducing edge computing and AI capabilities to the International Space Station with Spaceborne Computer-2, we have helped foster a growing, collaborative research community that shares a common goal to make scientific and engineering breakthroughs that benefit humankind, in space and here on Earth,” Mark Fernandez, HPE Spaceborne Computer-2 principal investigator, said in a statement.
Edge processing will become increasingly important for human space exploration because astronauts traveling to the moon, Mars and other deep space destinations will experience communications delays. If used wisely, AI, cloud computing and space-based edge processors could eliminate the need for astronauts to constantly send information to the ground for processing and analysis.
Another widely discussed experiment looks for mutations in astronaut DNA. Prior to the Spaceborne Computer-2, sending a 1.8 gigabit raw DNA sequence took more than 12 hours to deliver the data to researchers on the ground for processing. Now, the data can be processed on the space station in six minutes, compressed and sent to Earth in two seconds, according to an HPE April 4 news release.
The Spaceborne Computer-2 also is being used to test automatic interpretation of satellite imagery. NASA Jet Propulsion Laboratory researchers use a type of artificial intelligence called deep learning to automatically interpret data captured in orbit of land and structures after disasters like floods and hurricanes.
Another experiment by the Cornell University Fracture Group tested modeling software that simulates 3D printing of metal parts and predicts failure or deformation that could result. Tests on the Spaceborne Computer-2 validated the software.
Mobile network operator Cumucore tested various features of its 5G core network on Spaceborne Computer-2. The experiment indicated that installing 5G equipment on some satellites and spacecraft could enhance space-based communications.
The Spaceborne Computer-2, which HPE sent to orbit in February 2021 in a Northrop Grumman Cygnus capsule in collaboration with the ISS National Laboratory, is expected to remain on ISS for approximately two more years.
I hear you mate! But when I posted about it on hc...........(My post on hc is the plane)

plane crash.gif
 
  • Haha
  • Like
  • Love
Reactions: 12 users
Mandatory reading IMO. Article written by René Torres the VP / General Manager of IOT Sales and Marketing Group at Intel Corporation.


Enabling the era of intelligent edge

sponsored content

Enabling the era of intelligent edge​

René Torres

In a few years, we may be largely living “on the edge.” As the amount of data grows exponentially, there is a greater need for edge computing solutions to aid in real-time decision-making. Gartner predicts that by 2025, 75% of enterprise-generated data will be created and processed at the edge, outside of traditional centralized data centers or the cloud.
But with data centers and cloud computing traditionally supporting data flow, where does edge fit in? As a form of distributed computing, edge computing enables processing to happen where data is being generated. The convergence of 5G networks with edge computing means data is not only traveling faster, but can be quickly translated via media, inferencing and analytics into insights and action, enabling new, ultra-low latency applications to come to life. An autonomous vehicle that senses a pedestrian moving into the road may have less than a second to stop or swerve to avoid hitting them, and removing the latency caused by the data traveling to the cloud and back could literally be life-saving. Other benefits of analyzing data at the edge include stronger security protection of data, lower transportation costs, enhanced data quality and increased reliability, particularly in rural or remote places.



A fireside chat with Intel youtu.be

The opportunities and challenges of edge adoption​

The ability to make fully autonomous vehicles a reality is just one example. Together, 5G and the intelligent edge will enable a new era of distributed intelligence that will transform all types of industries, from smart cities to health care. According to IDC, 77% of U.S. organizations regard edge as a strategic business investment. The need for edge solutions has also been accelerated by the pandemic due to trends like distributed workforces, the growth of remote environments and companies’ digital transformations.
At the same time, while realization of the value of the intelligent edge is growing, companies are struggling to find the right resources to move adoption forward. An IDG survey found that top challenges include identifying clear use cases, security, a lack of internal skills and cost. If the age of distributed intelligence is to reach its full potential, solution providers will need to rely on proven hardware and software platforms, supported by trusted partners with industry-specific experience.

The intelligent edge, practically applied​

On the factory floor, the convergence of technologies like edge, 5G, AI and automation is setting the stage for sectorwide Industry 4.0 transformation. Specifically, edge computing is enabling manufacturers access to real-time insights about operations, which allows them to automate control and monitoring processes, optimize logistics and anticipate and correct anomalies before they impact production. Modernizing the factory can be a complicated process due to the time and cost associated with replacing legacy systems with new technology and ensuring everything seamlessly integrates. However, edge computing with 5G can also provide greater flexibility to connect the factory in stages over time, since compute is localized. Intel is currently working with partners to build an end-to-end smart factory to demonstrate how with a modular application environment, digitalization can happen at any scale.
In health care, edge computing could have a similarly transformative effect on patient care. In the near-term future, edge devices may help with sharing real-time data about patients’ vital signs as soon as they enter an ambulance. Instead of having to run an assessment and additional tests once the patient arrives at the hospital, doctors and clinicians would be able to utilize the data already gathered to begin care immediately, continuing to assess, analyze and adjust treatment through the operative and the post-operative phases and beyond to ensure care is always personalized and based on the most up-to-date information. This is a single example; already, edge devices are being used in many other areas of health care to aid with advanced remote patient monitoring, image-based diagnostics, medical equipment management and robotic surgery.



What’s next for intelligent edge adoption​

As companies’ edge innovations scale and mature, moving from POC to full-scale deployment, collaboration will play a large role in the success of projects. Solution providers are beginning to realize the value of partnering with technology providers with deep, purpose-built portfolios and industry experience to develop customized edge solutions that drive efficiencies and outcomes. Security will remain a key concern, with connected, intelligent devices making attractive targets for attackers interested in stealing data or disrupting the flow of operations. For that reason, providers should also seek platforms infused with silicon-level telemetry to improve the detection of advanced threats at every level.
Distributed computing will enable limitless potential by ensuring many everyday devices aren’t just connected, but intelligent. In this future, data will no longer be stored in centralized locations, but always moving to where it can provide the most value. The companies that master the manipulation of data this way will be the ones to unlock its true potential, unleashing a new wave of innovation.



René Torres
René Torres is currently the VP / General Manager of IOT Sales and Marketing Group at Intel Corporation. René has been with Intel for 20+ years having started in 1997. René has worked in multiple management roles including sales, marketing, product management, and platform application engineering for Intel Architecture and software solutions with a special focus on markets such as wireless communications infrastructure, networking and mobile client computing. Prior to his current role, René was the Director of Marketing & Platform Enabling for Software Defined Networking and Network Function Virtualization in the Data Center Group (DCG). René was also the chief of staff and technical assistant to Doug Davis, SVP and GM of the Internet of Things Group (IOTG). He has worked abroad under three expat assignments in Brazil, China and Germany. René has a BS in Business Administration from Pepperdine University and MBA in Global Management from the Thunderbird School of Global Management.
Great article and having read the article watch Kristopher Carlton’s recent presentation and or read my following post from a few days ago:


“I listened to the Kristopher Carlson presentation regarding ADAS and enjoyed it very much. I do think the new slides they are using are easier to follow and they clearly recognise the importance of what Mercedes Benz has done for the Brainchip Brand.

My one concern about the Brainchip message has been the failure to fully capitalise on the killer advantage which AKIDA technology has over other technology revolutions in the past. Kristopher Carlson was hammering home this advantage.

Since time began there have been two barriers to the adoption of new ideas/technology:

1. Human barriers: Change comes with a cost to those who are required to embrace it. Those who have status and position because they understand the existing systems better than everyone else resist change because it comes with the risk they will be out shone by others below them in the pecking order of society or the company in which they exist. It also comes with an additional work load as they are required to keep up production while at the same time retiring the old and introducing the new and learning how it operates. If they can see retirement on the horizon do they really want to go through all that or let the next generation deal with it.

2. Capital cost thrown away: Change brings with it the need to throw away the existing physical system which has a cost and so on paper the numbers have to really stack up. If you have a physical system that has five years of use left in it why change now when the replacement item might not generate sufficient return in those five years to make a greater profit than you would have made had you retained the existing physical system. In this environment the above mentioned human barriers come into play and argue strongly the economic case for not adopting now but in the future. In this environment new technology is a very hard sell.

The killer advantage of which I speak is that these two barriers are non existent where the AKIDA technology revolution is concerned.

Kristopher Carlson goes to great lengths to make this clear. He points out that you do not have to throw away any of your existing system and you do not need to know anything about neuromorphic computing to take advantage of the Brainchip AKIDA revolution. The more developed your system the better. The more advanced your software solutions and algorithms the better. When you have your system complete then all you have to do using MetaTF is convert automatically from your existing CNN2SNN and AKD1000 is off and running giving you all the advantages of low power, latency, on chip one shot several shot learning and privacy.

No workers are made redundant and no capital costs are thrown away in fact AKIDA can extend the productive life of your existing system by making it more efficient and lower powered and able to match the latest technology if not out perform it.

The absolute beauty of how Brainchip has set up its revolutionary model is that the close to retirement jaded techie in charge can actually go on to the Brainchip website and using MetaTF run simulations of how their systems would be improved and then take these improvements to the Head of Department in his coffee stained shirt, with a jaded outlook and while counting the days till he can turn off the alarm clock forever and say "I can save us x dollars by simply doing a, b & c."


Kristopher Carlson is an academic not a sales person but his excitement for the AKIDA technology achievement runs all the way through his presentation and why shouldn't it he and his colleagues have so much to be proud of where AKIDA is concerned.”

This is why Brainchip will beat the big guys at the edge. In the article Intel is playing the trust the big guys card but it is like the Joker not always involved if you choose the right game.

Brainchip have won the strategic game and the reveals over the last five months proves my point.

My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 47 users

Labsy

Regular
Not to keep beating the lonely apple drum on this forum, but I really feel the apple silicon is evolving to implement the akida Ip. Have a look at the neural engines...a machine learning controller...and now 16 cores on the A15...what's next? How far can they push the software. Surely an architectural evolution on their arm processor to include akida ip...
I dont really know what I'm talking about and don't truelly understand the technology.
Just my opinion.
 
  • Like
  • Fire
  • Love
Reactions: 25 users

Labsy

Regular
Not to keep beating the lonely apple drum on this forum, but I really feel the apple silicon is evolving to implement the akida Ip. Have a look at the neural engines...a machine learning controller...and now 16 cores on the A15...what's next? How far can they push the software. Surely an architectural evolution on their arm processor to include akida ip...
I dont really know what I'm talking about and don't truelly understand the technology.
Just my opinion.
My prediction...apple A15 bionic gets an upgrade 2023/2024
 
  • Like
  • Love
  • Fire
Reactions: 16 users

M_C

Founding Member
Not to keep beating the lonely apple drum on this forum, but I really feel the apple silicon is evolving to implement the akida Ip. Have a look at the neural engines...a machine learning controller...and now 16 cores on the A15...what's next? How far can they push the software. Surely an architectural evolution on their arm processor to include akida ip...
I dont really know what I'm talking about and don't truelly understand the technology.
Just my opinion.
You aren't alone @Labsy

agree.gif
 
  • Like
  • Love
  • Fire
Reactions: 22 users
Then you might also find this article VERRRY interesting!!

HPE_Spaceborne_Computer-2_in_International_Space_Station_Columbus_module._Credit-_NASA-scaled.jpeg
Hewlett Packard Enterprise’s Spaceborne Computer-2, sent to the International Space Station in February 2021, is linked to Microsoft’s Azure cloud through NASA and HPE ground stations. Credit: NASA



COLORADO SPRINGS – Hewlett Packard Enterprise, Microsoft and NASA will share details at the 37th Space Symposium on 24 research experiments completed to date on the International Space Station’s HPE Spaceborne Computer-2, including analysis of astronaut gloves that relies on artificial intelligence.
Since the Spaceborne Computer-2 was installed in ISS in May 2021, HPE has been working with Microsoft and NASA to demonstrate a variety of applications. Experiments conducted to date have focused on astronaut healthcare, image processing, natural disasters, 3D printing and 5G communications.
The astronaut glove experiment relies on artificial intelligence to analyze photos and videos of the gloves astronauts wear when repairing equipment and installing instruments outside ISS. NASA and Microsoft developed a glove-analyzer AI model that looks for signs of glove damage. When damage is detected, an annotated image is sent automatically to Earth for further review.
“By introducing edge computing and AI capabilities to the International Space Station with Spaceborne Computer-2, we have helped foster a growing, collaborative research community that shares a common goal to make scientific and engineering breakthroughs that benefit humankind, in space and here on Earth,” Mark Fernandez, HPE Spaceborne Computer-2 principal investigator, said in a statement.
Edge processing will become increasingly important for human space exploration because astronauts traveling to the moon, Mars and other deep space destinations will experience communications delays. If used wisely, AI, cloud computing and space-based edge processors could eliminate the need for astronauts to constantly send information to the ground for processing and analysis.

Another widely discussed experiment looks for mutations in astronaut DNA. Prior to the Spaceborne Computer-2, sending a 1.8 gigabit raw DNA sequence took more than 12 hours to deliver the data to researchers on the ground for processing. Now, the data can be processed on the space station in six minutes, compressed and sent to Earth in two seconds, according to an HPE April 4 news release.
The Spaceborne Computer-2 also is being used to test automatic interpretation of satellite imagery. NASA Jet Propulsion Laboratory researchers use a type of artificial intelligence called deep learning to automatically interpret data captured in orbit of land and structures after disasters like floods and hurricanes.
Another experiment by the Cornell University Fracture Group tested modeling software that simulates 3D printing of metal parts and predicts failure or deformation that could result. Tests on the Spaceborne Computer-2 validated the software.
Mobile network operator Cumucore tested various features of its 5G core network on Spaceborne Computer-2. The experiment indicated that installing 5G equipment on some satellites and spacecraft could enhance space-based communications.
The Spaceborne Computer-2, which HPE sent to orbit in February 2021 in a Northrop Grumman Cygnus capsule in collaboration with the ISS National Laboratory, is expected to remain on ISS for approximately two more years.
Is this the reason Elon Musk has not been too space. Could his DNA reveal he was not born on Earth???

"Another widely discussed experiment looks for mutations in astronaut DNA."

On a serious note the Ai intelligent gloves sound like an obvious fit for a chip that can process multiple senses at ultra low power at the extreme Edge without connectivity either with the space craft or Earth. Being outside the space craft to repair an aerial or receiver of some sort has been one reason I read astronauts were required to walk in space on past missions.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 8 users

Diogenese

Top 20
Then you might also find this article VERRRY interesting!!

HPE_Spaceborne_Computer-2_in_International_Space_Station_Columbus_module._Credit-_NASA-scaled.jpeg
Hewlett Packard Enterprise’s Spaceborne Computer-2, sent to the International Space Station in February 2021, is linked to Microsoft’s Azure cloud through NASA and HPE ground stations. Credit: NASA



COLORADO SPRINGS – Hewlett Packard Enterprise, Microsoft and NASA will share details at the 37th Space Symposium on 24 research experiments completed to date on the International Space Station’s HPE Spaceborne Computer-2, including analysis of astronaut gloves that relies on artificial intelligence.
Since the Spaceborne Computer-2 was installed in ISS in May 2021, HPE has been working with Microsoft and NASA to demonstrate a variety of applications. Experiments conducted to date have focused on astronaut healthcare, image processing, natural disasters, 3D printing and 5G communications.
The astronaut glove experiment relies on artificial intelligence to analyze photos and videos of the gloves astronauts wear when repairing equipment and installing instruments outside ISS. NASA and Microsoft developed a glove-analyzer AI model that looks for signs of glove damage. When damage is detected, an annotated image is sent automatically to Earth for further review.
“By introducing edge computing and AI capabilities to the International Space Station with Spaceborne Computer-2, we have helped foster a growing, collaborative research community that shares a common goal to make scientific and engineering breakthroughs that benefit humankind, in space and here on Earth,” Mark Fernandez, HPE Spaceborne Computer-2 principal investigator, said in a statement.
Edge processing will become increasingly important for human space exploration because astronauts traveling to the moon, Mars and other deep space destinations will experience communications delays. If used wisely, AI, cloud computing and space-based edge processors could eliminate the need for astronauts to constantly send information to the ground for processing and analysis.

Another widely discussed experiment looks for mutations in astronaut DNA. Prior to the Spaceborne Computer-2, sending a 1.8 gigabit raw DNA sequence took more than 12 hours to deliver the data to researchers on the ground for processing. Now, the data can be processed on the space station in six minutes, compressed and sent to Earth in two seconds, according to an HPE April 4 news release.
The Spaceborne Computer-2 also is being used to test automatic interpretation of satellite imagery. NASA Jet Propulsion Laboratory researchers use a type of artificial intelligence called deep learning to automatically interpret data captured in orbit of land and structures after disasters like floods and hurricanes.
Another experiment by the Cornell University Fracture Group tested modeling software that simulates 3D printing of metal parts and predicts failure or deformation that could result. Tests on the Spaceborne Computer-2 validated the software.
Mobile network operator Cumucore tested various features of its 5G core network on Spaceborne Computer-2. The experiment indicated that installing 5G equipment on some satellites and spacecraft could enhance space-based communications.
The Spaceborne Computer-2, which HPE sent to orbit in February 2021 in a Northrop Grumman Cygnus capsule in collaboration with the ISS National Laboratory, is expected to remain on ISS for approximately two more years.

Hi @MC, @Bravo,

HP have several AI NN patents. Here is a sample:
https://worldwide.espacenet.com/pat... nftxt = "neural network" AND nftxt = "space"

This one has a 2020 priority date:

US2022012573A1 NEURAL NETWORK ACCELERATORS

1652496586615.png


[0030] FIG. 2 illustrates a neural network-based computing system 200 for performing tensor operations, in accordance with an example of the present subject matter. The neural network-based computing system 200 (referred to as system 200 ) may include a neural network accelerator 202 with a plurality of computational units arranged in a hierarchical manner. In an example, the system 200 may further include tile unit(s) 204 which include a plurality of core(s) 206 - 1 , 2 , . . . , N. The core(s) 206 - 1 , 2 , . . . , N (collectively referred to as cores 206 ) may further include a plurality of matrix-vector multiplication units for performing matrix vector multiplication. In an example, such matrix-vector multiplication units may be implemented using memristive crossbar arrays.


... and they've been on the memristive path for over a decade:

WO2009113993A1 NEUROMORPHIC CIRCUIT

1652497263296.png


Embodiments of the present invention are directed to neuromorphic circuits containing two or more internal neuron computational units. Each internal neuron computational unit includes a synchronization-signal input for receiving a synchronizing signal, at least one input for receiving input signals, and at least one output for transmitting an output signal. A memristive synapse connects an output signal line carrying output signals from a first set of one or more internal neurons to an input signal line that carries signals to a second set of one or more internal neurons.

I have a recollection of NASA saying that memristors were suitable for space because of their inherent immunity to radiation damage.

https://www.hpe.com/us/en/compute/hpc/supercomputing/spaceborne.html
HPE and NASA collaborate to put a commercial, off-the-shelf (COTS) supercomputer aboard the International Space Station (ISS).
The ultimate goal?
Prove the technology's ability to operate in the harsh conditions of space while performing data center-level compute processing at the edg
e.

For a COTS processor, the timetable is a little tight for Akida.

SPACEBORNE’S JOURNEY​

HPE and NASA collaborated to test if affordable, off the shelf servers could withstand the harsh conditions of space and provide reliable computing aboard the International Space Station (ISS).
Highlights of this collaborative journey include:

2017 HPE Spaceborne Computer Successfully Reaches and Powers Up in Space​

2018 HPE Spaceborne Computer Open for Supercomputing Use on the ISS​

2019 HPE Spaceborne Computer Returns to Earth​

2020 Next Version of HPE Spaceborne Computer Handed Over to NASA​

2021 Spaceborne Computer-2 Launched Into Space​



Here's one each so there's no squabbling:

1652497960745.png
1652497994470.png
 
  • Like
  • Haha
  • Love
Reactions: 23 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I hear you mate! But when I posted about it on hc...........(My post on hc is the plane)

View attachment 6560


Uh-oh! Then it looks like I'm toast too. 🥵

I just thought it all somehow tied back into this SBIR crapola because it says in regards to BrainChip "The system will be available as a “plug and play module” for all future spacecraft".🥴




Screen Shot 2022-05-14 at 1.06.16 pm.png
 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 13 users

ndefries

Regular
Hi Fact Finder,

There was also other Financial Advice that you gave that I believe was more important.
That was to have your investments ( BRN ) thanks in your SMSF.
Less or no tax on the gains.
I have done this and am way ahead, on paper.

thanks
Yep 15 percent capital gains tax or 10 if over 12 months holding. The government limit how much you can get into this great tax haven. By getting your super loaded with BRN at these prices will lead to a massive pot on the inside of your super. Then you may get to a situation you have too much money in super than you need for which there are strategies for getting your hands on it before 60. Hopefully a good problem to worry about later.
 
  • Like
  • Love
Reactions: 12 users

Boab

I wish I could paint like Vincent
Yep 15 percent capital gains tax or 10 if over 12 months holding. The government limit how much you can get into this great tax haven. By getting your super loaded with BRN at these prices will lead to a massive pot on the inside of your super. Then you may get to a situation you have too much money in super than you need for which there are strategies for getting your hands on it before 60. Hopefully a good problem to worry about later.
Screen Shot 2021-02-15 at 8.58.44 am.png
 
  • Like
  • Love
Reactions: 9 users
Did BrainChip just win an Emmy Award with GoPro and Socionext?



Logic as follows:

Samantha Hamilton is ex GoPRO and it seems that she is tight with the old crew (photo below)

Following her LinkedIn likes GoPro just won an Emmy for their advanced image stabilisation (photo below)

Socionext are in partnership with GoPro and share the Emmy due to their contribution - camera sensors (photo below)

Socionext are also obviously involved with BrainChip


View attachment 6548
View attachment 6557


View attachment 6550
View attachment 6551

View attachment 6553

View attachment 6554

I just searched the forum and found this May 3 post by @Fullmoonfever in the SOCIONEXT thread - which has more info about the Socionext tech

Post in thread 'SOCIONEXT'
https://thestockexchange.com.au/threads/socionext.1436/post-58055
 
  • Like
  • Thinking
  • Fire
Reactions: 10 users

Slade

Top 20
Did BrainChip just win an Emmy Award with GoPro and Socionext?

70099298-FBC1-427B-95C0-24C25CED497B.jpeg


Logic as follows:

Samantha Hamilton is ex GoPRO and it seems that she is tight with the old crew (photo below)

Following her LinkedIn likes GoPro just won an Emmy for their advanced image stabilisation (photo below)

Socionext are in partnership with GoPro and share the Emmy due to their contribution - camera sensors (photo below)

Socionext are also obviously involved with BrainChip


View attachment 6548
View attachment 6557


View attachment 6550
View attachment 6551

View attachment 6553

View attachment 6554
The BrainChip and Sicionext bit from BrainChip’s website is new to me. I must be blind because I can’t find it on the site. Where exactly is it?
 
  • Like
  • Sad
Reactions: 4 users
The BrainChip and Sicionext bit from BrainChip’s website is new to me. I must be blind because I can’t find it on the site. Where exactly is it?

I found via google search so maybe it’s been lost in crossover from old to new websites?

Link

March 23, 2020

BrainChip And Socionext Provide A New Low-Power Artificial Intelligence Platform For AI Edge Applications​


BrainChip and Socionext Provide a New Low-Power Artificial Intelligence Platform for AI Edge Applications ALISO VIEJO, Calif.–(BUSINESS WIRE)– BrainChip Holdings Ltd (ASX: BRN), a leading provider of ultra-low power high performance AI technology, today announced that Socionext Inc., a leader in advanced SoC solutions for video and imaging systems, will offer customers an Artificial Intelligence […]
Categories: Press

By Admin​


Share​


Desktop_Press-Releases_2x.png

BrainChip and Socionext Provide a New Low-Power Artificial Intelligence Platform for AI Edge Applications​

ALISO VIEJO, Calif.–(BUSINESS WIRE)– BrainChip Holdings Ltd (ASX: BRN), a leading provider of ultra-low power high performance AI technology, today announced that Socionext Inc., a leader in advanced SoC solutions for video and imaging systems, will offer customers an Artificial Intelligence Platform that includes the Akida SoC, an ultra-low power high performance AI technology.
BrainChip has developed an advanced neural networking processor that brings artificial intelligence to the edge in a way that existing technologies are not capable. This innovative, event-based, neural network processor is inspired by the event-based nature of the human brain. The resulting technology is high performance, small, ultra-low power and enables a wide array of edge capabilities that include local inference and incremental learning.
Socionext has played an important role in the implementation of BrainChip’s Akida IC, which required the engineering teams from both companies to work in concert. BrainChip’s AI technology provides a complete ultra-low power AI Edge Network for vision, audio, and smart transducers without the need for a host processor or external memory. The need for AI in edge computing is growing, and Socionext and BrainChip plan to work together in expanding this business in the global market.
Complementing the Akida SoC, BrainChip will provide training and technical customer support, including network simulation on the Akida Development Environment (ADE), emulation on a Field Programmable Gate Array (FPGA) and engineering support for Akida applications.
Socionext also offers a high-efficiency, parallel multi-core processor SynQuacerTM SC2A11 as a server solution for various applications. Socionext’s processor is available now and the two companies expect the Akida SoC engineering samples to be available in the third quarter of 2020.
In addition to integrating BrainChip’s AI technology in an SoC, system developers and OEMs may combine BrainChip’s proprietary Akida device and Socionext’s processor to create high-speed, high-density, low-power systems to perform image and video analysis, recognition and segmentation in surveillance systems, live-streaming and other video applications.
“Our neural network technology enables ultra-low power AI technology to be implemented effectively in edge applications”, said Louis DiNardo, CEO of BrainChip. “Edge devices have size and power consumption constraints that require a high degree of integration in IC solutions. The combination of BrainChip’s technology and Socionext’s ASIC expertise fulfills the requirements of edge applications. We look forward to working with the Socionext in commercial engagements.”
“As a leading provider of ASICs worldwide, we are pleased to offer our customers advanced technologies driving new innovations,” said Noriaki Kubo, Corporate Executive Vice President of Socionext Inc. “The Akida family of products allows us to stay at the forefront of the burgeoning AI market. BrainChip and Socionext have successfully collaborated on the Akida IC development and together, we aim to commercialize this product family and support our increasingly diverse customer base.”
About BrainChip Holdings Ltd (ASX: BRN)
BrainChip is a global technology company that has developed a revolutionary advanced neural networking processor that brings artificial intelligence to the edge in a way that existing technologies are not capable. The solution is high performance, small, ultra-low power and enables a wide array of edge capabilities that include continuous learning and inference. The company markets an innovative event-based neural network processor that is inspired by the spiking nature of the human brain and implements the network processor in an industry standard digital process. By mimicking brain processing BrainChip has pioneered an event domain neural network processor, called Akida™, which is both scalable and flexible to address the requirements in edge devices. At the edge, sensor inputs are analyzed at the point of acquisition rather than transmission to the cloud or a datacenter. Akida is designed to provide a complete ultra-low power Edge AI network processor for vision, audio and smart transducer applications. The reduction in system latency provides faster response and a more power efficient system that can reduce the large carbon footprint datacenters.
About Socionext Inc.
Socionext is a global, innovative enterprise that designs, develops and delivers System-on-Chip based solutions to customers worldwide. The company is focused on technologies that drive today’s leading-edge applications in consumer, automotive and industrial markets. Socionext combines world-class expertise, experience, and an extensive IP portfolio to provide exceptional solutions and ensure a better quality of experience for customers. Founded in 2015, Socionext Inc. is headquartered in Yokohama, and has offices in Japan, Asia, United States and Europe to lead its product development and sales activities. For more information, visit www.socionext.com.
Additional information is available at https://brainchipinc.com
Follow BrainChip on Twitter: https://www.twitter.com/BrainChip
Follow BrainChip on LinkedIn: https://www.linkedin.com/company/7792006
View source version on businesswire.com: https://www.businesswire.com/news/home/20200323005183/en/
Roger Levinson
rlevinson@brainchipinc.com
+1 (949) 330-6750

Source: BrainChip
 
  • Like
  • Love
  • Fire
Reactions: 13 users

Violin1

Regular
Another week older and deeper in…volved !

For mine, the past week was another big one for ‘The Eyes‘. Their dot joining, analysis, research, news and reporting has again been first class. Some within this esteemed group are working twelve hour plus shifts and posting up to twenty comments each day. Those of us like me, who lack the wherewithal to be part of this, are indeed indebted to them for their unflagging efforts to uncover the classified manoeuverings of our battler.

This being the case, and in an my ongoing attempt to generate healthy debate, I do have a couple of questions to pose on what appears to be weather wise…a wonderful weekend ahead.

Having taken the time to read the proposed new company constitution and remuneration report, and with the AGM fast approaching, I’m wondering what you in TSE land think?

Although there has been much speculation about it for sometime, we recently discovered that our company is indeed engaged in a partnership with ARM. As many have stated, this is a very important and welcome development... but it did not come from any formal announcement, a press release, or even a passing mention in an interview. Rather, it was discovered within the depths of our new website. Me myself personally wonders if you think this was the best way to handle such an important declaration?

It’s been another great week for accumulation by trading…see you at the AGM.
Howdy @Realinfo. I'll be demonstrating my childlike (not MC!) niaivity by saying I'll support the positions and motions that the Board has put forward without hugely detailed analysis. My run through the EM and looking at the remuneration levels leads me to the conclusion that the Board is:
1. Well structured and has top talent
2. Is being reasonably conservative in its dealings with shareholder funds
3. Clear about Consitution changes (doesn't seem anything new that is significant or worrying to me)

I may have raised an eyebrow at share allocations last time but looking at the track record over the last 18 months my eyebrow didn't even twitch this time. Sean's salary is very modest and the rest of his share allocations and performance arrangements align with driving the company commercialisation forward. Hearing him speak, I get the sense he wouldn't have it any other way - he's competitive, wants challenges and wants to be a leader and winner.

The Director fees are also very modest and share allocations look reasonable for where they're taking us over the next couple of years.

On the ARM partnership, I'm of the view that, while I would love regular ASX announcements to keep kicking the SP along, I accept the Board's strategy to limit those to $ material requirements for disclosure. To do otherwise sets us up for ASX demands for more and more detail that our NDA parties don't want. Other information does seem to come through in a couple of different ways but I'm not hassled by that. There would be so many variations in the nature of relationships that a standard mechanism would be difficult. If everything was rushed out what would the 1000 eyes have to do (Other than count money in the next few years...).

Looking forward to meeting people at the AGM.

A......B.........
 
  • Like
  • Love
  • Fire
Reactions: 35 users

Evermont

Stealth Mode
  • Like
  • Fire
  • Thinking
Reactions: 20 users
Remember this the 1.6 million is indexed so that if you are entering the retirement phase tomorrow I think the number is now around $1.7 million.

If you have a SMSF you can also have your spouse as a member and you each have $1.7 million so combined $3.4 million.

Also as the law presently stands if your individual account balance is say $2.7 million meaning you have $1 million more than the tax free amount you can leave it in the fund earning and pay your tax at 15% which is still a great outcome compared with your personal tax and capital gains tax rates.

If Brainchip eventually pays franked dividends you still get the benefit of the franking credits in the fund so it is all very nice.

Remember this however if you draw out everything in your fund the day before you die subject to possibly capital gains on some part of those funds at 10% you will basically be able to leave it all to your cat, dog, the charity of your choice or your children.

If however you forget to withdraw while alive in winding up your estate and calling in those funds from your super to be distributed in accordance with your will they will be taxed at 47 cents in the dollar on their way out of the fund.

If you have spouse living at the time of your death those funds will transfer to them tax free but if they leave them in the fund and pass away the 47 cent tax rule will apply.

There are special provisions covering dependant children.

So it is critical to take annual advice from your accountant as to the rules and if any have been changed and keep up to date with what political parties have in mind for your hard earned.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 45 users
Top Bottom