BRN Discussion Ongoing

dippY22

Regular
I don't mean to throw a turd in the punchbowl, but watch out for the US markets this coming week. (The U.S. markets....) they WANT to keep falling. They NEED to keep falling. With a few of the big boys reporting next week even good results from them could be punished.

The U.S. Fed is running the countries equities market show for awhile more. I think the best that Brainchip could do in this environment is a "meh" market reaction. Not too hot and not too cold. Just give us some incremental improvement hopefully, management reporting on continued customer engagement progress and an overall thumbs up report from management.

And then let the necessary passage of time put the worldwide macro events impacting markets everywhere right now (Ukraine, rising interest rates in U.S., inflation- at least in the U.S., worker shortages, etc), behind us, ...and pray that no significant virus varients pop up to shut things down again.

Brainchip is still in the early stages, afterall. I will always want them to under promise, and over deliver.

Sorry everyone, but that's just my opinion on the coming week ahead. Regards, dippY
 
  • Like
  • Love
  • Fire
Reactions: 27 users

VictorG

Member
I don't mean to throw a turd in the punchbowl, but watch out for the US markets this coming week. (The U.S. markets....) they WANT to keep falling. They NEED to keep falling. With a few of the big boys reporting next week even good results from them could be punished.

The U.S. Fed is running the countries equities market show for awhile more. I think the best that Brainchip could do in this environment is a "meh" market reaction. Not too hot and not too cold. Just give us some incremental improvement hopefully, management reporting on continued customer engagement progress and an overall thumbs up report from management.

And then let the necessary passage of time put the worldwide macro events impacting markets everywhere right now (Ukraine, rising interest rates in U.S., inflation- at least in the U.S., worker shortages, etc), behind us, ...and pray that no significant virus varients pop up to shut things down again.

Brainchip is still in the early stages, afterall. I will always want them to under promise, and over deliver.

Sorry everyone, but that's just my opinion on the coming week ahead. Regards, dippY
Spot on dippY22, the headwinds are many. The coming weeks will test the nerves of the most seasoned trader.
We need to remind ourselves that the BRN story keeps getting better and will weather the coming storms.
 
  • Like
Reactions: 11 users

Slade

Top 20
These are exciting times. Every quarterly is like Xmas to me. I love the build up.
 
  • Like
  • Love
  • Haha
Reactions: 13 users
I don't mean to throw a turd in the punchbowl, but watch out for the US markets this coming week. (The U.S. markets....) they WANT to keep falling. They NEED to keep falling. With a few of the big boys reporting next week even good results from them could be punished.

The U.S. Fed is running the countries equities market show for awhile more. I think the best that Brainchip could do in this environment is a "meh" market reaction. Not too hot and not too cold. Just give us some incremental improvement hopefully, management reporting on continued customer engagement progress and an overall thumbs up report from management.

And then let the necessary passage of time put the worldwide macro events impacting markets everywhere right now (Ukraine, rising interest rates in U.S., inflation- at least in the U.S., worker shortages, etc), behind us, ...and pray that no significant virus varients pop up to shut things down again.

Brainchip is still in the early stages, afterall. I will always want them to under promise, and over deliver.

Sorry everyone, but that's just my opinion on the coming week ahead. Regards, dippY
I agree but remember the OECD has recently predicted that once again Australia will be the miracle economy and grow at more than 4% for the rest of this year.

When my son was in primary school he came home one day and asked his mother “When are you and Dad getting divorced?” She said “We are not.” He said “All my friends parents are.”

His logic was flawless but things are much more complex underneath than they appear from the outside.

The ASX will react to what the US does because that is what it does.

The US reacts because their economy is not performing well.

Australia’s economy according to the OECD is a miracle but just like my son Aussie investors look at America from the outside and draw a false conclusion about Australia.

The fact that this occurs is what traders and institutions rely upon safe volatility.

No real concern about fundamentals involved so even if we get stuck in a trade we are at no risk of losing the lot.

Hence they cultivate the idea of the US sneezing and Australia gets a cold. It is massive institutionalised manipulation and Australia falls for it every single time.

I suspect if Australia became wealthier than the US they would still engage the same lie and out of habit Australians would follow.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Thinking
Reactions: 19 users

Xray1

Regular
Hi Potato. :)
The start of next week maybe somewhat influenced by the American and broader markets on Anzac day.
As you know, their last run was a negative session pretty much everywhere (except Singapore) and the Yanks in particular often seem to give our general market a lead in sentiment and confidence.
I'm looking to top up at the moment so selfishly am not fussed with the current depression in our share price as I know it to be a temporary phenomena and a bit of lucky timing for me.
I was going to be buying some more around now whatever the price was and still think fair value atm is around the $1.50 mark and was quite prepared to pay that, however, am also happy to be the recipient of all that hard work and risk the shortee's have got going on and bagging a few extra bangs for my buck. :)
We here, all know that we are on a good thing, that the company is kicking goals and that we are in good hands as far as management are concerned. But we also know that we are not operating in a vacuum. Wars, shifting geopolitical tensions, ongoing global pandemics, climate change ffs let alone widespread supply chain issues along with transitions to new energy sources all have short and long term effects which are in the process of playing out.
Many subtle and gross influences, as well as big boy manipulation are playing on the old greed and fear levers that kick our share price around.
Things which I guess those who trade in the short term have to be more concerned with, at least in regard to the daily fluctuations of a particular share price.
I have chosen to invest, and so now have the luxury of being able to take a longer view and operate over a broader time scale and as far as I can see, all indications point in a very positive direction, for Brainchip.
The world needs what we have and we see, almost daily confirmation (as revealed here by the 1000 eyes) that we are getting noticed.
Traction will follow.
We have the right product, at the right price, at the right time, protected as we can be, by patent, commercially available now, with a pipeline of future enhanced versions of same, being finished as we speak or on test benches in labs awaiting their turn.
Much is going on in the labs and boardrooms of our customers and all the key players are doing their damnedest to keep it as secret squirrel as poss. for the sake of first mover advantage.
It will eventually all come out and as that occurs our share price will reflect it.
In the meantime, my strategy is to accumulate as much as possible and hold as long as I can in accord with my other general and financial considerations, in the expectation of future dividend payments, allowing me an ongoing income stream along with a legacy for my family and other worthy notables.
We are all impatient for the widespread adoption of our tech and the rewards which will flow, but ultimately, it will take as long as it's going to,
so I figure my best posture is to just relax and enjoy the journey.
It will be one of decades, not days. :)
AKIDA BALLISTA
AKIDA EVERYWHERE
GLTAH
Extremely well articulated and what I envisage to be a summary of the sentiment we s/holders & posters hold dear to our own investment strategies and potential outcomes.
 
  • Like
  • Love
Reactions: 5 users
D

Deleted member 118

Guest
  • Haha
  • Like
Reactions: 8 users
I agree but remember the OECD has recently predicted that once again Australia will be the miracle economy and grow at more than 4% for the rest of this year.

When my son was in primary school he came home one day and asked his mother “When are you and Dad getting divorced?” She said “We are not.” He said “All my friends parents are.”

His logic was flawless but things are much more complex underneath than they appear from the outside.

The ASX will react to what the US does because that is what it does.

The US reacts because their economy is not performing well.

Australia’s economy according to the OECD is a miracle but just like my son Aussie investors look at America from the outside and draw a false conclusion about Australia.

The fact that this occurs is what traders and institutions rely upon safe volatility.

No real concern about fundamentals involved so even if we get stuck in a trade we are at no risk of losing the lot.

Hence they cultivate the idea of the US sneezing and Australia gets a cold. It is massive institutionalised manipulation and Australia falls for it every single time.

I suspect if Australia became wealthier than the US they would still engage the same lie and out of habit Australians would follow.

My opinion only DYOR
FF

AKIDA BALLISTA
Ask yourself when was the last time you felt any sense of concern because the Japanese market took a fall. I never have if I am completely honest. Never give it a thought really.

Then ask yourself WHY having regard to the following importance of Japan verses the US to the Australian economy:

1650771751137.png
 
  • Like
  • Fire
Reactions: 8 users

Slade

Top 20
  • Haha
  • Like
Reactions: 5 users
Posted this then worked out @uiux had already done so far more comprehensively back in February

I’m more of a simple dot joiner as I do not have a technical background

I’ll leave my post here too in case others missed the one from Uiux like I did. Credit goes to Uiux

Link here to @uiux’s post

0E9AC82B-E00B-4DF1-9150-18A9CB5AEB46.jpeg

041E1179-1113-467D-9EF9-914EE66240AB.jpeg

6487F90F-1D37-4D3B-93EE-03FDFF6246DE.jpeg


A7361FCF-62BE-48A3-9B4E-4F855B3FDEDC.jpeg

8CFEA9EF-1CE9-4FC7-BB39-055AA78783B8.jpeg


1349F74C-38E3-406F-80BF-9C68427D3F0F.jpeg
D1109A8C-BC45-4B48-87D8-F3CF079DA2C9.jpeg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 30 users
D

Deleted member 118

Guest
He even looks like me.

Don’t know who I feel sorry for now.
 
  • Like
  • Haha
Reactions: 4 users

chapman89

Founding Member
  • Like
  • Fire
  • Thinking
Reactions: 22 users

Build-it

Regular
I don't mean to throw a turd in the punchbowl, but watch out for the US markets this coming week. (The U.S. markets....) they WANT to keep falling. They NEED to keep falling. With a few of the big boys reporting next week even good results from them could be punished.

The U.S. Fed is running the countries equities market show for awhile more. I think the best that Brainchip could do in this environment is a "meh" market reaction. Not too hot and not too cold. Just give us some incremental improvement hopefully, management reporting on continued customer engagement progress and an overall thumbs up report from management.

And then let the necessary passage of time put the worldwide macro events impacting markets everywhere right now (Ukraine, rising interest rates in U.S., inflation- at least in the U.S., worker shortages, etc), behind us, ...and pray that no significant virus varients pop up to shut things down again.

Brainchip is still in the early stages, afterall. I will always want them to under promise, and over deliver.

Sorry everyone, but that's just my opinion on the coming week ahead. Regards, dippY

Listening to Cramer, always seems like I have to listen to his clips twice to understand what he is on about.

He certainly has an interesting way of presenting but I suppose that can be US way.

 
  • Like
Reactions: 3 users

Terroni2105

Founding Member
Further confirmation from Intel that they are way behind Akida.



Intel Labs lead Rich Uhlig offered two possibilities: integrating Loihi in a CPU for PCs to perform energy-efficient AI tasks and potentially offering its neuromorphic chips as a cloud service, although Uhlig was clear he wasn't firming actual product plans, just projecting what could theoretically happen in the future.

"Right now with Loihi, we're at that point where we think we're onto something, but we don't actually have product plans yet. We're sort of earlier on in that work stream," he said last month.


 
  • Like
  • Haha
  • Love
Reactions: 28 users
D

Deleted member 118

Guest
Further confirmation from Intel that they are way behind Akida.



Intel Labs lead Rich Uhlig offered two possibilities: integrating Loihi in a CPU for PCs to perform energy-efficient AI tasks and potentially offering its neuromorphic chips as a cloud service, although Uhlig was clear he wasn't firming actual product plans, just projecting what could theoretically happen in the future.

"Right now with Loihi, we're at that point where we think we're onto something, but we don't actually have product plans yet. We're sort of earlier on in that work stream," he said last month.



If anyone from Intel is reading this

 
  • Haha
  • Like
Reactions: 23 users
  • Like
  • Fire
Reactions: 11 users
Latest vacancy just saw while surfing.

Looking for a Junior role to get up to speed on Akida 1.0 but assist the research team in dev of Akida 2.0 :)




BrainChip, Inc.
Laguna Hills, CA
  • Posted: 1 day ago
  • Full-Time
Job Description

Come join the company leading the technological revolution in artificial intelligence. BrainChip is a global technology company producing a groundbreaking neuromorphic processor that brings artificial intelligence to the edge in a way that is beyond the capabilities of other products.
We are the world’s first commercial producer of ultra-low-power and high-performance artificial intelligence technology processors that enables a wide array of applications such as self-driving cars, hearing aids, drones, and agricultural equipment. The event-based neural network processor is inspired by the spiking nature of the human brain and is implemented in an industry-standard digital process.
Our company was recognized as one of the “Startups Worth Watching in 2021” in EE Times’ annual Silicon 100 list of global semiconductor technologies and our founder was named the winner of the AI Hardware 2021 Innovator Award. We have offices in Laguna Hills, California; Toulouse, France; Hyderabad, India; and Perth, Australia. We are also publicly traded on the Australian Stock Exchange (BRN:ASX) and the OTC Market (BRCHF).
Job Title: Junior Machine Learning Engineer
Reports To: Manager of Applied Research
Department: R&D

SUMMARY:
The Junior Machine Learning Engineer’s primary role is to is to support the research team in developing commercial applications for BrainChip’s Akida Neuromorphic System-on-Chip (NSoC). Some of the target applications include work in computer vision (object classification/detection and face recognition), audio processing (keyword spotting), and sensor fusion. Additionally, this team member will support the research team’s algorithm development for the next version of the Akida NSoC.
ESSENTIAL JOB DUTIES AND RESPONSIBILITIES:
  • Quickly prototyping, training, and testing machine learning (ML) solutions using online code repositories, research publications, or customer specifications
  • Keeping abreast of developments in ML such as new development tools, libraries, and frameworks as well as new ML models/architectures, training techniques, and application pipelines
  • Participating in ML algorithm/hardware co-design tasks
  • Performing, documenting, and presenting detailed analyses related to ML algorithm development, software/hardware benchmarking, and application development
  • Obtaining a thorough understanding of the Akida 1.0 hardware device and associated software stack (MetaTF)
  • Interfacing with customers to discuss ML application goals, constraints, and opportunities
QUALIFICATIONS:To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Education/Experience:
  • Bachelor’s Degree in Computer Science or equivalent.
  • Course work in machine learning and computer vision
  • Strong programming skills in Python
  • One year experience developing ML applications in either TensorFlow/Keras and/or PyTorch
  • Excellent communication skills
  • Experience in one or more of the following application fields: Image Processing/Computer Vision, ADAS, Anomaly Detection, Audio/Speech Processing, or Sensor Fusion.
Preferred Qualifications:
  • Two plus years’ experience developing ML applications in TensorFlow/Keras and PyTorch
  • Multi-project experience in object classification, object detection, face recognition, and/or keyword spotting
  • Knowledge of deep learning quantization techniques
  • Experience with Docker and Git
  • Experience with Scrum/Agile software development (e.g. Jira)
Benefits Offered:
· Competitive Pay
· Restricted Stock Units
· Bonus Pay up to 10% of annual salary
· 401K with matching
· Free Lunch Daily
· Flexible Work Schedule
· Paid Time Off
· Holiday Pay
· Company-paid Medical HMO, Dental PPO, and Vision Insurance
· Company-paid Life Insurance and AD&D
· Employee Assistance Program, Caregiver Support, Adoption Assistance Program
· Flexible Spending Account
· Health Savings Account
· Commuter Benefit Program
· Employee Discounts

BrainChip is an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, veteran status, or any other status protected under federal, state, or local law.

BrainChip, Inc.​


Address​

Laguna Hills, CA
92653 USA

Industry​

Technology
 
  • Like
  • Fire
  • Love
Reactions: 43 users

Shadow59

Regular
Latest vacancy just saw while surfing.

Looking for a Junior role to get up to speed on Akida 1.0 but assist the research team in dev of Akida 2.0 :)




BrainChip, Inc.
Laguna Hills, CA
  • Posted: 1 day ago
  • Full-Time
Job Description

Come join the company leading the technological revolution in artificial intelligence. BrainChip is a global technology company producing a groundbreaking neuromorphic processor that brings artificial intelligence to the edge in a way that is beyond the capabilities of other products.
We are the world’s first commercial producer of ultra-low-power and high-performance artificial intelligence technology processors that enables a wide array of applications such as self-driving cars, hearing aids, drones, and agricultural equipment. The event-based neural network processor is inspired by the spiking nature of the human brain and is implemented in an industry-standard digital process.
Our company was recognized as one of the “Startups Worth Watching in 2021” in EE Times’ annual Silicon 100 list of global semiconductor technologies and our founder was named the winner of the AI Hardware 2021 Innovator Award. We have offices in Laguna Hills, California; Toulouse, France; Hyderabad, India; and Perth, Australia. We are also publicly traded on the Australian Stock Exchange (BRN:ASX) and the OTC Market (BRCHF).
Job Title: Junior Machine Learning Engineer
Reports To: Manager of Applied Research
Department: R&D

SUMMARY:
The Junior Machine Learning Engineer’s primary role is to is to support the research team in developing commercial applications for BrainChip’s Akida Neuromorphic System-on-Chip (NSoC). Some of the target applications include work in computer vision (object classification/detection and face recognition), audio processing (keyword spotting), and sensor fusion. Additionally, this team member will support the research team’s algorithm development for the next version of the Akida NSoC.
ESSENTIAL JOB DUTIES AND RESPONSIBILITIES:
  • Quickly prototyping, training, and testing machine learning (ML) solutions using online code repositories, research publications, or customer specifications
  • Keeping abreast of developments in ML such as new development tools, libraries, and frameworks as well as new ML models/architectures, training techniques, and application pipelines
  • Participating in ML algorithm/hardware co-design tasks
  • Performing, documenting, and presenting detailed analyses related to ML algorithm development, software/hardware benchmarking, and application development
  • Obtaining a thorough understanding of the Akida 1.0 hardware device and associated software stack (MetaTF)
  • Interfacing with customers to discuss ML application goals, constraints, and opportunities
QUALIFICATIONS:To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Education/Experience:
  • Bachelor’s Degree in Computer Science or equivalent.
  • Course work in machine learning and computer vision
  • Strong programming skills in Python
  • One year experience developing ML applications in either TensorFlow/Keras and/or PyTorch
  • Excellent communication skills
  • Experience in one or more of the following application fields: Image Processing/Computer Vision, ADAS, Anomaly Detection, Audio/Speech Processing, or Sensor Fusion.
Preferred Qualifications:
  • Two plus years’ experience developing ML applications in TensorFlow/Keras and PyTorch
  • Multi-project experience in object classification, object detection, face recognition, and/or keyword spotting
  • Knowledge of deep learning quantization techniques
  • Experience with Docker and Git
  • Experience with Scrum/Agile software development (e.g. Jira)
Benefits Offered:
· Competitive Pay
· Restricted Stock Units
· Bonus Pay up to 10% of annual salary
· 401K with matching
· Free Lunch Daily
· Flexible Work Schedule
· Paid Time Off
· Holiday Pay
· Company-paid Medical HMO, Dental PPO, and Vision Insurance
· Company-paid Life Insurance and AD&D
· Employee Assistance Program, Caregiver Support, Adoption Assistance Program
· Flexible Spending Account
· Health Savings Account
· Commuter Benefit Program
· Employee Discounts

BrainChip is an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, veteran status, or any other status protected under federal, state, or local law.

BrainChip, Inc.​


Address​

Laguna Hills, CA
92653 USA

Industry​

Technology
I would love to see Nvidia advertising for system engineers with knowledge of Akida.
That would make me very happy.;)
 
  • Like
  • Haha
  • Love
Reactions: 22 users
Latest vacancy just saw while surfing.

Looking for a Junior role to get up to speed on Akida 1.0 but assist the research team in dev of Akida 2.0 :)




BrainChip, Inc.
Laguna Hills, CA
  • Posted: 1 day ago
  • Full-Time
Job Description

Come join the company leading the technological revolution in artificial intelligence. BrainChip is a global technology company producing a groundbreaking neuromorphic processor that brings artificial intelligence to the edge in a way that is beyond the capabilities of other products.
We are the world’s first commercial producer of ultra-low-power and high-performance artificial intelligence technology processors that enables a wide array of applications such as self-driving cars, hearing aids, drones, and agricultural equipment. The event-based neural network processor is inspired by the spiking nature of the human brain and is implemented in an industry-standard digital process.
Our company was recognized as one of the “Startups Worth Watching in 2021” in EE Times’ annual Silicon 100 list of global semiconductor technologies and our founder was named the winner of the AI Hardware 2021 Innovator Award. We have offices in Laguna Hills, California; Toulouse, France; Hyderabad, India; and Perth, Australia. We are also publicly traded on the Australian Stock Exchange (BRN:ASX) and the OTC Market (BRCHF).
Job Title: Junior Machine Learning Engineer
Reports To: Manager of Applied Research
Department: R&D

SUMMARY:
The Junior Machine Learning Engineer’s primary role is to is to support the research team in developing commercial applications for BrainChip’s Akida Neuromorphic System-on-Chip (NSoC). Some of the target applications include work in computer vision (object classification/detection and face recognition), audio processing (keyword spotting), and sensor fusion. Additionally, this team member will support the research team’s algorithm development for the next version of the Akida NSoC.
ESSENTIAL JOB DUTIES AND RESPONSIBILITIES:
  • Quickly prototyping, training, and testing machine learning (ML) solutions using online code repositories, research publications, or customer specifications
  • Keeping abreast of developments in ML such as new development tools, libraries, and frameworks as well as new ML models/architectures, training techniques, and application pipelines
  • Participating in ML algorithm/hardware co-design tasks
  • Performing, documenting, and presenting detailed analyses related to ML algorithm development, software/hardware benchmarking, and application development
  • Obtaining a thorough understanding of the Akida 1.0 hardware device and associated software stack (MetaTF)
  • Interfacing with customers to discuss ML application goals, constraints, and opportunities
QUALIFICATIONS:To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Education/Experience:
  • Bachelor’s Degree in Computer Science or equivalent.
  • Course work in machine learning and computer vision
  • Strong programming skills in Python
  • One year experience developing ML applications in either TensorFlow/Keras and/or PyTorch
  • Excellent communication skills
  • Experience in one or more of the following application fields: Image Processing/Computer Vision, ADAS, Anomaly Detection, Audio/Speech Processing, or Sensor Fusion.
Preferred Qualifications:
  • Two plus years’ experience developing ML applications in TensorFlow/Keras and PyTorch
  • Multi-project experience in object classification, object detection, face recognition, and/or keyword spotting
  • Knowledge of deep learning quantization techniques
  • Experience with Docker and Git
  • Experience with Scrum/Agile software development (e.g. Jira)
Benefits Offered:
· Competitive Pay
· Restricted Stock Units
· Bonus Pay up to 10% of annual salary
· 401K with matching
· Free Lunch Daily
· Flexible Work Schedule
· Paid Time Off
· Holiday Pay
· Company-paid Medical HMO, Dental PPO, and Vision Insurance
· Company-paid Life Insurance and AD&D
· Employee Assistance Program, Caregiver Support, Adoption Assistance Program
· Flexible Spending Account
· Health Savings Account
· Commuter Benefit Program
· Employee Discounts

BrainChip is an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, veteran status, or any other status protected under federal, state, or local law.

BrainChip, Inc.​


Address​

Laguna Hills, CA
92653 USA

Industry​

Technology
Another great pick up and highly significant. Thanks FMF.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
Reactions: 11 users
The following is for the visionaries who have stayed the course. Written in 2017 it makes clear how absolutely amazing it is where Brainchip is today:

Neuromorphic Chips Are Destined for Deep Learning—or Obscurity​

Researchers in this specialized field have hitched their wagon to deep learning’s star​

LEE GOMES
29 MAY 2017
ILLUSTRATION: CHAD HAGEN
People in the tech world talk of a technology “crossing the chasm" by making the leap from early adopters to the mass market. A case study in chasm crossing is now unfolding in neuromorphic computing.
The approach mimics the way neurons are connected and communicate in the human brain, and enthusiasts say neuromorphic chips can run on much less power than traditional CPUs. The problem, though, is proving that neuromorphics can move from research labs to commercial applications. The field's leading researchers spoke frankly about that challenge at the Neuro Inspired Computational Elements Workshop, held in March at the IBM research facility at Almaden, Calif.
“There currently is a lot of hype about neuromorphic computing," said Steve Furber, the researcher at the University of Manchester, in England, who heads the SpiNNaker project, a major neuromorphics effort. “It's true that neuromorphic systems exist, and you can get one and use one. But all of them have fairly small user bases, in universities or industrial research groups. All require fairly specialized knowledge. And there is currently no compelling demonstration of a high-volume application where neuromorphic outperforms the alternative."
Other attendees gave their own candid analyses. Another prominent researcher, Chris Eliasmith of the University of Waterloo, in Ontario, Canada, said the field needs to meet the hype issue “head-on." Given that neuromorphics has generated a great deal of excitement, Eliasmith doesn't want to “fritter it away on toy problems": A typical neuro morphic demonstration these days will show a system running a relatively simple artificial intelligence application. Rudimentary robots with neuromorphic chips have navigated down a Colorado mountain trail and rolled over squares of a specific color placed in a pattern on the floor. The real test is for traditional companies to accept neuromorphics as a mainstay platform for everyday engineering challenges, Eliasmith said, but there is “tons more to do" before that happens.
Illustration: James Provost Tiny Spikes: Two layers within a neural network contain groups of “neurons" with similar functions, indicated by color [blue, yellow, orange, and pink] in the illustration on the left. In the graphic on the right, those neurons are mapped to spiking neurons in an IBM TrueNorth chip. The spiking neurons are connected by gridlike “synapses" to other neurons in the same core, and to a row of inputs. Those inputs can generate spikes, which are then processed by the neural network.
The basic building block of neuromorphic computing is what researchers call a spiking neuron, which plays a role analogous to what a logic gate does in traditional computing. In the central processing unit of your desktop, transistors are assembled into different types of logic gates—AND, OR, XOR, and the like—each of which evaluates two binary inputs. Then, based on those values and the gate's type, each gate outputs either a 1 or a 0 to the next logic gate in line. All of them work in precise synchronization to the drumbeat of the chip's master clock, mirroring the Boolean logic of the software it's running.
The spiking neuron is a different beast. Imagine a node sitting on a circuit and measuring whatever spikes—in the form of electrical pulses—are transmitted along the circuit. If a certain number of spikes occur within a certain period of time, the node is programmed to send along one or more new spikes of its own, the exact number depending on the design of the particular chip. Unlike the binary, 0-or-1 option of traditional CPUs, the responses to spikes can be weighted to a range of values, giving neuromorphics something of an analog flavor. The chips save on energy in large part because their neurons aren't constantly firing, as occurs with traditional silicon technology, but instead become activated only when they receive a spiking signal.
“When we look at how neurons compute in the brain, there are concrete things we can learn"
A neuromorphic system connects these spiking neurons into complex networks, often according to a task-specific layout that programmers have worked out in advance. In a network designed for image recognition, for example, certain connections between neurons take on certain weights, and the way spikes travel between these neurons with their respective weights can be made to represent different objects. If one pattern of spikes appears at the output, programmers would know the image is of a cat; another pattern of spikes would indicate the image is of a chair.
Within neuromorphics, each research group has come up with its own design to make this possible. IBM's DARPA-funded TrueNorth neuromorphic chip, for example, does its spiking in custom hardware, while Furber's SpiNNaker (Spiking Neural Network Architecture) relies on software running on the ARM processors that he helped develop.
In the early days, there was no consensus on what neuromorphic systems would actually do, except to somehow be useful in brain research. In truth, spiking chips were something of a solution looking for a problem. Help, though, arrived unexpectedly from an entirely different part of the computing world.
Starting in the 1990s, artificial intelligence researchers made a number of theoretical advances involving the design of the “neural networks" that had been used for decades for computational problem solving, though with limited success. Emre Neftci, with the University of California, Irvine's Neuromorphic Machine Intelligence Lab, said that when combined with faster silicon chips, these new, improved neural networks allowed computers to make dramatic advances in classic computing problems, such as image recognition.
This new breed of computing tools used what's come to be called deep learning, and in the past few years, deep learning has basically taken over the computer industry. Members of the neuromorphics research community soon discovered that they could take a deep-learning network and run it on their new style of hardware. And they could take advantage of the technology's power efficiency: The TrueNorth chip, which is the size of a postage stamp and holds a million “neurons," is designed to use a tiny fraction of the power of a standard processor.
Those power savings, say neuromorphics boosters, will take deep learning to places it couldn't previously go, such as inside a mobile phone, and into the world's hottest technology market. Today, deep learning enables many of the most widely used mobile features, such as the speech recognition required when you ask Siri a question. But the actual processing occurs on giant servers in the cloud, for lack of sufficient computing horsepower on the device. With neuromorphics on board, say its supporters, everything could be computed locally.
Which means that neuromorphic computing has, to a considerable degree, hitched its wagon to deep learning's star. When IBM wanted to show off a killer app for its TrueNorth chip, it ran a deep neural networkthat classified images. Much of the neuromorphics community now defines success as being able to supply extremely power-efficient chips for deep learning, first for big server farms such as those run by Google, and later for mobile phones and other small, power-sensitive applications. The former is considered the easier engineering challenge, and neuromorphics optimists say commercial products for server farms could show up in as few as two years.
Unfortunately for neuromorphics,just about everyone else in the semiconductor industry—including big players like Intel and Nvidia—also wants in on the deep-learning market. And that market might turn out to be one of the rare cases in which the incumbents, rather than the innovators, have the strategic advantage. That's because deep learning, arguably the most advanced software on the planet, generally runs on extremely simple hardware.
“The neuromorphic approaches are interesting scientifically, but they are nowhere close on accuracy"
Karl Freund, an analyst with Moor Insights & Strategy who specializes in deep learning, said the key bit of computation involved in running a deep-learning system—known as matrix multiplication—can easily be handled with 16-bit and even 8-bit CPU components, as opposed to the 32- and 64-bit circuits of an advanced desktop processor. In fact, most deep-learning systems use traditional silicon, especially the graphics coprocessors found in the video cards best known for powering video games. Graphics coprocessors can have thousands of cores, all working in tandem, and the more cores there are, the more efficient the deep-learning network.
So chip companies are bringing out deep-learning chips that are made out of very simple, traditional components, optimized to use as little power as possible. (That's true of Google's Tensor Processing Unit, the chip the search company announced last year in connection with its own deep-learning efforts.) Put differently, neuromorphics' main competition as the platform of choice for deep learning is an advanced generation of what are essentially “vanilla" silicon chips.
Some companies on the vanilla side of this argument deny that neuromorphic systems have an edge in power efficiency. William J. Dally, a Stanford electrical engineering professor and chief scientist at Nvidia, said that the demonstrations performed with TrueNorth used a very early version of deep learning, one with much less accuracy than is possible with more recent systems. When accuracy is taken into account, he said, any energy advantage of neuromorphics disappears.
“People who do conventional neural networks get results and win the competitions," Dally said. “The neuromorphic approaches are interesting scientifically, but they are nowhere close on accuracy."
Indeed, researchers have yet to figure out simple ways to get neuromorphic systems to run the huge variety of deep-learning networks that have been developed on conventional chips. Brian Van Essen, at the Center for Applied Scientific Computing at the Lawrence Livermore National Laboratory, said his group has been able to get neural networks to run on TrueNorth but that the task of picking the right network and then successfully porting it over remains “a challenge." Other researchers say the most advanced deep-learning systems require more neurons, with more possible interconnections, than current neuromorphic technology can offer.
The neuromorphics communitymust tackle these problems with a small pool of talent. The March conference, the field's flagship event, attracted only a few hundred people; meetings associated with deep learning usually draw many thousands. IBM, which declined to comment for this article, said last fall that TrueNorth, which debuted in 2014, is now running experiments and applications for more than 130 users at more than 40 universities and research centers.
By contrast, there is hardly a Web company or university computer department on the planet that isn't doing something with deep learning on conventional chips. As a result, those conventional architectures have a robust suite of development tools, along with legions of engineers trained in their use— typical advantages of an incumbent technology with a large installed base. Getting the deep-learning community to switch to a new and unfamiliar way of doing things will prove extremely difficult unless neuromorphics can offer an unmistakable performance and power advantage.
Again, that's a problem the neuromorphics community openly acknowledges. If the presentations at the March conference frequently referred to the challenges that lie ahead for the field, most of them also offered suggestions on how to overcome them.
The University of Waterloo's Eliasmith, for example, said that neuromorphics must progress on a number of fronts. One of them is building more-robust hardware, with more neurons and interconnections, to handle more-advanced deep-learning systems. Also needed, he said, are theoretical insights about the inherent strengths and weaknesses of neuromorphic systems, to better know how to use them most productively. To be sure, he still believes the technology can live up to expectations. “We have been seeing regular improvements, so I'm encouraged," Eliasmith said.
Still, the neuromorphics community might find that its current symbiotic relationship with deep learning comes with its own hazards. For all the recent successes of deep learning, plenty of experts still question how much of an advance it will turn out to be.
Photo: University of Manchester Building Blocks: The SpiNNaker project is constructing a machine with 50,000 of these specialized chips in hopes of creating a network of 1 billion “neurons."
Deep learning clearly delivers superior results in applications such as pattern recognition, in which one picture is matched to another picture, or for language translation. It remains to be seen how far the technique will take researchers toward the holy grail of “generalized intelligence," or the ability of a computer to have, like HAL 9000 in the film 2001: A Space Odyssey, the reasoning and language skills of a human. Deep-learning pioneer Yann LeCun compares AI research to driving in the fog. He says there is a chance that even armed with deep learning, AI might any day now crash into another brick wall.
That prospect caused some at the conference to suggest that neuromorphics researchers should persevere even if the technology doesn't deliver a home run for deep learning. Bruno Olshausen, director of the University of California, Berkeley's Redwood Center for Theoretical Neuroscience, said neuromorphic technology may, on its own, someday bring about AI results more sophisticated than anything deep learning ever could. “When we look at how neurons compute in the brain, there are concrete things we can learn," he said. “Let's try to build chips that do the same thing, and see what we can leverage out of them."
The SpiNNaker project's Furber echoed those sentiments when asked to predict when neuromorphics would be able to produce low-power components that could be used in mobile phones. His estimate was five years—but he said he was only 80 percent confident in that prediction. He added, however, that he was far more certain that neuromorphics would play an important role in studying the brain, just as early proponents thought it might.
However, there is a meta-issue hovering over the neuromorphics community: Researchers don't know whether the spiking behavior they are mimicking in the brain is central to the way the mind works, or merely one of its many accidental by-products. Indeed, the surest way to start an argument with a neuromorphics researcher is to suggest that we don't really know enough about how the brain works to have any business trying to copy it in silicon. The usual response you'll get is that while we certainly don't know everything, we clearly know enough to start.
It has often been noted that progress in aviation was made only after inventors stopped trying to copy the flapping wings of birds and instead discovered—and then harnessed—basic forces, such as thrust and lift. The knock against neuromorphic computing is that it's stuck at the level of mimicking flapping wings, an accusation the neuromorphics side obviously rejects. Depending on who is right, the field will either take flight and soar over the chasm, or drop into obscurity.
This article appears in the June 2017 print issue as “The Neuromorphic Chip's Make-or-Break Moment."
 
  • Like
  • Love
  • Fire
Reactions: 19 users
I would love to see Nvidia advertising for system engineers with knowledge of Akida.
That would make me very happy.;)
Nvidia have plans to recruit 3,000 plus additional engineers over the next 12 months so you have over 3,000 chances to be made very happy. 😂🤣 FF
 
  • Like
  • Haha
  • Fire
Reactions: 10 users
Top Bottom