BRN Discussion Ongoing

Slade

Top 20
To all our German members. I feel your pain.
 
  • Like
  • Love
Reactions: 7 users

Esq.111

Fascinatingly Intuitive.
  • Like
  • Fire
Reactions: 6 users

wilzy123

Founding Member
Afternoon Wilzy,

Cheers for that, is a relief.

Only if anyone is still logged into their trading platforms... would be interesting if any late Trades.

Have a great weekend all.

Regards,
Esq.
1669962409541.png
 
  • Like
  • Love
Reactions: 15 users

Slade

Top 20
  • Like
  • Fire
Reactions: 2 users

wilzy123

Founding Member
  • Haha
  • Fire
  • Like
Reactions: 9 users

Slade

Top 20
  • Like
  • Love
  • Fire
Reactions: 14 users
Quick TSE search and I saw @TECH had some comms with this company few months back.

Hmmm...wonder if anything happened with BRN / these guys or if their tech will assist in accelerating the uptake & penetration of SNN to more mainstream?

Always considered one of our bigger hurdles is how adept others in the real world are at understanding & implementing our SNN easily and figured that was part of the Uni strategy now...to train the next gen engineers to use Akida.

This, on face value (only just saw it), could assist industry maybe to allow easier integration.

Need to understand better if friend or foe :unsure: :LOL:

Robot looks familiar haha



USPTO Issues Landmark Patent to Artificial Intelligence Startup, ORBAI​

NEWS PROVIDED BY
EIN Presswire
Nov 02, 2022, 11:21 AM ET

NeuroCAD v5.0

ORBAI NeuroCAD Toolchain
BICHNN SNN Autoencoder

BICHNN SNN Autoencoder

AGI Eta Going Exponential

ORBAI SNNs Scale Exponentially

These innovations in Spiking Neural Networks are analogous to Nicola Tesla's innovations that made AC current practical and able to surpass DC current as the standard for electricity in the 1900s.”
— Brent Oster, CEO ORBAI

SANTA CLARA, CA, USA, November 2, 2022 /EINPresswire.com/ -- ORBAI, a Silicon Valley Startup, was issued a landmark AI Patent this week by the US Patents and Trademarks Office, covering tools and methods for a revolutionary 3rd generation of AI based on an entirely new set of spiking neural network technologies that takes a huge leap past today's 2nd generation deep learning AI.

These technologies will enable more advanced AI applications, with conversational speech, human-like cognition, and planning and interaction with the real world, actively learning without supervision. They will find first use in smart devices, homes, and robotics, then in online professional services.

What we usually think of as Artificial Intelligence (AI) today, when we see human-like robots and holograms in our fiction, talking and acting like real people and having human-level or even superhuman intelligence and capabilities, is actually called Artificial General Intelligence (AGI), and it does NOT exist anywhere on earth yet. What the AI industry actually has today is much simpler and much more narrow Deep Learning (DL) that can only do some very specific tasks better than people and has fundamental limitations that will not allow it to become AGI. Without neural networks that have spatial-temporal processing and learn at runtime from new data, current Gen 2 Deep Neural Networks are unable to fully realize true computer vision, robust speech recognition, and interpretation of the environment and data inputs.

The patent covers ORBAI's NeuroCAD SNN Toolkit and processes for designing 3rd generation Spiking Neural Networks and training and evolving them towards applications in vision, speech, control, planning, and analytics. The Patent is Application No. 16/437,838, titled "Apparatus and method for utilizing a parameter genome characterizing neural network connections as a building block to construct a neural network with feedforward and feedback paths".

The methods in the patent solve the greatest obstacle to date to using Gen 3 SNNs that have previously prevented their widespread adoption - training SNNs to do useful functions, as standard backpropagation techniques used in deep learning do not work with SNNs because of their time domain properties. The ORBAI patent implements training of the network using a SNN autoencoder with a feedback loop, all tuned by genetic algorithms. These innovations in Spiking Neural Networks are a major breakthrough, making training them possible for practical purposes for the first time. This accomplishment is analogous to Nicola Tesla's innovations that made AC current practical and able to surpass DC current as the standard for electricity in the 1900s.

By building on this NeuroCAD toolchain and SNN technology shaped by genetic algorithms, ORBAI is designing Artificial General Intelligence that will enable more advanced AI applications, with conversational speech, human-like cognition, and planning and interaction with the real world, learning without supervision. It will find first use in smart devices, homes, and robotics, then in online professional services and enhanced analytics, forecasting, and decision making capabilities in financial forecasting and enterprise software, all with an AGI at the core powering them.

ORBAI's business model is to license the development tools to customers and also work alongside 3rd party developers to create custom integrated AI solutions for customers. Customers will then have access to the core AGI as AI as a Service (AIaaS), enabling our developer network to connect to it with data and applications for various customer needs.

ORBAI is a California-based startup developing artificial general intelligence to power smart devices and intelligent online professional services (www.orbai.com) with core AGI technology that will be licensed to companies doing devices and AI professional services.
Brent Oster
ORBAI
brent.oster@orbai.com
 
  • Like
  • Fire
  • Thinking
Reactions: 10 users

FJ-215

Regular
That's pretty high tax rate, may be he had other income too?

My cousin is in US, he makes some moolah and in our talks has mentioned his rates actually go down after a certain threshold. Until 2m he pays about 26% and when above, the tax rate actually goes down. Bit different to what we have in Aus.

Anyways, done and dusted.
G'day BL,

Sounds like some of that trickle down economics that past US pollies were so fond of. The cream rises to the top.........and you lot can eat cake..... or more likely, each other.
 
  • Like
Reactions: 3 users

Baisyet

Regular
Just curious about the share disposed off by Sean, I am trying to get it right I mean when borad or CEO buys share it mean confidence right thats what i hear all the time. what does this mean for us when 1000 eyes are trying to do all they can to inform all of us . thanks
 
  • Like
Reactions: 3 users

Foxdog

Regular
Oh wow. Ok watch for the SP drop now that instos can lend some of their required holdings to shorters. I reckon it's back to status quo here without any announcements. Back into the 60's for a bit, perhaps. Nevermind though our time is coming 👌
 
  • Like
  • Fire
Reactions: 4 users

Getupthere

Regular

Stop your public-cloud AI projects from dripping you dry

Last year, Andreessen Horowitz published a provocative blog post entitled “The Cost of Cloud, a Trillion Dollar Paradox.” In it, the venture capital firm argued that out-of-control cloud spending is resulting in public companies leaving billions of dollars in potential market capitalization on the table. An alternative, the firm suggests, is to recalibrate cloud resources into a hybrid model. Such a model can boost a company’s bottom line and free capital to focus on new products and growth.

Whether enterprises follow this guidance remains to be seen, but one thing we know for sure is that CIOs are demanding more agility and performance from their supporting infrastructure. That’s especially so as they look to use sophisticated and computing-intensive artificial intelligence/machine learning (AI/ML) applications to improve their ability to make real-time, data-driven decisions.

To this end, the public cloud has been foundational in helping to usher AI into the mainstream. But the factors that made the public cloud an ideal testing ground for AI (that is, elastic pricing, the ease of flexing up or down, among other factors) are actually preventing AI from realizing its full potential.

Here are some considerations for organizations looking to optimize the benefits of AI in their environments.

For AI, the cloud is not one-size-fits-all

Data is the lifeblood of the modern enterprise, the fuel that generates AI insights. And because many AI workloads must constantly ingest large and growing volumes of data, it’s imperative that infrastructure can support these requirements in a cost-effective and high-performance way.

When deciding how to best tackle AI at scale, IT leaders need to consider a variety of factors. The first is whether colocation, public cloud or a hybrid mix is best suited to meet the unique needs of modern AI applications.

While the public cloud has been invaluable in bringing AI to market, it doesn’t come without its share of challenges. These include:

Vendor lock-in: Most cloud-based services pose some risk of lock-in. However, some cloud-based AI services available today are highly platform-specific, each sporting its own particular nuances and distinct partner-related integrations. As a result, many organizations tend to consolidate their AI workloads with a single vendor. That makes it difficult for them to switch vendors in the future without incurring significant costs.

Elastic Pricing: The ability to pay only for what you use is what makes the public cloud such an appealing option for businesses, especially those hoping to reduce their CapEx spending. And consuming a public cloud service by the drip often makes good economic sense in the short term. But organizations with limited visibility into their cloud utilization all too often find that they are consuming it by the bucket. At that point it becomes a tax that stifles innovation.

Egress Fees: With cloud data transfers, a customer doesn’t need to pay for the data that it sends to the cloud. But getting that data out of the cloud requires them to pay egress fees, which can quickly add up. For instance, disaster recovery systems will often be distributed across geographic regions to ensure resilience. That means that in the event of a disruption, data must be continually duplicated across availability zones or to other platforms. As a result, IT leaders are coming to understand that at a certain point, the more data that’s pushed into the public cloud, the more likely they will be painted into a financial corner.

Data Sovereignty: The sensitivity and locality of the data is another crucial factor in determining which cloud provider would be the most appropriate fit. In addition, as a raft of new state-mandated data privacy regulations goes into effect, it will be important to ensure that all data used for AI in public cloud environments comply with prevailing data privacy regulations.

Three questions to ask before moving AI to the cloud

The economies of scale that public cloud providers bring to the table have made it a natural proving ground for today’s most demanding enterprise AI projects. That said, before going all-in on the public cloud, IT leaders should consider the following three questions to determine if it is indeed their best option.

At what point does the public cloud stop making economic sense?

Public cloud offerings such as AWS and Azure provide users with the ability to quickly and cheaply scale their AI workloads since you only pay for what you use. However, these costs are not always predictable, especially since these types of data-intensive workloads tend to mushroom in volume as they voraciously ingest more data from different sources, such as training and refining AI models. While “paying by the drip” is easier, faster and cheaper at a smaller scale, it doesn’t take long for these drips to accumulate into buckets, pushing you into a more expensive pricing tier.

You can mitigate the cost of these buckets by committing to long-term contracts with volume discounts, but the economics of these multi-year contracts still rarely pencil out. The rise of AI Compute-as-a-Service outside the public cloud provides options for those who want the convenience and cost predictability of an OpEx consumption model with the reliability of dedicated infrastructure.

Should all AI workloads be treated the same way?

It’s important to remember that AI isn’t a zero-sum game. There’s often room for both cloud and dedicated infrastructure or something in between (hybrid). Instead, start by looking at the attributes of your applications and data, and invest the time upfront in understanding the specific technology requirements for the individual workloads in your environment and the desired business outcomes for each. Then seek out an architectural model that enables you to match the IT resource delivery model that fits each stage of your AI development journey.

Which cloud model will enable you to deploy AI at scale?

In the land of AI model training, fresh data must be regularly fed into the compute stack to improve the prediction capabilities of the AI applications they support. As such, the proximity of compute and data repositories have increasingly become important selection criteria. Of course, not all workloads will require dedicated, persistent high-bandwidth connectivity. But for those that do, undue network latency can severely hamper their potential. Beyond performance issues, there are a growing number of data privacy regulations that dictate how and where certain data can be accessed and processed. These regulations should also be part of the cloud model decision process.

The public cloud has been essential in bringing AI into the mainstream. But that doesn’t mean it makes sense for every AI application to run in the public cloud. Investing the time and resources at the outset of your AI project to determine the right cloud model will go a long way towards hedging against AI project failure.

Holland Barry is SVP and field CTO at Cyxtera.
 
  • Like
  • Fire
Reactions: 7 users
And now for a serious hopefully intelligent post.

I let Blind Freddie do the maths so it is more reliable than my usual efforts so assuming the following is correct then Brainchip is unlikely to be removed from the ASX200.

During the period 23.5.22 to 24.11.22 it’s average share price has been 91.6 cents giving it an average market capitalisation of $1.58 billion.

During the same period and currently based on the 4C’s and half yearly report it has absolutely no liquidity issues with a long cash runway.

So the FACTS not emotion support Brainchip remaining in the ASX200.

My opinion only DYOR
FF

AKIDA BALLISTA
For those who were actually concerned that BRN would be removed from the ASX200 in todays announcement I have brought forward my earlier post in which I stated this was unlikely and set out the science against it occurring.

Not to boast because there is nothing to boast about when you do your research and understand how your investment will be affected by the operation of ASX Rules. It is the minimum that you should do to protect your investment.

What I am bringing it forward for is to put others in the 1,000 Eyes on notice that some among us may have other motives than promoting the interests of genuine shareholders in this great company.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 57 users

Teach22

Regular
Hi FF.

I know you have been quite strong with your opinion on how much weight an ASX announcement such as Ford being an EAP and if they were no longer an EAP, we would need to be advised if they were no longer an EAP through another Price sensitive announcement. Otherwise, it would be considered deception by Brainchip and potentially legal action could be taken against them. Forgive me if I have put words in your mouth but I recall you saying something to that effect.

In saying that, a statement such as “(Akida) WILL be the computational foundation of smart sensing in all edge devices.” is extremely strong in its language and I assume, extremely intentional. How much weight does a statement like this hold? If we do not become the foundation (or things change and it looks like we won't be), is there any grounds for legal action? If they do not believe we will become this foundation at some point, should they be advising the change of their opinion? Because based on that statement, investing in Brainchip is a (pardon the pun) no brainer!

For as long as I’ve been in the market, CEOs have been making statements like this.
Its why the disclaimer is there about forward looking statements.
 
  • Like
  • Fire
Reactions: 8 users

dippY22

Regular
Another opening at the Brainchip marketing team:

"Come join the company leading the technological revolution in artificial intelligence. BrainChip is a global technology company producing a groundbreaking neuromorphic processor that brings artificial intelligence to the edge in a way that is beyond the capabilities of other products.

We are the world’s first commercial producer of ultra-low-power and high-performance artificial intelligence technology processors that enables a wide array of applications such as self-driving cars, hearing aids, drones, and agricultural equipment. The event-based neural network processor is inspired by the spiking nature of the human brain and is implemented in an industry-standard digital process.

Our company was recognized as one of the “Startups Worth Watching in 2021” in EE Times’ annual Silicon 100 list of global semiconductor technologies and our founder was named the winner of the AI Hardware 2021 Innovator Award. We have offices in Laguna Hills, California; Toulouse, France; Hyderabad, India; and Perth, Australia. We are also publicly traded on the Australian Stock Exchange (BRN:ASX) and the OTC Market (BRCHF).

Job Title: Corporate Marketing Manager

Reports To: Chief Marketing Officer

Department: Marketing

SUMMARY:

A principal goal for the Corporate Marketing Manager is to develop, implement, and execute marketing programs for BrainChip, Inc. to improve market positioning, and attract potential customers and increase the value of existing ones.

ESSENTIAL JOB DUTIES AND RESPONSIBILITIES:

  • Finalize the completion and integration of BrainChip rebrand, ensuring that brand-compliant assets are integrated throughout the business
  • Manage the re-build of the company’s website based on rebrand, leveraging existing site infrastructure
  • Integrate site measurement and analytics
  • Integrate and optimize demand to lead generation programs
  • Manage content calendar for PR, whitepapers, blogs, social
  • Presentation and collateral development
  • Develop and execute marketing processes and deliverables
  • Collaborate with, and manage media organizations and advertising agencies
  • Work closely with sales and product delivery team to optimize sales enablement programs
  • Analyse data to evaluate the success of their marketing efforts and come up with new ideas to improve brand marketing and exposure
  • Travel, as required to Laguna office (20%). Travel to events and customers as needed (<5%)
QUALIFICATIONS:

To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.

Education/Experience:

  • Bachelor of Science in Marketing or Business Administration. Master’s degree preferred.
  • At least 5 Years’ extensive knowledge of and experience with various marketing strategies and digital tools used to implement a campaign
  • Minimum of 5 Years’ experience developing marketing strategies for companies and products
Other Skills and Abilities:
  • Proven ability to manage complex marketing projects
  • Proven ability to build and maintain relationships with department heads and outside agencies
  • Excellent analytical skills to forecast and identify marketing trends and challenges
  • Excellent communications and presentations skill
  • Professional judgement and discretion
  • Experience leading creative development and campaign creation
  • Ability to effectively prioritize and manage multiple tasks to meet deadlines"

I went to Brainchip Holdings website just now and did not see this position posted. I did see the longstanding opening " Regional Sales Manager " which reports to VP of world wide sales. So where did you find this posting for a corporate marketing manager? Regards, dippY
 
  • Like
Reactions: 1 users

cosors

👀
I wish you all a lovely weekend and hereby raise a toast to you. Let's see if it's good. Cheers!
Screenshot_2022-12-02-18-39-31-74_92460851df6f172a4592fca41cc2d2e6.jpg
 
  • Like
  • Love
  • Fire
Reactions: 23 users

Evermont

Stealth Mode
I went to Brainchip Holdings website just now and did not see this position posted. I did see the longstanding opening " Regional Sales Manager " which reports to VP of world wide sales. So where did you find this posting for a corporate marketing manager? Regards, dippY

 
  • Like
Reactions: 12 users

cosors

👀
I wish you all a lovely weekend and hereby raise a toast to you. Let's see if it's good. Cheers!
View attachment 23404
Wow! You really know how to make wine. And only from this one vine. Wow again. Even the French can't do that with their Syrah (aka Shiraz) - as I know. Hats off!
Small Pot 😅
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 14 users

Rskiff

Regular
Here is Rob Telson
 
  • Like
  • Love
  • Fire
Reactions: 32 users
Top Bottom