BRN Discussion Ongoing


Used Google translator from Korean to English
Their SNN also compairing on GPT-2

Through this, it was possible to reduce the parameters of the GPT-2 giant model from 708 million to 191 million and the parameters of the T5 model used for translation from 402 million to 76 million. As a result of this compression work, we succeeded in reducing the power consumed by loading language model parameters from external memory by 70%. According to the researchers, the complementary transformer consumes 1/625th the power of the NVIDIA A100 GPU, while enabling high-speed operation of 0.4 seconds for language generation using the GPT-2 model and 0.2 seconds for language translation using the T5 model. In the case of language generation due to parameter lightweighting, the accuracy decreased by 1.2 branching coefficient (lower means that the language model was learned better), but the researchers explained that it is at a level where people will not feel awkward when reading the generated sentences. In the future, the research team plans to expand the scope of neuromorphic computing to various application fields rather than limiting it to language models.
Sounds like a bit of stretch now, saying it's running ChatGPT2, when it's actually a compressed model of it, with just over a quarter of the parameters...

Would be more accurate I think, to say it was running a SLM "derived" from ChatGPT2..
 
  • Like
Reactions: 2 users

rgupta

Regular
It definitely is, a tangled web out there Rgupta..

That's what makes the dot joining Fun, Frustrating and potentially Fruitless, all at the same time..
That is a screen shot of Dr Tony Lewis at AGM where akida was compared with Chat gpt 2 and it was claimed our model can use 5000 times less power hungry and can run on the edge.
And then team brainchip raised last CR just to work on LLMs for neurophonic chips.
I donot know why everyone is comparing with chat gpt 2 while we already have chat gpt 4 plus.
But again for joining but yes Dr Tony Lewis was saying almost the same thing.
 

Attachments

  • Screenshot_20240904-223843.png
    Screenshot_20240904-223843.png
    1.1 MB · Views: 45
  • Like
Reactions: 1 users
That is a screen shot of Dr Tony Lewis at AGM where akida was compared with Chat gpt 2 and it was claimed our model can use 5000 times less power hungry and can run on the edge.
And then team brainchip raised last CR just to work on LLMs for neurophonic chips.
I donot know why everyone is comparing with chat gpt 2 while we already have chat gpt 4 plus.
But again for joining but yes Dr Tony Lewis was saying almost the same thing.
It's to do with the number of parameters Rgupta.

ChatGPT4 is huge..


I was always told size didn't matter 😔..

20240904_221756.jpg



No chance at all, of running anything like ChatGPT4, outside of a data center, with current technology.

Interestingly, this shows over twice the parameters, for the full ChatGPT2, that Kaist claims it has (and they are running with just over a quarter of that).
 
Last edited:
  • Like
  • Haha
Reactions: 8 users

stockduck

Regular

"....
Being a business-centric device, Dell offers several security features on the Latitude 5455 including SafeBIOS, IR biometric login, optional fingerprint reader in the power button, camera privacy shutter, and optional security hardware authentication bundles.

Customers can also configure additional hardware security measures including a chassis intrusion switch, hard drive wipes, and tamper-evident packaging along with software measures such as Crowdstrike (ahem!), Secureworks, and Netskope.

The Latitude 5455 uses Qualcomm's FastConnect 7800 Wi-Fi 7 WLAN card with Bluetooth 5.4 for fast wireless networking. Additionally, users also get 2x USB4 Type-C 40 Gbps ports, 1x USB 3.2 Gen1 Type-A with Power Share, a microSD card reader, and a combo audio jack."


Well,..... that sounds interesting to me...what the hell is Dell doing?:rolleyes:
But there is no specific point for Brainchip IP insn`t it?

Here a description in german language....:


"...Dell is launching a new notebook with a computing chip from Qualcomm. With these, the company is likely to be aimed at customers who are looking for a notebook for more general tasks and who would like to benefit from a long battery life.
....
Users should benefit from various options, such as a search function with input in natural language or a special suppression of disturbing ambient noise.
....."

translated from google translator
 
  • Like
  • Fire
Reactions: 5 users

rgupta

Regular
It's to do with the number of parameters Rgupta.

ChatGPT4 is huge..


I was always told size didn't matter 😔..

View attachment 68940


No chance at all, of running anything like ChatGPT4, outside of a data center, with current technology.

Interestingly, this shows over twice the parameters, for the full ChatGPT2, that Kaist claims it has (and they are running with just over a quarter of that).
There is no doubt it is almost impossible to make a 1trillion parter model on the edge. But do we need all trillion parameters all the time. It is just like I have 10 wardrobes full of clothes but I can only wear a few at a given time. Which arises the question of what is important in a given situation, in a given industry etc.
There are a lot of instances where we may not need full chat gpt and that is why it is expected industry specific Small language models will be the future.
 
  • Like
Reactions: 5 users

Frangipani

Regular


498B4710-D0D6-45C3-98D3-52DF08175FE2.jpeg


 
  • Like
  • Love
  • Fire
Reactions: 39 users
  • Like
  • Fire
  • Love
Reactions: 26 users

Cgc516

Regular
The best we can do to let the shorters down to the hell!
IMG_4964.jpeg
IMG_4965.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 22 users

keyeat

Regular
It's to do with the number of parameters Rgupta.

ChatGPT4 is huge..


I was always told size didn't matter 😔..

View attachment 68940


No chance at all, of running anything like ChatGPT4, outside of a data center, with current technology.

Interestingly, this shows over twice the parameters, for the full ChatGPT2, that Kaist claims it has (and they are running with just over a quarter of that).
"

Bigger isn’t always better: How hybrid AI pattern enables smaller language models​

"

Why need a model to tell you about Shakespeare when its main function is to summarise input data / find trends / forecast ....
 
  • Like
Reactions: 4 users

Labsy

Regular
What’s your final amount you are going for labsy?
Hey man, I'm just gobbling up small parcels sub 16c. I don't really have an end goal... Haven't sold a share since covid. I suspect insto traders are passing to and fro large bundles and making penies on the dollar with their algos. Possibly some retail traders also trading large amounts daily too. I'm just picking up when I can. Just in top 100 - 150 ATM... Comfortable here but actively accumulating so who knows brother. ;)
 
  • Like
  • Fire
  • Love
Reactions: 22 users

Shadow59

Regular
Hey man, I'm just gobbling up small parcels sub 16c. I don't really have an end goal... Haven't sold a share since covid. I suspect insto traders are passing to and fro large bundles and making penies on the dollar with their algos. Possibly some retail traders also trading large amounts daily too. I'm just picking up when I can. Just in top 100 - 150 ATM... Comfortable here but actively accumulating so who knows brother. ;)
I've been slowly accumulating too. Roughly what number of shares would be the top 150 or 100. I now have way more than I originally intended.
 
  • Like
Reactions: 7 users

wilzy123

Founding Member
I donot know why everyone is comparing with chat gpt 2 while we already have chat gpt 4 plus

Completely different models... completely different power/resource requirements... completely different applications. Brainchip are simply demonstrating Akida's capability and strengths in the large language model (language based applications) space, using GPT2... a small language model as an example. It's not reasonable to compare GPT2/4 simply on the basis of how recently they were created - that is largely irrelevant.
 
  • Like
  • Love
Reactions: 10 users

Taproot

Regular
I've been slowly accumulating too. Roughly what number of shares would be the top 150 or 100. I now have way more than I originally intended.
The last time someone put up a list it was:

300: 525K
250: 625K
200: 800K
150: 1M
100: 1.3M


Might be a little outdated, but should give you a rough guide.
 
  • Like
  • Fire
  • Haha
Reactions: 18 users

MrNick

Regular
The UK is promoting this... should we? Shouldn't we? Seems tailor-made to park some 'brains' in this new Centre.

 
Last edited:
  • Like
  • Thinking
  • Love
Reactions: 11 users
Hey man, I'm just gobbling up small parcels sub 16c. I don't really have an end goal... Haven't sold a share since covid. I suspect insto traders are passing to and fro large bundles and making penies on the dollar with their algos. Possibly some retail traders also trading large amounts daily too. I'm just picking up when I can. Just in top 100 - 150 ATM... Comfortable here but actively accumulating so who knows brother. ;)
Cool same same
But I have lost track of where I sit on the ladder
I was 190 a one stage
But gobbled up more so who knows
 
  • Fire
  • Like
Reactions: 3 users
The last time someone put up a list it was:

300: 525K
250: 625K
200: 800K
150: 1M
100: 1.3M


Might be a little outdated, but should give you a rough guide.
Well I am then somewhere between 150 / 100
I think I am happy with that
 
  • Like
  • Fire
  • Wow
Reactions: 5 users

Labsy

Regular
Cool same same
But I have lost track of where I sit on the ladder
I was 190 a one stage
But gobbled up more so who knows
Awesome! 👍 Let's do this!!!
Next list we make is Australia's richest 200...🚀🚀
 
  • Like
  • Fire
  • Love
Reactions: 15 users
Awesome! 👍 Let's do this!!!
Next list we make is Australia's richest 200...🚀🚀
Definitely not an easy path
But hopefully it will be worth the wait and stress
 
  • Like
  • Fire
Reactions: 6 users

CHIPS

Regular
  • Like
  • Fire
  • Love
Reactions: 13 users

Gazzafish

Regular
Didn’t I read somewhere that we were involved with Siemens 😁. Only guessing but interesting 🙏


Extract :-“
30 August 2024

Leveraging AI for Predictive Maintenance: The Future of Industrial Efficiency​

In today’s rapidly evolving industrial landscape, the integration of artificial intelligence (AI) into maintenance operations has transformed the way we predict and prevent equipment failures.
The primary objective of AI in this context is to make the unpredictable predictable, streamlining processes that are otherwise difficult or time-consuming for humans to manage. Moreover, AI helps safeguard against the costly consequences of equipment failure, ensuring minimal disruption in operations.

The Role of Algorithms in Predictive Maintenance​

AI-driven predictive maintenance relies on sophisticated algorithms that continuously monitor, analyze, and predict the condition of machinery. These algorithms process vast amounts of data from sensors and other sources to detect patterns that might indicate an impending failure. The insights gained from this data are then turned into actionable steps that maintenance teams can take to prevent downtime and optimize performance.
These AI systems work tirelessly behind the scenes, ensuring that facilities run smoothly with minimal interruptions. Whether it’s reducing unexpected downtime or fine-tuning the performance of machinery, AI’s potential to enhance operational efficiency is immense.

The Evolution of Predictive Maintenance​

Predictive maintenance is not a new concept, but it has evolved significantly over time. Traditionally, maintenance was a reactive process, where issues were addressed only after they occurred. However, with the advent of AI, predictive maintenance has become more proactive and data-driven.
The roots of predictive maintenance can be traced back to the early days of machinery, where experienced operators would rely on their senses to detect anomalies. Today, AI enhances this process by using sensors and IoT devices to continuously monitor equipment. This data is then processed using advanced algorithms to predict potential failures long before they happen.
In industries like aerospace, predictive maintenance has been a game-changer. The ability to monitor and predict the condition of complex machines has allowed companies to offer these machines as a service, ensuring reliability and efficiency. As AI continues to evolve, other industries are beginning to adopt these practices, learning from the successes and challenges faced by early adopters.

Challenges in Implementing Predictive Maintenance​

Despite its potential, implementing AI-driven predictive maintenance comes with its own set of challenges. One of the primary hurdles is the scalability of these solutions. While many companies can implement predictive maintenance on a small scale, extending it to an entire enterprise with thousands of machines is a different story. This requires systems that are not only scalable but also vendor-agnostic and automated.
Another significant challenge is data management. Predictive maintenance relies on vast amounts of data, which must be collected, analyzed, and interpreted accurately. Many companies already have valuable data from their existing industrial control systems, but integrating this with new AI tools can be complex. Moreover, it’s crucial to combine machine data with human insights—information from maintenance staff on past repairs and interventions—to create a complete picture of machine health.

The Value of AI in Predictive Maintenance​

The value of AI in predictive maintenance is immense, offering both cost avoidance and cost savings. For example, in high-stakes industries like automotive manufacturing, where downtime can cost millions of dollars per hour, the ability to prevent unexpected failures is invaluable.
Moreover, AI can help companies optimize their maintenance schedules, reducing the need for routine inspections and repairs. By focusing on predictive maintenance, organizations can achieve a full return on investment in as little as six months. This rapid ROI is a testament to the effectiveness of AI in reducing costs and improving operational efficiency.

Getting Started with AI-Driven Predictive Maintenance​

For companies looking to embark on the journey of AI-driven predictive maintenance, the first steps involve understanding the specific needs of their operations and gathering relevant data. Engaging with experts in the field and learning from existing case studies can provide valuable insights into the potential benefits and challenges.
It’s also essential to think big—start with a small-scale implementation but have a plan for scaling up across the entire enterprise. Success in predictive maintenance requires not just the right technology but also a cultural shift within the organization. Maintenance teams must be trained and supported as they adapt to new tools and processes.
AI-driven predictive maintenance represents a significant leap forward in industrial efficiency. By harnessing the power of AI, companies can not only prevent costly equipment failures but also optimize their entire maintenance strategy. While challenges exist, the potential benefits make it a worthwhile investment for any organization looking to improve its operations and reduce costs.
As AI continues to advance, its role in predictive maintenance will only grow, helping industries of all kinds to achieve new levels of reliability, efficiency, and profitability.”
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 15 users
Top Bottom