BRN Discussion Ongoing

On Saturday, they are creating a Commercial downloadable version of the whole Chat Brain of Open AI. They had to port some stuff to Apache. It is optimized for the M1 chip, but runs on everything. Any kid with a laptop can grab this entire stack. He explains it all and documents it.

In 10 minutes, mega people can download and run this thing. Next is text to video.

I thought this information was absolutely informative. ☄️☄️☄️
Hi Stuart,

I think this is a great find, though I don't think many others on here have understand it's significance.

I just got around to watching the video you linked. As you say, they have essentially trained a chatbot similar to ChatGPT which they call GPT4All. They used Meta's Large Language model (similar to GPT3) available for research purposes, then trained it using ChatGPT prompts. End of story, they created a smaller chatbot model that can run on someone's laptop and which randomly enough is a 4-bit model.

Now the limitation with using Meta's model is that it can't legally be used for commercial purposes. So in a few days time they'll be releasing another small model that is using a large Open Source dataset, which can be used for commercial purposes.

Unlike the big companies, they are releasing these models for everyone to use. Not only that, but they're releasing the instructions so that anyone can create their own models that can be used for commercial purposes.

This is just the start of almost anyone being able to make their own Chatbot models for business opportunities. And as the guy was saying on the video, the Chatbot they produce could potentially be placed on something small like a Raspberry Pi. Others in the machine learning community will eventually improve on these models and make them smaller and better. So hobbyists and the like will now have their own way of making their own versions of this.

There's a good chance that big companies are already talking to Brainchip about using Akida for these purposes. But what this open sourcing of chatbots will do is enable more custom and innovative AI applications that will continue to grow. Since it can be done for less than $1000, a lot of people will be able to create their own chatbots, it will no longer be limited to just the companies with millions of dollars.

If you were going to run a small chatbot on something like a Raspberry Pi you'd want it to have an efficient, high performance machine learning chip ideally so it wouldn't run too slow. This is where I think Akida is in the right place at the right time.

Pure speculation, DYOR
 
  • Like
  • Fire
  • Love
Reactions: 34 users

Bravo

If ARM was an arm, BRN would be its biceps💪!


View attachment 33424
Rob “likes” something from someone partnered with Qualcomm... I wonder what that means? 🤓
 
  • Like
Reactions: 17 users

Diogenese

Top 20
Rob “likes” something from someone partnered with Qualcomm... I wonder what that means? 🤓
I'm bologna way by your by-line.
 
  • Like
  • Haha
Reactions: 7 users
Found this article below interesting regarding the future direction of Defense. Autonomous air to air refuelling from Airbus Defence and its subsidiary UpNext.
https://apple.news/AjxCBVG2_TiOgx-6EM0My2w

It mentions the system using augmented gps, LiDAR and and ai algorithms, ”specifically by a computer where the artificial intelligence and cooperative control algorithms were running.”

and later this year
“exploring the use of navigation sensors based on artificial intelligence and enhanced algorithms for autonomous formation flight.”

Also at the beginning of the article ‘Auto’Mate Demonstrator technology.’ Leveraging artificial intelligence and machine learning, the Auto’Mate system allowed the A310 to autonomously command the DT-25 targeting drones that were used for the tests.

Also seems they are using Luminar for the LiDAR.
1680366000375.jpeg
 
  • Like
  • Fire
Reactions: 17 users
D

Deleted member 118

Guest
March 7, 2023
Products/Services / Press Releases
Expanded support menu and added 2 new colors for general sales
Acceptance of general purchase acceptance of "NICOBO", a robot that makes you smile
Panasonic Entertainment & Communication Co., Ltd. (hereinafter referred to as Panasonic) has decided to sell "NICOBO", which was provided only to supporters who applied for crowdfunding, and started accepting reservations for purchasing NICOBO from today. To do. General purchases will be available on the NICOBO official website from May 16th.

Nikobo is a "weak robot" born from a project proposed by an employee who seeks to provide value in the form of "richness of mind." Unlike robots that perform tasks in place of humans, Nikobo, who doesn't do anything, amplifies the kindness and smiles of those around her, creating new ways of happy interaction between robots and humans that have never existed before. We will propose it as a value.

Along with this general sale, two colors of smoke navy and shell pink have been added in addition to the stone gray nikobo that has been available so far. We aim to penetrate a wider range of people with colors that blend well with interiors. In order to live with Nikobo, you will need to purchase the main unit and the monthly fee necessary for Nikobo to adapt to life with the purchaser and continue to evolve. Purchase reservations will be accepted from the NICOBO official website, and will be shipped to those who have applied for reservations in conjunction with the start of sales in mid-May. In addition, along with the release, we will strengthen the support menu for living with Nikobo with peace of mind. We have prepared a NICOBO CLINIC that provides a NICOBO health checkup service and a knit exchange service to change NICOBO's knitwear into new ones. In addition, we have prepared a care plan that offers discounts on NICOBO CLINIC services such as treatment costs when hospitalization is required.

Panasonic's technology supports the realization of the concept of "weak robot" Nikobo, such as noise reduction technology for voice recognition, which is indispensable for communication, and information communication linkage with smartphone applications. Through the commercialization of Nikobo, Panasonic will accelerate its efforts to create new value of "impression and comfort."

Sorry can’t translate, but up for presale now.

 
  • Like
Reactions: 5 users

Boab

I wish I could paint like Vincent
Sorry can’t translate, but up for presale now.

You Can opt for the English version.
 
  • Like
  • Haha
Reactions: 3 users
D

Deleted member 118

Guest
  • Haha
  • Like
Reactions: 3 users

IloveLamp

Top 20

🤔🤔🤔

Image capture is done through a very advanced CMOS sensor and the rendering process is done at least 911 times faster compared to current consumer GPUs. A demo video shows that MetaVRain can render a model at 32.8 fps, while an RTX 2080 GPU renders only less than 3% of a frame in one second. The superior speeds further reduce the energy requirements for a single rendered frame by 26,400 times (133 mW) compared to consumer GPUs. Such specs enable real-time rendering for VR/AR applications using just a smartphone.
 
  • Like
  • Fire
  • Thinking
Reactions: 17 users

IloveLamp

Top 20
  • Like
  • Fire
  • Love
Reactions: 46 users

IloveLamp

Top 20
Gates has been a big believer of artificial intelligence and its related futuristic technologies, and recently declared that the “age of A.I. is here.” He said he thought OpenAI’s chatbot tool ChatGPT, released last year to great fanfare, was revolutionary and had the potential to effect far-reaching change in healthcare and education.

“Entire industries will reorient around it. Businesses will distinguish themselves by how well they use it,” Gates wrote of A.I. in a blog post earlier this month

. https://fortune.com/2023/03/31/bill-gates-autonomous-vehicle-self-driving-car-tipping-point/
 
  • Like
  • Fire
  • Thinking
Reactions: 11 users

GStocks123

Regular
  • Like
  • Fire
Reactions: 18 users

Makeme 2020

Regular

Way To Get Serious About Vlogging​

Couple a full-frame sensor with easy-to-use AI features and you get the most approachable camera for content creators, the ZV-E1.
BYJACKSON CHEN
MARCH 31, 2023
Sony ZV-E1 compact vlogging camera

Sony
Everybody’s a content creator these days, but not everyone’s content is created equal. Sony, for its part, wants to make it a lot easier for anyone trying to take vlogging or content creation more seriously. Its latest compact camera, the ZV-E1 comes out of its ZV lineup that specifically caters to content creators and vloggers. Notably, the ZV-E1 is the first one to offer a full-frame sensor.

Larger sensor aside, Sony packed a ton of easy-to-use features in the ZV-E1 that makes getting into vlogging seem a lot less daunting. This latest compact camera is aimed at those looking to step up their vlogging game, but with a straightforward camera setup that still feels familiar. It’s not as high quality as some of Sony’s other flagship cameras, but the ZV-E1 certainly beats out recording on a smartphone.
Sony ZV-E1 compact vlogging camera

The ZV-E1 will come in white and black.
SONY

PRO SENSOR, BEGINNER-FRIENDLY​

The ZV-E1 uses the same full-frame 12-megapixel sensor as Sony’s a7S III and FX3, which are much more expensive options for taking video. The compact ZV-E1 can equally shoot in 4K video at 60 fps in 10-bit 4:2:2 and you can get up to 120 fps when you shoot in 1080p resolution.
You’ll get more than 15 stops of dynamic range and a standard ISO range that goes up to 102,400 but can be expanded to 409,600 for both stills and video. The ZV-E1 has five-axis in-body stabilization for steady video, which you can further finetune with the several stabilization options in-camera.
Sony chose not to include an electronic viewfinder with the ZV-E1, so you’ll have to view everything through its three-inch LCD touchscreen. You can also set up the camera as a webcam by connecting it to a PC or smartphone.
Sony ZV-E1 compact vlogging camera

Like Sony’s other cameras, the ZV-E1 will have an articulating touchscreen.
SONY

GETTING IT ON THE FIRST TAKE​

Sony wanted to emphasize how easy it is to use the ZV-E1 by including a bunch of AI features that can be activated with a touch of a button. The new AI chip allows for a bunch of new features like multiple face recognition, auto framing, framing stabilizer, and auto microphone. All of these features were designed to make solo vlogging much more manageable. The ZV-E1 even has a real-time tracking feature that automatically keeps focus on the subject and auto framing, which crops the frame to keep the subject as the main focus of the shot.
If you want to add a little dramatic flair to your footage, the ZV-E1 can do that with its S-Cinetone feature. This feature gives your video a cinematic look in-camera and without post-processing, making your b-roll shots look good and easy to shoot.
Sony ZV-E1 compact vlogging camera

You can put any of your Sony E-mount lenses on the ZV-E1 or you can buy the 28-60mm lens that is bundled with the camera.
SONY

FLAGSHIP PRICES​

Sony is making the ZV-E1 available in black and white colorways in May 2023. If you want just the camera body itself, it’ll cost you $2,199.99. Sony is also planning to release a ZV-E1 kit that bundles a 28-60mm F4.5-5.6 zoom lens that will retail for $2,499.99.
It’s definitely a steep investment cost, but the full-frame sensor and the AI features warrant the price tag. As more people jump into the world of content creation, Sony is hoping to capture that demographic with its ZV-E1 that’s looking to be a go-to pick for many.
MORE LIKE THIS
The side of the PSVR 2.
VR
Sony Is Just 3 Tweaks Away From Making the PSVR 2 a Huge Success

Netflix Games logo displayed on a phone screen and Netflix logo displayed on a screen are seen in th...
CLOUD GAMING
How Netflix Could Make a Cloud Gaming Service Worth Playing


NEWS
Nothing Is Teasing a Mysterious Product on Twitter

LEARN SOMETHING NEW EVERY DAY
Subscribe for free to Inverse’s award-winning daily newsletter!
SUBMIT
By subscribing to this BDG newsletter, you agree to our Terms of Service and Privacy Policy
RELATED TAGS
 
  • Like
  • Thinking
Reactions: 11 users
D

Deleted member 118

Guest

Way To Get Serious About Vlogging​

Couple a full-frame sensor with easy-to-use AI features and you get the most approachable camera for content creators, the ZV-E1.
BYJACKSON CHEN
MARCH 31, 2023
Sony ZV-E1 compact vlogging camera

Sony
Everybody’s a content creator these days, but not everyone’s content is created equal. Sony, for its part, wants to make it a lot easier for anyone trying to take vlogging or content creation more seriously. Its latest compact camera, the ZV-E1 comes out of its ZV lineup that specifically caters to content creators and vloggers. Notably, the ZV-E1 is the first one to offer a full-frame sensor.

Larger sensor aside, Sony packed a ton of easy-to-use features in the ZV-E1 that makes getting into vlogging seem a lot less daunting. This latest compact camera is aimed at those looking to step up their vlogging game, but with a straightforward camera setup that still feels familiar. It’s not as high quality as some of Sony’s other flagship cameras, but the ZV-E1 certainly beats out recording on a smartphone.
Sony ZV-E1 compact vlogging camera

The ZV-E1 will come in white and black.
SONY

PRO SENSOR, BEGINNER-FRIENDLY​

The ZV-E1 uses the same full-frame 12-megapixel sensor as Sony’s a7S III and FX3, which are much more expensive options for taking video. The compact ZV-E1 can equally shoot in 4K video at 60 fps in 10-bit 4:2:2 and you can get up to 120 fps when you shoot in 1080p resolution.
You’ll get more than 15 stops of dynamic range and a standard ISO range that goes up to 102,400 but can be expanded to 409,600 for both stills and video. The ZV-E1 has five-axis in-body stabilization for steady video, which you can further finetune with the several stabilization options in-camera.
Sony chose not to include an electronic viewfinder with the ZV-E1, so you’ll have to view everything through its three-inch LCD touchscreen. You can also set up the camera as a webcam by connecting it to a PC or smartphone.
Sony ZV-E1 compact vlogging camera

Like Sony’s other cameras, the ZV-E1 will have an articulating touchscreen.
SONY

GETTING IT ON THE FIRST TAKE​

Sony wanted to emphasize how easy it is to use the ZV-E1 by including a bunch of AI features that can be activated with a touch of a button. The new AI chip allows for a bunch of new features like multiple face recognition, auto framing, framing stabilizer, and auto microphone. All of these features were designed to make solo vlogging much more manageable. The ZV-E1 even has a real-time tracking feature that automatically keeps focus on the subject and auto framing, which crops the frame to keep the subject as the main focus of the shot.
If you want to add a little dramatic flair to your footage, the ZV-E1 can do that with its S-Cinetone feature. This feature gives your video a cinematic look in-camera and without post-processing, making your b-roll shots look good and easy to shoot.
Sony ZV-E1 compact vlogging camera

You can put any of your Sony E-mount lenses on the ZV-E1 or you can buy the 28-60mm lens that is bundled with the camera.
SONY

FLAGSHIP PRICES​

Sony is making the ZV-E1 available in black and white colorways in May 2023. If you want just the camera body itself, it’ll cost you $2,199.99. Sony is also planning to release a ZV-E1 kit that bundles a 28-60mm F4.5-5.6 zoom lens that will retail for $2,499.99.
It’s definitely a steep investment cost, but the full-frame sensor and the AI features warrant the price tag. As more people jump into the world of content creation, Sony is hoping to capture that demographic with its ZV-E1 that’s looking to be a go-to pick for many.
MORE LIKE THIS
The side of the PSVR 2.
VR
Sony Is Just 3 Tweaks Away From Making the PSVR 2 a Huge Success
Netflix Games logo displayed on a phone screen and Netflix logo displayed on a screen are seen in th...
CLOUD GAMING
How Netflix Could Make a Cloud Gaming Service Worth Playing

NEWS
Nothing Is Teasing a Mysterious Product on Twitter
LEARN SOMETHING NEW EVERY DAY
Subscribe for free to Inverse’s award-winning daily newsletter!
SUBMIT
By subscribing to this BDG newsletter, you agree to our Terms of Service and Privacy Policy
RELATED TAGS
What an ugly camera and they look like they need a haircut
 
  • Like
  • Haha
Reactions: 8 users

Makeme 2020

Regular
Morning all..

Neuromorphic Computing Applications and Companies Creating Promising Hyper-Realistic Generative AI​

In developing AI models that can create images, videos, and other media that are virtually indistinguishable from real-world content, researchers and developers are exploring new applications that were once the stuff of science fiction. Let's look at some of the most promising neuromorphic computing applications currently in development and explore how they could transform the way we live, work, and play.

BrainChip​

One company at the forefront of this innovation is the Australian startup, BrainChip, which has developed a range of neuromorphic computing hardware and software solutions for edge devices.

At the heart of BrainChip's technology is the Akida Neuromorphic System-on-Chip (NSoC), which is specifically designed to perform pattern recognition and sensory processing tasks with high efficiency and low power consumption. This makes it an ideal solution for edge devices such as surveillance cameras and drones, which require advanced processing capabilities but have limited power and computational resources.

The Akida NSoC is based on a unique architecture that combines digital and analog processing elements, allowing it to perform complex computations while maintaining low power consumption. It also incorporates a spiking neural network (SNN) architecture, modelled on the behaviour of biological neurons and synapses.

BrainChip's technology has a range of potential applications in industries such as security and surveillance, as well as in autonomous vehicles and robotics. As a result of BrainChip's ability to perform advanced processing tasks with high efficiency and low power consumption, next-generation computing solutions are becoming more intelligent, efficient, and responsive.
 
  • Like
  • Fire
  • Love
Reactions: 38 users

AARONASX

Holding onto what I've got
  • Haha
  • Like
Reactions: 6 users

Steve10

Regular
Chris from Ciovacco Capital has mentioned that the October 2022 market low & current rebound is very similar to 1990 low & rebound. The charts have many similarities.

Also mentions that the post-Covid run should resume.

The markets had a big run from the October 1990 low until the March 2000 peak of dotcom bubble.

S&P500 went up x5.2 in approximately 9.5 years.

Nasdaq 100 went up x29.2 & Nasdaq Composite went up x15.4 in 9.5 years.

In Australia, market bottomed in January 1991 & peaked in February 2002.

The All Ordinaries went up x2.9 & ASX200 went up x2.8 in approximately 11 years.

Nasdaq had it's biggest quarterly rise since June 2020 quarter.

I compared the Nasdaq 100 rise from the March 2020 low following Covid crash & recent rise from October 2022 low to determine the actual percentage rise at similar time frames following the low. Post Covid rise after 1 quarter & 2 days from March 2020 low was +44.42% & recent rise from October 2002 low was +23.4% so the current market recovery is rising at 52.7% rate of post Covid market recovery. US Feds are not pumping funds into the market now as they did post Covid so market will rise slower.

I then compared the performance of XIJ & XTX to Nasdaq & S&P500 post Covid to determine whether or not the local XIJ & XTX indexes performed better or worse. Wanted to compare ASX tech/growth sector to equivalent in USA in leau of using All Ordinaries & ASX200.

From March 2020 low to peak performance was as follows:

Nasdaq went up x2.38

Nasdaq Comp went up x2.36

S&P500 went up x2.15

XIJ went up x2.85

XTX went up x2.8

Average of Nasdaq = x2.37 compared to average of XIJ/XTX = x2.825 thus +19.2% more upside locally with XIJ/XTX.

There is no data for XIJ/XTX from 1991 to calculate the actual performance to compare with US markets so I used post Covid performance to make comparisons. Post Covid data confirms the ASX tech/growth sectors perform better than US Nasdaq & Nasdaq Composite.

XIJ data is available from June 2001 which indicates it went up x4.15 from March 2003 to June 2007 peak in about 4.25 years following the dotcom bubble which is more than the x2.85 rise post Covid.

The big run in markets during 1990-2000 was due to dotcom bubble. The internet created the dotcom bubble.

I believe we may see something similar from 2022-2032 which will be referred to as the AI bubble. The emergence of ChatGPT recently has commenced the inflation of the AI sector. Nvidia is already trading at PE 159.43. Could get a x15-30 XIJ/XTX rise during the next decade & more for individual stocks.

BRN will most likely rise at least x60 during the next AI fuelled decade to circa $30 SP & $54B MC. Will most likely trade at PE 100+.

1680387354264.png
 
  • Like
  • Fire
  • Love
Reactions: 72 users

Tothemoon24

Top 20
Last edited:
  • Like
Reactions: 11 users

Dhm

Regular
Chris from Ciovacco Capital has mentioned that the October 2022 market low & current rebound is very similar to 1990 low & rebound. The charts have many similarities.

Also mentions that the post-Covid run should resume.

The markets had a big run from the October 1990 low until the March 2000 peak of dotcom bubble.

S&P500 went up x5.2 in approximately 9.5 years.

Nasdaq 100 went up x29.2 & Nasdaq Composite went up x15.4 in 9.5 years.

In Australia, market bottomed in January 1991 & peaked in February 2002.

The All Ordinaries went up x2.9 & ASX200 went up x2.8 in approximately 11 years.

Nasdaq had it's biggest quarterly rise since June 2020 quarter.

I compared the Nasdaq 100 rise from the March 2020 low following Covid crash & recent rise from October 2022 low to determine the actual percentage rise at similar time frames following the low. Post Covid rise after 1 quarter & 2 days from March 2020 low was +44.42% & recent rise from October 2002 low was +23.4% so the current market recovery is rising at 52.7% rate of post Covid market recovery. US Feds are not pumping funds into the market now as they did post Covid so market will rise slower.

I then compared the performance of XIJ & XTX to Nasdaq & S&P500 post Covid to determine whether or not the local XIJ & XTX indexes performed better or worse. Wanted to compare ASX tech/growth sector to equivalent in USA in leau of using All Ordinaries & ASX200.

From March 2020 low to peak performance was as follows:

Nasdaq went up x2.38

Nasdaq Comp went up x2.36

S&P500 went up x2.15

XIJ went up x2.85

XTX went up x2.8

Average of Nasdaq = x2.37 compared to average of XIJ/XTX = x2.825 thus +19.2% more upside locally with XIJ/XTX.

There is no data for XIJ/XTX from 1991 to calculate the actual performance to compare with US markets so I used post Covid performance to make comparisons. Post Covid data confirms the ASX tech/growth sectors perform better than US Nasdaq & Nasdaq Composite.

XIJ data is available from June 2001 which indicates it went up x4.15 from March 2003 to June 2007 peak in about 4.25 years following the dotcom bubble which is more than the x2.85 rise post Covid.

The big run in markets during 1990-2000 was due to dotcom bubble. The internet created the dotcom bubble.

I believe we may see something similar from 2022-2032 which will be referred to as the AI bubble. The emergence of ChatGPT recently has commenced the inflation of the AI sector. Nvidia is already trading at PE 159.43. Could get a x15-30 XIJ/XTX rise during the next decade & more for individual stocks.

BRN will most likely rise at least x60 during the next AI fuelled decade to circa $30 SP & $54B MC. Will most likely trade at PE 100+.

View attachment 33454
Just watched the latest Ciovacco video. Last week's "Inflection Point" video was followed up with a strong bullish move - but in its early stages - that most probably will continue. Ciovacco speaks to probabilities not predictions, and the current move increases the probability of further bullish gains.

Screen Shot 2023-04-02 at 8.20.38 am.png


The S&P has broken to the upside, giving greater confidence to further moves.



Similar strong move with Semiconductors......


Screen Shot 2023-04-02 at 8.28.04 am.png


These aren't predictions, just improving odds of stronger moves to come.


 
  • Like
  • Love
  • Fire
Reactions: 20 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
They just destroyed my hope of buying more at a low price!

But if we get a slice of a billion here and a slice of a billion there, then it wouldn't be a too bad start.
Re Valeo, so the royalty on the orders worth 1 billion euros using @Kachoo's suggested calculation in italics below. Remembering that this is for 2 contract only.🥳



Screen Shot 2023-04-02 at 9.10.51 am.png




There has been a lot of speculation about whether we would get 10 cents or 30 cents royalty per product, but, apparently in the MegaChips deal, it is a percentage of the sales price of the product, on a sliding scale, ie, the more the customer sells, the smaller the percentage.

As an example, assume there is a high volume $10 product and a lower volume $100 product.

So, according to the sliding scale (set according to volume of sales), if it's 2% for the $10 product, we get 20 cents a product, whereas if it's 3% of a $100 product, we get $3 per product.

Remember, these are just example royalty rates and sales process, not the real thing.
 
  • Like
  • Love
  • Fire
Reactions: 14 users

Townyj

Ermahgerd
  • Like
  • Haha
  • Fire
Reactions: 9 users
Top Bottom