BRN Discussion Ongoing

F

Filobeddo

Guest
click the X to bugger off that giant ad

Haha No X on mine, you must have the delux

By the way would like to see the other ex defence ministers disclosed holdings , Dutton, Pyne etc
 
  • Like
Reactions: 7 users
Haha No X on mine, you must have the delux

By the way would like to see the other ex defence ministers disclosed holdings , Dutton, Pyne tec
 

Attachments

  • hey big spender.JPG
    hey big spender.JPG
    24.7 KB · Views: 99
  • Like
  • Haha
Reactions: 3 users
mp.JPG
 
  • Like
Reactions: 6 users
F

Filobeddo

Guest
Thanks

Wow talk about tight! on her salary
 
  • Like
  • Haha
Reactions: 5 users

JK200SX

Regular
In relation to the Citicorp Nominees topic, I've done some more investigation. On the surface it looks like they have taken large positions in many companies listed on the ASX:

MYER
1657757783534.png


RIO TINTO
1657757895379.png


JB HIFI
1657757965991.png


Wesfarmers
1657758038991.png


AGL
1657758102181.png


CBA
1657758187166.png


BRN (not current)
1657758321225.png


Additionally, HSBC, JP Morgan, BNP Paribas Nominees, and other "nominee" companies have taken up positions too in all of the examples shown above. Knowing this now, I can perhaps conclude that these type of investments are a collective (ie smsf's, managed funds, individual investors etc), and is perhaps a good thing?
 

Attachments

  • 1657757874504.png
    1657757874504.png
    157.5 KB · Views: 64
  • Like
  • Fire
  • Love
Reactions: 55 users
  • Haha
  • Like
Reactions: 10 users
F

Filobeddo

Guest
  • Haha
Reactions: 12 users

HopalongPetrovski

I'm Spartacus!
In relation to the Citicorp Nominees topic, I've done some more investigation. On the surface it looks like they have taken large positions in many companies listed on the ASX:

MYER
View attachment 11295

RIO TINTO
View attachment 11297

JB HIFI
View attachment 11298

Wesfarmers
View attachment 11299

AGL
View attachment 11300

CBA
View attachment 11301

BRN (not current)
View attachment 11302

Additionally, HSBC, JP Morgan, BNP Paribas Nominees, and other "nominee" companies have taken up positions too in all of the examples shown above. Knowing this now, I can perhaps conclude that these type of investments are a collective (ie smsf's, managed funds, individual investors etc), and is perhaps a good thing?
https---d1e00ek4ebabms.cloudfront.net-production-944f782f-accd-4f40-96ff-70c5917bd309.jpg

Arise fellow Buff's!
It's even worse than we thought.
They're into everythink.
The rot go's deep and we must DO..................................... somethin'???
 
  • Haha
  • Like
Reactions: 6 users
D

Deleted member 118

Guest
In relation to the Citicorp Nominees topic, I've done some more investigation. On the surface it looks like they have taken large positions in many companies listed on the ASX:

MYER
View attachment 11295

RIO TINTO
View attachment 11297

JB HIFI
View attachment 11298

Wesfarmers
View attachment 11299

AGL
View attachment 11300

CBA
View attachment 11301

BRN (not current)
View attachment 11302

Additionally, HSBC, JP Morgan, BNP Paribas Nominees, and other "nominee" companies have taken up positions too in all of the examples shown above. Knowing this now, I can perhaps conclude that these type of investments are a collective (ie smsf's, managed funds, individual investors etc), and is perhaps a good thing?


And are these all companies constantly attacked by the shorters on hot crapper?
 
  • Like
Reactions: 3 users

Xhosa12345

Regular
In relation to the Citicorp Nominees topic, I've done some more investigation. On the surface it looks like they have taken large positions in many companies listed on the ASX:

MYER
View attachment 11295

RIO TINTO
View attachment 11297

JB HIFI
View attachment 11298

Wesfarmers
View attachment 11299

AGL
View attachment 11300

CBA
View attachment 11301

BRN (not current)
View attachment 11302

Additionally, HSBC, JP Morgan, BNP Paribas Nominees, and other "nominee" companies have taken up positions too in all of the examples shown above. Knowing this now, I can perhaps conclude that these type of investments are a collective (ie smsf's, managed funds, individual investors etc), and is perhaps a good thing?
images.jpeg-46.jpg

Like i suggested....
 
  • Like
  • Haha
  • Love
Reactions: 8 users

Proga

Regular

Andrej Karpathy, Tesla Inc.’s top artificial-intelligence executive and an architect of its Autopilot self-driving system, is planning to depart the maker of electric vehicles.

The executive, who joined Tesla in 2017, announced his departure in a series of tweets on Wednesday. Karpathy led the computer-vision teams -- overseeing the technology at the core of Autopilot -- in addition to groups responsible for data labeling and deployment of the feature.
 
  • Like
  • Wow
  • Love
Reactions: 10 users

Diogenese

Top 20
Something I found that appears interesting but again not necessarily linked per se.

In some of CNN2SNN documents we can also incorporate Adaround Quantizer which I believe is open source for anyone.

Appears can be added as a calibration after Akida does it's stuff.

So, who owns / created Adaround?...Qualcomm.

Snip from BRN doc and short vid from Qualcomm. Is a 2020 demo appears but Qualcomm only uploaded Jan this year.

Doesn't mean too much maybe as stand alone Adaround looks like down to 8 bit where we 1-4 but assists tuning for us from my basic understanding and obviously compatible and probs a common bolt on to diff AI prods?

The vid is interesting to see what Adaround at 8 bit achieves when I think ok, couple that with Akida first.

Spotted it from a July 22 post by Edge AI on Qualcomm AIMET which can / does also use Adaround.

Maybe @Diogenese could add some thoughts sometime?



View attachment 11256
View attachment 11253 View attachment 11254
View attachment 11255

Looks like can't embed the vid here but link below.

Hi Fmf,

AIMET ( AI Model Efficiency Toolkit) is used to compress the image model libraries used to provide the weights against which the input data (activations) is compared in Akida.

AIMET supports many features, such as Adaptive Rounding (AdaRound) and Channel Pruning, and the results speak for themselves. For example, AIMET’s data-free quantization (DFQ) algorithm quantizes 32-bit weights to 8-bits with negligible loss in accuracy. AIMET’s AdaRound provides state-of-the-art post-training quantization for 8-bit and 4-bit models, with accuracy very close to the original FP32 performance. AIMET’s spatial SVD plus channel pruning is another impressive example because it achieves a 50% MAC (multiply-accumulate) reduction while retaining accuracy within 1% of the original uncompressed model.

In May 2020, our Qualcomm Innovation Center (QuIC) open-sourced AIMET. This allows for collaboration with other ML researchers to continually improve model efficiency techniques that benefit the ML community
.

There are many standard model libraries and many are in 32-bit format. AIMAT can compress these to 8-bit format. AdaRound can optimize the compressed 8-bit format and 4-bit format models. Akida 1000 can utilize 1, 2, and 4-bit weights and activations.

BrainChip has developed some in-house model libraries adapted for use with Akida, but the production of model libraries is a very time-consuming task, so It is useful to be able to adapt existing model libraries for use with Akida.

Footnote: As I mentioned before, I think Akida could be adapted to 8-bits since moving from 1-bit to 4-bits already required a small MAC circuit. It would, however, require a lot more silicon footprint for the MAC circuit. The 8-bit configuration would use much more power and it would also be slower, so it would be used only where very high accuracy was required, remembering that " AIMET’s AdaRound provides state-of-the-art post-training quantization for 8-bit and 4-bit models, with accuracy very close to the original FP32 performance." Akida can be configured to use different bit levels between the different layers of the SNN.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 27 users

Doz

Regular
The below snap shot is a copy of yesterday’s closing activity . We can clearly see the level of algorithmic trading , with the confirmed algo activity coloured yellow and the high likelihood of institutional unmarketable parcels coloured green . This clearly shows the limited buying and selling being conducted by retail shareholders , especially when you consider that the uncoloured will also include the odd pip trader . As Brainchip continue to conduct investor presentations globally and with the knowledge of millions of shares being transferred into international holdings , in my opinion it is only a matter of time before demand starts to outstrip supply . GLTAH’s


View recent photos 4.png
 
  • Like
  • Love
  • Fire
Reactions: 71 users
Hi Fmf,

AIMET ( AI Model Efficiency Toolkit) is used to compress the image model libraries used to provide the weights against which the input data (actuations) is compared in Akida.

AIMET supports many features, such as Adaptive Rounding (AdaRound) and Channel Pruning, and the results speak for themselves. For example, AIMET’s data-free quantization (DFQ) algorithm quantizes 32-bit weights to 8-bits with negligible loss in accuracy. AIMET’s AdaRound provides state-of-the-art post-training quantization for 8-bit and 4-bit models, with accuracy very close to the original FP32 performance. AIMET’s spatial SVD plus channel pruning is another impressive example because it achieves a 50% MAC (multiply-accumulate) reduction while retaining accuracy within 1% of the original uncompressed model.

In May 2020, our Qualcomm Innovation Center (QuIC) open-sourced AIMET. This allows for collaboration with other ML researchers to continually improve model efficiency techniques that benefit the ML community
.

There are many standard model libraries and many are in 32-bit format. AIMAT can compress these to 8-bit format. AdaRound can compress the 8-bit format to 4-bit format.

BrainChip has developed some in-house model libraries adapted for use with Akida, but the production of model libraries is a very time-consuming task, so It is useful to be able to adapt existing model libraries for use with Akida.
Awesome thanks D...knew you'd expand on it.

If I understand then....

So essentially a bridging type set up that users can utilise to compress other non BRN libraries to make it easier to use those with Akida?

Expands our reach & use capabilities for users not just being locked into our much smaller, at this point, libraries.
 
  • Like
  • Fire
Reactions: 8 users

Diogenese

Top 20
I see Intellisense have a new solicitation proposal with NASA.

Unlike the NECR proposal which is now Ph II, this one doesn't specify Akida however I've highlighted a red section and given their experience with us so far you would have to expect the same path...no?




Proposal Information


Proposal Number:
22-1- H6.22-1780


Subtopic Title:
Deep Neural Net and Neuromorphic Processors for In-Space Autonomy and Cognition


Proposal Title:
Adaptive Deep Onboard Reinforcement Bidirectional Learning System


Small Business Concern


Firm:

Intellisense Systems, Inc.


Address:

21041 South Western Avenue, Torrance, CA 90501


Phone:

(310) 320-1827
Technical Abstract (Limit 2000 characters, approximately 200 words):
NASA is seeking innovative neuromorphic processing methods and tools to enable autonomous space operations on platforms constrained by size, weight, and power (SWaP). To address this need, Intellisense Systems, Inc. (Intellisense) proposes to develop an Adaptive Deep Onboard Reinforcement Bidirectional Learning (ADORBL) processor based on neuromorphic processing and its efficient implementation on neuromorphic computing hardware. Neuromorphic processors are a key enabler to the cognitive radio and image processing system architecture, which play a larger role in mitigating complexity and reducing autonomous operations costs as communications and control become complex. ADORBL is a low-SWaP neuromorphic processing solution consisting of multispectral and/or synthetic aperture radar (SAR) data acquisition and an onboard computer running the neural network algorithms. The implementation of artificial intelligence and machine learning enables ADORBL to choose processing configurations and adjust for impairments and failures. Due to its speed, energy efficiency, and higher performance for processing, ADORBL processes raw images, finds potential targets and thus allows for autonomous missions and can easily integrate into SWaP-constrained platforms in spacecraft and robotics to support NASA missions to establish a lunar presence, to visit asteroids, and to extend human reach to Mars. In Phase I, we will develop the CONOPS and key algorithms, integrate a Phase I ADORBL processing prototype to demonstrate its feasibility, and develop a Phase II plan with a path forward. In Phase II, ADORBL will be further matured, implemented on available commercial neuromorphic computing chips, and then integrated into a Phase II working prototype along with documentation and tools necessary for NASA to use the product and modify and use the software. The Phase II prototype will be tested and delivered to NASA to demonstrate for applications to CubeSat, SmallSat, and rover flights.




Potential NASA Applications (Limit 1500 characters, approximately 150 words):
ADORBL technology will have many NASA applications due to its low-SWaP and increased autonomy. It can be used to enable autonomous space operations beyond Low Earth Orbit to establish a lunar presence, visit asteroids, and extend human reach to Mars. ADORBL can be directly transitioned to the NASA Glenn Research Center to address the needs of the Cognitive Communications Project, the Human Exploration and Operations Mission Directorate (HEOMD) Space Communications and Navigation (SCaN) Program.




Potential Non-NASA Applications (Limit 1500 characters, approximately 150 words):
Commercial applications of ADORBL include remote sensing, geophysical and planetary surveying and prospecting, atmosphere, water, and land pollution monitoring, space flights and space exploration. Multispectral sensor data fusion can be used in aviation security and mine and explosives detection. Wider applications include machine vision, robotics, telemedicine, spectral medical imaging.




Duration: 6
Form Generated on 05/25/2022 15:44:17
 
  • Like
  • Fire
Reactions: 11 users

wilzy123

Founding Member
The below snap shot is a copy of yesterday’s closing activity . We can clearly see the level of algorithmic trading , with the confirmed algo activity coloured yellow and the high likelihood of institutional unmarketable parcels coloured green . This clearly shows the limited buying and selling being conducted by retail shareholders , especially when you consider that the uncoloured will also include the odd pip trader . As Brainchip continue to conduct investor presentations globally and with the knowledge of millions of shares being transferred into international holdings , in my opinion it is only a matter of time before demand starts to outstrip supply . GLTAH’s


View attachment 11308
Nice one. Thanks :)
 
  • Like
Reactions: 13 users

Dang Son

Regular
In relation to the Citicorp Nominees topic, I've done some more investigation. On the surface it looks like they have taken large positions in many companies listed on the ASX:

MYER
View attachment 11295

RIO TINTO
View attachment 11297

JB HIFI
View attachment 11298

Wesfarmers
View attachment 11299

AGL
View attachment 11300

CBA
View attachment 11301

BRN (not current)
View attachment 11302

Additionally, HSBC, JP Morgan, BNP Paribas Nominees, and other "nominee" companies have taken up positions too in all of the examples shown above. Knowing this now, I can perhaps conclude that these type of investments are a collective (ie smsf's, managed funds, individual investors etc), and is perhaps a good thing?
All these Co's are having SP manipulated by Bots but probably not only by Citicorp
Looks like a Take Over of Australian shares equity as provided for by current trading rules to bias institutions.
 
Last edited:
  • Like
Reactions: 7 users

buena suerte :-)

BOB Bank of Brainchip
The below snap shot is a copy of yesterday’s closing activity . We can clearly see the level of algorithmic trading , with the confirmed algo activity coloured yellow and the high likelihood of institutional unmarketable parcels coloured green . This clearly shows the limited buying and selling being conducted by retail shareholders , especially when you consider that the uncoloured will also include the odd pip trader . As Brainchip continue to conduct investor presentations globally and with the knowledge of millions of shares being transferred into international holdings , in my opinion it is only a matter of time before demand starts to outstrip supply . GLTAH’s


View attachment 11308
Wow... thanks for that Doz ... something to get my head around :)
 
  • Like
Reactions: 10 users

Doz

Regular
Just while I am banging on about the high level of algorithmic trading being conducted on BRN , the below image clearly shows that the same algo is being used on both sides of the ledger . I find this activity highly questionable , I would even go as far as claiming that it is manipulate and illegal , as it is in my opinion being conducted for a financial benefit . Maybe someone would like to ask the question with our authorities entrusted with maintaining a fair and equitable market . GLTAH’s



Bot on both sides of ledger.png
 
  • Like
  • Fire
  • Love
Reactions: 48 users
Top Bottom