BRN Discussion Ongoing

HopalongPetrovski

I'm Spartacus!
  • Like
  • Haha
  • Wow
Reactions: 7 users

Diogenese

Top 20
Your use of the term SPACS in connection with Brainchip in 2023 it having listed via a reverse takeover in 2015 to undertake fundamental research into neuromorphic computing underpinned by a world wide patent portfolio protection dating from 2008 seems a very interesting approach.

“special purpose acquisition company (SPAC) is a “blank check” shell corporation designed to take companies public without going through the traditional IPO process.”

It becomes even more interesting when Brainchip officially declared it had moved from research phase to a company commercialising the IP which had arisen from this fundamental research and has in fact accounted for sales of the IP to two major semiconductor world players in Renesas and MegaChips and has product coming to market in 2023 as a result from them as well as from Socionext making your statement regarding income entirely misleading.

I have also read @chapman89 ’s posts and it is also interesting that you have styled his narrative as making light (flippant) of revenue in up coming 4Cs as having done so I obtained the exact opposite view of his intent.

I perhaps should be more charitable but your decision to post charts on this thread and extol opinions based on same when there is a dedicated space for chartists seems to conflate with the above matters and give rise to the need for me to ask what is your intent here?

In the absence of an explanation your failure to address another posters reasonable question regarding what you claimed about your charts showing in 2021 prior to the Mercedes Benz reveal does suggest dishonest manipulation may be your stock in trade.

I do hope this is not the case because you have come to the wrong place.

My opinion only DYOR
FF

AKIDA BALLISTA
I do have this poster on Ignore, but @zeeb0t allows us to take a peek, and I notice that SL attributes the recent price rise to shorts, presumably covering their posteriors.

Friday's "shorts" are quite skimpy (T-Bar?) and don't leave much to the imagination.

1672547485967.png
 
  • Like
  • Love
  • Fire
Reactions: 18 users

VictorG

Member
Not likely when it is a short post, capitals are used and this poster likes to claim extensive market knowledge and experience and well knows the difference between a Spec and a Spac. 😂🤣🤡😂🤣

My opinion only DYOR
FF

AKIDA BALLISTA
In the words of the immortal Bee Gees
"Where are the girls I left far behind,
the spicks and the specks,
of the girls on my mind".
Sorry, no mention of SPACS.

 
  • Haha
  • Love
  • Like
Reactions: 8 users
Exciting other applications for the technology here too

View attachment 25839
View attachment 25837

OTHER COMMERCIAL APPLICATIONS​


Because the Group’s technology is based on the analysis of reflectance spectra (colour) combined with images providing information about shape, several agricultural applications other than the discrimination of weeds from crop, in real time, become apparent.

Precise fertiliser application​

Plants, and more specifically plant leaves, frequently develop colour-based characteristics because of nutrient deficiencies in the soil they are growing in. Our technology could be immediately adapted to analyse and identify nutrient deficiencies at a plant by plant level within an agricultural environment with immediate consequences to the application of fertilisers – that is, our technology could be the lead technology that allows for precise fertiliser application tailored to optimise soil in which individual plants are growing.

Grain sorting​

Spectral reflectance allows for the analysis of parts of the spectrum that are outside the range of light visible to humans. Both water levels and protein levels in cereal crops are important indicators as to the value of a grain crop. Our technology has the ability to identify and measure these variables, so a potential use for the technology would be for a grain-handler to deploy the technology to interrogate and classify varying grain grades based on objective criteria, or at a farmer level, allow for the sorting of grain, based on quality, as it is harvested.

Reduce herbicide resistance​

Another immediate use for Group’s technology is the spraying of all vegetative matter near railway lines. It is generally accepted that the constant and continued spraying of plants growing with a few metres of a railway lines (for safety reasons because left unchecked the plants can destabilise the ballast material that holds the lines) is a major cause of herbicide resistance in weeds. This spraying is currently undertaken manually, but with our technology it can be automated. Further, when linked to a GPS system, a precise record of plant location and herbicide application can be maintained, such that in a subsequent spray pass, if a plant is detected at an identical location as mapped previously, a different herbicide can be sprayed to the one used previously. In this way, future herbicide resistance can be minimised.
The maintenance of open grassland in suburban parks and the elimination of weeds from pavements to airport runways are all potential areas of operation for our technology.

The perfect lawn​

Ultimately, our technology could be seen mounted on the front of a suburban lawn mower with a small spray kit mounted on the rear of the mower, such that as the lawn is mown, individual weeds within the overall lawn are detected and killed as the lawn is mown.

"Grain sorting​

Spectral reflectance allows for the analysis of parts of the spectrum that are outside the range of light visible to humans. Both water levels and protein levels in cereal crops are important indicators as to the value of a grain crop. Our technology has the ability to identify and measure these variables, so a potential use for the technology would be for a grain-handler to deploy the technology to interrogate and classify varying grain grades based on objective criteria, or at a farmer level, allow for the sorting of grain, based on quality, as it is harvested."

You have no idea how long it takes to sort a million tonnes of wheat so that all the same size grains go into the same silo.🤡🤡🤡

Seriously though who would have thought grain sorting is a thing that would ever be possible at the individual grain level.

Could make a huge financial difference for farmers who can using such a system salvage grain that would have to otherwise go to feedstock when rain interferes with harvesting.

Revolutionising and creating industries that do not yet exist.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 36 users
Anyone found another SPAC or SPEC stock with the following alliances after just 12 months of full blown commercialisation that I can invest in on the ASX for under 80 cents a share with product being released to market under licence during 2023 by Renesas the number 3 Automotive supplier of semiconductors, mainly MCU's, in the world three places ahead of Bosch. By the way did anyone know Renesas has produced over 40 billion MCU's since inception:

1. ARM

2. EDGE IMPULSE

3. INTEL

4. ISL

5. MEGACHIPS

6. MERCEDES BENZ

7. MOSCHIPS

8. NASA

9. NUMEN

10. NVISO

11. PROPHESEE

12. RENESAS

13. SiFIVE

14. SOCIONEXT

15. VALEO


16. VVDN

Take your time do not answer all at once as I need to write them down. I do not want to miss any as each one would present an amazing opportunity to get in on the ground floor.


My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Fire
  • Haha
Reactions: 66 users

Diogenese

Top 20
Exciting other applications for the technology here too

View attachment 25839
View attachment 25837

OTHER COMMERCIAL APPLICATIONS​


Because the Group’s technology is based on the analysis of reflectance spectra (colour) combined with images providing information about shape, several agricultural applications other than the discrimination of weeds from crop, in real time, become apparent.

Precise fertiliser application​

Plants, and more specifically plant leaves, frequently develop colour-based characteristics because of nutrient deficiencies in the soil they are growing in. Our technology could be immediately adapted to analyse and identify nutrient deficiencies at a plant by plant level within an agricultural environment with immediate consequences to the application of fertilisers – that is, our technology could be the lead technology that allows for precise fertiliser application tailored to optimise soil in which individual plants are growing.

Grain sorting​

Spectral reflectance allows for the analysis of parts of the spectrum that are outside the range of light visible to humans. Both water levels and protein levels in cereal crops are important indicators as to the value of a grain crop. Our technology has the ability to identify and measure these variables, so a potential use for the technology would be for a grain-handler to deploy the technology to interrogate and classify varying grain grades based on objective criteria, or at a farmer level, allow for the sorting of grain, based on quality, as it is harvested.

Reduce herbicide resistance​

Another immediate use for Group’s technology is the spraying of all vegetative matter near railway lines. It is generally accepted that the constant and continued spraying of plants growing with a few metres of a railway lines (for safety reasons because left unchecked the plants can destabilise the ballast material that holds the lines) is a major cause of herbicide resistance in weeds. This spraying is currently undertaken manually, but with our technology it can be automated. Further, when linked to a GPS system, a precise record of plant location and herbicide application can be maintained, such that in a subsequent spray pass, if a plant is detected at an identical location as mapped previously, a different herbicide can be sprayed to the one used previously. In this way, future herbicide resistance can be minimised.
The maintenance of open grassland in suburban parks and the elimination of weeds from pavements to airport runways are all potential areas of operation for our technology.

The perfect lawn​

Ultimately, our technology could be seen mounted on the front of a suburban lawn mower with a small spray kit mounted on the rear of the mower, such that as the lawn is mown, individual weeds within the overall lawn are detected and killed as the lawn is mown.


US8179533B2 Sensing system and method for discriminating plant matter - Photonic Detection Systems
1672549222259.png



This is LiDaR:

A sensing system comprises a light source having three or more distinct wavelengths for illuminating a plurality of distinct areas in a field of view, a sensor for measuring the reflectance of the distinct areas at each of the distinct wavelengths, and an identifier for identifying at least one object in the field of view from the measured reflectance at each of the wavelengths.

1. A sensing system for discriminating plant matter comprising:

a light source comprising three or more lasers, each producing a laser beam of a different wavelength;

a collimator for collimating the laser beams from the plurality of lasers into a combined beam;

a splitter for splitting the combined beam into a plurality of beams each with the three or more wavelengths such that the beams are directed at distinct non-overlapping areas in a field of view:

a sensor for distinctly measuring the reflectance from each of the distinct non-overlapping areas at each of the distinct wavelengths; and

an identifier for identifying at least one plant type in the field of view from the measured reflectance at each of the wavelengths at each of the distinct non-overlapping areas
.


... and a couple of more recent applications by a couple of the authors:

WO2020014727A1 A DETECTION SYSTEM FOR DETECTING MATTER AND DISTINGUISHING SPECIFIC MATTER FROM OTHER MATTER

1672550170594.png


The present disclosure provides a detection system for detecting matter and distinguishing specific matter from other matter. The detection system comprises a spectral analysis system configured to at least assist in determining whether matter comprises the specific matter based on an intensity of light reflected from the specific matter. The spectral analysis system comprises a light source capable of emitting light having at least one known wavelength or wavelength range. Further, the spectral analysis system comprises an optical element for directing the emitted light towards the matter. The spectral analysis system also comprises a detector for detecting light reflected from the matter and a spatial analysis stem. The spatial analysis system is configured to at least assist in determining whether the matter comprises the specific matter based on a shape of the matter. The spatial analysis system comprising an image capturing device for capturing an image of the matter. Further, the spatial analysis system comprises an outcome determination system arranged to receive a first input from the spectral analysis system and a second input from the spatial analysis system, and determine an outcome providing an indication of whether the matter is specific matter based on the first and second inputs.

[Same discussion of NNs as below]

WO2020014728A1 A DETECTION SYSTEM FOR DETECTING MATTER AND DISTINGUISHING SPECIFIC MATTER FROM OTHER MATTER

1672549658646.png



1672549861228.png




The present disclosure provides a detection system for detecting matter and distinguishing specific matter from other matter. The detection system comprises at least one light source arranged to emit one or more light beams having a known wavelength or wavelength range. Further, the detection system comprises at least one optical element configured to direct the one or more light beams onto a plurality of locations within an area of interest including the matter. The detection system also comprises a detector for detecting intensities of the one or more light beams reflected at the plurality of locations within the area of interest including the matter. In addition, the detection system comprises an outcome determination system. The system is arranged to obtain information indicative of at least a portion of a shape of at least some of the matter based on detected light intensities of the one or more light beams reflected at the plurality of locations. The system is also arranged to obtain information indicative of a spectral intensity distribution based on detected light intensities of the one or more light beams reflected at the plurality of locations. The outcome determination system is arranged determine whether the matter is specific matter based on the information indicative of at least a portion of a shape of at least some of the matter and based on the information indicative of a spectral intensity distribution.


To train the system 116, a large set of training data is obtained by applying the detection system 100 to test case scenarios . This involves setting up an area of interest having plant matter 115 including multiple crop plants 134 and weeds 136, emitting the array of N (rows) x M (columns) light beams towards the area of interest, and detecting the reflection of component light beams 130 using the detector 114. The output signal from the detector 114 comprises information that can be used as the input values X± of the system 116 (i.e. input nodes 412 of the neural network 400) . Random weights will be assigned to weighted nodes 420 by the processor 140, and output values Yj ( i.e. the output nodes 416) are calculated. Since it is known in the setup which plant matter corresponds to crop 134 and which correspond to weed 136, it is also known which spots 138 of light beams in the array correspond to weed 136. This information represents the desired output of the training data, which is compared to the calculated output values. The neural network 400 adjusts the values of the weighted nodes 420 and repeats the process to converge the calculated output to the desired output. The system 116 can be trained further by rearranging the test setup and re-running with additional test data. Once the system 116 is trained, it can be applied to real-world scenarios.
 
  • Like
  • Love
  • Fire
Reactions: 29 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Are you saying you can completely discount SynSense as being the provider and if so based on what evidence given the following release:


My opinion only DYOR
FF

AKIDA BALLISTA

PS: AKIDA is better is not a sufficient answer.🤣😂🤣🪁🪁🪁

You know what I find really interesting @Fact Finder? Actually it wouldn't surprise me if you were telepathic but incase you're not, I'll tell you.😝

I find it VERY interesting that the partnership announcement that you posted only refers to "Speck" which is SynSense’s neuromorphic vision processor. Also, the eetimes article "Cars that think like you" only mentions BMW using Speck to "capture real-time visual information, recognize and detect objects, and perform other vision-based detection and interaction functions".

Intriguingly, I can't find "Xylo" mentioned anywhere in Googleland in realtion to BMW. "Xylo" is SynSense’s neuromorphic chip for low-dimensional signal (audio) processing. Maybe it just doesn't have enough grunt for the voice control component?


PS: AKIDA is much betterer and more perfeck anyway. 😝😜🤪


Screen Shot 2023-01-01 at 3.51.45 pm.png

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 34 users

TopCat

Regular
Don’t think anyone has gone down this rabbit hole yet? I looked into the DSTG Women in STEM Award - specifically what her paper was about

I couldn’t actually find the paper that the won the award - An energy-efficient AkidaNet for morphologically similar weeds and crops recognition at the Edge' (co-authors Kevin Tsiknos, Kristofor Carlson, Selam Ahder, but another one that lead to the outputs below

My search lead me to the Australian company Photonic Group

Vi Nguyen Thanh Le has journal article cited on their website

PATENTED TECHNOLOGY TO DISTINGUISH ONE OBJECT FROM ANOTHER.​

Our patented technology seeks to mimic the human eye as a mechanism for distinguishing one object from another in real time by using spectral reflectance data (colour) as well as images (shape) as a combined differentiator.


AGRICULTURAL SPRAYING – DIFFERENTIATE BETWEEN PLANTS AND WEEDS IN REAL TIME.​

Commercially, the Group is currently focused on deploying the technology within the agricultural sector where the accurate real time differentiation of one green plant from another has substantial commercial implications in terms of the reduction in herbicide application following the ability to distinguish one plant as desirable crop and not spray it, and another as an undesirable weed and to spray that plant in isolation.


View attachment 25827


View attachment 25826


View attachment 25816
View attachment 25829

View attachment 25819




View attachment 25823

View attachment 25821


WHAT WE DO.​


PATENTED TECHNOLOGY TO DISTINGUISH ONE OBJECT FROM ANOTHER.

Our patented technology seeks to mimic the human eye as a mechanism for distinguishing one object from another in real time by using spectral reflectance data (colour) as well as images (shape) as a combined differentiator.


AGRICULTURAL SPRAYING – DIFFERENTIATE BETWEEN PLANTS AND WEEDS IN REAL TIME.​

Commercially, the Group is currently focused on deploying the technology within the agricultural sector where the accurate real time differentiation of one green plant from another has substantial commercial implications in terms of the reduction in herbicide application following the ability to distinguish one plant as desirable crop and not spray it, and another as an undesirable weed and to spray that plant in isolation.


OTHER APPLICATIONS.​

Our patent families encompass object differentiation using size, shape and colour, and accordingly we are of the opinion that this technology now truly does mimic the human eye and as such ,the technology has broad application in a multitude of commercial scenarios, some of which are described in accompanying pages. However, we acknowledges that the number of potential applications for this new technology are vast and should anyone believe that our technology has particular application in some specific field or endeavour or would like to explore how our technology could be used or deployed in the future, either in isolation or teamed with some other technology, we would encourage that person to contact is to further discuss and evaluate the concept.


WHO WE ARE​


The purpose behind the formation of the Photonic Group was to determine if it was possible to create an automated detection system that used light to distinguish one plant from another.

Since that time, the Group has made several key discoveries leading to the lodgement of various patent families in various countries, including Australia, Canada, USA, and Europe.

In 2017, the Group realised that real-time identification using only one discrimination mechanism (spectral reflectance) did not, of it itself, allow for the requisite discrimination in all instances encountered, so a decision was made to identify a suitable complimentary detection technology that could be combined or hybridised with spectral reflectance to generate superior discrimination rates.

Imaging technology was found to be the best complementary technology and the system now developed uses a combination of image data and spectral reflectance data, collected simultaneously, with both data streams being blended and ultimately analysed via the application of artificial intelligence in our proprietary neural net.

Selective spraying using Photonic Group detection​

As a result of the work done, the Group has determined that the generation of spectral reflectance data by illuminating a target with a selection of specific laser wavelengths and the collection and analysis of that spectral reflectance data in real time, combined with image data collected at the same time does indeed enable the detection unit to distinguish one plant from another.

Having distinguished one plant from another, the system can then be programmed to make a range of decisions – within an agricultural environment, these decisions are typically Spray Plant A, ignore all other plants, or ignore Plant A, spray everything else, however, once the identification is made, the decisions and actions following from that identification are totally contained with the system programming.

As the US Marines have observed – “If you can see a target, you can hit it, and if you can hit it, you can kill it.”

Real time identification & spraying​

The initial step is the most difficult – the seeing of the target – what our detection unit does is provide a substitute for the human eye (but is not limited to the human eye limitations in terms of only using the visible light portion of the entire electromagnetic spectrum) to identify a target in real time. Once that identification is made, decisions and actions will follow, subject only to the pre-programmed instructions of the system.


View attachment 25820

A TECHNOLOGICAL BREAKTHROUGH​


The recently developed discrimination sensor has been termed the ‘Missing Link in Precision Agriculture’ and as such represents the future of real time weed / crop discrimination.

At its most basic it is a system that provides a farmer with real time discrimination between differing types of vegetation, typically discriminating between crop and weed.

The demand for such a system within the precision agricultural arena has been high, predominantly because of the costly outlay arising from the current practice of blanket spraying of pre and post emergent weeds; a practice that is now universally recognised as being highly inefficient, expensive and hazardous to both human and environmental health.

Another potential usage within agriculture is dealing with those weeds that are starting to show resistance to any herbicide applied in a blanket pre-emergent spray.

Because the herbicide applied using the Group’s technology is used precisely and sparingly, a second spray run can be done in the weeks following a blanket spray, and where viable plants (i.e. those starting to develop resistance to the herbicide used as a blanket spray) are detected, those plants can be re-sprayed using a more expensive, but more effective herbicide, thus eliminating from the farm’s seed bank any weeds developing resistance to the blanket spray herbicide

Furthermore, because the decision to turn on a spray nozzle is made in real-time, the precise location of where that nozzle is activated can be recorded by the technology, leading to the generation of ‘paddock maps’ showing the precise location and numbers of activation in a given area.

Should this information be passed back to a central location, then, at the farm level, analysis of the paddock maps over time, will allow farmers to see the impact their spraying program, identify the direction of and speed of spread of any invading weed, etc.

Analysis of multiple farms in the same region, will inevitably lead in better regional agronomic information regarding the control of weeds across multiple farming properties.

HOW DOES IT WORK? (UNIT LEVEL)​


Each detection unit currently contains three lasers projecting light at three discrete and highly optimised wavelengths.

These lasers are sequentially switched on with each pulse of light passing through an optical cavity that generates multiple beams from each laser source. A linear photo detector imager records the intensities of the laser light reflected off any plants within view.

Simultaneously, a camera takes a series of images and both the spectral reflectance data and the image data are combined with an on-board controller circuit then using both data streams, calculates the signature and compares that signature to signatures stored in a database.

Should the signature match the profile of a pre-recorded weed in the database, the system generates a ‘positive strike’ signal that then results in a positive action occurring, such as a spray nozzle being activated and the weed being sprayed, or the position of the weed being logged using a d GPS system.

unit-level.jpg

HOW DOES IT WORK? (SYSTEM LEVEL)​


Each detection unit covers a detection field of 500mm. Multiple detection units are mounted on a vehicle side by side to achieve the desired detection swathe – i.e. 4 units provide 2 metres coverage.

The vehicle is then driven forward at a relatively constant speed such that the units traverse and interrogate the terrain. Sensors detect the reflected laser intensities from the ground and vegetation, whilst images are being generated. The electronic system then processes the recorded data.

Once a target weed is detected a ‘positive strike’ signal is generated to activate a nozzle and spray the weed, or to log its precise position, or both.

If spraying is the selected outcome arising from a positive strike; because the distance between any individual detection unit and its associated spray nozzle is known absolutely and because the speed of the vehicle at that particular instant of time is also known, the system allows for an appropriate delay before activating the spray nozzle so that the spray nozzle is only activated immediately in front of the weed.

These factors then allow the spray nozzle to remain open only whilst it is positioned above the weed and once the detection unit determines that it has transited the weed, it turns off the spray nozzle.

In this way, that which is sprayed out of the spray nozzle is only sprayed immediately before the detected weed, across the detected weed and turned off immediately after the weed ceases to be detected – in this way, herbicide is precisely applied, minimising the volume of herbicide used per square metre and minimising the deleterious effects of excess herbicide coverage on the crop, on the soil and generally on the environment.
system-level.jpg

Because the herbicide is being applied so precisely, it is possible to envisage the Group’s sensor platform being used in row cropping scenarios, such as cotton, sugar-cane and similar, where the technology actually detects and sprays undesirable plants within the actual row, and not just in the area between rows.

FUTURISTIC​


Several technologies are now converging and one possible future within the agricultural sector would be the creation of a multiplicity of small, ground based, autonomous (or semi-autonomous) weeding devices, bearing a considerable likeness to domestic semi-autonomous vacuum cleaners.

These devices would, over time, map a particular area and / or have specified coordinates within which they operate on a 24 / 7 basis. Each device would be equipped with a Photonic Group sensor and would ‘patrol’ a broad-acre paddock or similar, constantly looking for plants that are designated as undesirable at that location (canola plants might not be regarded as undesirable, unless they were found in a field of barley, for example).

Once detected, the device could deploy one of many possible mechanisms to eradicate the undesirable plant that do NOT involve herbicides at all.

For example, the plant could be mechanically removed, the device could be fitted with solar cells and generate boiling water, position itself over the plant and generate a high voltage discharge, generate a flame, etc. All these measures do NOT involve herbicide as a killing mechanism, and all are far more environmentally friendly than continually applying increasingly more expensive herbicides.


As an interim measure to that future, the device could carry one or more herbicides, selecting which one to use based on prior logged activity, and could be programmed to return to a base station as its energy supply, or any of its payload herbicides ran low.

All instances of any intervention could be reported to a central information repository, allowing for real time analysis of activities undertaken to eradicate undesirable plants, in terms of location, frequency, and the like.
This environmentally friendly scenario ultimately rests on the ability of a machine to recognize ‘friend’ or ‘foe’ in real time.

The Photonic Group sensor platform is designed to allow machines to make this decision and once the decision is made, the consequences and down-stream outcomes of that decision can be readily programmed into the system.
I noticed in the article they had some patents in Canada. Precision.ai is a Canadian company developing weed spraying with drones


Precision AI has recently won an award for edge computing in California, but it’s not a term you hear often in agriculture. What is edge computing? And why does Precision AI use it? Here’s an easy-to-read breakdown:

Edge computing in a nutshell​

Imagine that the internet is a sphere, and your internet network is the center. The further you go from the center, the less internet you have. Devices that use edge computing are at the very EDGE of that sphere, and don’t necessarily need to connect to the internet or the cloud to execute complex tasks. Instead, they can be their very own data center, holding and processing most or all their data internally. For example: self-driving cars, in-hospital patient monitoring, or smart cameras.
There are a variety of reasons to use edge computing. Most commonly to solve the problem of latency (slow internet connection) when fast or data heavy processing is required. Does slow internet and a lot of data sound like a familiar problem in rural agriculture to you?

2 reasons why edge computing is a necessity for our drones

1. Near instant weed identification​

When we began utilizing artificial intelligence, we found that the quality of images needed to accurately identify weeds paired with the sheer number of images taken for large fields meant we needed to process massive amounts of data in a short period of time. Some fields can amass over 32 terabytes of data. That can equal 34 days to upload, process, and download back into the system for chemical application. We can’t wait 34 days to make spray decisions!
To put it into perspective, 32 terabytes of data is equal to approximately 46 days of nonstop music, 6,300 movies, 5.1 million pictures, or 2.7 billion pages of Word documents.
Spraying even 3-4 days after weed mapping would no longer achieve optimal results. Waiting means weeds and crops have grown. Huge changes in weather patterns can happen, resulting in sub-optimal spraying conditions such as high winds or heavy rainfall. There needed to be a better way, and we have one.
By using edge computing, our cameras can image at the highest sub-millimeter resolution to accurately identify crop and weeds. No longer does it take days to offload the data, process it, and re-upload for spraying. Edge computing allows us to process that data in real-time onboard our drones for spraying in a single pass, 8x faster than the industry average.

2. Limited connectivity in the field​

We have all experienced being without cell service. One of the most common locations for having no cell or internet connection is in the middle of an agricultural field in a rural area. Oftentimes internet connectivity is spotty or non-existent. Trying to rely on a connection to effectively process images in real time while in these locations is a no-go from the start. This issue removes the option for cloud computing.
Edge computing eliminates the need for cell service in remote locations and brings data processing closer to the action. After our drones complete their surveying, it is possible to upload a much more compact data package. This maximizes your customized field data once you are back in an area of connectivity.
Agriculture spray decisions are made as part of a system. This system includes speed of decision, timeliness of decision, and accuracy of decision. It is time to get our heads out of the clouds and begin making those decisions from the ground up and sky down.
Have more questions about how Precision AI uses edge computing? Shoot us a message.
 
  • Like
  • Love
  • Fire
Reactions: 29 users

Diogenese

Top 20
US8179533B2 Sensing system and method for discriminating plant matter - Photonic Detection Systems
View attachment 25841


This is LiDaR:

A sensing system comprises a light source having three or more distinct wavelengths for illuminating a plurality of distinct areas in a field of view, a sensor for measuring the reflectance of the distinct areas at each of the distinct wavelengths, and an identifier for identifying at least one object in the field of view from the measured reflectance at each of the wavelengths.

1. A sensing system for discriminating plant matter comprising:

a light source comprising three or more lasers, each producing a laser beam of a different wavelength;

a collimator for collimating the laser beams from the plurality of lasers into a combined beam;

a splitter for splitting the combined beam into a plurality of beams each with the three or more wavelengths such that the beams are directed at distinct non-overlapping areas in a field of view:

a sensor for distinctly measuring the reflectance from each of the distinct non-overlapping areas at each of the distinct wavelengths; and

an identifier for identifying at least one plant type in the field of view from the measured reflectance at each of the wavelengths at each of the distinct non-overlapping areas
.


... and a couple of more recent applications by a couple of the authors:

WO2020014727A1 A DETECTION SYSTEM FOR DETECTING MATTER AND DISTINGUISHING SPECIFIC MATTER FROM OTHER MATTER

View attachment 25846

The present disclosure provides a detection system for detecting matter and distinguishing specific matter from other matter. The detection system comprises a spectral analysis system configured to at least assist in determining whether matter comprises the specific matter based on an intensity of light reflected from the specific matter. The spectral analysis system comprises a light source capable of emitting light having at least one known wavelength or wavelength range. Further, the spectral analysis system comprises an optical element for directing the emitted light towards the matter. The spectral analysis system also comprises a detector for detecting light reflected from the matter and a spatial analysis stem. The spatial analysis system is configured to at least assist in determining whether the matter comprises the specific matter based on a shape of the matter. The spatial analysis system comprising an image capturing device for capturing an image of the matter. Further, the spatial analysis system comprises an outcome determination system arranged to receive a first input from the spectral analysis system and a second input from the spatial analysis system, and determine an outcome providing an indication of whether the matter is specific matter based on the first and second inputs.

[Same discussion of NNs as below]

WO2020014728A1 A DETECTION SYSTEM FOR DETECTING MATTER AND DISTINGUISHING SPECIFIC MATTER FROM OTHER MATTER

View attachment 25842


View attachment 25844



The present disclosure provides a detection system for detecting matter and distinguishing specific matter from other matter. The detection system comprises at least one light source arranged to emit one or more light beams having a known wavelength or wavelength range. Further, the detection system comprises at least one optical element configured to direct the one or more light beams onto a plurality of locations within an area of interest including the matter. The detection system also comprises a detector for detecting intensities of the one or more light beams reflected at the plurality of locations within the area of interest including the matter. In addition, the detection system comprises an outcome determination system. The system is arranged to obtain information indicative of at least a portion of a shape of at least some of the matter based on detected light intensities of the one or more light beams reflected at the plurality of locations. The system is also arranged to obtain information indicative of a spectral intensity distribution based on detected light intensities of the one or more light beams reflected at the plurality of locations. The outcome determination system is arranged determine whether the matter is specific matter based on the information indicative of at least a portion of a shape of at least some of the matter and based on the information indicative of a spectral intensity distribution.


To train the system 116, a large set of training data is obtained by applying the detection system 100 to test case scenarios . This involves setting up an area of interest having plant matter 115 including multiple crop plants 134 and weeds 136, emitting the array of N (rows) x M (columns) light beams towards the area of interest, and detecting the reflection of component light beams 130 using the detector 114. The output signal from the detector 114 comprises information that can be used as the input values X± of the system 116 (i.e. input nodes 412 of the neural network 400) . Random weights will be assigned to weighted nodes 420 by the processor 140, and output values Yj ( i.e. the output nodes 416) are calculated. Since it is known in the setup which plant matter corresponds to crop 134 and which correspond to weed 136, it is also known which spots 138 of light beams in the array correspond to weed 136. This information represents the desired output of the training data, which is compared to the calculated output values. The neural network 400 adjusts the values of the weighted nodes 420 and repeats the process to converge the calculated output to the desired output. The system 116 can be trained further by rearranging the test setup and re-running with additional test data. Once the system 116 is trained, it can be applied to real-world scenarios.

Now if they could only do this with insects, we may see a revival in pollinators ... I can just see it now - a LiDaR-controlled fly-swat.
 
  • Haha
  • Like
  • Love
Reactions: 20 users
I noticed in the article they had some patents in Canada. Precision.ai is a Canadian company developing weed spraying with drones


Precision AI has recently won an award for edge computing in California, but it’s not a term you hear often in agriculture. What is edge computing? And why does Precision AI use it? Here’s an easy-to-read breakdown:

Edge computing in a nutshell​

Imagine that the internet is a sphere, and your internet network is the center. The further you go from the center, the less internet you have. Devices that use edge computing are at the very EDGE of that sphere, and don’t necessarily need to connect to the internet or the cloud to execute complex tasks. Instead, they can be their very own data center, holding and processing most or all their data internally. For example: self-driving cars, in-hospital patient monitoring, or smart cameras.
There are a variety of reasons to use edge computing. Most commonly to solve the problem of latency (slow internet connection) when fast or data heavy processing is required. Does slow internet and a lot of data sound like a familiar problem in rural agriculture to you?

2 reasons why edge computing is a necessity for our drones

1. Near instant weed identification​

When we began utilizing artificial intelligence, we found that the quality of images needed to accurately identify weeds paired with the sheer number of images taken for large fields meant we needed to process massive amounts of data in a short period of time. Some fields can amass over 32 terabytes of data. That can equal 34 days to upload, process, and download back into the system for chemical application. We can’t wait 34 days to make spray decisions!
To put it into perspective, 32 terabytes of data is equal to approximately 46 days of nonstop music, 6,300 movies, 5.1 million pictures, or 2.7 billion pages of Word documents.
Spraying even 3-4 days after weed mapping would no longer achieve optimal results. Waiting means weeds and crops have grown. Huge changes in weather patterns can happen, resulting in sub-optimal spraying conditions such as high winds or heavy rainfall. There needed to be a better way, and we have one.
By using edge computing, our cameras can image at the highest sub-millimeter resolution to accurately identify crop and weeds. No longer does it take days to offload the data, process it, and re-upload for spraying. Edge computing allows us to process that data in real-time onboard our drones for spraying in a single pass, 8x faster than the industry average.

2. Limited connectivity in the field​

We have all experienced being without cell service. One of the most common locations for having no cell or internet connection is in the middle of an agricultural field in a rural area. Oftentimes internet connectivity is spotty or non-existent. Trying to rely on a connection to effectively process images in real time while in these locations is a no-go from the start. This issue removes the option for cloud computing.
Edge computing eliminates the need for cell service in remote locations and brings data processing closer to the action. After our drones complete their surveying, it is possible to upload a much more compact data package. This maximizes your customized field data once you are back in an area of connectivity.
Agriculture spray decisions are made as part of a system. This system includes speed of decision, timeliness of decision, and accuracy of decision. It is time to get our heads out of the clouds and begin making those decisions from the ground up and sky down.
Have more questions about how Precision AI uses edge computing? Shoot us a message.
Hi @Diogenese are you game enough to "Shoot" these guys a message saying "Well I get that but does it really matter?" 😁😂🤣😎🤡
 
  • Haha
  • Like
Reactions: 10 users
Can someone please post the link to where brainchip appears on Intel site as partner please?
 
  • Like
Reactions: 5 users

TopCat

Regular
  • Like
  • Fire
  • Love
Reactions: 29 users
  • Like
  • Love
  • Fire
Reactions: 9 users

Boab

I wish I could paint like Vincent
Anyone found another SPAC or SPEC stock with the following alliances after just 12 months of full blown commercialisation that I can invest in on the ASX for under 80 cents a share with product being released to market under licence during 2023 by Renesas the number 3 Automotive supplier of semiconductors, mainly MCU's, in the world three places ahead of Bosch. By the way did anyone know Renesas has produced over 40 billion MCU's since inception:

1. ARM

2. EDGE IMPULSE

3. INTEL

4. ISL

5. MEGACHIPS

6. MERCEDES BENZ

7. MOSCHIPS

8. NASA

9. NUMEN

10. NVISO

11. PROPHESEE

12. RENESAS

13. SiFIVE

14. SOCIONEXT

15. VALEO


16. VVDN

Take your time do not answer all at once as I need to right them down. I do not want to miss any as each one would present an amazing opportunity to get in on the ground floor.


My opinion only DYOR
FF

AKIDA BALLISTA
This is what I found
Tumbleweed.jpg
 
  • Haha
  • Like
  • Love
Reactions: 30 users
Don’t think anyone has gone down this rabbit hole yet? I looked into the DSTG Women in STEM Award - specifically what her paper was about

I couldn’t actually find the paper that the won the award - An energy-efficient AkidaNet for morphologically similar weeds and crops recognition at the Edge' (co-authors Kevin Tsiknos, Kristofor Carlson, Selam Ahder, but another one that lead to the outputs below

My search lead me to the Australian company Photonic Group

Vi Nguyen Thanh Le has journal article cited on their website

PATENTED TECHNOLOGY TO DISTINGUISH ONE OBJECT FROM ANOTHER.​

Our patented technology seeks to mimic the human eye as a mechanism for distinguishing one object from another in real time by using spectral reflectance data (colour) as well as images (shape) as a combined differentiator.


AGRICULTURAL SPRAYING – DIFFERENTIATE BETWEEN PLANTS AND WEEDS IN REAL TIME.​

Commercially, the Group is currently focused on deploying the technology within the agricultural sector where the accurate real time differentiation of one green plant from another has substantial commercial implications in terms of the reduction in herbicide application following the ability to distinguish one plant as desirable crop and not spray it, and another as an undesirable weed and to spray that plant in isolation.


View attachment 25827


View attachment 25826


View attachment 25816
View attachment 25829

View attachment 25819




View attachment 25823

View attachment 25821


WHAT WE DO.​


PATENTED TECHNOLOGY TO DISTINGUISH ONE OBJECT FROM ANOTHER.

Our patented technology seeks to mimic the human eye as a mechanism for distinguishing one object from another in real time by using spectral reflectance data (colour) as well as images (shape) as a combined differentiator.


AGRICULTURAL SPRAYING – DIFFERENTIATE BETWEEN PLANTS AND WEEDS IN REAL TIME.​

Commercially, the Group is currently focused on deploying the technology within the agricultural sector where the accurate real time differentiation of one green plant from another has substantial commercial implications in terms of the reduction in herbicide application following the ability to distinguish one plant as desirable crop and not spray it, and another as an undesirable weed and to spray that plant in isolation.


OTHER APPLICATIONS.​

Our patent families encompass object differentiation using size, shape and colour, and accordingly we are of the opinion that this technology now truly does mimic the human eye and as such ,the technology has broad application in a multitude of commercial scenarios, some of which are described in accompanying pages. However, we acknowledges that the number of potential applications for this new technology are vast and should anyone believe that our technology has particular application in some specific field or endeavour or would like to explore how our technology could be used or deployed in the future, either in isolation or teamed with some other technology, we would encourage that person to contact is to further discuss and evaluate the concept.


WHO WE ARE​


The purpose behind the formation of the Photonic Group was to determine if it was possible to create an automated detection system that used light to distinguish one plant from another.

Since that time, the Group has made several key discoveries leading to the lodgement of various patent families in various countries, including Australia, Canada, USA, and Europe.

In 2017, the Group realised that real-time identification using only one discrimination mechanism (spectral reflectance) did not, of it itself, allow for the requisite discrimination in all instances encountered, so a decision was made to identify a suitable complimentary detection technology that could be combined or hybridised with spectral reflectance to generate superior discrimination rates.

Imaging technology was found to be the best complementary technology and the system now developed uses a combination of image data and spectral reflectance data, collected simultaneously, with both data streams being blended and ultimately analysed via the application of artificial intelligence in our proprietary neural net.

Selective spraying using Photonic Group detection​

As a result of the work done, the Group has determined that the generation of spectral reflectance data by illuminating a target with a selection of specific laser wavelengths and the collection and analysis of that spectral reflectance data in real time, combined with image data collected at the same time does indeed enable the detection unit to distinguish one plant from another.

Having distinguished one plant from another, the system can then be programmed to make a range of decisions – within an agricultural environment, these decisions are typically Spray Plant A, ignore all other plants, or ignore Plant A, spray everything else, however, once the identification is made, the decisions and actions following from that identification are totally contained with the system programming.

As the US Marines have observed – “If you can see a target, you can hit it, and if you can hit it, you can kill it.”

Real time identification & spraying​

The initial step is the most difficult – the seeing of the target – what our detection unit does is provide a substitute for the human eye (but is not limited to the human eye limitations in terms of only using the visible light portion of the entire electromagnetic spectrum) to identify a target in real time. Once that identification is made, decisions and actions will follow, subject only to the pre-programmed instructions of the system.


View attachment 25820

A TECHNOLOGICAL BREAKTHROUGH​


The recently developed discrimination sensor has been termed the ‘Missing Link in Precision Agriculture’ and as such represents the future of real time weed / crop discrimination.

At its most basic it is a system that provides a farmer with real time discrimination between differing types of vegetation, typically discriminating between crop and weed.

The demand for such a system within the precision agricultural arena has been high, predominantly because of the costly outlay arising from the current practice of blanket spraying of pre and post emergent weeds; a practice that is now universally recognised as being highly inefficient, expensive and hazardous to both human and environmental health.

Another potential usage within agriculture is dealing with those weeds that are starting to show resistance to any herbicide applied in a blanket pre-emergent spray.

Because the herbicide applied using the Group’s technology is used precisely and sparingly, a second spray run can be done in the weeks following a blanket spray, and where viable plants (i.e. those starting to develop resistance to the herbicide used as a blanket spray) are detected, those plants can be re-sprayed using a more expensive, but more effective herbicide, thus eliminating from the farm’s seed bank any weeds developing resistance to the blanket spray herbicide

Furthermore, because the decision to turn on a spray nozzle is made in real-time, the precise location of where that nozzle is activated can be recorded by the technology, leading to the generation of ‘paddock maps’ showing the precise location and numbers of activation in a given area.

Should this information be passed back to a central location, then, at the farm level, analysis of the paddock maps over time, will allow farmers to see the impact their spraying program, identify the direction of and speed of spread of any invading weed, etc.

Analysis of multiple farms in the same region, will inevitably lead in better regional agronomic information regarding the control of weeds across multiple farming properties.

HOW DOES IT WORK? (UNIT LEVEL)​


Each detection unit currently contains three lasers projecting light at three discrete and highly optimised wavelengths.

These lasers are sequentially switched on with each pulse of light passing through an optical cavity that generates multiple beams from each laser source. A linear photo detector imager records the intensities of the laser light reflected off any plants within view.

Simultaneously, a camera takes a series of images and both the spectral reflectance data and the image data are combined with an on-board controller circuit then using both data streams, calculates the signature and compares that signature to signatures stored in a database.

Should the signature match the profile of a pre-recorded weed in the database, the system generates a ‘positive strike’ signal that then results in a positive action occurring, such as a spray nozzle being activated and the weed being sprayed, or the position of the weed being logged using a d GPS system.

unit-level.jpg

HOW DOES IT WORK? (SYSTEM LEVEL)​


Each detection unit covers a detection field of 500mm. Multiple detection units are mounted on a vehicle side by side to achieve the desired detection swathe – i.e. 4 units provide 2 metres coverage.

The vehicle is then driven forward at a relatively constant speed such that the units traverse and interrogate the terrain. Sensors detect the reflected laser intensities from the ground and vegetation, whilst images are being generated. The electronic system then processes the recorded data.

Once a target weed is detected a ‘positive strike’ signal is generated to activate a nozzle and spray the weed, or to log its precise position, or both.

If spraying is the selected outcome arising from a positive strike; because the distance between any individual detection unit and its associated spray nozzle is known absolutely and because the speed of the vehicle at that particular instant of time is also known, the system allows for an appropriate delay before activating the spray nozzle so that the spray nozzle is only activated immediately in front of the weed.

These factors then allow the spray nozzle to remain open only whilst it is positioned above the weed and once the detection unit determines that it has transited the weed, it turns off the spray nozzle.

In this way, that which is sprayed out of the spray nozzle is only sprayed immediately before the detected weed, across the detected weed and turned off immediately after the weed ceases to be detected – in this way, herbicide is precisely applied, minimising the volume of herbicide used per square metre and minimising the deleterious effects of excess herbicide coverage on the crop, on the soil and generally on the environment.
system-level.jpg

Because the herbicide is being applied so precisely, it is possible to envisage the Group’s sensor platform being used in row cropping scenarios, such as cotton, sugar-cane and similar, where the technology actually detects and sprays undesirable plants within the actual row, and not just in the area between rows.

FUTURISTIC​


Several technologies are now converging and one possible future within the agricultural sector would be the creation of a multiplicity of small, ground based, autonomous (or semi-autonomous) weeding devices, bearing a considerable likeness to domestic semi-autonomous vacuum cleaners.

These devices would, over time, map a particular area and / or have specified coordinates within which they operate on a 24 / 7 basis. Each device would be equipped with a Photonic Group sensor and would ‘patrol’ a broad-acre paddock or similar, constantly looking for plants that are designated as undesirable at that location (canola plants might not be regarded as undesirable, unless they were found in a field of barley, for example).

Once detected, the device could deploy one of many possible mechanisms to eradicate the undesirable plant that do NOT involve herbicides at all.

For example, the plant could be mechanically removed, the device could be fitted with solar cells and generate boiling water, position itself over the plant and generate a high voltage discharge, generate a flame, etc. All these measures do NOT involve herbicide as a killing mechanism, and all are far more environmentally friendly than continually applying increasingly more expensive herbicides.


As an interim measure to that future, the device could carry one or more herbicides, selecting which one to use based on prior logged activity, and could be programmed to return to a base station as its energy supply, or any of its payload herbicides ran low.

All instances of any intervention could be reported to a central information repository, allowing for real time analysis of activities undertaken to eradicate undesirable plants, in terms of location, frequency, and the like.
This environmentally friendly scenario ultimately rests on the ability of a machine to recognize ‘friend’ or ‘foe’ in real time.

The Photonic Group sensor platform is designed to allow machines to make this decision and once the decision is made, the consequences and down-stream outcomes of that decision can be readily programmed into the system.
Here is little microdot to go with your very large dots:

Adam Osseiran
School of Engineering - Edith Cowan University
Verified email at ecu.edu.au - Homepage
Electrical Engineering

And as we all know Adam Osseiran is head of the Brainchip Scientific Advisory Board.

We also know that he has worked on projects using AKIDA technology at Edith Cowan University and in consequence it seems likely that anyone studying at Edith Cowan University and working on neuromorphic edge solutions would at least be aware of Brainchip.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 31 users
I've come to the belief that we the members of this BRN forum and brainchip staff are the only surviving humans with intelligence and forethought left on this planet. Either that or I just know a lot of braindead people😩
 
Last edited:
  • Haha
  • Like
  • Fire
Reactions: 29 users
  • Haha
  • Like
  • Love
Reactions: 6 users

HopalongPetrovski

I'm Spartacus!
I've come to the belief that we the members of this BRN forum are the only surviving humans with intelligence and forethought left on this planet. Either that or I just know a lot of braindead people😩
Without wanting to piss in anyone's pocket, I also think that this is a pretty special group and again congratulate Zeeb0t for providing the space within which we can pursue our common interest (and for some passion) without all the distorting noise as previously experienced over at the other place. Thank you to all the genuine humans, doggo's and even the odd cat who support, visit and contribute here. It is a pleasure to know of you, even in this limited fashion.
GLTAH
 
  • Like
  • Love
  • Fire
Reactions: 56 users

McHale

Regular
Now if they could only do this with insects, we may see a revival in pollinators ... I can just see it now - a LiDaR-controlled fly-swat.
What about a varroa mite free bee hive ??
 
  • Like
  • Fire
  • Love
Reactions: 29 users
Without wanting to piss in anyone's pocket, I also think that this is a pretty special group and again congratulate Zeeb0t for providing the space within which we can pursue our common interest (and for some passion) without all the distorting noise as previously experienced over at the other place. Thank you to all the genuine humans, doggo's and even the odd cat who support, visit and contribute here. It is a pleasure to know of you, even in this limited fashion.
GLTAH
Yes such a great group of people and animals😄A massive overload of great contributions to sift through don't think I've ever noticed a single post being moderated. Really pissed me off that on topic juicy dots directly related to brn got deleted from that festering sewage pipe of a chat site we use to frequent. Checking in on occasions it resembles a mental asylum with free wifi access.
And it is a great pleasure to be amongst like minded people who have the same goal to further the knowledge on our investment in BRN.
 
  • Love
  • Like
  • Fire
Reactions: 29 users
Top Bottom