BRN 0.00% 19.5¢ brainchip holdings ltd

2024 BrainChip Discussion, page-6961

  1. 7,905 Posts.
    lightbulb Created with Sketch. 2037
    https://sites.google.com › view › s...
    Google Sites
    Telluride 2024 - SPA24: Neuromorphic systems for space applications
    .
    Invited Speakers
    • Prof Matthew McHarg - US Air Force Academy

    • Dr Noor Qadri - US Naval Research Laboratory

    • Dr Paul Kirkland - University of Strathclyde

    • Dr Andrew Wabnitz - DSTG

    • Dr Damien Joubert - Prophesee

    • Laurent Hili - ESA
    Goals
    The use of neuromorphic engineering for applications in space is one of the most promising avenues to successful commercialisation of this technology. Our topic area focuses on the use of neuromorphic technologies to acquire and process space data captured from the ground and from space, and the exploration and exploitation of neuromorphic algorithms for space situational awareness, navigation, visual attention and control. The project combines the academic expertise of Western Sydney University in Australia (Misha Mahowald Prize 2019) and KTH in Sweden, as well as the applied industry expertise of fortiss in Germany and Neurobus in France. Neuromorphic computing is a particularly good fit for this domain due to its novel sensor capabilities, low energy consumption, its potential for online adaptation, and algorithmic resilience. Successful demonstrations and projects will substantially boost the visibility of the neuromorphic community as the domain is connected to prestigious projects around satellites, off-earth rovers, and space stations. Our goal is to expose the participants to a range of challenging real-world applications and provide them with the tools and knowledge to apply their techniques where neuromorphic solutions can shine.

    Neurobus will make available neuromorphic hardware on site. This setup facilitates experiments, specific data collection for applications, experimenting with ...
    Projects
    • Algorithms for processing space-based data: The organisers will make a real-world space dataset available thatwas recorded from the ISS, specifically for the purpose of this workshop. In addition, data can be recorded with WSU's remote-controlled observatory network. There are exciting discoveries to be made using this unexplored data, especially when combined with neuromorphic algorithmic approaches.

    • Processing space-based data using neuromorphic computing hardware: Using real-world data, from both space and terrestrial sensors, we will explore algorithms for star tracking, stabilisation, feature detection, and motion compensation on neuromorphic platforms such as Loihi, SpiNNaker, BrainChip, and Dynap. Given that the organisers are involved in multiple future space missions, the outcomes of this project may have a significant impact on a future space mission (and could even be flown to space!)

    • Orbital collision simulator: The growing number of satellites in the Earth’s orbit (around 25,000 large objects) makes it increasingly likely that objects will collide, despite our growing Space Situational Awareness capabilities to monitor artificial satellites. The fast and chaotic cloud of flying debris created during such collisions can threaten more satellites and is very hard to track with conventional sensors. By smashing Lego “cubesats” () in front of a Neuromorphic camera, we can emulate satellite collisions and assess the ability of the sensor to track the pieces and predict where they will land. We will bring Lego kits to the workshop. Participants will need to design a “collider”, build cubesats with legos, record data, measure the position of fallen pieces, and write algorithms to process the data.

    • High-speed closed-loop tracking with neuromorphic sensors: Our motorised telescope can move at up to 50o per second and can be controlled by a simple API. The low-latency of event cameras makes it possible to dynamically control the motors using visual feedback to keep a moving object (bird, drone, plane, satellite...) in the centre of the field of view. The algorithm can be tested remotely with the Astrosite observatory (located in South Australia) or with the telescope that we will bring to the workshop.

    • Navigation and landing: Prophesee’s GenX320 can be attached to a portable ST board and powered with a small battery. To simulate a landing of a probe on an extraterrestrial body, we attach the camera to an off-the-shelf drone for the exploration of ventral landing, optical flow and feature tracking scenarios, as well as predicting the distance to the ground to avoid dangerous landmarks.

    • High-speed object avoidance: The goal is to work on an ultra-low-latency vision pipeline to avoid incoming objects in real-time, simulating threats in the form of orbital debris. This will involve a robotic element added to the orbital collision simulator.

    • Low-power visual attention in-space assembly using relative motion: In-space assembly of spacecraft requires precise pose estimation of neighbouring modules. This determines their relative position and attitude during proximity operations, crucial for successful assembly. Matching relative motion typically involves identifying the disparity in motion between neighbouring target modules or structures, enabling precise alignment and docking of spacecraft components through visual feedback. However, while event-based cameras are extremely sensitive to relative motion, the closer we match the motion of the target object, the fewer events the camera would generate. This poses a problem exactly in the final stages of the assembly. This project explores event-based visual attention to reduce redundancy and focuses solely on relevant parts of the visual field in combination with event-based relative motion estimation algorithms to initiate in-space assembly.

    • Optimal neural control in space: Control algorithms are abundant for controlling satellites. They have to deliver incredible precision with a miniscule power budget. Biological neurons evolved to solve exactly this task with limited resources (https://authors.library.caltech.edu/53092/1/00080335.pdf), and the added benefit of redundancy for fault tolerance and radiation-proofing. This project explores optimal control algorithms for specific sub-problems, such as attitude control using a reaction wheel (https://en.wikipedia.org/wiki/Reaction_wheel). We will use state-of-the-art optimization techniques to find neural solutions for conventional closed-loop control problems, and continuously compare the resource usage against algorithms running on conventional hardware. The neural controllers will be more efficient, but how far can we push the envelope?

    Materials, Equipment, and Tutorials:
    We're going to make available several pieces of equipment that includes telescopes to record the night sky, different event cameras from Prophesee and iniVation, a Phantom high-frame rate camera for comparison, neuromorphic hardware such as Brainchip's Akida and SynSense Speck. ICNS will also provide access to their Astrosite network of remote telescopes, as well as their new DeepSouth cluster.We will run hands-on sessions on neuromorphic sensing and processing in space, building on successful tutorials from space conferences, providing code and examples for projects, and training with neuromorphic hardware. Experts in space imaging, lightning, and high-speed phenomena detection will give talks, focusing on neuromorphic hardware's potential to address current shortcomings. The workshop will feature unique data from the International Space Station, provided by WSU and USAFA, marking its first public release, allowing participants to develop new algorithms for space applications and explore neuromorphic hardware's effectiveness in processing this data for future space missions. Additionally, various data collection systems will be available, including telescope observation equipment, long-range lenses, tripods, a Phantom High-speed camera, and WSU's Astrosite system for space observations. Neurobus will make available neuromorphic hardware on site. This setup facilitates experiments, specific data collection for applications, experimenting with closed-loop neuromorphic systems, and online participation in space observation topics due to the time difference.
 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
(20min delay)
Last
19.5¢
Change
0.000(0.00%)
Mkt cap ! $361.9M
Open High Low Value Volume
19.5¢ 20.0¢ 19.5¢ $117.3K 598.9K

Buyers (Bids)

No. Vol. Price($)
74 1642554 19.5¢
 

Sellers (Offers)

Price($) Vol. No.
20.0¢ 605027 20
View Market Depth
Last trade - 10.08am 16/07/2024 (20 minute delay) ?
BRN (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.