E-resources
Peer reviewed
Open access
-
D’Angelo, Giulia; Bartolozzi, Chiara
BMJ open ophthalmology, 03/2024, Volume: 9, Issue: Suppl 1Journal Article
IntroductionVisual applications in robotics must meet strict requirements for power efficiency, low latency, and data processing capacity. Despite the remarkable performance achievements of traditional computer vision methods, they struggle to generalise effectively and often rely on vast datasets, increasing data processing and transfer. The proposed system leverages bioinspired visual attention mechanisms to process only relevant parts of the scene, further exploring event-based sensing and neuromorphic computing via Spiking Neural Networks (SNNs).AimsThis scientific challenge aims to connect bioinspired hardware with biologically plausible algorithms, thereby showcasing the potential of spike-based implementations for online robotics visual applications.MethodsThe bioinspired saliency-based visual attention model processes events from event-driven cameras on the humanoid robot iCub, running on SpiNNaker neuromorphic hardware. Intensity, disparity, and motion are the bottom-up feature extraction channels competing for scene representation. These cues feed into a biologically plausible saliency-based proto-object model based on Gestalt perceptual grouping theories to detect only relevant scene parts. The model produces saliency maps with salient areas representing regions potentially containing objects, called ‘proto-objects’.ResultsThe online system accurately generates saliency maps in ~16ms detecting salient proto-objects and disregarding clutter. The system has been qualitatively and quantitatively validated, achieving comparable results to the frame-based implementation, in online simple office scenarios, as well as when compared against the ground truth fixation maps from real human subjects (NUS3D dataset).ConclusionThis project is the first significant step towards more complex real-world robotic applications for vision, where bioinspiration sets the basis for fast, power-efficient online robotic applications and innovative computer vision approaches.
![loading ... loading ...](themes/default/img/ajax-loading.gif)
Shelf entry
Permalink
- URL:
Impact factor
Access to the JCR database is permitted only to users from Slovenia. Your current IP address is not on the list of IP addresses with access permission, and authentication with the relevant AAI accout is required.
Year | Impact factor | Edition | Category | Classification | ||||
---|---|---|---|---|---|---|---|---|
JCR | SNIP | JCR | SNIP | JCR | SNIP | JCR | SNIP |
Select the library membership card:
If the library membership card is not in the list,
add a new one.
DRS, in which the journal is indexed
Database name | Field | Year |
---|
Links to authors' personal bibliographies | Links to information on researchers in the SICRIS system |
---|
Source: Personal bibliographies
and: SICRIS
The material is available in full text. If you wish to order the material anyway, click the Continue button.