Multi-modal novelty and familiarity detection

Panchev, Christo (2013) Multi-modal novelty and familiarity detection. BMC Neuroscience.

Full text not available from this repository.

Search Google Scholar

Abstract

Journal Special Issue on Computational Neuroscience Meeting CNS 2013

Presented is further development of the architecture presented in [1] where a top-down feature-based and spatial attention have been incorporated in a large scale visual module and novelty and familiarity detectors based on the model presented in [2]. These have been developed in the perceptual (visual and auditory) and motor modalities. In addition to the novelty/familiarity detection shown in [2, 3], the architecture is able to partially recognise familiar features in each perceptual modality, and furthermore in a distributed fashion activate associated familiar features from one perceptual modality to another and/or to the motor programmes and affordances. The architecture is implemented on a mobile robot operating in a dynamic environment. The proposed distributed multi-modal familiarity detection integrated in the architecture improves the recognition and action performance in a noisy environment, as well as contributing to the multi-modal association and learning of novel objects and actions.

Item Type: Article
Subjects: Computing > Information Systems
Divisions: Faculty of Technology
Faculty of Technology > School of Computer Science
Related URLs:
Depositing User: Glenda Young
Date Deposited: 12 Apr 2013 10:15
Last Modified: 15 Jun 2020 13:21
URI: http://sure.sunderland.ac.uk/id/eprint/3760

Actions (login required)

View Item View Item