12 April 2004 Polymodal information processing via temporal cortex Area 37 modeling
Author Affiliations +
A model of biological information processing is presented that consists of auditory and visual subsystems linked to temporal cortex and limbic processing. An biologically based algorithm is presented for the fusing of information sources of fundamentally different modalities. Proof of this concept is outlined by a system which combines auditory input (musical sequences) and visual input (illustrations such as paintings) via a model of cortex processing for Area 37 of the temporal cortex. The training data can be used to construct a connectionist model whose biological relevance is suspect yet is still useful and a biologically based model which achieves the same input to output map through biologically relevant means. The constructed models are able to create from a set of auditory and visual clues a combined musical/ illustration output which shares many of the properties of the original training data. These algorithms are not dependent on these particular auditory/ visual modalities and hence are of general use in the intelligent computation of outputs that require sensor fusion.
© (2004) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
James K. Peterson, James K. Peterson, } "Polymodal information processing via temporal cortex Area 37 modeling", Proc. SPIE 5421, Intelligent Computing: Theory and Applications II, (12 April 2004); doi: 10.1117/12.540423; https://doi.org/10.1117/12.540423


Radiative tetrahedral lattices
Proceedings of SPIE (May 31 1991)
Model-based vision for car following
Proceedings of SPIE (August 19 1993)
Creating a semantic-web interface with virtual reality
Proceedings of SPIE (July 26 2001)
Information fusion via isocortex-based Area 37 modeling
Proceedings of SPIE (August 08 2004)
Computer retina that models the primate retina
Proceedings of SPIE (June 23 1994)

Back to Top