Object recognition by active fusion

Manfred Prantl, H. Borotschnig, H. Ganster, David Sinclair, Axel Pinz

    Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

    Abstract

    Today's computer vision applications often have to deal with multiple, uncertain, and incomplete visual information. In this paper, we apply a new method, termed 'active fusion', to the problem of generic object recognition. Active fusion provides a common framework for active selection and combination of information from multiple sources in order to arrive at a reliable result at reasonable costs. In our experimental setup we use a camera mounted on a 2m by 1.5m x/z-table observing objects placed on a rotating table. Zoom, pan, tilt, and aperture setting of the camera can be controlled by the system. We follow a part-based approach, trying to decompose objects into parts, which are modeled as geons. The active fusion system starts from an initial view of the objects placed on the table and is continuously trying to refine its current object hypotheses by requesting additional views. The implementation of active fusion on the basis of probability theory, Dempster-Shafer's theory of evidence and fuzzy set theory is discussed. First results demonstrating segmentation improvements by active fusion are presented.
    Original languageEnglish
    Title of host publicationIntelligent robots and computer vision XV
    Place of PublicationBellingham, Wash.
    PublisherSPIE
    Pages320-330
    Volume2904
    DOIs
    Publication statusPublished - 1996
    EventIntelligent Robots and Computer Vision - Boston, Mass., United States
    Duration: 19 Nov 199621 Nov 1996

    Publication series

    NameSPIE Proceedings Series
    PublisherSPIE

    Conference

    ConferenceIntelligent Robots and Computer Vision
    Country/TerritoryUnited States
    CityBoston, Mass.
    Period19/11/9621/11/96

    Fingerprint

    Dive into the research topics of 'Object recognition by active fusion'. Together they form a unique fingerprint.

    Cite this