Non-parametric context-based object classification in images
DOI:
https://doi.org/10.5755/j01.itc.46.1.13610Keywords:
Object classification, image context, image segmentation, natural imagesAbstract
Segmentation and classification of objects in images is one of the most
important and yet one of the most complex problems in computer vision. Many
different things contribute to the complexity of this problem, but amongst those
that are most significant are variations in object appearance due to various perspectives,
changes in scene illumination and typically a large number of partially
occluded objects in a scene. Humans have the ability to successfully deal with
these complications and to recognize objects in scenes without seemingly much
effort. However, to aid their recognition process, humans use different sources of
information such as previous experiences, context of objects placed in a scene and
rules learned in the past that suggest how the physical world should appear and
behave. Computer vision is trying to replicate the same strategies used by human
vision and apply them to computer systems with the aim of improving typical
tasks of detection, localization and classification of image objects. In this work we
propose a new model for natural image object classification using contextual information
at the level of image segments. Context modeling is largely independent
of appearance-based classification and proposed model enables simple upgrade of
existing systems with information from global and/or local context. Context modeling
is based on non-parametric use of appearance-based classification results
which is a novel approach compared to previous systems that model context on a
limited number of rules expressed with a fixed set of parameters. Model implementation
resulted in a system that, in our simulations, showed stable improvement
of the appearance-based object classification.
Downloads
Published
Issue
Section
License
Copyright terms are indicated in the Republic of Lithuania Law on Copyright and Related Rights, Articles 4-37.