The perception process of an artificial system can be considered in two parts: the active and the passive. In a passive perception application, the incoming data are organized using a type of fusion in order to represent information about the surroundings. The information, which can be considered an "environmental picture," is then processed through the system to the various components. It is considered passive perception when no feedback component is present to readjust or redirect the environmental picture.
An active perception component introduced by Biel and Wide  may act as a feedback within the perception system. Active perception may initiate a redirection to specific sensing modules or may be used to adjust specific settings. A biological example of active perception is vision — the eye will compensate for luminance for the detection of objects. Another example is present in the olfactory sense when a desensitization effect occurs to adjust for odors. To determine how the active perception module interacts with the sensing components may require the use of a knowledge base. As described by Bajcsy , a top-down or bottom-up approach can be adopted when building an active perception system. In the top-down method, the system has no knowledge about the environment and requires a comparison with a knowledge base. In a bottom-up approach, however, the system has a predefined goal and searches for that goal in the environment.
Active perception works in cooperation with the sensor fusion or can be present within the embedded algorithm collecting the data. Basically, the use of active perception in a perceptual system is best summarized as the "intelligent goal-driven ability to make new decisions based on information feedback from past actions and consequences to the environment. It is also aimed for focusing the attention and weight of perception detectors based on internal drives and needs and considers the motivation of the system in order to generate decisions" .
Was this article helpful?