Perception

Source: Wikipedia, the free encyclopedia.
(Redirected from
Sensory perception
)

The Necker cube and Rubin vase can be perceived in more than one way.
Humans are able to have a very good guess on the underlying 3D shape category/identity/geometry given a silhouette of that shape. Computer vision researchers have been able to build computational models for perception that exhibit a similar behavior and are capable of generating and reconstructing 3D shapes from single or multi-view depth maps or silhouettes.[1]

Perception (from

pressure waves
.

Perception is not only the passive receipt for of these

object recognition).[5] The process that follows connects a person's concepts and expectations (or knowledge), restorative and selective mechanisms (such as attention
) that influence perception.

Perception depends on complex functions of the nervous system, but subjectively seems mostly effortless because this processing happens outside

quantitatively describes the relationships between the physical qualities of the sensory input and perception.[6] Sensory neuroscience studies the neural mechanisms underlying perception. Perceptual systems can also be studied computationally, in terms of the information they process. Perceptual issues in philosophy include the extent to which sensory qualities such as sound, smell or color exist in objective reality rather than in the mind of the perceiver.[4]

Although people traditionally viewed the senses as passive receptors, the study of illusions and ambiguous images has demonstrated that the brain's perceptual systems actively and pre-consciously attempt to make sense of their input.[4] There is still active debate about the extent to which perception is an active process of hypothesis testing, analogous to science, or whether realistic sensory information is rich enough to make this process unnecessary.[4]

The

sensory maps, mapping some aspect of the world across part of the brain's surface. These different modules are interconnected and influence each other. For instance, taste is strongly influenced by smell.[7]

Process and terminology

The process of perception begins with an object in the real world, known as the

distal stimulus or distal object.[3] By means of light, sound, or another physical process, the object stimulates the body's sensory organs. These sensory organs transform the input energy into neural activity—a process called transduction.[3][8] This raw pattern of neural activity is called the proximal stimulus.[3] These neural signals are then transmitted to the brain and processed.[3]
The resulting mental re-creation of the distal stimulus is the percept.

To explain the process of perception, an example could be an ordinary shoe. The shoe itself is the distal stimulus. When light from the shoe enters a person's eye and stimulates the retina, that stimulation is the proximal stimulus.[9] The image of the shoe reconstructed by the brain of the person is the percept. Another example could be a ringing telephone. The ringing of the phone is the distal stimulus. The sound stimulating a person's auditory receptors is the proximal stimulus. The brain's interpretation of this as the "ringing of a telephone" is the percept.

The different kinds of sensation (such as warmth, sound, and taste) are called sensory modalities or stimulus modalities.[8][10]

Bruner's model of the perceptual process

Psychologist Jerome Bruner developed a model of perception, in which people put "together the information contained in" a target and a situation to form "perceptions of ourselves and others based on social categories."[11][12] This model is composed of three states:

  1. When people encounter an unfamiliar target, they are very open to the informational cues contained in the target and the situation surrounding it.
  2. The first stage does not give people enough information on which to base perceptions of the target, so they will actively seek out cues to resolve this ambiguity. Gradually, people collect some familiar cues that enable them to make a rough categorization of the target.
  3. The cues become less open and selective. People try to search for more cues that confirm the categorization of the target. They actively ignore and distort cues that violate their initial perceptions. Their perception becomes more selective and they finally paint a consistent picture of the target.

Saks and John's three components to perception

According to Alan Saks and Gary Johns, there are three components to perception:

better source needed
]

  1. The Perceiver: a person whose awareness is focused on the stimulus, and thus begins to perceive it. There are many factors that may influence the perceptions of the perceiver, while the three major ones include (1)
    emotional state, and (3) experience
    . All of these factors, especially the first two, greatly contribute to how the person perceives a situation. Oftentimes, the perceiver may employ what is called a "perceptual defense", where the person will only see what they want to see.
  2. The Target: the object of perception; something or someone who is being perceived. The amount of information gathered by the sensory organs of the perceiver affects the interpretation and understanding about the target.
  3. The Situation: the environmental factors, timing, and degree of stimulation that affect the process of perception. These factors may render a single stimulus to be left as merely a stimulus, not a percept that is subject for brain interpretation.

Multistable perception

Stimuli are not necessarily translated into a percept and rarely does a single stimulus translate into a percept. An ambiguous stimulus may sometimes be transduced into one or more percepts, experienced randomly, one at a time, in a process termed multistable perception. The same stimuli, or absence of them, may result in different percepts depending on subject's culture and previous experiences. [14]

Ambiguous figures demonstrate that a single stimulus can result in more than one percept. For example, the Rubin vase can be interpreted either as a vase or as two faces. The percept can bind sensations from multiple senses into a whole. A picture of a talking person on a television screen, for example, is bound to the sound of speech from speakers to form a percept of a talking person.

Types of perception

Cerebrum lobes

Vision

In many ways, vision is the primary human sense. Light is taken in through each eye and focused in a way which sorts it on the retina according to direction of origin. A dense surface of photosensitive cells, including rods, cones, and

intrinsically photosensitive retinal ganglion cells captures information about the intensity, color, and position of incoming light. Some processing of texture and movement occurs within the neurons on the retina before the information is sent to the brain. In total, about 15 differing types of information are then forwarded to the brain proper via the optic nerve.[15]

The timing of perception of a visual event, at points along the visual circuit, have been measured. A sudden alteration of light at a spot in the environment first alters photoreceptor cells in the retina, which send a signal to the retina bipolar cell layer which, in turn, can activate a retinal ganglion neuron cell. A retinal ganglion cell is a bridging neuron that connects visual retinal input to the visual processing centers within the central nervous system.[16] Light-altered neuron activation occurs within about 5–20 milliseconds in a rabbit retinal ganglion,[17] although in a mouse retinal ganglion cell the initial spike takes between 40 and 240 milliseconds before the initial activation.[18] The initial activation can be detected by an action potential spike, a sudden spike in neuron membrane electric voltage.

A perceptual visual event measured in humans was the presentation to individuals of an anomalous word. If these individuals are shown a sentence, presented as a sequence of single words on a computer screen, with a puzzling word out of place in the sequence, the perception of the puzzling word can register on an electroencephalogram (EEG). In an experiment, human readers wore an elastic cap with 64 embedded electrodes distributed over their scalp surface.[19] Within 230 milliseconds of encountering the anomalous word, the human readers generated an event-related electrical potential alteration of their EEG at the left occipital-temporal channel, over the left occipital lobe and temporal lobe.

Sound

Anatomy of the human ear. (The length of the auditory canal is exaggerated in this image.)
  Brown is outer ear.
  Red is middle ear.
  Purple is inner ear.

Hearing (or audition) is the ability to perceive sound by detecting vibrations (i.e., sonic detection). Frequencies capable of being heard by humans are called audio or audible frequencies, the range of which is typically considered to be between 20 Hz and 20,000 Hz.[20] Frequencies higher than audio are referred to as ultrasonic, while frequencies below audio are referred to as infrasonic.

The

primary auditory cortex within the temporal lobe of the human brain, from where the auditory information then goes to the cerebral cortex
for further processing.

Sound does not usually come from a single source: in real situations, sounds from multiple sources and directions are superimposed as they arrive at the ears. Hearing involves the computationally complex task of separating out sources of interest, identifying them and often estimating their distance and direction.[21]

Touch