Logo Logo
Help
Contact
Switch language to German
Input-dependent neuronal representations of virtual environments in the hippocampus
Input-dependent neuronal representations of virtual environments in the hippocampus
Animals live in highly complex sensory environments that are repres- ented across multiple sensory-modalities. These multi-sensory neural representations allow animals to successfully navigate in space and to form relevant associative memories critical to survival. The remembered location of a plentiful food source or a predator could mean the difference between life and death. To form survival relevant associative memories across multiple sensory modalities animals must be able to sense, encode, and integrate information from their immediate environment. The Information gathered across multiple sensory systems must therefore be temporally correlated and converge within the brain. The mammalian hippocampus is one such structure where sensory information converges. Hippocampal place cells are known to fire at a particular location within an animal’s environment (place field). As the animal moves through a place field action potentials proceed in phase with respect to hippocampal theta oscillations (phase precession). Place fields encode specific positions in the animal’s environment, and therefore provide a suitable substrate for the integration of diverse sensory inputs with spatial information. Such association would position sensory landmarks within the environment and would thus be crucial for successful spatial orientation and navigation. To better understand how multisensory information is integrated for the formation of a neuronal place map, I performed in-vivo extracellular recordings in the hippocampus of behaving Mongolian gerbils (Meriones unguiculatus). Specifically, I sought to understand the relative contribution of visual and locomotor inputs to place cell activity. Recordings were performed in a virtual reality behavioural setup in which animals ran along a virtual linear corridor. Animal locomotion was determined via a tracking-ball treadmill. By altering the gain factor between the movement of the ball and the speed of the visual projection on a single trial basis I could decouple visually perceived movement from the animal’s locomotion. Through these experiments I showed that place cells in Cornu Ammonis area 3 (CA3) responded differentially to closed loop manipulation. One subset of place cells formed its place field based on visual information within the virtual environment, independent of the distance travelled by the animal. Such visually driven place fields predominantly occurred at visual texture changes within the virtual corridor. A second subset of place cells relied predominantly on locomotor inputs, as place fields remained at the same running distance on a single trial basis. This was also confirmed in dark trials during which the projection was switched off. This meant that animals had to rely on internal cues such as path-integration and proprioceptive/motor efference information. A third subset of cells exhibited two place fields, with each field being driven by one of the two different inputs. As a population both types of place fields formed input-specific maps that were simultaneously represented in the hippocampus. This notion is corroborated by the fact that place field firing was adjusted on a single trial basis without adaptive processes required. Furthermore, I investigated whether overlapping input-specific place maps are integrated in a common processing frame. To this end I analysed phase precession of overlapping place fields, which is thought to be crucial for encoding time ordered events that are compressed within a theta cycle. I found that theta-scale timing correlations shift with gain changes, which argues against mechanisms based exclusively on recurrent CA3 connections for memory formation. My results therefore indicate that the hippocampus preserves information from distinct processing streams to form coexisting space representations. These coexisting input-specific maps could then be associated on a population level in order to form multimodal memories. Such association is likely to be formed from sensory information, fed-forward via the hippocampus, rather than via recurrent hippocampal connections. Thus, the hippocampus provides animals with several maps of their environment, each specified for a specific sensory-input. Animals could therefore rely on the most appropriate map such that locomotor information would be more precise than visual information when the animal is navigating in darkness.
Not available
Haas, Olivia V.
2017
English
Universitätsbibliothek der Ludwig-Maximilians-Universität München
Haas, Olivia V. (2017): Input-dependent neuronal representations of virtual environments in the hippocampus. Dissertation, LMU München: Graduate School of Systemic Neurosciences (GSN)
[thumbnail of Haas_Olivia_V.pdf]
Preview
PDF
Haas_Olivia_V.pdf

253MB

Abstract

Animals live in highly complex sensory environments that are repres- ented across multiple sensory-modalities. These multi-sensory neural representations allow animals to successfully navigate in space and to form relevant associative memories critical to survival. The remembered location of a plentiful food source or a predator could mean the difference between life and death. To form survival relevant associative memories across multiple sensory modalities animals must be able to sense, encode, and integrate information from their immediate environment. The Information gathered across multiple sensory systems must therefore be temporally correlated and converge within the brain. The mammalian hippocampus is one such structure where sensory information converges. Hippocampal place cells are known to fire at a particular location within an animal’s environment (place field). As the animal moves through a place field action potentials proceed in phase with respect to hippocampal theta oscillations (phase precession). Place fields encode specific positions in the animal’s environment, and therefore provide a suitable substrate for the integration of diverse sensory inputs with spatial information. Such association would position sensory landmarks within the environment and would thus be crucial for successful spatial orientation and navigation. To better understand how multisensory information is integrated for the formation of a neuronal place map, I performed in-vivo extracellular recordings in the hippocampus of behaving Mongolian gerbils (Meriones unguiculatus). Specifically, I sought to understand the relative contribution of visual and locomotor inputs to place cell activity. Recordings were performed in a virtual reality behavioural setup in which animals ran along a virtual linear corridor. Animal locomotion was determined via a tracking-ball treadmill. By altering the gain factor between the movement of the ball and the speed of the visual projection on a single trial basis I could decouple visually perceived movement from the animal’s locomotion. Through these experiments I showed that place cells in Cornu Ammonis area 3 (CA3) responded differentially to closed loop manipulation. One subset of place cells formed its place field based on visual information within the virtual environment, independent of the distance travelled by the animal. Such visually driven place fields predominantly occurred at visual texture changes within the virtual corridor. A second subset of place cells relied predominantly on locomotor inputs, as place fields remained at the same running distance on a single trial basis. This was also confirmed in dark trials during which the projection was switched off. This meant that animals had to rely on internal cues such as path-integration and proprioceptive/motor efference information. A third subset of cells exhibited two place fields, with each field being driven by one of the two different inputs. As a population both types of place fields formed input-specific maps that were simultaneously represented in the hippocampus. This notion is corroborated by the fact that place field firing was adjusted on a single trial basis without adaptive processes required. Furthermore, I investigated whether overlapping input-specific place maps are integrated in a common processing frame. To this end I analysed phase precession of overlapping place fields, which is thought to be crucial for encoding time ordered events that are compressed within a theta cycle. I found that theta-scale timing correlations shift with gain changes, which argues against mechanisms based exclusively on recurrent CA3 connections for memory formation. My results therefore indicate that the hippocampus preserves information from distinct processing streams to form coexisting space representations. These coexisting input-specific maps could then be associated on a population level in order to form multimodal memories. Such association is likely to be formed from sensory information, fed-forward via the hippocampus, rather than via recurrent hippocampal connections. Thus, the hippocampus provides animals with several maps of their environment, each specified for a specific sensory-input. Animals could therefore rely on the most appropriate map such that locomotor information would be more precise than visual information when the animal is navigating in darkness.