Štoček, Fabián (2023): Disentangling multi sensory contribution to navigation in a rat using a novel freely moving virtual reality-enabled path integration task. Dissertation, LMU München: Graduate School of Systemic Neurosciences (GSN) |
Vorschau |
PDF
Stocek_Fabian.pdf 15MB |
Abstract
The ability to assess distance and direction traveled plays an existential role within the animal kingdom. Animals use many kinds of sensory systems in order to be able to achieve this feat. These sensory systems combine information into two frames of reference which animals use to retrieve a previously taken path. Animals are mainly able to use both an allocentric reference frame, which helps to associate objects to each other, and an egocentric reference frame, which ties a connection between ourselves and a given object. The use of the egocentric frame is highlighted when outside cues are limited or absent. Under such conditions, like in the darkness, we rely on our ability to integrate our past path in order to retrieve a starting location. Understanding this process of path integration also means helping to uncover the evolutionary basis for memory; the mechanism for planning and integrating information in our mind has evolved from a physical need to navigate in the environment (György Buzsáki and E. I. Moser 2013). There is much evidence to support the similarity between physical and mental travel (György Buzsáki and James J Chrobak 2005; Eichenbaum et al. 1999). Researchers have tried to understand this process of integrating egocentric cues using many animal behavioral assays. These assays have been hard to combine with recordings from the brain due to their inability to be repeated over multiple trials within the same animal, or a lack of precision in accurately measuring animal behavior. Many of the assays don’t have a full control over visual or vestibular perturbation in order to study their respective contributions. Here I developed a novel path integration task and used a freely moving virtual reality system (Del Grosso and Sirota 2019) to enable variable homing locations within an arena and implement perturbations of the vestibular input and an optic flow. This task structure can be split into path integration and goal diected segments within a trial, is suitable for long-term extracellular recordings, and results in a high throughput of trials. I recorded over 50 000 trials and over 100 000 rears which were essential in correctly characterizing the task, quantifying performance, and testing for the use of allocentric information. Flexible homing location and binary response of the rat (rearing) allow for objective measures of the accuracy in the path integration task. With freely moving virtual reality I could create conflicting information to study the contribution of each idiothetic sensory input to the path-integration. I show that transient visual or vestibular perturbations during the task segment associated with path integration strongly deteriorate the performance of the animal on the task. Such perturbations are expected to disrupt the head-direction system, which is necessary for successful path integration, and hence strongly suggests the developed task is indeed purely path integration-dependent. I present a preliminary analysis of the multiple place cells’ recordings from the CA1 layer of the hippocampus which provides the first glimpse on the spatial coding of the same cells during different modes of navigation associated with distinct segments of the task. Finally, I discuss how our task can be adapted easily in the future to further investigate this complicated mechanism.
Dokumententyp: | Dissertationen (Dissertation, LMU München) |
---|---|
Keywords: | neuroscience, path integration, VR, electrophysiology, multisensory, rat |
Themengebiete: | 500 Naturwissenschaften und Mathematik > 570 Biowissenschaften, Biologie |
Fakultäten: | Graduate School of Systemic Neurosciences (GSN) |
Sprache der Hochschulschrift: | Englisch |
Datum der mündlichen Prüfung: | 12. Juli 2023 |
1. Berichterstatter:in: | Sirota, Anton |
MD5 Prüfsumme der PDF-Datei: | d48992fbc189aa528f6aba8259964251 |
Signatur der gedruckten Ausgabe: | 0001/UMC 30529 |
ID Code: | 32826 |
Eingestellt am: | 17. Jul. 2024 11:41 |
Letzte Änderungen: | 17. Jul. 2024 11:41 |