Skip to content Skip to menu
This website uses cookies to help us understand the way visitors use our website. We can't identify you with them and we don't share the data with anyone else. If you click Reject we will set a single cookie to remember your preference. Find out more in UCL's privacy notices.

Medial temporal dynamics underlying spatial navigation in freely moving humans immersed in virtual, augmented, and real-world environments

Little is known of the underlying mechanisms in the human brain that allow one to keep track of their location while freely moving or that of others in a shared environment. I will present on recent findings from our our research platform that allows for wireless recording of deep brain activity in human participants that are able to freely maneuver while immersed in real, augmented or virtual spatial environments. We find that specific patterns of human medial temporal lobe oscillatory activity are modulated by eye movements, context, one’s position, and the position of another person in a spatial environment, all of which are further dependent on momentary task goal. Lastly, I will present updates on our ongoing aim to use mobile deep brain recording in multiple interacting participants navigating augmented or real-world environments.