Abstract
For visually guided navigation, the use of environmental cues is essential. Particularly, detecting local boundaries that impose limits to locomotion and estimating their location is crucial. In a series of three fMRI experiments, we investigated whether there is a neural coding of navigational distance in the human visual cortex (both female and male). We used virtual reality software to systematically manipulate the distance from a viewer perspective to different types of a boundary. Using a multivoxel pattern classification employing a linear support vector machine, we found that the occipital place area (OPA) is sensitive to the navigational distance restricted by the transparent glass wall. Further, the OPA was sensitive to a non-crossable boundary only, suggesting an importance of the functional constraint of a boundary. Together, we propose the OPA as a perceptual source of external environmental features relevant for navigation.
Original language | English |
---|---|
Pages (from-to) | 3621-3630 |
Number of pages | 10 |
Journal | Journal of Neuroscience |
Volume | 40 |
Issue number | 18 |
DOIs | |
Publication status | Published - 2020 Apr 29 |
Bibliographical note
Funding Information:Received Aug. 12, 2019; revised Feb. 28, 2020; accepted Mar. 5, 2020. Author contributions: J.P. and S.P. designed research; J.P. and S.P. performed research; J.P. and S.P. contributed unpublished reagents/analytic tools; J.P. and S.P. analyzed data; J.P. and S.P. wrote the paper. This work was supported by a National Eye Institute Grant (R01EY026042), National Research Foundation of Korea Grant (funded by MSIP-2019028919), and Yonsei University Future-leading Research Initiative (2019-22-0217) to S.P. We thank the F.M. Kirby Research Center for Functional Brain Imaging in the Kennedy Krieger Institute, Johns Hopkins University, 21205. The authors declare no competing financial interests. Correspondence should be addressed to Soojin Park at soojin.park@yonsei.ac.kr. https://doi.org/10.1523/JNEUROSCI.1991-19.2020 Copyright © 2020 the authors
Funding Information:
This work was supported by a National Eye Institute Grant (R01EY026042), National Research Foundation of Korea Grant (funded by MSIP-2019028919), and Yonsei University Future-leading Research Initiative (2019-22-0217) to S.P. We thank the F.M. Kirby Research Center for Functional Brain Imaging in the Kennedy Krieger Institute, Johns Hopkins University, 21205.
Publisher Copyright:
Copyright © 2020 the authors.
All Science Journal Classification (ASJC) codes
- Neuroscience(all)