A review of how the prefrontal cortex and high-level visual cortex interact during perception.
Kornblith, S., & Tsao, D. Y. (2017). How thoughts arise from sights: inferotemporal and prefrontal contributions to vision. Current Opinion in Neurobiology, 46, 208-218.
“The Functional Architecture of Cognition Is Rhythmic”. Indeed.
Randolph F. Helfrich, Robert T. Knight, Oscillatory Dynamics of Prefrontal Cognitive Control, In Trends in Cognitive Sciences, Volume 20, Issue 12, 2016, Pages 916-930, ISSN 1364-6613, https://doi.org/10.1016/j.tics.2016.09.007.
Rhythmic coupling across the cortex underlies perception.
Randolph F. Helfrich, Melody Huang, Guy Wilson, and Robert T. Knight
Prefrontal cortex modulates posterior alpha oscillations during top-down guided visual perception
PNAS 2017 114 (35) 9457-9462; published ahead of print August 14, 2017, doi:10.1073/pnas.1705965114
Distributed networks, not functional modules, are where it’s at in the 21st century.
This paper argues that decisions come from repeated computations that are distributed across many brain regions. This fits with the distributed nature of neural coding.
Nat Rev Neurosci. 2017 Feb 17;18(3):172-182. doi: 10.1038/nrn.2017.7.
A distributed, hierarchical and recurrent framework for reward-based choice.
Hunt LT, Hayden BY
For further reading:
Siegel, M., Buschman, T.J., and Miller, E.K. (2015) Cortical information flow during flexible sensorimotor decisions. Science. 19 June 2015: 1352-1355. View PDF »
Neuron populations, not individual neurons, are where it’s at in the 21st century.
Iñigo Arandia-Romero, Ramon Nogueira, Gabriela Mochol, Rubén Moreno-Bote, What can neuronal populations tell us about cognition?, In Current Opinion in Neurobiology, Volume 46, 2017, Pages 48-57, ISSN 0959-4388, https://doi.org/10.1016/j.conb.2017.07.008.
Come see Pavlov’s Dogz at Songbyrd DC on Sunday Nov 12 of the SFN meeting.
9:30pm Songbyrd DC 11/12/17
We show how limitations in cognitive capacity (how many thoughts you can think at the same time – very few) may be due to changes in rhythmic coupling between cortical areas. More specifically, feedback coupling breaks down when capacity is exceeded.
Working Memory Load Modulates Neuronal Coupling Dimitris A Pinotsis, Timothy J Buschman, Earl K Miller
New manuscript submitted to bioRxiv:
Neuronal rhythms orchestrate cell assembles to distinguish perceptual categories
Morteza Moazami Goudarzi, Jason Cromer, Jefferson Roy, Earl K. Miller
Categories are reflected in the spiking activity of neurons. However, how neurons form ensembles for categories is unclear. To address this, we simultaneously recorded spiking and local field potential (LFP) activity in the lateral prefrontal cortex (lPFC) of monkeys performing a delayed match to category task with two independent category sets (Animals: Cats vs Dogs; Cars: Sports Cars vs Sedans). We found stimulus and category information in alpha and beta band oscillations. Different category distinctions engaged different frequencies. There was greater spike field coherence (SFC) in alpha (~8-14 Hz) for Cats and in beta (~16-22 Hz) for Dogs. Cars showed similar differences, albeit less pronounced: greater alpha SFC for Sedans and greater beta SFC for Sports Cars. Thus, oscillatory rhythms can help coordinate neurons into different ensembles. Engagement of different frequencies may help differentiate the categories.
A very interesting theory from Randy O’Reilly and crew. I don’t know how to summarize it better than they did in their abstract:
How does the neocortex learn and develop the foundations of all our high-level cognitive abilities? We present a comprehensive framework spanning biological, computational, and cognitive levels, with a clear theoretical continuity between levels, providing a coherent answer directly supported by extensive data at each level. Learning is based on making predictions about what the senses will report at 100 msec (alpha frequency) intervals, and adapting synaptic weights to improve prediction accuracy. The pulvinar nucleus of the thalamus serves as a projection screen upon which predictions are generated, through deep-layer 6 corticothalamic inputs from multiple brain areas and levels of abstraction. The sparse driving inputs from layer 5 intrinsic bursting neurons provide the target signal, and the temporal difference between it and the prediction reverberates throughout the cortex, driving synaptic changes that approximate error backpropagation, using only local activation signals in equations derived directly from a detailed biophysical model. In vision, predictive learning requires a carefully-organized developmental progression and anatomical organization of three pathways (What, Where, and What * Where), according to two central principles: top-down input from compact, high-level, abstract representations is essential for accurate prediction of low-level sensory inputs; and the collective, low-level prediction error must be progressively and opportunistically partitioned to enable extraction of separable factors that drive the learning of further high-level abstractions. Our model self-organized systematic invariant object representations of 100 different objects from simple movies, accounts for a wide range of data, and makes many testable predictions.
O’Reilly, R. C., Wyatte, D. R., & Rohrlich, J. (2017). Deep Predictive Learning: A Comprehensive Model of Three Visual Streams. arXiv preprint arXiv:1709.04654.