Options
Krishna, B. Suresh
Loading...
Preferred name
Krishna, B. Suresh
Official Name
Krishna, B. Suresh
Alternative Name
Krishna, B. S.
Now showing 1 - 5 of 5
2016Journal Article [["dc.bibliographiccitation.artnumber","e1002390"],["dc.bibliographiccitation.issue","2"],["dc.bibliographiccitation.journal","PLOS Biology"],["dc.bibliographiccitation.volume","14"],["dc.contributor.author","Yao, Tao"],["dc.contributor.author","Treue, Stefan"],["dc.contributor.author","Krishna, B. Suresh"],["dc.date.accessioned","2017-09-07T11:43:32Z"],["dc.date.available","2017-09-07T11:43:32Z"],["dc.date.issued","2016"],["dc.description.abstract","We experience a visually stable world despite frequent retinal image displacements induced by eye, head, and body movements. The neural mechanisms underlying this remain unclear. One mechanism that may contribute is transsaccadic remapping, in which the responses of some neurons in various attentional, oculomotor, and visual brain areas appear to anticipate the consequences of saccades. The functional role of transsaccadic remapping is actively debated, and many of its key properties remain unknown. Here, recording from two monkeys trained to make a saccade while directing attention to one of two spatial locations, we show that neurons in the middle temporal area (MT), a key locus in the motion-processing pathway of humans and macaques, show a form of transsaccadic remapping called a memory trace. The memory trace in MT neurons is enhanced by the allocation of top-down spatial attention. Our data provide the first demonstration, to our knowledge, of the influence of top-down attention on the memory trace anywhere in the brain. We find evidence only for a small and transient effect of motion direction on the memory trace (and in only one of two monkeys), arguing against a role for MT in the theoretically critical yet empirically contentious phenomenon of spatiotopic feature-comparison and adaptation transfer across saccades. Our data support the hypothesis that transsaccadic remapping represents the shift of attentional pointers in a retinotopic map, so that relevant locations can be tracked and rapidly processed across saccades. Our results resolve important issues concerning the perisaccadic representation of visual stimuli in the dorsal stream and demonstrate a significant role for top-down attention in modulating this representation."],["dc.identifier.doi","10.1371/journal.pbio.1002390"],["dc.identifier.gro","3151568"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/12932"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/8378"],["dc.language.iso","en"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","final"],["dc.notes.submitter","chake"],["dc.relation.issn","1545-7885"],["dc.rights","CC BY 4.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/4.0"],["dc.title","An Attention-Sensitive Memory Trace in Macaque MT Following Saccadic Eye Movements"],["dc.type","journal_article"],["dc.type.internalPublication","unknown"],["dc.type.peerReviewed","no"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI2016Journal Article [["dc.bibliographiccitation.artnumber","e1005225"],["dc.bibliographiccitation.issue","12"],["dc.bibliographiccitation.journal","PLOS Computational Biology"],["dc.bibliographiccitation.volume","12"],["dc.contributor.author","Schwedhelm, Philipp"],["dc.contributor.author","Krishna, B. Suresh"],["dc.contributor.author","Treue, Stefan"],["dc.date.accessioned","2017-09-07T11:43:35Z"],["dc.date.available","2017-09-07T11:43:35Z"],["dc.date.issued","2016"],["dc.description.abstract","Paying attention to a sensory feature improves its perception and impairs that of others. Recent work has shown that a Normalization Model of Attention (NMoA) can account for a wide range of physiological findings and the influence of different attentional manipulations on visual performance. A key prediction of the NMoA is that attention to a visual feature like an orientation or a motion direction will increase the response of neurons preferring the attended feature (response gain) rather than increase the sensory input strength of the attended stimulus (input gain). This effect of feature-based attention on neuronal responses should translate to similar patterns of improvement in behavioral performance, with psychometric functions showing response gain rather than input gain when attention is directed to the task-relevant feature. In contrast, we report here that when human subjects are cued to attend to one of two motion directions in a transparent motion display, attentional effects manifest as a combination of input and response gain. Further, the impact on input gain is greater when attention is directed towards a narrow range of motion directions than when it is directed towards a broad range. These results are captured by an extended NMoA, which either includes a stimulus-independent attentional contribution to normalization or utilizes direction-tuned normalization. The proposed extensions are consistent with the feature-similarity gain model of attention and the attentional modulation in extrastriate area MT, where neuronal responses are enhanced and suppressed by attention to preferred and non-preferred motion directions respectively."],["dc.identifier.doi","10.1371/journal.pcbi.1005225"],["dc.identifier.gro","3151572"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/14005"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/8382"],["dc.language.iso","en"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","final"],["dc.notes.submitter","chake"],["dc.relation.issn","1553-7358"],["dc.rights","CC BY 4.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/4.0"],["dc.title","An Extended Normalization Model of Attention Accounts for Feature-Based Attentional Enhancement of Both Response and Coherence Gain"],["dc.type","journal_article"],["dc.type.internalPublication","unknown"],["dc.type.peerReviewed","no"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI2018Journal Article [["dc.bibliographiccitation.issue","1"],["dc.bibliographiccitation.journal","Nature Communications"],["dc.bibliographiccitation.volume","9"],["dc.contributor.author","Yao, Tao"],["dc.contributor.author","Treue, Stefan"],["dc.contributor.author","Krishna, B. Suresh"],["dc.date.accessioned","2020-12-10T18:09:45Z"],["dc.date.available","2020-12-10T18:09:45Z"],["dc.date.issued","2018"],["dc.identifier.doi","10.1038/s41467-018-03398-3"],["dc.identifier.eissn","2041-1723"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/15591"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/73751"],["dc.language.iso","en"],["dc.notes.intern","DOI Import GROB-354"],["dc.notes.intern","Merged from goescholar"],["dc.rights","CC BY 4.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/4.0"],["dc.title","Saccade-synchronized rapid attention shifts in macaque visual cortical area MT"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI2014Journal Article [["dc.bibliographiccitation.artnumber","6"],["dc.bibliographiccitation.firstpage","1"],["dc.bibliographiccitation.issue","1"],["dc.bibliographiccitation.journal","Journal of vision"],["dc.bibliographiccitation.lastpage","18"],["dc.bibliographiccitation.volume","14"],["dc.contributor.author","Krishna, B. Suresh"],["dc.contributor.author","Ipata, Anna E."],["dc.contributor.author","Bisley, James W."],["dc.contributor.author","Gottlieb, Jacqueline"],["dc.contributor.author","Goldberg, Michael E."],["dc.date.accessioned","2019-07-09T11:41:14Z"],["dc.date.available","2019-07-09T11:41:14Z"],["dc.date.issued","2014"],["dc.description.abstract","Previous studies have shown that subjects require less time to process a stimulus at the fovea after a saccade if they have viewed the same stimulus in the periphery immediately prior to the saccade. This extrafoveal preview benefit indicates that information about the visual form of an extrafoveally viewed stimulus can be transferred across a saccade. Here, we extend these findings by demonstrating and characterizing a similar extrafoveal preview benefit in monkeys during a free-viewing visual search task. We trained two monkeys to report the orientation of a target among distractors by releasing one of two bars with their hand; monkeys were free to move their eyes during the task. Both monkeys took less time to indicate the orientation of the target after foveating it, when the target lay closer to the fovea during the previous fixation. An extrafoveal preview benefit emerged even if there was more than one intervening saccade between the preview and the target fixation, indicating that information about target identity could be transferred across more than one saccade and could be obtained even if the search target was not the goal of the next saccade. An extrafoveal preview benefit was also found for distractor stimuli. These results aid future physiological investigations of the extrafoveal preview benefit."],["dc.identifier.doi","10.1167/14.1.6"],["dc.identifier.pmid","24403392"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/11872"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/58381"],["dc.language.iso","en"],["dc.notes.intern","Merged from goescholar"],["dc.relation.issn","1534-7362"],["dc.rights","Goescholar"],["dc.rights.uri","https://goescholar.uni-goettingen.de/licenses"],["dc.subject.mesh","Animals"],["dc.subject.mesh","Attention"],["dc.subject.mesh","Fixation, Ocular"],["dc.subject.mesh","Fovea Centralis"],["dc.subject.mesh","Macaca mulatta"],["dc.subject.mesh","Male"],["dc.subject.mesh","Orientation"],["dc.subject.mesh","Psychomotor Performance"],["dc.subject.mesh","Saccades"],["dc.subject.mesh","Task Performance and Analysis"],["dc.subject.mesh","Visual Perception"],["dc.title","Extrafoveal preview benefit during free-viewing visual search in the monkey."],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC2016Journal Article [["dc.bibliographiccitation.artnumber","e18009"],["dc.bibliographiccitation.firstpage","1"],["dc.bibliographiccitation.journal","eLife"],["dc.bibliographiccitation.lastpage","12"],["dc.bibliographiccitation.volume","5"],["dc.contributor.author","Yao, Tao"],["dc.contributor.author","Ketkar, Madhura"],["dc.contributor.author","Treue, Stefan"],["dc.contributor.author","Krishna, B Suresh"],["dc.date.accessioned","2021-06-01T10:48:59Z"],["dc.date.available","2021-06-01T10:48:59Z"],["dc.date.issued","2016"],["dc.description.abstract","Maintaining attention at a task-relevant spatial location while making eye-movements necessitates a rapid, saccade-synchronized shift of attentional modulation from the neuronal population representing the task-relevant location before the saccade to the one representing it after the saccade. Currently, the precise time at which spatial attention becomes fully allocated to the task-relevant location after the saccade remains unclear. Using a fine-grained temporal analysis of human peri-saccadic detection performance in an attention task, we show that spatial attention is fully available at the task-relevant location within 30 milliseconds after the saccade. Subjects tracked the attentional target veridically throughout our task: i.e. they almost never responded to non-target stimuli. Spatial attention and saccadic processing therefore co-ordinate well to ensure that relevant locations are attentionally enhanced soon after the beginning of each eye fixation."],["dc.description.abstract","When we look at a scene, our gaze does not move continuously across it. Instead, our eyes move discontinuously, shifting gaze rapidly from point to point to focus on different locations in the scene. These eye movements are known as saccades, and during them the brain temporarily and selectively stops processing visual information. In the brain, a particular area of a scene is represented by different neurons before and after a saccade. Paying attention to a relevant location in a scene across an eye movement therefore requires the brain to shift its attentional effects from the neurons that represented that location in the scene before the saccade to the set of neurons that do so after the saccade. Ideally, this shift should happen rapidly and be synchronized with the eye movement. Exactly how long it takes for attention to emerge at a relevant location after a saccade was not clear because attention had not been recorded on a fine enough time-scale immediately after an eye movement. Yao et al. have now addressed this issue in a series of experiments that asked volunteers to focus their eyes on a fixed point. The volunteers had to follow the point with their eyes as it jumped to a new location, and at the same time had to look out for a change in the movement of a pattern of random dots. The results reveal that attention is fully available at the relevant location within 30 milliseconds after the saccade. In fact, the 30-millisecond delay in the emergence of attention matches the period during which vision is suppressed during a saccade. Thus, the change in the brain’s focus of attention coordinates with the saccadic eye movement to ensure that attention can be fixed on a relevant location as soon as possible after the eye movement ends. More studies are now needed to investigate how the brain coordinates its attention and eye-movement processes to synchronize the shift in attention with the eye movement."],["dc.identifier.doi","10.7554/eLife.18009"],["dc.identifier.gro","3151559"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/14055"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/86125"],["dc.language.iso","en"],["dc.notes.intern","DOI-Import GROB-425"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","final"],["dc.notes.submitter","chake"],["dc.relation.eissn","2050-084X"],["dc.relation.issn","2050-084X"],["dc.rights","CC BY 4.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/4.0"],["dc.title","Visual attention is available at a task-relevant location rapidly after a saccade"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","no"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI