WIKIMEDIA, TIIA MONTO

Mice who have learned to lick a waterspout in response to a visual signal can perform the action without seeing such a cue if the neurons activated by the signal are instead stimulated with light, according to a paper in Science today (July 18). Similar results were reported in a recent Cell paper last week (July 11). The approach taken in the papers could help researchers tackle questions of how perception is encoded in the brain, scientists say.

“[The approach] is a significant step forward in our ability to manipulate the activity of neurons in a specific manner,” says Shaul Hestrin, a behavioral neuroscientist at Stanford University who was not involved with either study, “and the results are very promising.”

The way the brain identifies and interprets external stimuli and then executes appropriate behaviors remains largely a black box. It is known, for example,...

Optogenetic techniques, in which neuronal cells can be activated on-demand with light, could help to answer such questions, but until recently the technique lacked specificity, activating multiple neighboring neurons together.  The neurons that compose ensembles that respond to certain stimuli are not necessarily adjacent, and an individual neuron can act quite differently to its neighbors. Progress in single-cell methods has addressed this issue, but as yet such advances have not enabled the elicitation of defined behaviors. To do so, says bioengineer Karl Deisseroth of Stanford University who led the study published in Science, it seems stimulation of multiple individual cells within a network is needed.

To that end, Deisseroth’s team combined two-photon excitation, which enables focusing of light to hit single-cells, with a holographic illumination technique that essentially sculpts the light source in 3D such that multiple individual cells distributed across a small area of the brain can be hit at once.

Armed with this illumination technique and with a novel opsin—a light activated ion channel called ChRmine, which the team selected and developed for its extreme sensitivity to red light and its powerful current generation—Deisseroth and colleagues set about testing their ability to mimic natural vision-induced neural activity in mice and drive a specific downstream behavior. They transduced mouse cortices with a vector encoding the ChRmine opsin and a fluorescent calcium indicator that glows green when cells are activated. They then trained thirsty mice to recognize a particular visual cue—contrasting bars moving horizontally or vertically on a screen—that indicated the availability of water in a spout, which the mice proceeded to lick.

Via windows in the skulls of head-fixed mice, the team recorded the cue-induced ensemble activity (green glowing cells) in the animals’ visual cortices. Then, using holographic illumination, they stimulated ChRmine activity in a tiny subset of the same neurons and successfully elicited the spout licking behavior in the complete absence of a visual cue.

Rafael Yuste, a neuroscientist at Columbia University, was not involved with the Science paper but authored the recent Cell paper in which similar results were obtained with a different opsin. Both papers “show the power of two-photon optogenetic holography as a method to ‘play the piano’ with neural circuits with single-cell precision and to specifically manipulate behavior,” he writes in an email to The Scientist. They also “demonstrate that neuronal ensembles are functional units of visual perception.”

While “there is still a lot to learn [regarding] . . . the question of how neuronal activity generates perception,” says Hestrin, the new method certainly “gets us closer to that goal.”

J.H. Marshel et al., “Cortical layer–specific critical dynamics triggering perception,” Science, doi:10.1126/science.aaw5202, 2019.

L. Carrillo-Reid et al., “Controlling visually guided behavior by holographic recalling of cortical ensembles,” Cell, 178:1–11, 2019. 

Interested in reading more?

The Scientist ARCHIVES

Become a Member of

Receive full access to more than 35 years of archives, as well as TS Digest, digital editions of The Scientist, feature stories, and much more!
Already a member?