Instructor: Vadim Arshavsky
Summary: We are well familiar with the metaphor comparing the eye with a photographic camera. Indeed, both rely on refraction and lenses to form images. What is perhaps less appreciated is that the eye functions as a digital camera. Information about the surrounding world reaches the back of the eye in the form of photons of variable wavelength, which are absorbed by rod and cone photoreceptor cells of the retina. The light-evoked electrical signals produced by photoreceptors are next processed by a network of retinal neurons, so that information about each point in visual space becomes digitized and reaches the brain through multiple channels, each reporting a different feature of the visual world (brightness, contrast, color, motion, etc.).
In this module, we will follow each step of this analog-to-digital transition by discussing critical experimental papers in three areas: phototransduction (the transformation of a light signal into an electrical signal); the functioning of the first synapse in the retina; and the split of visual information into multiple channels each carried by a highly-specialized type of the retinal ganglion cells. Our goal would be to integrate the findings of molecular, cellular and electrophysiological studies into a single big picture of how the retina works.