310 CATALYZING INQUIRY
A second example of a neural prosthesis is a retinal prosthesis intended to provide functionality when
the retina of the eye is nonfunctional. In one variant, a light-sensitive microchip is implanted into the back
of the eye. Light striking the microchip (which has thousands of individual sensors) generates electrical
signals that travel through the optic nerve to the brain and are interpreted as an image.^19 In another
variant, the retina is bypassed entirely through the use of a camera mounted on a pair of eyeglasses to
capture and transmit a light image via a radio signal to a chip implanted near the ganglion cells, which
send nerve impulses to the brain.^20 In a third variant, an implanted microfluidic chip that controls the flow
of neurotransmitters translates digital images into neurochemical signals that provide meaningful visual
information to the brain. The microfluidic chip has a two-dimensional array of small controllable pores,
corresponding to pixels in an image. An image is created by the selective drip of neurotransmitters onto
specific bipolar cells, which are the cells that carry retinal information to the brain.^21
A third example of work in this area is that of Musallam et al., who have demonstrated the feasibil-
ity of a neural interface that enables a monkey to control the movement of a cursor on a computer screen
by thinking about a goal the monkey would like to achieve and assigning a value to that goal.^22 The
interesting twist to this work is the reliance of signals from parts of the brain related to higher-order
(“cognitive”) brain functions for movement planning for the control of a prosthetic device. (Previous
studies have relied on lower-level signals from the motor cortex.^23 )
The advantage of using higher-level cognitive signals is that they capture information about the
monkey’s goal (moving the cursor) and preferences (the destination on the screen the monkey wants).
Musallam et al. point out that once the signals associated with the subject’s goals are decoded, a smart
external device can perform the lower-level computations necessary to achieve the goals. For example,
a smart robotic arm would be able to understand what the intended goal of an arm movement is and
then compute—on its own—the trajectory needed to move the arm to that position. Furthermore, the
abstract nature of a cognitive command would allow it to be used for the control and operation of a
number of different devices. If higher-level signals associated with speech or emotion could be decoded,
it would become possible to record thoughts from speech areas (reducing the need for the use of
cumbersome letter boards and time-consuming spelling programs) or to provide online indications of a
patient’s emotional state.
A fourth example is provided by Theodore Berger of the University of Southern California, who is
attempting to develop an artificial hippocampus—a silicon implant that will behave neuronally in a
manner identical to the brain tissue that it replaces.^24 The hippocampus is the part of the brain respon-
sible for encoding experiences so that they can be stored as long-term memories elsewhere in the brain;
without the hippocampus, a person is unable to store new memories but can recall ones stored prior to
its loss. Because the manner in which the hippocampus stores information is unknown, Berger’s ap-
proach is based on designing a chip that can provide the identical input-output response. The input-
(^19) N.S. Peachey and A.Y. Chow, “Subretinal Implantation of Semiconductor-based Photodiodes: Progress and Challenges,”
Journal of Rehabilitation Research and Development 36(4):371-376, 1999.
(^20) W. Liu, E. McGucken, M. Clements, S.C. DeMarco, K. Vichienchom, C. Hughes, et al., “Multiple-Unit Artificial Retina
Chipset System to Benefit the Visually Impaired,” to be published in IEEE Transactions on Rehabilitation Engineering. Available at
http://www.icat.ncsu.edu/projects/retina/files/MARC_system_paper.pdf.
(^21) B. Vastag, “Future Eye Implants Focus on Neurotransmitters,” Journal of the American Medical Association 288(15):1833-1834,
2002.
(^22) S. Musallam, B.D. Corneil, B. Greger, H. Scherberger, and R.A. Andersen, “Cognitive Control Signals for Neural Prosthetics,”
Science 305(5681):258-262, 2004. A Caltech press release of July 8, 2004, available at http://pr.caltech.edu/media/Press_Releases/
PR12553.html, describes this work in more popular terms.
(^23) J. Wessberg, C.R. Stambaugh, J.D. Kralik, P.D. Beck, M. Laubach, J.K. Chapin, J. Kim, S.J. Biggs, M.A. Srinivasan, and M.A.L.
Nicolelis, “Real-Time Prediction of Hand Trajectory by Ensembles of Cortical Neurons in Primates,” Nature 408(6810):361-365,
- Similar work on rats is described in J.K. Chapin, K.A. Moxon, R.S. Markowitz, and M.A.L. Nicolelis, “Real-Time Control of
a Robot Arm Using Simultaneously Recorded Neurons in the Motor Cortex,” Nature Neuroscience 2(7):664-670, 1999.
(^24) R. Merritt, “Nerves of Silicon: Neural Chips Eyed for Brain Repair,” EE Times, March 17, 2003 (10:37 a.m. EST), available at
http://www.eetimes.com/story/OEG20030317S0013.