Catalyzing Inquiry at the Interface of Computing and Biology

(nextflipdebug5) #1
324 CATALYZING INQUIRY


  • Spatial registration and alignment of instruments, the surgeon’s hands, and the digital body being oper-
    ated on. The surgeon must see an instrument move to the position in the body to which the surgeon has
    moved it. When a cutting motion is made, the appropriate tissue should split in the appropriate place
    and amount.

  • The different feel and texture of tissue depending on whether the instrument is a scalpel or a finger. A
    digital human for surgical use must provide appropriate force feedback (“haptic capability”) to the
    surgeon so that, for example, cutting into soft tissue feels different than cutting into bone.

  • Incorporation of gravity in the model. Many organs consist of soft tissue that is deformed easily
    under pressure from instruments and touch. As importantly, tissues are subject to gravitational forces
    that will change their shape as their orientation is changed (the breast of a woman lying on her back has
    an entirely different shape than when she is lying on her side).


Some first steps have been taken in many of these areas. For example, a project at the Ohio
Supercomputer Center (OSC) in 1996 sought to develop a virtual reality-based simulation of regional
anesthesia that employed haptic techniques to simulate the resistance felt when an injection is given in
a certain area (Box 9.4).
A second example is work in computational anatomy, one application of which has sought to
characterize the structure of human brains in a formal manner. Structure is interesting to neuroscientists
because of a presumed link between physical brain structure and neurological function. Through math-
ematical transformations that can deform one structure into another, it is possible to develop metrics
that can characterize how structurally different two brains are. These metrics can then be correlated
with understanding of the neurological functions of which each brain is capable (Box 9.5). Such metrics
can also be used to identify normal versus diseased states that are reflected anatomically.


Box 9.4
A Virtual Reality Simulation of Regional Anesthesia

A collaborative effort between researchers at the Ohio State University Hospitals, Immersion Corporation, and
the Ohio Supercomputer Center has led to the creation of a virtual reality simulator that enables anesthesiol-
ogists-in-training to practice in a realistic environment the injection of a local anesthetic into the epidural
space of the spinal column. The system includes a workstation capable of stereo display, a real-time spatial
volume renderer, a voice-activated interface, and most importantly, a one-dimensional haptic probe capable
of simulating the resistive forces of penetrated tissues.

Although this procedure appears simple, it is in fact a delicate manual operation that requires the placement
of a catheter into a small epidural space using only haptic cues (i.e., cues based on tactile sensations of
pressure) to guide the needle. By feeling the resistive forces of the needle passing through various tissues, the
anesthesiologist must maneuver the tip of the needle into the correct space without perforating or damaging
the spinal cord in the process.

The system is designed to enable the trainee to practice the procedure on a variety of datasets representative
of what he or she might experience with real patients. That is, the pressure profile as a function of needle
penetration would vary from patient to patient. By training in this environment, the trainee can gain proficien-
cy in the use of this technique in a non-harmful manner.

SOURCE: L. Hiemenz, J.S. McDonald, D. Stredney, and D. Sessanna, “A Physiologically Valid Simulator for Training Residents to Perform
an Epidural Block,” Proceedings of the 15th Southern Biomedical Engineering Conference, March 29-31, 1996, Dayton, OH. See also
http://www.osc.edu/research/Biomed/past_projects/anesthesia/index.shtml.
Free download pdf