These scripts enable navigation. It requires javascript be enabled in your browser. Human Space Flight WebHuman Space Flight WebHuman Space Flight WebHuman Space Flight WebHuman Space Flight WebHuman Space Flight WebHuman Space Flight WebHuman Space Flight WebHuman Space Flight Web
Skip navigation to content.
Human Space Flight WebReturn to Human Space Flight home page
Human Space Flight Web
Human Space Flight Web

NEEMO: | Home | Facilities | Teams | History | Journals | EVAs
Behind the ScenesBehind the ScenesTrainingNEEMOBehind the ScenesTrainingNEEMOTrainingNeutral Buoyancy Lab
IMAGE: NEEMO-7 Mission Commander Robert Thirsk and Dr. Craig McKinley
NEEMO-7 Mission Commander Robert Thirsk and Dr. Craig McKinley of the Centre for Minimal Access Surgey at St. Joseph's Healthcare Hamilton, Ontario work near the Aquarius Underwater Laboratory.
*NEEMO 7 Journals

NEEMO 7 Journals

NEEMO 7, Topside Team
Day 5-7, October 15-17, 2004


Today our Aquanauts got to try something a little different: driving a remotely operated rover across the sea floor. This little robot, a VGTV-Xtreme, was quite the hit. It's about as big as a laptop computer case, and runs on two treads like a miniature tank. The operator controls it from a small control unit with a joystick and video screen that shows what the camera on the rover is seeing. Something very similar to this was used to look for survivors after 9/11.

We sent it down to the habitat and got it hooked up, but the control panel wouldn't come on. It was noticed that the touch pad controls were all depressed from the increased pressure inside Aquarius. It quickly became apparent that the only solution would be to somehow get air underneath the touch pad, so that the pressure on each side of each button could equalize. Wasting no time, our resident surgeon Craig got out a small suturing needle and deftly lanced each button pad. Voila! Problem solved. We started this mission planning to do surgery with robots. Now suddenly we were doing surgery on robots.

This done, the crew started taking turns driving it. Our topside team was there on scuba to evaluate and document how it operated underwater. Later, while the aquanauts were out working on Waterlab, the ExPOC in Houston took control and drove it around for a while. Among other things, they were able to take it over to the area the aquanauts were working in and see them on the camera. Just think: the ground control team had the capability to tell the camera to "go stand over there" to give them a better view of the crew they are watching over, and the work they are doing. This capability is likely to be a very common thing on future exloration missions. And you were there when we did it for the first time… Our thanks to the gang from American Standard Robotics, who graciously supported our evaluation of this capability.

Two of the major Center for Minimal Access Surgery (CMAS) objectives were performed this weekend. The first was a hand vein suture experiment. In space flight, astronauts are subjected to a zero gravity environment that could potentially cause minor or severe body trauma. There are a number of life threatening conditions which may require suturing to control bleeding, such as deep lacerations or severed arteries. The NEEMO crew will be testing their knot tying skills today as they attempt to suture a fake severed vessel for the first time, a task that may very well need to be performed by an astronaut in space flight someday. Luckily, they will not be doing this alone: in Hamilton, Ontario, Dr Anvari will be tele-mentoring them, guiding them step by step as they tie the sutures.

Working in an extreme environment sometimes calls for the assistance of a robotic device. These devices are controlled by someone located some distance away; this is called tele-manipulation. To give the operator the ability to feel, these robotic devices employ a technology called haptics. Quantifying the usefulness and limitations of Haptics was the second CMAS objective.Haptics is the science of applying touch (tactile) sensation and control to interaction with robotic devices. By using special input/output devices users can receive feedback from robotic devices in the form of felt sensations in the hand. But there is a downside to this type of technology: time delays can affect haptics to the point where the user cannot control the device. On this mission the crew has been evaluating a new technology called TiDeC. TiDeC is a time delay compensator designed to allow a haptic-enabled device to be controlled from a distance of nearly 1,300 miles. Anvari was in Hamilton, and using TiDeC assisted haptics, guided the crew through a series of tasks with the ability to feel every move each other makes, even the mistakes.

We have a distinguished VIP visiting our team in the ExPOC on Monday to observe first hand. We'll tell you how it went in tomorrow's Topside Report.

Curator: Kim Dismukes | Responsible NASA Official: John Ira Petty | Updated: 10/18/2004
Web Accessibility and Policy Notices