As we discussed in the first of this two part series, mobile devices are already entering the world of the surgeon. Currently, it is mostly downloadable apps that promise to help surgeons with the informational portions of their tasks, such as tracking the cases they have done, e.g. Surgichart or helping in the consent process, e.g. Surgery Risk
While apps that are dedicated to the technical aspects of surgery, such as the excellent AO Surgery Reference, are becoming available, in the future we will see the iPad (or its brethren) actually in the operating room. Why ? Because the iPad has many characteristics that make it a great an advanced surgical instrument.
First is its small size. Every modern operating room has stacks of electronic equipment hanging from the ceiling or in large cabinets for patient monitoring and controlling in-field devices. Since the iPad already supports a bevy of standard wireless communication protocols, many of these large boxes’ functions could likely be off-loaded to an iPad with clever engineering. One immediate advantage would be that the iPad could be brought directly into the operative field in a small sterile bag where the touch interface makes it possible to change settings even using gloved hands. This means that the surgeon or assistant could manipulate the controls rather than asking the room staff to make every little change. The long battery life means that an iPad can make it through even a long surgical day without needing a recharge (at least not before the surgeons do !)
Even more intriguing are the possibilities for user interfaces that incorporate the built-in sensors in the iPad, specifically the accelerometer and gyroscope. These sensors have made the iPod Touch a massive success in the mobile gaming world by allowing players to interact with games by turning and twisting their devices. Could they also lead to new types of assistive surgical instruments ? Read below as we explore these ideas.
[Ed. This article, and the first part of this series, is co-published in the Journal of Surgical Radiology]
The most basic and fundamental function of mobile devices is communication. In the surgical realm, this often means sharing visual information since intraoperative observations are critical to understanding surgical options and prognosis. But this can even be extended to the preoperative setting, where the ability to quickly consult with a colleague about physical findings could help guide surgical planning. The built-in FaceTime videoconferencing feature in the iPad 2 and iPhone 4 is as simple as it gets. In fact, this is exactly what Armstrong et al showed could be one use of FaceTime, in a brief report describing its use in sharing the appearance of pre- and post-operative wounds, where two surgeons discussed the appearance of an extremity and whether surgery is indicated, as seen in the image below. If the Wi-Fi network is encrypted, the communication is likely in fact HIPAA compliant.
Sharing intraoperative images with patients and other professionals is also a feature of SurgiCharts, as our recent review showed while eGoWorks is an iOS app for sharing endoscopic images stored using Envisionier’s web-based endoscopic image server by the same name. For more robust video sharing, JEMS offers a system where multiple mobile clients can view the same video stream (image below). The stream is encrypted and mobile clients are available for both iOS Android devices. A video server is required, which can take in up to four different video inputs. This means that the same surgical procedure can be viewed in multiple locations, even in different cities, making it potentially a great tool for teaching.
While the iPad could be used to view a remote surgery, it may be put to even better use in the immediate operative field. One use could be to follow the patient’s vital signs. The AirStrip platform already allows for live remote monitoring of fetal heart tracings and is already deployed in multiple hospitals. Now also available is remote monitoring of vital signs, such as EKG tracing, blood pressure and oxygen saturation. While this is mostly the responsibility of the anesthesiologist, in cardiac procedures especially, the surgeon is also acutely interested in heart electrical activity and blood pressures. Thus, large and expensive plasma screens are often hung from the ceiling in cardiac operating rooms. An iPad in the operative field running AirStrip may be a far more economical alternative.
Another interesting possibility is to use an iPad as a microscope. Surgeons are used to using “loupes”, eyeglasses with built in magnification, when working on small objects such as nerves and vessels. Other times, large and expensive floor microscopes are brought sterile-wrapped into the operative field. However, there may be times when an inexpensive video microscope may be all that is needed. One university project, “CellScope” demonstrated the feasibility of attaching a magnifier to a cell phone camera to make an inexpensive 45x microscope that can be used in rural settings or less developed countries. Instead of using the built-in camera, if the image sensor was a the tip of a thin extension tube, such as seen below, it could be brought into the field and the video transmitted wirelessly to a nearby iPad. This would then function as the surgeon’s eyes, allowing her to peer into poorly lit body cavities and magnifying small objects.
The accelerometer inside many smartphones is sensitive to changes in position, such as rotating the device between portrait and landscape. It can also allow for software to be written to allow the device function as a sort of “level” to determine when objects are parallel to each other or to the ground, as demonstrated by the popular iPhone app “iHandy”. This capability is exploited by the “Scoliogauge” app to help orthopedic surgeons measure extent of spine curvature in scoliosis. By having the patient bend forward and placing the iPhone on their back, the angle of trunk rotation can be measured. This rotation corresponds to the degree of curvature in scoliosis. You can read more about this in our recent interview with the orthopedic surgeon and app developer Matt Ockendon.
The same concept has been used to help guide surgeons planning reconstructive eyelid surgery as reported by Mezzana et al, who used an iPhone intraoperatively to ensure that eye lids are parallel during oculoplastic surgery, as seen below.
The app is called Laser Level and it is was originally designed for home decorators to determine when wall hangings are perfectly level. The app overlays two “laser” lines in real-time onto the camera image which turn green when they are parallel, as seen in the image above. The authors reported they were able to get both 100% interobserver reliability as well as “perfect alignment of the lateral canthal position after surgery verified by manual level assessment” in all nine of their patients.
Intraoperative “navigation” refers to a group of technologies that are used to assist surgeons in locating deep anatomic structures or accurately placing implants inside patients. In orthopedics, navigation has been shown to increase the accuracy of placement of knee replacement prostheses. However, one consistent complaint among surgeons has been the inconvenience of assembling the required transmitters and registering them onto the display sensors while in the operating room. Now Brainlabs, in partnership with implant maker Smith & Nephew, has developed a system that uses an iPod Touch to replace almost the entire device.
As can be seen in the image above, the reflectors and the user interface are incorporated into the iPod Touch which is inside a case and brought sterile into the field. The alignment of the implant is read directly off the screen. The device is awaiting FDA approval in the USA and a demo is available in the iTunes App Store.
Augmented Reality in the Operating Room
Object recognition software is becoming increasingly sophisticated, even on mobile devices. A great example is the remarkable iPhone application MagicPlan that can almost “magically” draw the floor plan of a room by using the iPhone camera to determine the distance to each corner in a room. The user points the iPhone camera at each corner in a room and the app draws green lines, which are approved the user. Once all the corners are registered, the floor plan, with actual distances, is shown.
This type of technology is termed “augmented reality” and generally defined as the practice of combining or superimposing computer generated data onto live-obtained images. Familiar examples include televised football where a virtual first down line is superimposed onto the field or televised soccer where advertisements seem to be displayed on the walls enclosing the playing field.
The medical uses for augmented reality are just being explored. One app currently available measures the angle between bones on an xray. The app is called Hallux and it is specifically designed to guide surgeons planning reconstructive foot surgery.
In this case, the software combines readings from the iPhone accelerometer and the user determined positions of the bones, using the on-screen alignment guide, to read the angle between the first two metatarsal foot bones. You can read more about it at our more detailed app review.
Much of surgery has to do with imagining deeper structures, such as organs and bones. Sophisticated image processing techniques make it possible to render three dimensional images, even color coded by organ. These types of images can be invaluable for identifying pathology and planning complex surgery.
The above rendering was produced by Dr. Maki Sugimoto a hepatobiliary surgeon who is pioneering methods of incorporating advanced imaging techniques and surgery. In this example, he uses an overhead projector to overlay the 3d image onto the actual patient during surgery, as seen below. In order to align the images, he uses anatomic landmarks, such as the navel and the iliac crests. to properly align and scale the image.
His group reported their findings in seven surgeries including three cholecystectomies, two gastrectomies and two colectomies. They found that live image registration helped the surgeon locate small objects and tumors introperatively that would otherwise have been difficult to find, and potentially helped avoid intraoperative injury.
An even more useful and convenient method of combining real and computer generated images during surgery, I propose, would be if this “overlay” image could be displayed on an iPad. In this concept, the three dimensional image would be manipulated in real time by pointing the iPad at different portions of the body, giving the sense of being able to peer inside the body. A method of simulating a live three-dimensional view on the iPad was demonstrated by the Engineering Human-Computer Interaction Research Group in France (below).
By using the iPad’s accelerometer, different aspects of a computer-generated 3D object are displayed as the iPad is tilted (video). This can simulate the familiar stereoscopic 3D image display which relies on polarizing glasses, without the glasses. The group even demonstrated how using the front-facing camera and face detection algorithms, the same 3D experience can even simulated when the user looks at the iPad from different angles. A demo version of the software is available for download in the iTunes App Store.
Going further, perhaps the remarkable technology developed by Microsoft for its Kinect could be incorporated into surgery. The Kinect was developed as a gaming tool to allow players to interact with objects on the screen by moving their own bodies. It works by projecting a dense yet invisible mesh of infrared dots into a room and, by rapidly reading the reflection of the infrared light, it can track the movement of people.The image below is from source Matt Cutt’s blog.
The Kinect has already been “hacked” to work inside an operating room by allowing a surgeon to manipulate the display of CT & MRI images from across the room, hands-free.
From there, it is not hard to imagine the same technology being used to track the movement of a surgeon’s hands or instruments. What could be a great leap forward would be to combine the position of the surgeon’s instruments with live-registered three dimensional anatomy images and thus simulate the instruments inside the body. A Kinect-like device could “watch” the surgeon’s hands and instruments outside the body while the iPad is pointed at different parts of the patient to virtually peer inside and show the instruments.
The iPad has the potential be a game changer in surgery because of its small size, built in sensors and wireless networking capabilities. The only restriction is the imagination of future surgical innovators. Even if the above predictions fail to materialize, it is safe to say what the future holds is only barely imagined today.
Armstrong, D. G., Giovinco, N., Mills, J. L., & Rogers, L. C. (2011). FaceTime for Physicians: Using Real Time Mobile Phone-Based Videoconferencing to Augment Diagnosis and Care in Telemedicine Eplasty, 11, e23.
Mezzana, P., Scarinci, F., & Marabottini, N. (2011). Augmented Reality in Oculoplastic Surgery: First iPhone Application. Plastic and Reconstructive Surgery, 127(3), 57e–58e.
Image overlay navigation by markerless surface registration in gastrointestinal, hepatobiliary and pancreatic surgery. Sugimoto, et al. J Hepatobiliary Pancreas Sci. 2010 Sep;17(5):629-36. Epub 2009 Oct 2.