Tyler Wortman was a PhD student at MIT when his adviser, Alex Slocum, came to him with a project idea.
Edward Perez, a dermatologist friend of Slocum’s, explained that skin cancer detection is as much an art as a science. Everyone uses visual cues for diagnosis, but the best specialists will incorporate feeling as well. Just as many doctors use palpation to diagnose other forms of cancer, the best dermatologists feel for modulation in tissue to differentiate malignant tumors from harmless lesions.
While a ton of research has focused on using visual cues to automatically diagnose skin cancer (especially the “ABCDs” — asymmetry, border irregularity, color variation, and diameter), Wortman and Slocum wondered about building a contraption that could “feel” the difference in stiffness associated with malignant tumors.
Wortman, a mechanical engineer by training, knew stiffness was a simple property linking force and deflection. Apply a force, measure its deflection — done.
But without spending an arm and a leg on a robotic contraption that applies physical force to areas of the skin, how do you measure stiffness?
Even more troubling, existing research indicated that measuring stiffness in a single spot was insufficient — the best solutions can produce a stiffness map of an entire area of tissue.
Wortman decided the most cost-effective way to apply controllable forces to tissue was with a vacuum; by encasing tissue in an air-tight enclosure and evacuating the air in a controlled manner, Wortman could determine the force applied to the skin.
The next issue became deflection measurement.
While he experimented with different types of sensors — including an intriguing inductive distance sensor from Texas Instruments — he ultimately decided that measuring deflection visually — using a structured light setup — was a perfect balance of cost-effectiveness and high resolution.
By projecting a pattern of dots onto the tissue from an offset angle, he could observe how the dot positions changed to triangulate the (X,Y,Z) coordinates of the dot. This allowed him to build a surface contour of the skin.
Implementation
With the design sketched out, the next challenge was to actually implement a prototype.
Most structured-light research systems use off-the-shelf projectors that would be far too bulky and expensive to integrate into his prototype. Instead, he used a COB LED, gobo plate, and a standard M12 camera lens to illuminate a dot pattern carved into the gobo and project it out into space.
He used an off-the-shelf camera designed for computer vision work that had good support for MATLAB.
He used an off-the-shelf industrial pressure sensor that had a 0-to-5V analog output, as well as a 12V vacuum pump.
Wortman decided the prototype would be too unwieldy to use if the dermatologist needed to continually go back and forth between the patient and computer to start data collection, so he also integrated a simple push button into the design to trigger data collection.
The main snag he hit was trying to figure out how to integrate all these systems together. He considered using an Arduino, but he didn’t know the best way to interface it with MATLAB, and wanted to provide a seamless experience for the dermatologist that would evaluate the system on patients.
To this end, he decided to use a Treehopper board to integrate these components together. MATLAB can directly call into .NET and Java assemblies, so integrating Treehopper into his MATLAB workflow was seamless. Because Treehopper boards instantiate as real USB devices — instead of old-school COM ports — Wortman’s MATLAB code could instantly detect the board without any user configuration.
A custom PCB was designed for the project to integrate these components together, and he was off to the races. He’s currently investigating commercialization of the project, and his preliminary design work will appear in the June 2018 issue of Journal of Medical Devices — check out a PDF of the article here.