Colorectal cancer kills some 500,000 people every year. It is the second most common form of cancer in women and the third most common in men.
Some 80 per cent of these cancers start as colorectal polyps, which are relatively easy to spot and remove. But conventional colonoscopies are time-consuming, invasive and expensive, so various groups are looking for better ways to do the job.
One of the more promising options is colon capsule endoscopy. This involves a tiny digital camera, a light, a transmitter and a battery in a capsule which the patient swallows. As the capsule passes through the patient’s digestive system, it transmits images wirelessly to a recording device that the patient carries.
That’s handy for the patient who can continue with his or her routine, more or less as usual. But it’s not so good for medical staff who have to analyse the images later. With the camera taking pictures at anything up to 30 frames per second, that can mean long hours studying the images for every patient.
Today, Alexander Mamonov at the he University of Texas at Austin and a few pals unveil an algorithm that can do the job automatically. This program examines each image in the sequence for the tell-tale signs of a polyp and flags up potential candidates for more detailed analysis.
Mamonov and co’s algorithm uses two techniques for spotting polyps. One key difference between a polyp and healthy tissues is that it protrudes from its surroundings. So the algorithm homes in on protrusions in an attempt to spot frames that contain polyps.
This is no simple task. The difficulty is to differentiate between a polyp and the many natural folds that occur in healthy tissue. To do this, the algorithm measures the curvature of the tissue using a sphere-fitting technique. The radius of the sphere that best fits the tissue fold is then a measure of the curvature.
Mamonov and co can then set a threshold curvature above which the frames are flagged for further investigation.
Another important feature of polyps is their texture, which tends to be much rougher than healthy tissue. So the algorithm automatically discards frames that have too little texture in them. However, this process is confounded by bubbles and froth in a frame which can make the image much more textured. So the algorithm also discards those frames with too much texture.
The result is a process that assesses each frame according to two criteria.
So how well does this algorithm work? Mamonov and co have put it through its paces on a data set consisting of almost 19,000 images, 230 of which containing polyps. In this test, the algorithm detected polyps correctly 47 per cent of the time (with a low rate of false positives).
That performance needs to be put in context. The images are not always straightforward to examine, mainly because of shadows cast by the natural curves and folds in colons. These shadows can easily obscure polyps, making them hard to spot.
However, most polyps appear in several frames as the capsule moves through the digestive system and this provides several opportunities to spot it.
So a much better way to measure the performance of the algorithm is to look at its ability to spot polyps in the sequence of images in which they appear, rather than in each frame.
By this measure, the algorithm achieves a much more respectable recognition rate of 81 per cent. The 230 images of interest actually contained only 16 different polyps, of which the algorithm successfully spotted 13. “The algorithm correctly detects 13 out of 16 polyps in at least one frame of each corresponding sequence,” say Mamonov and co.
That’s not perfect, of course. “While our approach is by no means an ultimate solution of the automated polyp detection problem, the achieved performance makes this work an important step towards a fully automated polyp detection procedure,” they say.
That’s an honest assessment. What Manamov and co have developed is a useful stepping stone towards the automated detection of polyps,a goal that has the potential to save countless lives in future.