Using selfies to detect cancer

by • August 29, 2017 • Feature Slider, Feature-Home, Featured-Slides-Home, Medical DevicesComments Off on Using selfies to detect cancer436

The BiliScreen smartphone app used with a 3-D printed box that helps control lighting conditions to detect signs of jaundice in a person’s eye (Image by Dennis wise/University of Washington)

Pose, snap, post. In the last few years, taking selfies has evolved from a curious social media-focused practice to a ubiquitous automatic reaction of global proportions. Now, researchers from the University of Washington believe they can turn the public’s prevailing penchant for self-portraits into something that could detect pancreatic cancer.

Pancreatic cancer is a very aggressive form of cancer. It is estimated to have caused the deaths of 411, 600 individuals in 2015 and in the fourth most common cause of cancer deaths in North America. Pancreatic cancer has one of the worst survival rates because its symptoms manifest quite late into the progression of the disease.

Early detection of the disease can give patients a better chance at survival and researchers at UW are working on a mobile app that allows people to snap smartphone selfies which could aid in screening for pancreatic cancer and other diseases.

One of the symptoms of pancreatic cancer is jaundice.

“Jaundice is only recognizable to the naked eye in severe stages, but a ubiquitous test using computer vision and machine learning can detect milder forms of jaundice,” according to a study by researchers Alex Mariakakis, Megan A. Banks, Lauren Phillipi, Lei Yu, James Taylor, and Shwetak Patel. “We propose BiliScreen, a smartphone app that captures pictures of the eye and produces an estimate of a person’s bilirubin level, even at levels normally undetectable by the human eye.”

 

Special Podcast

Dean Fulford, vice-president of human resources consulting for Stratford Managers Corporation, talks about the nationwide life sciences HR-focus survey his company is conducting with Biotechnology Focus and BioTalent Canada.

BiliScreen is a smartphone camera app which uses machine learning to estimate the extent of jaundice in the sclera – the white outer layer of the eyeball.

The researchers focused on measuring jaundice in the eye since the typical sclera is race-agnostic, unlike skin which widely differs across ethnicities.

Jaundice in the skin and eyes are caused by a buildup of bilirubin in the blood. Bilirubin is a substance found in bile. It is produced when the liver breaks down old red blood cells.

Examining two
methods for color normalization: (top-left) a box similar to a head-mounted VR display that controls the amount of light
that reaches the eyes, and (bottom-left) paper glasses that provide colored squares for calibration. (University of Washington)

Jaundice is not apparent to the trained naked eye until roughly 3.0 mg/dl, according to the researchers. However, bilirubin levels greater than 1.3 mg/dl “warrant clinical concern.”

“There exists a detection gap between 1.3 and 3.0 mg/dl that is missed by clinicians unless a TSB (a total serum bilirubin) is requested, which is rarely done without due cause,” according to the study. “We hypothesize that diagnoses can be made much earlier and lead to better outcomes with a system that is precise enough to distinguish between bilirubin levels within and outside of those bounds.”

The researchers said the BiliScreen app is able to snap pictures of the eye and produce an estimate of a person’s bilirubin level at levels that are not “normally undetectable by the human eye.”

In a clinical study of 70 people, the researchers found that BiliScreen achieves a person correlation coefficient of 0.89 and a mean error of -0.09 ± 2.76 mg/dl in predicting a person’s bilirubin level. BiliScreen identifies cases of concern with a sensitivity of 89.7 per cent and a specificity of 96.8 per cent with the box accessory, according to the researchers.

Because different ambient lighting can change the colour of the same scene, the researchers are evaluating two accessories for ambient lighting conditions.

One of the accessories is a head-worn box similar to commercially available virtual reality headpieces that attach to smartphones. The box is designed to block out ambient light. It uses the phone’s camera flash to control lighting.

The other accessory is a pair of paper glasses similar to the 3D glasses worn by movie watchers.

Since the sclera does not have a predefined shape, BiliScreen also requires an additional step of segmentation.

The team hopes to continue building improved models of BiliScreen and conduct a longer-term study which will capture trends of bilirubin levels.

The study was funded by the National Science Foundation and the Coulter Foundation.

The paper will be presented on September 13 at the Ubicomp 2017 which is the Association for Computing Machinery’s International Joint Conference on Pervasive and Ubiquitous Computing.

Comments are closed.