Study Supports Use of a Smartphone App for Dermatologic Diagnosis

Doctors on cell phone in a hospital
Doctors on cell phone in a hospital
A machine learning-based mobile phone app may serve as a diagnostic support tool for a range of common skin conditions in patients of color.

The feasibility of a machine learning-based mobile phone app for the diagnosis of common dermatologic diseases in patients of color found support in study data published in the Journal of the European Academy of Dermatology and Venereology. The app may serve as a diagnostic support tool for a range of common skin conditions.

Investigators trained a convolutional neural network-based algorithm with clinical images from 40 different skin conditions. The algorithm was used to generate a mobile health (mHealth) smartphone app. When given a clinical lesion image, the app produces a list of the most likely dermatologic diagnoses.

The performance of this app was tested in a clinical validation study of patients receiving dermatologic care at tertiary health centers in India. Dermatologists were asked to upload a photograph of a single lesion from each patient to the mHealth app. The app’s diagnostic analysis of the lesion was compared with a consensus diagnosis made by 2 board-certified physicians. Practitioners were blinded to the app’s diagnosis. Model and app performance was measured in terms of sensitivity, specificity, and area under the curve (AUC).

The study enrolled 5014 patients: 3699 from tertiary care, 383 from urban private practice, and 932 from rural primary care. The most common diagnoses were acne (n=592), tinea cruris (n=545), eczema (n=432), and psoriasis (n=394). For the diagnosis of 40 skin conditions, the model displayed a top-1 overall accuracy of 75.07%: for 75% of patients, the “most probable” diagnosis given by the model matched the true diagnosis. The top-3 accuracy was 89.62%, indicating that nearly all patients were correctly diagnosed by the first 3 options presented by the app.

Mean AUC was 0.90 plus or minus 0.07, indicating excellent discrimination between the actual diagnosis and other conditions. Mean top-1 specificity was 99.11%, suggesting a very low chance of false positives. Minor differences in overall accuracy were observed between facility types, with urban practices showing the highest accuracy and tertiary care centers showing the lowest accuracy. Top-1 and top-3 sensitivity was high for most diseases.

The app was most sensitive to hidradenitis suppurativa, basal cell carcinoma, vitiligo, acne, alopecia, molluscum contagiosum, melasma, ichthyosis, and tinea corporis. Sensitivities were lower for Bowen disease (squamous cell carcinoma in situ), bullous pemphigoid, candidiasis, fixed drug eruption, eczema, lichen sclerosus, melanocytic nevus, rosacea, and tinea pedis.

As study limitations, investigators noted that the app did not integrate patient medical history, which may have limited diagnostic accuracy for conditions with long clinical history and nondermatologic symptoms.

This large-scale clinical validation study supports the feasibility and utility of a machine learning-driven app for the diagnosis of common dermatologic conditions in skin of color, the researchers concluded. “Future versions of such apps need to refine the image-based diagnosis with automated analyses of patient metadata,” study authors wrote.

Disclosure: Several study authors declared affiliations with the pharmaceutical industry. Please see the original reference for a full list of authors’ disclosures.

Reference

Pangti R, Mathur J, Chouhan V, et al. A machine learning-based, decision support, mobile phone application for diagnosis of common dermatological diseases. Published online September 29, 2020. J Eur Acad Dermatol Venereol. doi:10.1111/jdv.16967