Citation: SHEN JL, GUAN AH, WANG XY, et al. Facial color-preserving generative adversarial network-based privacy protection of facial diagnostic images in traditional Chinese medicine. Digital Chinese Medicine, 2025, 8(4): 455-466. DOI: 10.1016/j.dcmed.2025.12.002
Citation: Citation: SHEN JL, GUAN AH, WANG XY, et al. Facial color-preserving generative adversarial network-based privacy protection of facial diagnostic images in traditional Chinese medicine. Digital Chinese Medicine, 2025, 8(4): 455-466. DOI: 10.1016/j.dcmed.2025.12.002

Facial color-preserving generative adversarial network-based privacy protection of facial diagnostic images in traditional Chinese medicine

  • Objective To develop a facial image generation method based on a facial color-preserving generative adversarial network (FCP-GAN) that effectively decouples identity features from diagnostic facial complexion characteristics in traditional Chinese medicine (TCM) inspection, thereby addressing the critical challenge of privacy preservation in medical image analysis.
    Methods A facial image dataset was constructed from participants at Nanjing University of Chinese Medicine between April 23 and June 10, 2023, using a TCM full-body inspection data acquisition equipment under controlled illumination. The proposed FCP-GAN model was designed to achieve the dual objectives of removing identity features and preserving colors through three key components: (i) a multi-space combination module that comprehensively extracts color attributes from red, green, blue (RGB), hue, saturation, value (HSV), and Lab spaces; (ii) a generator incorporating efficient channel attention (ECA) mechanism to enhance the representation of diagnostically critical color channels; and (iii) a dual-loss function that combines adversarial loss for de-identification with a dedicated color preservation loss. The model was trained and evaluated using a stratified 5-fold cross-validation strategy and evaluated against four baseline generative models: conditional GAN (CGAN), deep convolutional GAN (DCGAN), dual discriminator CGAN (DDCGAN), and medical GAN (MedGAN). Performance was assessed in terms of image quality peak signal-to-noise ratio (PSNR) and structural similarity (SSIM), distribution similarity Fréchet inception distance (FID), privacy protection (face recognition accuracy), and diagnostic consistency mean squared error (MSE) and Pearson correlation coefficient (PCC).
    Results The final analysis included facial images from 216 participants. Compared with baseline models, FCP-GAN achieved superior performance, with PSNR = 31.02 dB and SSIM = 0.908, representing an improvement of 1.21 dB and 0.034 in SSIM over the strongest baseline (MedGAN). The FID value (23.45) was also the lowest among all models, indicating superior distributional similarity to real images. The multi-space feature fusion and the ECA mechanism contributed significantly to these performance gains, as evidenced by ablation studies. The stratified 5-fold cross-validation confirmed the model’s robustness, with results reported as mean ± standard deviation (SD) across all folds. The model effectively protected privacy by reducing face recognition accuracy from 95.2% (original images) to 60.1% (generated images). Critically, it maintained high diagnostic fidelity, as evidenced by a low MSE (< 0.051) and a high PCC (> 0.98) for key TCM facial features between original and generated images.
    Conclusion The FCP-GAN model provides an effective technical solution for ensuring privacy in TCM diagnostic imaging, successfully having removed identity features while preserving clinically vital facial color features. This study offers significant value for developing intelligent and secure TCM telemedicine systems.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return