Intuitive Assessment of Mortality Based on Facial Characteristics: Behavioral, Electrocortical, and Machine Learning Analyses
Twelve self-identified intuitives viewed 404 photographs (50% deceased, 50% alive) balanced across 8 visual characteristics. Overall accuracy 53.6% vs. 50% chance (p=0.005); 5/12 participants individually significant. Performance best with recent deaths (56.8%, p<0.002) vs. old (51.7%) and very old (50.2%). 32-channel EEG showed early visual ERP difference (~100ms, right parieto-occipital) for correct vs. incorrect classification of deceased photos (cluster-corrected p<0.05). Machine learning (random forest, logistic regression) on 11 image features failed to exceed chance, ruling out simple visual cues. Results suggest some individuals can weakly discriminate mortality status from facial photographs via unknown mechanism.