Abstract
Social interaction supports brain health and recovery after neurological injury. Yet no validated tool exists for real-time measurement in individuals with and without neurological deficits. We developed SocialBit, a lightweight, privacy-preserving machine learning algorithm that detects social interactions using ambient audio features on a commercial smartwatch. In a prospective validation study, we evaluated SocialBit against livestream minute-by-minute human-coded ground truth in 153 hospitalized stroke patients who wore the device for up to 8 days, generating 88,918 min of observation. In these patients, the stroke severity and cognition spanned broad clinical ranges (NIH Stroke Scale 0-25; Montreal Cognitive Assessment 8-30), and 24 patients had aphasia across diverse subtypes, including severe presentations. SocialBit achieved high overall performance (sensitivity 0.87, specificity 0.88, area under the curve 0.94) and maintained accuracy in patients with language deficits (AUC 0.93). Despite lower temporal sampling, SocialBit produced interaction frequency distributions closely matching minute-by-minute human coding. Performance was robust across environments and interaction types. Of clinical relevance, SocialBit showed that patients with more severe strokes engaged in less social interaction, paralleling human-coded results. SocialBit is an accurate digital biomarker of social interaction with potential applications in remote monitoring and clinical trials.