This work matters because it shifts how we study cognition. If peripheral signals like whisker twitches, eye movements, or tiny facial muscle changes reliably mirror internal computations, scientists can track mental states in more natural behavior. That opens paths to studying learning, attention, and spontaneous decision-making in animals and people while they behave freely rather than being restrained by lab equipment.

Thinking about human potential, these findings suggest new tools for inclusive research and real-world applications. Imagine monitoring attention or learning progress through noninvasive sensors in classrooms, clinics, or everyday life. Follow the full article to see how covert brain activity shows up on the face and what that might mean for expanding access to cognitive measurements and supporting diverse learners and patients.
Brain and body are fundamentally intertwined. A recent study from Cazettes et al. reveals how latent decision variables in the mouse brain can be read out from facial expressions. This shows how covert cognitive computations ‘leak’ into the periphery, and opens new opportunities for tracking cognitive processes through bodily measurements.