Task-driven neural network models predict neural dynamics of proprioception. (https://pubmed.ncbi.nlm.nih.gov/38518772/)

These scientists wanted to understand how our brain knows where our body parts are without looking at them. They focused on a part of the brain called the cuneate nucleus and another part called somatosensory cortex area 2.

To figure this out, they used a computer to create models of how our muscles send signals to the brain when we move. They then made the computer learn different ways to predict what the brain would do based on these signals. They tested 16 different ideas to see which one worked the best.

They found that the computer could predict what the brain would do when we move our limbs by looking at the signals from the muscles. This means our brain is really good at knowing where our arms and legs are even when we're not looking at them. They think that when we move our limbs on purpose, our brain tells these parts of the brain to pay extra attention to what's happening.

Marin Vargas A., Bisi A., Chiappa AS., Versteeg C., Miller LE., Mathis A. Task-driven neural network models predict neural dynamics of proprioception. Cell. 2024 Mar 28;187(7):1745-1761.e19. doi: 10.1016/j.cell.2024.02.036. Epub 2024 Mar 21.

ichini | 7 months ago | 0 comments | Reply