Led by researchers in CMU's
Dietrich College of Humanities and Social Sciences, the study had a
group of actors look at words like anger, disgust, envy, fear,
happiness, lust, pride, sadness and shame. As they did so, the actors
tried to bring themselves to this emotional state. Their brains were
monitored by fMRI and a computer modeled the results.
Based on these scans, the
computer model could then correctly guess the emotion of the actors when
they were shown a series of evocative photos. Each emotion essentially
had a neural signature. The patterns of brain activity the computer
learned were not limited to those individuals. Based on the scans of the
actor’s brains, the computer model could correctly identify the
emotions of a new test subject who had not participated in the earlier
trials.
No comments:
Post a Comment