Jump to content
  • Current Donation Goals

    • Raised $390 of $700 target

Mind-reading is becoming a reality. No, really.


niremetal

Recommended Posts

  • Premium Member

http://www.economist.com/node/21534748

IF YOU think the art of mind-reading is a conjuring trick, think again. Over the past few years, the ability to connect first monkeys and then men to machines in ways that allow brain signals to tell those machines what to do has improved by leaps and bounds. In the latest demonstration of this, just published in the Public Library of Science, Bin He and his colleagues at the University of Minnesota report that their volunteers can successfully fly a helicopter (admittedly a virtual one, on a computer screen) through a three-dimensional digital sky, merely by thinking about it. Signals from electrodes taped to the scalp of such pilots provide enough information for a computer to work out exactly what the pilot wants to do.

That is interesting and useful. Mind-reading of this sort will allow the disabled to lead more normal lives, and the able-bodied to extend their range of possibilities still further. But there is another kind of mind-reading, too: determining, by scanning the brain, what someone is actually thinking about. This sort of mind-reading is less advanced than the machine-controlling type, but it is coming, as three recently published papers make clear. One is an attempt to study dreaming. A second can reconstruct a moving image of what an observer is looking at. And a third can tell what someone is thinking about.

I think the coolest one is this:

Unlike Dr Dresler, who focused on the sensorimotor cortex, which controls movement, Dr Gallant and his team looked at the visual cortex. Their method depended on the brute power of modern computing. They compared the film trailers frame by frame with fMRI images recorded as those trailers were being watched, and looked for correlations between the two. They then fed their computer 5,000 hours of clips from YouTube, a video-sharing website, and asked it to predict, based on the correlations they had discovered, what the matching fMRI pattern would look like.

Having done that, they each endured a further two hours in the machine, watching a new set of trailers. The computer looked at the reactions of their visual cortices and picked, for each clip, the 100 bits of YouTube footage whose corresponding hypothetical fMRI pattern best matched the real one. It then melded these clips together to produce an estimate of what the real clip looked like. As the pictures above show, the result was often a recognisable simulacrum of the original. It also moved (watch at gallantlab.org) in the same way as the clip it was based on.

Check out the full results from the test. Cool stuff.

Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...