Skip to content

Turning dreams into reality?

October 21, 2011


Recording our dreams when asleep and then watching them as movies when awake – it surely must be an idea fantasized by many of us. An experiment conducted at the University of California, Berkeley now shows that this may not be as far-fetched as it sounds.

The Berkeley researchers reconstructed movies of visual experiences of people from their measured brain activity. In essence, they were able to ‘see’ what the subjects had seen by just monitoring the activity in their visual cortex, an area in the back of the brain responsible for processing visual information.

Using functional Magnetic Resonance Imaging (fMRI), the scientists scanned areas of the visual cortex where neurons showed a high rate of electrical activity while three subjects watched hours of pre-determined movie clips (control data set). Active neurons spend more energy and thus induce an increase in the local flow of blood carrying oxygen-rich haemoglobin.

fMRI can detect the difference in magnetic properties of oxygenated and de-oxygenated haemoglobin, the so-called BOLD (Blood-Oxygen-Level Dependence) signal. Typically, the BOLD signals vary slowly with time, with changes occuring on the timescale of several seconds. In order to track fast dynamic changes occuring in neural activity while watching movies, the scientists decoded signals recorded from the control data set of movies using sophisticated signal and image processing techniques. The decoding method was able to identify the specific movie stimulus which induced a particular BOLD signal with more than 75 percent accuracy.

The researchers created a ‘dictionary’ of fundamental brain activity patterns associated with unique shapes, edges and motion. It was found that different areas (voxels) of the visual cortex are sensitive to different kinds of visual stimuli. For example, voxels responding to direct, head-on views were able to track only static or slow-moving images. On the other hand, voxels responding to peripheral views preferred to track high-speed motion. 

A random library of clips (separate from the control set) was built from 5000 hours of YouTube videos, and the ‘dictionary’ of basic patterns was used to predict the brain activity evoked by these clips. A new test set of movies was then shown to the subjects, and their brain activity was recorded using fMRI. Statistical techniques were used to identify 100 clips from the library whose predicted brain activity was most similar to the measured brain activity for each clip in the test set. These 100 clips were averaged together to generate a reconstruction of the visual experience of the subjects corresponding to each test clip.

The video at the top of this post shows some of the reconstructions – while they are not exactly HD quality, the qualitative features of the images are captured quite well. The authors propose an improvement in the quality of the reconstruction by having a larger library of clips to select from. They further suggest the tantalizing prospect of employing their technique to decode dynamic involuntary subjective mental states like dreaming or hallucinating.


Nishimoto, S., Vu, A., Naselaris, T., Benjamini, Y., Yu, B., & Gallant, J. (2011). Reconstructing Visual Experiences from Brain Activity Evoked by Natural Movies Current Biology, 21 (19), 1641-1646 DOI: 10.1016/j.cub.2011.08.031


From → Biology

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: