BRAVE NEW WORLD: Researchers Control Multiple Drones At Once With The Human Mind

Source – jneurosci.org

“…The rise in brain-computer interfaces sets the stage for embeddable devices which will decode human thoughts and translate them into robotic actions, which is what the Arizona researchers have accomplished. The Mind Drone has countless applications but also has many consequences. The teams goal over the next couple of years is to have a hybrid team of ground vehicles, mobile robots, and aerial vehicles”:

(BEAST TECH Mind Control: Researchers Control Multiple Drones At Once With The Human Mind)

Researchers from the University of Arizona control multiple drones at the same time with the human mind, wirelessly. However, this is not the only development as of late regarding mind control and brain computer interfaces. Watch the video below to see it in action.

Mind control is a dangerous field, and with the advancements in technology, mind control is no longer a fringe topic, rather it is now the focal point of a governmental project for mapping the human mind. Researchers from Arizona University have created algorithms which can control up to four drones at a single time just using their thoughts.

The rise in brain-computer interfaces sets the stage for embeddable devices which will decode human thoughts and translate them into robotic actions, which is what the Arizona researchers have accomplished. The Mind Drone has countless applications but also has many consequences. The teams goal over the next couple of years is to have a hybrid team of ground vehicles, mobile robots, and aerial vehicles.

An Arizona State University researcher named Panagiotis Artemiadis has come up with a way to use the human mind to control robotic drones. An operator simply has to put on a skullcap that is wired directly to a computer with 128 electrodes. Brain activity is recorded and every time the controller has a thought or moves his hand, certain areas on the computer begin to light up.

Decoding human thoughts and actions gives rise to technology which can control those thoughts and actions. The mind drone is only one of several new technological advances, the machine which can read human thought is another. The University of Oregon recently released that they have developed a system that can read people’s thoughts via brain scans, and rebuild the faces they were visualizing in their heads. Plus, IARPA recently began seeking future predicting technology and algorithms which will pave the way for thought crimes.

http://www.jneurosci.org/content/36/22/6069.short?sid=39264297-467f-403a-9806-429798f2237c

Related…

Reconstructing Perceived and Retrieved Faces from Activity Patterns in Lateral Parietal Cortex – By Hongmi Lee1 and Brice A. Kuhl2

Abstract

Recent findings suggest that the contents of memory encoding and retrieval can be decoded from the angular gyrus (ANG), a subregion of posterior lateral parietal cortex. However, typical decoding approaches provide little insight into the nature of ANG content representations. Here, we tested whether complex, multidimensional stimuli (faces) could be reconstructed from ANG by predicting underlying face components from fMRI activity patterns in humans. Using an approach inspired by computer vision methods for face recognition, we applied principal component analysis to a large set of face images to generate eigenfaces. We then modeled relationships between eigenface values and patterns of fMRI activity. Activity patterns evoked by individual faces were then used to generate predicted eigenface values, which could be transformed into reconstructions of individual faces. We show that visually perceived faces were reliably reconstructed from activity patterns in occipitotemporal cortex and several lateral parietal subregions, including ANG. Subjective assessment of reconstructed faces revealed specific sources of information (e.g., affect and skin color) that were successfully reconstructed in ANG. Strikingly, we also found that a model trained on ANG activity patterns during face perception was able to successfully reconstruct an independent set of face images that were held in memory. Together, these findings provide compelling evidence that ANG forms complex, stimulus-specific representations that are reflected in activity patterns evoked during perception and remembering.

SIGNIFICANCE STATEMENT Neuroimaging studies have consistently implicated lateral parietal cortex in episodic remembering, but the functional contributions of lateral parietal cortex to memory remain a topic of debate. Here, we used an innovative form of fMRI pattern analysis to test whether lateral parietal cortex actively represents the contents of memory. Using a large set of human face images, we first extracted latent face components (eigenfaces). We then used machine learning algorithms to predict face components from fMRI activity patterns and, ultimately, to reconstruct images of individual faces. We show that activity patterns in a subregion of lateral parietal cortex, the angular gyrus, supported successful reconstruction of perceived and remembered faces, confirming a role for this region in actively representing remembered content.

http://www.jneurosci.org/content/36/22/6069.short?sid=39264297-467f-403a-9806-429798f2237c

3 thoughts on “BRAVE NEW WORLD: Researchers Control Multiple Drones At Once With The Human Mind

Leave a reply to depatridge Cancel reply