There was a brain implant that "reads thoughts" and controls the computer

The device will be presented at the 74th annual meeting of the American Academy of Neurology, which will be held in

Seattle from April 2 to April 7, 2022. The creators said they tested the implant and confirmed that it is safe. They say it will help people with paralysis use a computer for everyday tasks.

ALS is a progressivea neurodegenerative disease that affects nerve cells in the brain and spinal cord. People with ALS lose the ability to control muscle movements, often leading to complete paralysis.

People with ALS eventually lose the ability to movelimbs, so they cannot control devices such as a phone or computer. We decided to create our own brain-computer interface device: it receives electrical signals from the brain and allows people to control the computer with their mind.

Bruce Campbell, MD, Professor at the University of Melbourne in Australia

In the study, four people with ALSunderwent the procedure of implanting a device into the brain. The brain-computer interface is placed through one of the two jugular veins in the neck and into a major blood vessel in the brain. The device is made of mesh material and has 16 sensors. It can also expand to line the vessel wall.

The device is connected to another electronicsensor in the chest. Then it begins to transmit all brain activity from the motor cortex—this is the area that generates signals for movement. 

The researchers observed the participants foryears and concluded that the device is safe. There were no serious side effects that resulted in disability or death. The sensor was located all year exactly in the place where it was installed. 

The authors also examined whether participants coulduse a brain-computer interface to perform common tasks. All volunteers were trained to use an eye-tracking device to operate the technology. Eye tracking technology helps the computer determine what a person is looking at and correctly execute his command. 

The researchers also stated that the decoder,developed during the study allowed one participant to independently operate a computer without an eye tracker. The machine learning decoder was programmed like this: when a person asked the participants to move, for example, tap their foot or straighten their knee, the decoder analyzed the signals and could convert this into computer navigation.

Read more:

Scientists have named the first sign by which to look for extraterrestrial life

Potentially dangerous asteroid will approach Earth on April Fools' Day

Genetics Beat Cat Allergies With CRISPR