Showing posts with label interface. Show all posts
Showing posts with label interface. Show all posts

Tuesday, October 25, 2022

Researchers can decode thoughts from a distance.




New research has confirmed that the internal language of the brain is universal. And researchers can decode thoughts without even touching the person. Maybe that thing makes it possible to project other people's thoughts and memories onto the computer screen. The system that is based on MRI (Magnetic Resonance Image) can be put in the drones. 

And then the drones equipped with nano-size MRI sensors can fly around the head of the targeted person. And they can send the information to computers by using the internet. These kinds of systems allow the creation of a new type of BCI (Brain-Computer Interfaces). And there is no need for special sensors or special equipment like surgically installed microchips. 


 Those new sensors don't need any special skills to make them operational.  


The new BCI interface is easy to use. Wearing that system is like wearing a hat on the head. 

The BCI systems can use regular internet for remote controlling the human-looking robots. And human-looking robots can do the same thing as real humans. 

The system can send the carrier wave through a person's brain, and then the sensor on another side will receive the EEG. Or the system's user wears only a bandanna where those sensors are. And then the system must just understand the things that those people think. The idea is that the new type of BCI system is easy to wear like the hat. And the same way is easy to remove. The new jetfighters planned to have BCI-based interfaces. There is also the possibility that pilots can fly an entire plane by using brain waves. 

So the idea is that pilot can just sit on the chair and then control the systems by using sensors that are in the helmet. Or the pilot who sits on the ground can control the human-looking robot that sits in the cockpit. Because that kind of system are interacting straight with the brain. For the user, there is no difference between reality and virtual reality. And that thing can use to create so-called false memories. If the system can install on the aircraft's dashboard, it can send the carrier wave through the pilot's head to the seat where the receivers are. 

And operators can also control things like human-looking robots by using the BCI. The BCI is one of the most radical operating systems in the world. And if that system where human controls computers by using their brain waves turns into reality, it can revolutionize robotics. The new quantum sensors are far more accurate than old-fashion radiotelescopes. And that thing makes it possible to create systems that can read EEG from a long distance. 


https://www.bbc.com/news/business-62289737


https://www.englishforums.com/news/controlling-machines-using-mind/


https://www.sciencealert.com/new-technique-for-decoding-peoples-thoughts-can-now-be-done-from-a-distance


https://link.springer.com/chapter/10.1007/978-3-030-72254-8_20


https://en.wikipedia.org/wiki/Brain%E2%80%93computer_interface


Image: https://www.sciencealert.com/new-technique-for-decoding-peoples-thoughts-can-now-be-done-from-a-distance


https://designandinnovationtales.blogspot.com/

Tuesday, September 7, 2021

Wireless nano-scale neural sensors allow making better BCI(Brain-Computer Interface)



The new nano-scale neural sensors allow making the new and more effective BCI (Brain-Computer Interface), what the old-fashion single-chip systems ever can be. The BCI system could be one of the biggest advantages in computing that has ever been invented. And the BCI is the system that turns the thought into physical actions. 

The nano-scale BCI can position membrane over certain brain areas. When those microchips are positioned over the brain membranes those surgical operations are easier to make. If doctors are trying to connect those chips straight with the neural structure that operation requires brain surgeons. 

If the microchips are over brain membranes. The surgeons have not touched the neural tissue when they implant those chips. If the microchips do have not straight contact with neural tissue installing them doesn't need the brain surgeon. So in this multi-chip BCI system over each brain area is an independently operating microchip that can use neural electricity as a power source. 

When there is a small and thin microchip over every brain area like motion brain area. That allows separating the EEG from every area better than using one microchip. Another advantage could be that those microchips are not put inside brains. They can be the EEG sensors that are operating wirelessly through the skull or skin. If those microchips have enough accurate sensors they can position under the skin of the  

They can position over the membranes that are protecting the brain or neural tissue. The system can communicate with every type of device by using BlueTooth or some other mobile communication protocol. When we are talking about the BCI control robots and other systems. We must first make neural access to the computer. And then through the computer interface use the BMI (Brain Machine Interface). 

The use of the BCI requires that the system gets accurate EEG curves in use. And if over every brain area is an independent microchip that allows getting very accurate information of the electric operations of the independent brain areas. And if we think that BCI is used in two-ways communication between the computer and the user. 

The microchips that are put over the sensorial brain areas can transmit data to the nervous system. And that would make the computer act with the human mind without borders. But how the BCI will input the data to the computer. The idea is to use the same brain areas that are making it possible to write the words.  

Writing is like speaking. But during writing, the movements that are reserved for the mouth are moved to the hands and fingers. So recording the EEG from the brain lobes that are producing the words before the hands are writing them. Those EEG pulses can transfer to the computer. On the computer, those EEG curves are transformed into letters and they will send to a modified text-to-speech application. And then that application will send the text to the interface that operates computers or robots that are connected with that computer. 

()https://scitechdaily.com/wireless-microscale-neural-sensors-enable-next-generation-brain-computer-interface-system/


Image:()https://scitechdaily.com/wireless-microscale-neural-sensors-enable-next-generation-brain-computer-interface-system/



What was before the Big Bang (Part II)

 What was before the Big Bang. (Part II) "Our universe could be the mirror image of an antimatter universe extending backwards in time....