Showing posts with label CPU. Show all posts
Showing posts with label CPU. Show all posts

Tuesday, September 5, 2023

The neural network is the mind of the tiger. The neural network is the mind of the tiger.

 The neural network is the mind of the tiger. 


The difference between traditional computers and neural networks is the role of the CPU. In traditional computers, the CPU (Central Processing Unit) makes all work, and that causes limits in computing. In neural networks, the CPU preprocesses data that senses send into it. That means the CPU will just recognize the situation and then resend it to a sub-processing unit. The receiving sub-system must have responsibility for that kind of case. That means the subsystem has databases where there are reactions for certain types of events. 

In neural networks, all sub-networks use sorted data. And they are meant to respond to things that senses are sending to the neural network. Each of the subnetworks has certain operational or response areas. And the CPU is like some kind of router. The system finds certain details from the data. Then it can route the information to the right subnetwork. In real life, neural networks could be multiple subnetworks. They can act as AI-based event handlers. 

The complete neural network requires that all sub-networks have also a physical system that runs them. In that case, the system has multiple workstations that run the AI-based systems. In that case, the system will decrease the use of the top CPU. If the system uses multiple independently operating workstations. At can handle multiple problems at the same time. Error detection happens when the neural network uses two or more independently operating data handling units at the same time. 






In an ideal situation, the CPU of the neural network is working as a router that can send information to independently operating subsystems. The independent subsystem has a physical computer and its CPU. In that case, the common CPU of those multiple networked workstations sends a message to a certain subunit, and then the AI-based subunit will make the mission and the system sends the data back to the CPU. 

The shape of the neural network is free. The neural network can be a drone swarm. Or it can be the lots of workstations in some warehouse. The neural networks can collect information from multiple sources. And the neural system's power is that it shares data between multiple computers. The neural network doesn't necessarily learn anything. But it can collect data from very large areas. 

The learning neural network follows some parameters for making its duty. The thing that can be the parameter is the drone that operated the longest time in a certain area. The system can store information about the routes and other things that drones use. Then it can multiply the longest-standing drone's operational profile to another drone. 

And of course system must have multiple parameters like when drones were seen, and where the first impact came. The system interconnects the data from all members of the drone swarm. And then it tries to create an ideal flight path to the target. Here we must remember that the same drone that used to deliver pizza can deliver at least hand grenades. The neural network can interconnect multiple systems like surveillance cameras, satellites, and other things like drone swarms. 

Tuesday, August 29, 2023

Integrated AI is the tool of the next generation.

 Integrated AI is the tool of the next generation. 


Do you know what a kernel is? Kernel is the border computer program that connects hardware to software in computers. The kernel is the code that microchips and other components use when they interact with each other. 

The integrated AI means that the AI is integrated with microchips or their control program. That thing means that the kernel turns into AI. And that could mean that the Chat GPT type program is programmed straight in microchips. 


That allows the computer to program other computers and robots. And that ability makes that kind of combination super hot. 


The Chat GPT is a chatbot. That means it will not control the robots themselves. But that system can create program code that can control robots when they are operating. This ability makes robots flexible and adapts tools because the AI can modify their control codes anytime. That it can adapt to the environment. 

The flexible neuro-network is like brains. The thing, that differs human brains from animals is the number of different skills. Computers are acting like human brains. They store data that they need in records or databases. The system can make it easier to find necessary data records sorting them under certain topics. 

The neural network-based computer architecture works similar way as one computer. That stores data on a hard disk. In neural network-based architectures, the system uses multiple computers and hard drives to store information. In the ideal case, all databases have their own physical devices. That thing makes it possible for the system to be effective. It can share the responses with multiple CPUs in multi-level coordination. 


There are no limits to the neural network's size.


The multi-level coordination is important for physical systems like robots. 


When the system switches on all databases are telling the CPU that handles their operations where that CPU finds them. That data is stored in routers.  There might be multiple CPUs. And each of them has responsibility for one skill area that the robot has. There could be a CPU with certain database connections for emergencies. There is a hierarchy in the CPU network whose purpose is to save the robot if it slips on the floor. 

The artificial reflexes require an integrated- microchip-based database that can help robots react to things like slipping on the floor.  


So the "skills of the left hand" are under one topic. Like in human brains, all information is stored in cells that are databases. In computer-based neural networks, the computer stores all information in databases with the same purpose as memory cells. When the system requires some skills it must first recognize the situation. The computer searches databases that information matches with situations that sensors are telling it. And can give responses for actions that happen around the computer. 

The multi-layer system is the ultimate tool for that kind of system. In a multi-layer system CPU that controls senses is different than the CPU that controls movements. The interconnected database network might be millions of databases in subnetworks. In an ideal situation, each database has its physical processor that interconnects records with each other. 

In a multi-layer system, there must be some kind of artificial reflexes. When a robot slips on the floor there must be data that the robot can use immediately when it sees that the acceleration sensors of the robot are out of balance. That data that makes the robot put a hand to stop falling is easy to store straight in the microchips. 

When the system requires some information or code it must find the right databases. If the computer must search all databases one by one that takes time. So there must be a better mode. 

In the active model, each database will report for the CPU (Central Processing Unit) that it's ready. During that process database can tell that "I'm database 3 that controls left-hand movements, and the CPU can find me behind connection number 3". 

That thing makes routers route data between the CPU and the left-hand's database. In network-based systems, there might be thousands of databases. In each database are thousands of records. There is a possibility that every database has its central processing unit that can search the right records. And then those databases can interconnected. 


What was before the Big Bang (Part II)

 What was before the Big Bang. (Part II) "Our universe could be the mirror image of an antimatter universe extending backwards in time....