Skip to main content

Kafka and data handling processes. (Grey persons in grey offices)




The offices are always grey. Everybody who is working there is grey, and their mission is to control the data. 

When we are staring at the film "Kafkaesque" above this text, we might realize that data handling is similar to bureaucracy. God is the thing that controls the data process and creates the protocol for handling data. For the system, the superuser who has the right for exterminating the system is the god. If we are thinking of the computer as the Kafkas's bureau. 

We are facing the problem of row-type computing. If the event handler is the man who is walking in the rows the event handler must always deliver its data. Without depending does the data contain enough information or not. When the event handler is taken the data, there is no way to stop the operation until the process ends. 

Order is the key element in computing. If there is chaos in the system, there is no way to make anything. The answer to the chaos problem is simple. Somewhere in the system sits the master controller. The superuser is using the "master's voice". And voices that "everybody halt" and orders the data handlers to read their data at the same time. Or the order can be given in the form, that the data handlers are reading the things that the data handling form requires. 

That means everybody should check do they have all the necessary papers with them. And if every single paper contains a serial number helps it to give new papers if some of those files are missing. And if somebody delivers the data from that bureau, the controller knows what paper is delivered to the street. And because all data handlers are handling the individual data type that helps to track the person who delivered unauthorized data. 

If the system uses two lines there is easy to see the differences in data. In the two-line data handling process every data handler has a pair. And if data that those handlers are carrying contains differences there is an error. The two-line data handling works that when the router brings data in the system, it also doubles the data. 

That makes it possible to see if there are some errors by benefiting from the differences in the data structures. In that system, everything is doubled including the rowing rooms. In that room, the data is delivered to the desk, and then the superuser compiles the papers or data tables and finds out are they identical. If they are identical there are no errors in the data lines. The data is traveling in rows and each paper is combined in the same entirety, which is like film. 

If the data travel in a row, where are many curves the other data handlers can check is the data that the data handler carries right and does it have the right form. But that requests that every data handler has the same data, and in that case, the system can check the traveling data. The check can be made that every single data handler is yelling at the same time. What kind of data should be in their papers. 

Or the persons can send the number of the papers that are delivered back to begin. And that thing is called checksum. The checksum tells is there some papers dropped to the floor. And if every paper is numbered the people can know immediately what paper is missing. So if something is missing the data handler can ask for the missing paper from the colleague who stands at the door. And that means that there is no need for remaking the entire paper stack. Only the missing papers are needed to make. 

If all data handlers are talking at different times, the result is chaos, where nobody can check the data. The man who is keeping himself hunger is the data handler, which is not used. The useless data is removed and the mark of that data turns weaker and weaker all the time when data stands useless. The data security might seem very well done if there are many doors between observer and data. But if the doors are not locked they are useless. 

We can think that the data that is traveling in the system is like black and white bugs. The system is like the giant labyrinth that must know which way it should route those data bites. At every corner in that labyrinth is the router. Or the person who knows where to guide each bite of data. When data is traveling in a mixed form that thing seems like a mess or chaos. But when the routers are sorting that thing turns data into an understandable form. 

Comments

Popular posts from this blog

The LK-99 could be a fundamental advance even if it cannot reach superconductivity in 400K.

The next step in superconducting research is that LK-99 was not superconducting at room temperature. Or was it? The thing is that there is needed more research about that material. And even if it couldn't reach superconductivity in 400K that doesn't mean that material is not fundamental. And if LK-99 can maintain its superconductivity in 400K that means a fundamental breakthrough in superconducting technology.  The LK-99 can be hype or it can be the real thing. The thing is, anyway, that high-voltage cables and our electric networks are not turning superconducting before next summer. But if we can change the electric network to superconducting by using some reasonable material. That thing can be the next step in the environment. Superconductors decrease the need to produce electricity. But today cooling systems that need lots of energy are the thing that turn superconductors that need low temperatures non-practical for everyday use.  When the project begins there is lots of ent

Black holes, the speed of light, and gravitational background are things that are connecting the universe.

 Black holes, the speed of light, and gravitational background are things that are connecting the universe.  Black holes and gravitational waves: is black hole's singularity at so high energy level that energy travels in one direction in the form of a gravitational wave.  We normally say that black holes do not send radiation. And we are wrong. Black holes send gravitational waves. Gravitational waves are wave movement or radiation. And that means the black holes are bright gravitational objects.  If we can use water to illustrate the gravitational interaction we can say that gravitational waves push the surface tension out from the gravitational center. Then the other quantum fields push particles or objects into a black hole. The gravitational waves push energy out from the objects. And then the energy or quantum fields behind that object push them into the gravitational center.  The elementary particles are quantum fields or whisk-looking structures. If the gravitational wave is

The CEO of Open AI, Sam Altman said that AI development requires a similar organization as IAEA.

We know that there are many risks in AI development. And there must be something that puts people realize that these kinds of things are not jokes. The problem is how to take control of the AI development. If we think about international contracts regarding AI development. We must realize that there is a possibility that the contract that should limit AI development turns into another version of the Nuclear Non-Proliferation Treaty. That treaty didn't ever deny the escalation of nuclear weapons. And there is a big possibility that the AI-limitation contracts follow the route of the Nuclear Non-Proliferation Treaty.  The biggest problem with AI development is the new platforms that can run every complicated and effective code. That means the quantum computer-based neural networks can turn themselves more intelligent than humans. The AI has the ultimate ability to learn new things. And if it runs on the quantum-hybrid system that switches its state between binary and quantum states,