Skip to main content

The thing is that AI requires control. But when we want to control something, we must have the right arguments to justify those things.

 The thing is that AI requires control. But when we want to control something, we must have the right arguments to justify those things.

Non-democratic governments use the same algorithms that are used to search web cheaters for searching people, who resist governments. And AI-based programming tools can create new operating systems for missiles.  And they can be used to create malware tools for hackers, who can work for some governments. 

We living in a time where a single person can keep ultimate power in their hands. The Internet and media are turning tools that can conduct ultimate power. And there is the possibility that some hackers will capture the homepages of some news agencies. That means the person who makes those things must not be some governmental chief. An ordinary hacker can capture and change information on governmental homepages. The problem is that we are waking up to a situation in that in some countries the media has the purpose to support governments. 

And another thing is that we faced the situation, that some hackers are operating under governmental control. That means they have permission to do their work. And in countries where censorship isolates people inside the internet firewall that allows only internal communication, the position of a governmental hacker offers free use of the internet, which is a luxury for people who cannot see even Western music videos. 

We see many times arguments against AI. And the biggest and most visible arguments are not big things. That we should be afraid of. We should afraid of things like AI-controlled weapons, and then we must understand is that robots and AI democratize warfare. The biggest countries are not always winners. Ukraine would lose to Russia many times without robots. 

We don't understand that the same system that delivers pizza from drones can use to drop hand grenades in enemy positions. That technology is deadly in the wrong hands. But is the thing that we think of as a "threat" the threat that some terrorists can send the drone and drop grenades in some public place, or is it that we can no anymore predict the winner of the conflicts as easily as before? If we support the wrong side, that thing causes problems. 

AI is a game changer in warfare, and that's why we must control that thing. In the same way, we should start to control advanced manufacturing tools like 3D printers. The 3D printers can make guns. Maybe those guns do have not the same quality as the Western army guns. But criminals can use those tools. 

And when we see the quality of the Russian military armament, we can think that the 3D printer can make a gun with the same quality as the Russian military used guns. But is that the reason why we resist that technology? Or is it that if we transfer all practical work robots we don't have human henchmen?

In this case, we must say, that it's not cool to be boss to robots. Robots are tools. They are machines, and that means if we want to yell something at robots, it's not the same thing as we would yell at human henchmen. Robots don't care what kind of social skills people have. And if we yell to robots that thing is the same as we would yell to some drill or wrench. 



Another thing when we talk about AI and algorithms is that the Russian, Chinese, Iranian, and other non-democratic governments use the same algorithms that are used to search web cheaters for searching people, who dare to resist the government. 

Then we must realize that things like automatized AI-based encoding tools are making it possible to create ultimate tools for hacking. And those tools can use to create computer viruses that are taking nuclear power plants under control. Professional nuclear security experts say that it's impossible to take nuclear power plants under control. There is always a local manual switch. 

That drops the adjustment rods in a nuclear reactor. But then we must understand that if that emergency shutdown switch is in the nuclear reactor hall, and the protective water layer that absorbs nuclear radiation is lost turning the reactor off is impossible. So that switch must be outside the reactor room so that the operator can turn the reactor off. Only small error in drawings causes the emergency system not to operate as it should.  

And that is the thing that makes the AI an ultimate tool. The AI can control things and follow that people who have the responsibility to control that everything is done right are making their job. It can search for weaknesses in those drawings. But if its databases are corrupted. That thing can turn AI into the worst nightmare that we ever seen. 

We must control AI development better. But the arguments that people see are wrong. The worst cases are the free online AI applications that can generate any application that people dare to ask for. This kind of AI-based system can turn into a virus or malware generator that can infect any system in the world. And in some scenarios, the hacker who doesn't know what system that person uses can cause nuclear destruction or even begin a nuclear war. 

If the hacker accidentally slips into the nuclear commanding system and thinks that is some kind of game, that thing can cause a situation where the system opens fire with nuclear warheads. One possible scenario is that hacker just crosslinks some computer game to a nuclear command system. Or the hacker accidentally adjusts the speed of the centrifugal separator. That separates the nuclear material for use in nuclear power plants. In that case the system can make too rich nuclear fuel. And that causes the nuclear reactor melts down. 

AI is the ultimate tool that makes life easier, but that same tool is the ultimate weapon in the wrong hands. So ultimate tool can turn into the ultimate enemy. But when we are looking at arguments against AI the excuse is not thing that AI can create ultimate data weapons, or AI can control armies. The argument is that AI takes jobs from bosses, and AI makes better jobs than humans. 

Same way as many times before, privacy and other kinds of things like legislation are things, that are used against the AI. Rules, prohibitions, and other things are artificial tools. They are very weak tools if the argument is that people should do something because that guarantees their privacy. Privacy and data security are things that force people to use things like paper dictionaries and books because the information is more secure when a person cannot use things like automatized translation programs. 

The fact is that AI requires control, but the arguments must be something else, than prohibiting the AI development or use of AI tools guarantees the position of the human boss. The fact is this. Things like privacy are small things. If we compare them with the next-generation AI, which can create software automatically. Privacy is an important thing, but how private our life could be? We can see things like is a person under guardianship just by looking at their ID papers. 

Things like working days in the office are always justified by using social relationships as an argument. But how many words do you say to other people during your working day? When we face new things we must realize that nothing is black and white. Some things are always causing problems. New things always cause resistance. And of course, somebody can turn the food delivery robot into a killer robot, by equipping them with machine guns. 

Those delivery tools allow people to have access to somebody's home address. But same way if we use some courier service for transporting food to us, we must give our home address. And there is the possibility that the food courier is some drug addict. That thing always causes problems with data security. But we don't care about those things, because there is human on the other side. And maybe that thing makes the AI frightening. AI is nothing that we can punish. We cannot mirror how good we are to robots. 

Maybe the threat that we see when we talk about robot couriers and AI is that we lose something holy. We lose the object that is worse than we are. Robots are like dolls. We can say everything that we want to the robot, and the robot is always our henchman. And that is one of the things that is making the AI frightening. We think that the AI is like a henchman, and what if we lose a chess game to AI? 



Comments

Popular posts from this blog

Schrödinger's cat: and the limits of that idea.

"In quantum mechanics, Schrödinger's cat is a thought experiment concerning quantum superposition". (Wikipedia, Schrödinger's cat). But the same thing can use as model for many other thought experiments.  Sooner or later, or at least in the ultimate end of the universe, the Schrödinger's cat will turn into wave movement. The information that this cat involved exists but the cat does not exist in its material form. The information doesn't ever vanish. It just turns its shape.  We are all trapped in the universe and time. The universe is the space that is entirety to us. There are no confirmed other universities. But the multiverse is a logical continuum for the expanding galactic megastructures.  The problem with natural things is this. They are black and white. They exist or do not exist. Could there be something, that exists and not exists at the same time?  Scrödinger's cat is thinking experiment about case their cat is not dead or not alive. But in this...

The string theory offers a new way to calculate Pi.

"Scientists discovered a new series for pi through string theory research, echoing a 15th-century formula by Madhava. By combining Euler-Beta Functions and Feynman Diagrams, they modeled particle interactions efficiently. Credit: SciTechDaily.com" (ScitechDaily, String Theory Unravels New Pi Formula: A Quantum Leap in Mathematics) People normally think that. The pi is the ratio of the circumference circle's circumference to the circle's diameter. The Pi is a mathematical constant 3.14159..., the endless decimal number. The Pi is interesting because developers can use that decimal number to make the encryption algorithms stronger.  The idea is that the encryptions program hides the message's original ASCII numbers by multiplicating those numbers with some decimal number. Or the system can add some numbers to those ASCII numbers.  "Aninda Sinha (left) and Arnab Saha (right). Credit: Manu Y" (ScitechDaily, String Theory Unravels New Pi Formula: A Quantum Le...

There are always more than three actors in the real world.

"An international research team is advancing precision timekeeping by developing a nuclear clock using thorium isotopes and innovative laser methods, potentially transforming our understanding of physical constants and dark matter. (Artist’s concept.) Credit: SciTechDaily.com" (ScitechDaily, Unveiling the Thorium Nuclear Clock and Its Time-Twisting Secrets) From Three-body problem... There are no pure three-body systems in nature. There are always more than three components in the system. For making real three-body systems we must separate those three bodies from the environment. Otherwise, there are stable effects. But nobody can predict some effects like distant supernova explosions or sun eruptions.  And one of those things that affect all bodies is time. When radioactive materials decay. That affects the stability and symmetry of the object.  Energy levels affect the existence of things like neutrons. The thorium atom clocks are next-generation tools for time measurement....