Skip to main content

Removing unnecessary work makes high-power computers more effective.

    Removing unnecessary work makes high-power computers more effective. 


A new type of data analysis boosts the calculation power. The idea is to use tensors that connect the operands. If we multiply something with zero the answer is zero. And even if the calculation inside the brackets is very complicated, but the entirety is multiplied with zero like this (X+Y+z)*0=0. We can scale this thing to all calculations the result is always zero. 

That means multiplying something inside brackets with zero means zero, and that means we can leave it without attention. In AI research the value zero can mean that the system removes the calculation, or it can wait for the next data packet to get more information. The values zero and one can mean that in zero system waits. And if the answer is one or more. The system continues to the next operand. 

Machine learning requires that the computer can use multiple data sources. That requires the ability to handle very large data masses. One way to make that operation easier is to remove unnecessary data from that mass. And the thing how researchers can make that thing is simple. They can remove all zeros from the operand. If the result of some calculation is zero that means the result is "empty". This data-handling protocol can also in robotics to make its reactions faster. 

If the robot has three computers that act like the human brain. Two main computers and a third computer make sure that the system cannot get stuck if two main data handling units get different answers. So if the system removes all cases from the third computer when two main units get the same answer that removes unnecessary work. In this system, the third computer selects an answer, that it uses from the different answers that the main computers give. But that is important only when two main components get different answers. 


"MIT and NVIDIA researchers have created two techniques to enhance sparse tensor processing, improving performance and energy efficiency in AI machine-learning models. These techniques optimize zero value handling, with HighLight accommodating a variety of sparsity patterns and Tailors and Swiftiles maximizing on-chip memory utilization through “overbooking.” The developments offer significant speed and energy usage improvements, enabling more specialized yet flexible hardware accelerators." (ScitechDaily.com/New Techniques From MIT and NVIDIA Revolutionize Sparse Tensor Acceleration for AI)



"Components of the Cauchy stress tensor in Cartesian coordinates" (Wikipedia, Tensor). 



"Innovative new chip technology integrates data storage and processing, significantly boosting efficiency and performance. Inspired by the human brain, these chips, expected to be market-ready in three to five years, require interdisciplinary collaboration to meet industry security standards." (ScitechDaily.com/Innovative new chip technology integrates data storage and processing, significantly boosting efficiency and performance. Inspired by the human brain, these chips, expected to be market-ready in three to five years, require interdisciplinary collaboration to meet industry security standards."

Wikipedia determines tensor like this: 

In mathematics, a tensor is an algebraic object that describes a multilinear relationship between sets of algebraic objects related to a vector space. Tensors may map between different objects such as vectors, scalars, and even other tensors. There are many types of tensors, including scalars and vectors (which are the simplest tensors), dual vectors, multilinear maps between vector spaces, and even some operations such as the dot product. Tensors are defined independent of any basis, although they are often referred to by their components in a basis related to a particular coordinate system; those components form an array, which can be thought of as a high-dimensional matrix. (Wikipedia/tensor)


So if two main computers get the same answer. It's not necessary to send that thing to the third computer. 


Removing unnecessary work makes high-power computers more effective. However, the problem is how the system selects unnecessary work. The system must have certain parameters to detect if the operation is unnecessary. 

One method is to find zeros from the data mass. If zero scales itself over calculations and calculation series. That means the operation is empty. The system can remove empty operations. That doesn't cause action. And that saves time and work in complicated and heavy calculations. 

Things like quantum computers require high-power binary computers. The data will driven to qubits by using AI-based binary computers that control qubits and their states. The quantum computer itself is useless. It requires a binary level that drives data input into the quantum system. And if those systems operate without friction that revolutionizes computing. 

If we think that the AI or some other computing system has two sides that operate with the same problems the zero can mean that there is no conflict. In this model, two computers or data handling lines are operating with the same problem. And when they introduce their result the system will make a subtraction between solutions. 

If the answer is zero, that means the system has no conflict between answers. That allows data compilation that sensors bring to the system as well as the effectiveness of data storage. 

In the secured system, the data storage can store information about things like when the file is open. And things that the file involves. The system can make two backup copies of the files, and if somebody tries to change data in those files or change the file. That system tells that thing to superior officials who accept or deny those changes. 


https://scitechdaily.com/new-techniques-from-mit-and-nvidia-revolutionize-sparse-tensor-acceleration-for-ai/

https://scitechdaily.com/twice-as-powerful-next-gen-ai-chip-mimics-human-brain-for-power-savings/


https://en.wikipedia.org/wiki/Cauchy_stress_tensor

https://en.wikipedia.org/wiki/Tensor

Comments

Popular posts from this blog

The LK-99 could be a fundamental advance even if it cannot reach superconductivity in 400K.

The next step in superconducting research is that LK-99 was not superconducting at room temperature. Or was it? The thing is that there is needed more research about that material. And even if it couldn't reach superconductivity in 400K that doesn't mean that material is not fundamental. And if LK-99 can maintain its superconductivity in 400K that means a fundamental breakthrough in superconducting technology.  The LK-99 can be hype or it can be the real thing. The thing is, anyway, that high-voltage cables and our electric networks are not turning superconducting before next summer. But if we can change the electric network to superconducting by using some reasonable material. That thing can be the next step in the environment. Superconductors decrease the need to produce electricity. But today cooling systems that need lots of energy are the thing that turn superconductors that need low temperatures non-practical for everyday use.  When the project begins there is lots of ent

Black holes, the speed of light, and gravitational background are things that are connecting the universe.

 Black holes, the speed of light, and gravitational background are things that are connecting the universe.  Black holes and gravitational waves: is black hole's singularity at so high energy level that energy travels in one direction in the form of a gravitational wave.  We normally say that black holes do not send radiation. And we are wrong. Black holes send gravitational waves. Gravitational waves are wave movement or radiation. And that means the black holes are bright gravitational objects.  If we can use water to illustrate the gravitational interaction we can say that gravitational waves push the surface tension out from the gravitational center. Then the other quantum fields push particles or objects into a black hole. The gravitational waves push energy out from the objects. And then the energy or quantum fields behind that object push them into the gravitational center.  The elementary particles are quantum fields or whisk-looking structures. If the gravitational wave is

The CEO of Open AI, Sam Altman said that AI development requires a similar organization as IAEA.

We know that there are many risks in AI development. And there must be something that puts people realize that these kinds of things are not jokes. The problem is how to take control of the AI development. If we think about international contracts regarding AI development. We must realize that there is a possibility that the contract that should limit AI development turns into another version of the Nuclear Non-Proliferation Treaty. That treaty didn't ever deny the escalation of nuclear weapons. And there is a big possibility that the AI-limitation contracts follow the route of the Nuclear Non-Proliferation Treaty.  The biggest problem with AI development is the new platforms that can run every complicated and effective code. That means the quantum computer-based neural networks can turn themselves more intelligent than humans. The AI has the ultimate ability to learn new things. And if it runs on the quantum-hybrid system that switches its state between binary and quantum states,