Showing posts with label parameters. Show all posts
Showing posts with label parameters. Show all posts

Sunday, December 11, 2022

AI will destroy the net in form as we know it.



Artificial intelligence-based search engines are already in use. Those systems are measuring time how long each user uses some web pages. The effectiveness of AI depends on the mass and quality of information that it can use. The number of clicks are meaning that the page is interesting. But the time that users spend on those home pages tells more about how interesting those pages are. 

If the web stores deliver information about how many clicks that system gets and how many products the customers make. That thing can make AI a very impressive tool. The AI can connect information from the different types of search results so that the system can find what attracts the user. 

Byt the new type of search engines can make the homepage that they show.  When the user inputs parameters to the AI the system can find the homepages where is that type of data. And then, the system can simply connect the paragraphs one after one. Then the AI can put the source where of those texts after those paragraphs. That kind of tool will destroy the regular internet. 

That kind of system is making traditional links unnecessary. And that means the indexing system of the search engines is unnecessary so they don't bring money to Google anymore. But at this point, the AI uses the indexes as the tool. There can be parameters that deny the AI use the successive results. Then the system can sort the source paragraphs to the order that the user wants to use. 

Maybe in the future, Google offers a change that users can select between AI-based search or traditional search. The fuzzy logic is the thing that makes AI better for handling things like chat pages. If the user wants to get some precise information. AI would not be the best possible way to search for information. 

But as I wrote earlier, the AI can put the sources below the texts where it took the information. Or the search engine offers the possibility, to jump between a link list and AI created web page. 


These kinds of things are the tools of the next generation. 


The AI can be an excellent painter, but the user gives parameters that it can use. 


If we want to make AI make paintings. We can just find a series of paintings or images on the net. Then, we must have a similar tool to scissors in the graphics program. After that, we can simply select or surround objects by using that tool and then name the object for the AI. The AI uses those objects as parameters that it uses to make a painting that is perfect for the customer. 

The names of those objects are free but those images form the matrix that AI uses to select details for the image, that it will make. The user can simply determine those selected details, as the parameters that the AI must use. Then the AI can search for similar things from the net. 

And after that, it will connect those images or parameters as the entirety. The user can give the system orders that it must use a certain number of human characters. But the AI can also calculate the average number of human images that AI finds from the images it uses. The AI can select also colors that are the most pleasing for humans. That kind of thing makes AI create very attractive art. 


https://fin.afterdawn.com/uutiset/2022/12/10/tekoaly-voi-tuhota-googlen


Image: Pinterest


Wednesday, January 5, 2022

And then the dawn of machine learning.

 And then the dawn of machine learning.

Image: Pinterest


Machine learning or autonomously learning machines are the newest and the most effective versions of artificial intelligence. Machine learning means that the machine can autonomously increase the data mass, sort the data and make connections between databases. That ability is making machine learning someway unpredictable. And that kind of thing makes the robot multi-use systems that can do the same things as humans. 

The reflex robot is a very fast-reacting machine. The limited operational field guarantees. that there is not needed a very large number of databases. And that means the system must not search the right database very often. That makes it very fast. But if it goes out from its field it will be helpless. 

When we are thinking of robots that can make only one thing like playing tennis they can react very fast in every situation. That is connected with tennis. There is a limited number of databases. And that means the robot is acting very fast. 

When a robot or AI makes the decision it systematically searches every single database. And if there are matching details to observed action. That activates the database or command series that is stored in the database. But the thing that makes this type of computer program very complicated is that when the number of stored actions is increased the system will slow.  

If we want to make a robot that can make multiple actions. That thing requires multiple databases. And searching for the match for the situation in every database takes a certain time. So complicated actions require complicated database structures. Compiling complex databases takes time because there are limits in every computer. And in the case of a street operating robot, the system compiles data that its sensors are transmitting to its computers. 

So the conditions that this kind of system must handle might involve unexpected variables like fog or rain. And for those cases, the system needs fuzzy logic for solving problems. In that case, only the frames of the cases are stored in databases by the system creators. And that system is compiling those frames with the data sent from the sensors. 


The waiter robot can be used, as an example of machine learning.


A good example of a learning machine is the waiter robot that is learning the customer's wishes. The robot will store the face of the customer to its memory. When it asks does the customer wants coffee or tea? Then the robot will ask "anything else". And in that case, the robot can introduce the menu. 

And then the customer can make an order. There are certain parameters in the algorithm. Those are stored in the waiter-robots memory. The robot is of course storing that data in the database. The reason for that is simple. The crew requires that information that they can make the right things for the customer. But that data can use to calculate also how many items the average customer makes after a question "anything else"? 

The robot can also store the face in the database that it can calculate how often that person visits the cafeteria. Then that robot can simply store the orders below the customer's face. And it learns how often a person orders something. If some customer is ordering some certain products always. The robot can send the pre-order to the kitchen. That they can get a certain type of order. When some customers will visit often and order all the time same thing, the robot can start to say "do you want the same as usual? For that thing the system requires parameters how often in a certain time is "often"? That was an example of the learning system. 



What was before the Big Bang (Part II)

 What was before the Big Bang. (Part II) "Our universe could be the mirror image of an antimatter universe extending backwards in time....