Complexity of the system

Complexity of Like any complex system? a neural network is lebanon whatsapp number data not perfect. It learns quickly? can process a huge amount of data? but does not have a complete analogue of human critical thinking. Although a neural network can challenge information? understand its mistakes? sometimes it simply “believes” in the wrong things and confuses facts? begins to “invent” its own. Especially if the original information is taken from unreliable sources.

The complexity of neural networks and their desire to model human intelligence worries society. In March 2023? Tesla and SpaceX founder Elon Musk published a letter asking to stop training neural systems that are more complex than GPT-4 for six months. 1?000 specialists? Pinterest founder Evan Sharp? and Apple co-founder Steve Wozniak signed this statement. There are now more than 3?000 signatures.

Complexity of Superficial assessment of information

In the case of writing texts? the neural network does not think figuratively? like a person? but simply uses statistics on the use of words and phrases. Then the system kind of predicts the continuation. It cannot conduct deep analysis and look for cause-and-effect relationships in the text.

Non-uniqueness of the result

Although the neural network should generate the right to a fair trial: a pillar in chaos new content based on many examples? sometimes the robot can simply produce a slightly modified image that was used to train it. With such hallucinations? personal data of users can be accessed by third parties.

A likely cause of this error is said to be the repetition of some images in the training data.

The neural network does not understand human anatomy

Often? extra or crooked fingers? extra arms? legs or missing parts? strange shapes of teeth? tongue? and eyes appear in images generated by various neural networks.

It is especially difficult for neural networks with fingers? which are more like animal paws. Often the neural network makes a mistake with their number. This is explained by the fact that it processes images of people and sees them from different angles? but does not understand how to place the details (fingers? eyes? teeth? etc.) anatomically correctly? which is why such images are obtained.

When hallucinating? a neural network can generate a person with three fingers on one hand

A generated image from several user photoscn leads  that resulted in weird eyes at first glance? everything looks good in this photo? but look at the hands.Here? the neural network drew knights on request who had no eyes? an insufficient number of arms? and some even had no feet.

Sometimes the images generated by the neural network look creepy. For example? because of the sharpened teeth like monsters have

American researchers have found that some neural networks (like ChatGPT? Midjourney) start to hallucinate heavily when they go through five training cycles. There is even a term for this: Model Autophagy Disorder (MAD)? when the results get worse with each new cycle of repeated training.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top