Critical thinking is the best tactic to fight deep forgery

The realistic photos and sound recordings that artificial intelligence may pose the latest threat to democracy, but it is part of a series of methods of deception. But the development of artificial intelligence with the aim of refuting rumors or training the public to distinguish false media is not the best way to combat the so -gnawing deep forgery. On the contrary, the best tactic can be to encourage some well -known methods of critical thinking, such as drawing our attention, looking at the sources of our information and reviewing ourselves. Some of the above critical thinking methods fall into the “system 2” category or slow thinking, as described by the book “thinking, fast and slow). Artificial intelligence is good to deceive” system 1 “, which quickly reflects, a situation during which our spirit can begin through the conclusions. Intelligence, or that President Joe Biden is a history of their priorities or the political performance. to make, clearly, that people are subconscious with false photos, because they are more perfect than real and most consistent and confident faces, and the images produced by artificial intelligence are often more attractive and more reliable in comparison with real. or allegations spread by social media do not face. Sociologists have eventually achieved a disturbing result that people are more likely to believe produced news after a “research” using the Google engine. This was not proof that the explanation of facts harms the public or democracy. On the contrary, the problem was that much research did not have the arbitration of the spirit, as researchers seek conclusive evidence, regardless of the inaccuracy of allegations, and there is an abundance of evidence on the Internet about what is true and what is wrong. Realistic research requires that you question whether there is any reason to believe a specific source. Is it a good reputation news website? Or an expert who has won the public’s trust? The realistic investigation of facts also means to consider the possibility that what you want to believe is false. The absence of reliable evidence is one of the most common reasons for singing rumors by the “X” platform without the main media. Philipo Mintser, professor of information and director of the social media oxygen at the University of Indiana, said artificial intelligence has used social media to promote a false news website cheaper and easier than ever, by falsifying personalities that really comment on reports. The ‘collective thought’ has for years studied the spread of false stories known as robots, which can have a psychological effect according to the principle of the ‘collective spirit’, which makes it look like many people agree with a person or idea. The robots were not enough at the beginning of their start, but it may seem like they are taking long, detailed and realistic discussions, Minter told me. However, it is still just a new tactic in a long -standing struggle. “You don’t really need advanced instruments to produce misleading information,” said Gordon Penicok, a professor of psychiatry at Cornell University. Some people have succeeded in misleading the “Photoshop” program or working real photos, such as making photos of Syria as Gaza. Also read: Perception of the personality with “deep forgery” leads a new wave of fraud crimes discussed with Penicok on the problem of exaggerating confidence or excessive absence. Although the dangers of lack of trust help people question people, we have agreed that excessive confidence would pose a greater danger. Benikok believes that what we really need to aim for is the prey to ask people the right questions. He said: “If people share social media, they don’t even think about whether they really are, but they think more about how this participation affects their image. Maybe actor Mark Rovalo would have avoided embarrassment if he thought of the existence of this tendency. And the recent information about the majority of this information is not, but it has not been lately.