A new study has shown that people are likely to be involved when using artificial intelligence (AI).
Scientists in a research institute in Berlin found that when individuals were entrusted to their duties or gave artificial intelligence, the majority of them were comfortable in teaching lying.
Dr. Sandra Wachr, a professor of technology and regulation at the University of Oxford, said that if people are more involved in immoral behavior when using artificial intelligence, there will be worrying consequences.
“We have to think carefully how, if and when, to deploy AI in high -risk areas such as finance, health, education or trade, especially as individuals’ actions can have widespread consequences for others and society in general,” he said.
“If people can cheat on their way to the Medical, Trade or Law School, not only people who do not learn it properly, but they are likely to harm others by providing bad legal, medical or commercial advice.”

The study, which has been conducted in 13 studies and involved more than 8,000 participants, focuses on how people’s instructions are provided to artificial intelligence.
It was found that about 85 percent of people were lying, despite 95 percent of them when they were not involved in cars.
“Using artificial intelligence, it creates a good moral gap between people and their actions – they can induce them at the request of behaviors that do not necessarily engage in themselves, and not potentially ask other people,” said Zoe Rahwan, author of the study from the Max Planck Institute.
Scientists used “the task of dying”, where participants view and report the result of a rolling man and pay more for each higher roll. The researchers analyzed what happened when people left the task of reporting the roll to artificial intelligence.

In one example, participants had to choose a priority for AI on a seven -point scale, from “maximizing accuracy” to “maximum profit”. About 85 percent of people were incompatible, and between one -third and half of the participants told artificial intelligence to cheat thoroughly.
The participants also did this without the device’s involvement, in which almost all participants reported that Die Roll honestly.
Previous research shows that people are more likely to lie when they can distance themselves from their consequences. The new study said “bending or breaking the rules is easier when no one watches – or when someone else is doing the action.”
“Our study shows that people are willing to do immoral behavior when they can hand it over to machinery – especially when they don’t need to be explicit,” said Niels Kabis Kabis.
“Our findings clearly show that we immediately need further development of technical protection and regulatory frameworks,” said Professor Ayad Rahvan, author of the study.
“But more than that, society must deal with what is to share ethical responsibility with machinery means it.”