New parental control instruments on "chatgpt" after the death of a teenager
Openai launched parental control instruments on Chatgpt, after a lawsuit accused the company of a teenager committed suicide last spring on the famous chat robot as a psychological coach. These instruments, offered to all users on Monday, allow parents to determine how adolescents use a chat robot and receive warnings if the ‘GBT -Chat’ discovers that the teen can be in a psychological distress. These control instruments are available via the “GBT chat” settings, and it also allows parents to determine hours in which the service is prohibited. It is noteworthy that a chat robot for users is 13 years or longer. The company has a lawsuit against ‘Oben Ai’ after the death of a teenager the death of a secondary student in California, and the company announced a series of changes on ‘GBT chat’, including parental control instruments. Is artificial intelligence safe for psychological support? … The story here after another series of reports on the involvement of heavy users to rely on the chat robot in harmful behavior, the lawsuit that claims that “Chat GBT” was systematically Ryan from his family and helped him plan his death. He died by hanging in April. Lauren Jonas, head of the youth well -being at Openai, said the company works as soon as possible to develop tools like parental control. What are the new parental control instruments in “Chat BBT”? To activate the new option, an adult user sends a ‘GBT chat’ an e -mail invitation to his child. And if the invitation is accepted, the adult can take measures, such as determining whether a teenager can reach the sound position in the “chat GBT” or to his ability to generate images, or choose whether a chat robot can refer to previous conversations. Tools also allow parents to determine if they want their child to use a limited version of the robot dedicated to less content on topics such as diet, sex and hate speech. Also read: When does we use or exclude artificial intelligence from our lives? Your extensive guide, if ‘Chat GBT’ discovers that a teenager can go through a narrow psychological condition, a human reference will determine whether an emergency warning should be sent to a parent. These warnings can be set to email, text messages or notifications from the “Chat BT” application. Jonas, a new program to expect the age of the user, said that the purpose of these warnings is to give parents sufficient knowledge of the harmful situations to which the teenager may be exposed until they talk to him about them while retaining the child’s privacy and independence. She explained that ‘Oben Ai’ will not participate in the teenagers’ conversations with ‘Chat GBT’ with his parents. In addition to parents, San Francisco -based Opin AI said it was working to develop a program to expect the age of the user, planning to use it to direct how chat GBT responds to users under the age of 18.