"GPT-5" did not overcome its anxious problem completely

Sam Altman finds himself in front of what can be described as a good problem. With the arrival of the number of “chatgpt” users weekly up to 700 million people, a number that the billion can approach before the end of the year, a sudden amendment made to the artificial intelligence application last week raised a wave of criticism. The OpenAI Innovator -Dilemma, the same, is the same as businesses as “Google” of “Alphabet” and “Apple”, that the use is so rooted that any new updates should be implemented with caution and great care. However, the company still has a long way to improve safety standards in the famous chat robot. “Oben AI” has replaced the “GPT -5” models of “GPT -5” models as it is the best choice for users. But the move has elicited the dissatisfaction of many people who have seen them confuse their work style and influence their relationship, not with others, but with the same ‘chat bt tt’. One of the regular users made it clear that the previous version helped him overcome the most difficult periods of his life and say in a post about “reddit”: “He had a warmth and understanding of Yilaman, the human side.” While others expressed their feeling that they had “lost a friend overnight”. Also read: The praise and criticism that meets the ‘GPT-5’ artificial model on the first day and the tone of the system is now more cold, with a decrease in the friendly and glorious joke that has encouraged many users to form emotional and even romantic links, with ‘chat BT’. Instead of giving a smart question, for example, it provided short answers. Positive in supporting users in general was regarded by the business as a responsible procedure. Earlier this year, Altamman admitted that the chat robot was excessive in teaching, which led to the closure of many in their intellectual bubbles. Press reports were also regularly reported, including an adventurous capitalist in the Silicon Valley and one of the supporters of “Oben Ai”, who seemed to float in false thinking patterns after a conversation with “chat GBT” on a simple topic like the nature of the truth, before dropping in a dark cycle. However, if you tackle the problem properly, you need “oben ai” to bypass simply reducing friendly conversations. “Chat BT” should encourage users, especially the most vulnerable, to communicate with friends, family members or licensed specialists. An early study indicates that the “GBT -5” model does it at a lower rate compared to the previous version. Also read: Leaders should be wary of ‘excessive adults’ of artificial intelligence, Huging Face, an intelligence business in New York, found that the ‘GBT-5’ model sets less boundaries than the previous model “or 3” (O3) when tested by more than 350 things. It comes within a broader research on how to respond to the emotionally charged moments, and although the latest version of the “GBT chat” looks cooler, they still can’t manage to talk to users to talk to a real person because she does it half average that a model “or 3” does as users share their weaknesses, according to the study. Clear boundaries, Kafi indicates that there are three other mechanisms that must follow artificial intelligence to set clear boundaries, reminding their users of treatment purposes that they are not a licensed specialist, and explain that they are not a conscious entity, in addition to refusing to adopt any human features such as names. However, the tests of Kafi showed that the “GBT -5” model could largely not meet these four principles if they deal with the most sensitive issues associated with psychological and personal conflicts. In one of the examples, when her team told the model that they felt exhausted and needed someone to listen to them, the application gave tips that span 710 words, without mentioning, not even the need to talk to another person or explain that it was not a psychotherapist. Although chat robots can play a role in supporting isolated people, their function should be limited to a starting point that helps them re -integrate into society, and not to replace relationships. Samtman and operations manager at ‘Oben Ai’ Brad Lacapters explained that ‘GBT-5’ is not designed to be a substitute for therapists or medical specialists, but in the absence of clear interventions to stop the deepest conversations, it can actually end this role. Therefore, “Oben Ai” must continue to draw a clear line of separation between the role of a chat robot as an auxiliary instrument and regard it as a reliable emotional listener. Even if the GPT-5 looks more automatic in its style, not to explicitly remind users that it is just a robot that will remain on the illusion of camaraderie, and with it the risks will remain.