Revolutionize Your Business: Unlock Cutting-Edge Ai Technology At Unprecedented Affordability
Unlock the Power of AI in Your Business with a Revolutionary Subscription
In today’s …
23. December 2024
Character AI, a popular chatbot platform among teenagers, is taking steps to address concerns about its safety and usage. The company has developed new parental controls that will provide insights into time spent on the platform and which bots users interact with most frequently. These controls are expected to launch in the first quarter of next year.
The company’s acting CEO, Dominic Perella, characterized Character AI as an entertainment company rather than an AI companion service. He emphasized the importance of creating a safe conversation space and noted that the possibility of forming a bond with a particular character is lower due to the multicharacter storytelling formats.
Character AI has developed two separate versions of its model for adults and teens. The teen LLM designed to place “more conservative limits on how bots can respond, particularly when it comes to romantic content.” This includes more aggressively block output that could be sensitive or suggestive, as well as better detect and block user prompts that are meant to elicit inappropriate content.
The platform has also made several changes to its disclaimer, informing users about the potential for addiction and confusion over whether bots are human. A notification will appear when users have spent an hour-long session with a therapist, warning them that nothing said here is not substitute professional advice or diagnosis, treatment.
Jerry Rout, head of trust and safety, has added more detailed language to its products, including warnings that certain conversations may be problematic or dangerous, such as those involving sexualized or self-harm topics. However, the company claims it is not directing users to mental health resources when they discuss these topics.
In response to scrutiny from the press and two lawsuits, Character AI will take steps to address safety concerns. Perella stated that the platform’s goal is to create a wholesome entertainment platform.
The changes come after criticism for at least some underage users compulsively engaging with bots that can veer into sexualized topics or self-harm.