The United States Senate has taken a pivotal step towards bolstering the safety of children online with the passage of the Kids Online Safety Act (KOSA). The primary objective of KOSA is to shield minors under the age of 17 from various online threats. The legislation encompasses large social media platforms, digital gaming services, and virtual reality sites that facilitate user-generated content.
Upon the forthcoming signature of the President, the various provisions of KOSA, which has been combined under the KOSPA bill package, are anticipated to become effective within 12 to 18 months. These directives expect digital platforms to proactively protect minors from numerous forms of harm, including:
1. Mental health disorders consistent with evidence-informed medical information, such as anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.
2. Usage patterns indicative of or encouraging addiction-like behaviors.
3. Exposure to physical violence, online bullying, and the harassment of minors.
4. Risks of sexual exploitation and abuse.
5. Exposure to marketing and promotion of substances like narcotics defined under the Controlled Substances Act, tobacco, gambling, or alcohol.
6. Exposure to predatory, unfair, or deceptive marketing tactics, or other financial harms.
The act mandates that digital companies must swiftly react to immediate threats to minors. Larger entities with more than ten million users must address reported concerns within 10 days, whereas smaller firms have a 21-day window. This means less severe issues, like minor bullying or microaggressions, might not be addressed immediately.
KOSA imposes a default setting mandate, whereby platforms must activate the highest level of safety measures for minors automatically. It empowers parents with greater control over account management and settings. Restrictions are instated on targeted advertising, direct communications with minors, geolocation services, automatic content recommendations and playback features, and time spent on the digital platforms.
To tackle issues related to opaque algorithmic processes, the Federal Trade Commission (FTC) obtains the authority to take legal action against organizations failing to disclose details of their algorithmic content ranking systems. This move aims to encourage transparency from companies, compelling them to reveal how they harness user data and behavior patterns to propel widespread popularity and potentially addictive usage. Additionally, service providers need to offer an algorithmic choice that doesn’t rely on user-specific data for content display.
Moreover, the legislation tasks the National Academy of Sciences with conducting and submitting five studies to the FTC within the first year regarding:
1. Mental health issues like anxiety, depression, eating disorders, and suicidal behaviors.
2. Substance use disorders and the consumption of narcotics, tobacco products, gambling, or alcohol by minors.
3. The risks posed by sexual exploitation and abuse.
4. The addiction-like use of social media and the design elements that contribute to the overuse and harm of social media.
However, it is noteworthy that these studies’ data will not be available for public release by the National Academy.
As digital dynamics continue to influence the lives of minors, it’s essential to remain updated on measures like KOSA designed to protect them. Parents are also encouraged to take proactive steps to monitor their children’s online interactions. Some may consider using devices with limited internet functionality, such as basic cell phones, as a means to disconnect their kids from online threats. As the digital landscape evolves, this legislation signifies a growing concern and response to the need for comprehensive strategies to ensure the online safety and wellbeing of children.



