It's clearer than ever. Social media is dangerous for kids. | Opinion
We as consumers and Meta as a company have been warned repeatedly about the dangers of social media use. Will things change?
When the person who designed doom scrolling tells us that his most ubiquitous creation is dangerous, we ought to listen.
Thankfully, a jury in California did listen when Aza Raskin, inventor of the infinite scroll, testified in a deposition that social media platforms such as Facebook and Instagram are designed to be addictive.
In the California case, decided March 25, jurors found that Meta, owner of Facebook and Instagram, as well as Google, which operates YouTube, are liable for harming the mental health of a woman who began using the companies’ products as a child.
A day earlier, in New Mexico, a jury ruled that Meta must pay $375 million for failing to adequately protect young users from child predators.
We know social media is destructive. These cases should be a reminder.

The California case, in particular, may have enormous (and expensive) ramifications for tech companies because of one word: addiction.
The legal contention, which the jury in California affirmed, is that Meta and Google have pushed products on children that are designed to compel their use to the point of harm, much like tobacco companies for generations hooked cigarette smokers on nicotine.
Other lawsuits against social media companies are pending across the country. The cases decided in California and New Mexico may set earth-rattling precedents for tech companies.
In the California case, the plaintiff’s attorneys presented company documents as evidence during the trial that showed corporate executives knew their social platforms were harmful to children. The attorneys also argued that features such as autoplay videos reinforced users’ compulsive behavior.
In a damning deposition, Raskin said that tech companies’ use of the infinite scrolling feature he developed now leaves him in a “sick place in the pit of my stomach.”
One of the saddest things about these court decisions is that they aren’t the least bit shocking. We as consumers and Meta as a company have been warned repeatedly about the dangers of social media use, especially among teenagers.
Still, little has changed, at least in the United States. Australia in December banned kids 16 and younger from using social media. Other countries, including the United Kingdom, are considering similar bans.
Can we trust social media giants going forward?

But whether legally forbidding kids from hopping on Instagram will actually help is uncertain.
Instead, the financial, political and social pressures of losing high-profile court cases ‒ and the hidden facts they drag into the light ‒ could be our best hope for true change.
I doubt that Instagram, TikTok, Snapchat and other social platforms will disappear anytime soon. What we need then is a reboot ‒ fixes that put brakes on features that prompt kids to waste hours on the platforms and place stronger safeguards against bullying and exploitation.
This, after all, may be the last chance to set things right before more revolutionary changes hit our homes and families. Google and Meta are among the companies leading the charge into the age of artificial intelligence. If they mishandle AI like they have fumbled social media, the consequences could be even more devastating.
Facebook founder Mark Zuckerberg long told his employees to “move fast and break things.” That should never have included breaking our children.

Tim Swarens is a former deputy opinion editor of USA TODAY and former opinion editor of The Indianapolis Star.