On January 31, 2024, the United States Senate Judiciary Committee grilled Facebook CEO Mark Zuckerberg, publicly shaming him. Those around said a sense of change was palpable. In the viewing gallery sat parents wearing black, holding pictures of their deceased children — all dead because of social media. These parents were here supporting the Kids Online Safety Act (KOSA) legislation to prevent more deaths.
In 2019, 12-year-old Matthew Minor suffocated while attempting the TikTok “blackout challenge,” which had participants compete to cut off their brain’s oxygen first. In 2020, 19-year-old Devin Norring bought a lethal overdose of fentanyl on SnapChat, believing it was Percocet to alleviate pain. In 2023, 14-year-old Adriana Kuch committed suicide after intense cyberbullying, which she viewed only at school through her friends’ phones; her parents kept phones and screens away from her at home. Hundreds of children in the US have died in similar incidents.
Yet those deaths represent the tiniest traceable sliver of damage that social media has caused to young people. That trackable body count misses many other deaths, primarily caused by the gradual accumulation of loneliness and depression that screens themselves generate. Digital signals are just bad for the human nervous system, period.
How many more deaths from screens might we be missing? Before we return to the drama between alpha businessmen and politicians, we need to know what actually happened. And before any person or company is accused of systematic harm, one needs proof.
I intend to calculate the number of people killed by screens the same way one calculates deaths from cigarettes or pollution. So please indulge me some data-scientist geekery before I return to Zuckerberg’s surprising, correct testimony. After that, I will express what I would have said had I been in his place, what I did say to a Vermont Senate committee and what administrators say to defend the indefensible.
The damage screens cause
Is it possible to disentangle the damage caused by smartphones from that caused by social media? Probably not, because so much of social media appears on smartphones. From a data point of view, that’s good; we can use either one to roughly estimate the other. So first, we’ll see how screens hurt people one by one and hour by hour. Next, we’ll see how the introduction of smartphones increased suicides. Then we’ll combine those observations into an estimate of deaths caused by social media combined with phones.
The first data observation is that hours-per-day screen use directly correlates with all kinds of loneliness and mental distress, including suicidality and actual suicide. The more hours screens are used, the more mental problems worsen — year-over-year specific demographic slivers of young people indicate that they get 1% worse per hour of screens per day.
Evidence like this shows up everywhere, undisputed in general but loudly disputed by whichever app is currently under fire. Once you know that brains need continuous three-dimensional space, the damage done by twinkly, two-dimensional screens makes sense.
The second observation is that the introduction of smartphones around 2010 coincided with a rough 50% spike in teen suicides for both girls and boys in the US. This number has continued to the current day. The point was first made by social scientist Jean Twenge in 2020, and has been validated ever since. No other force right then penetrated teen lives as dramatically, so a safe estimate is that phones and the stresses brought by social media are the root of teen mental anguish. This neutral line of reasoning concludes that statistically, about 2,000 extra teen suicides per year in the US are caused by smartphones.
Over the whole US population, an extra 15,000 suicides per year (approximately one-third) are caused in aggregate by smartphones and social media. That’s a kind of annual body count. On one hand, it is a small fraction of the statistical half-million people killed each year by cigarettes. But the social media body count is vast by other standards. For example, 50 years ago, the Ford Motor Company was sued and charged with criminal homicide when a car the automakers knew to be dangerous, the Ford Pinto, killed 27 total people by exploding. A few dozen fatalities is plenty to answer for, but nowhere close to tens of thousands.
Proper responses: Executive Officer vs. Algorithm Officer
Every parent knows that phones and social media are bad for children. The Senate committee room was full of such parents, looking at Zuckerberg to put a human face on that which had killed their kid. He was there to answer for their lives.
In the US, there is a ritual when a great sin has been committed: Someone must be fired for the offense. Minutes earlier, committee member Senator Josh Hawley had just proved that Facebook knew their product was killing children, yet did nothing to stop it. Hawley bluntly confronted Zuckerberg.
Hawley: Had Facebook fired anyone for killing kids?
Zuckerberg: No. [He repeated this over and over.]
Hawley: Had Facebook compensated any grieving families?
Zuckerberg: No. Our job is to build industry-leading tools.
Hawley asked the same question over and over. He went in like a boxer, relentlessly hammering Zuckerberg about the lack of firings and his fussy deference to the privacy of employee records. He ignored Zuckerberg’s blather about building “industry-leading tools.” Finally, having destabilized his adversary, Hawley insisted that Zuckerberg physically turn around and face the crowd of grieving mothers.
Zuckerberg performed his sacrificial role ably, showing an earnest public face for the trillion-dollar Facebook. His face looked honest as he enunciated the bland apology, “No one should have to go through the things your families have suffered….” Masterful bearing, plenty of empathy and no acceptance of responsibility. That is another American ritual. It would have been a perfect CEO performance, except that it tapered into a final shout-out for his beloved industry-leading tools: “…and this is why we have invested so much in industry-leading efforts….”
Zuckerberg was only doing his job by dodging the blame. In the US, the job of CEO is to defend shareholder value, which in this case means building software that leads the industry in making money, and to defend that source of revenue against attackers. CEOs are meant to pay their owners, not admit the entire enterprise is wrong.
What would I have said had I been in his place? Not as a Chief Executive who moves fast and breaks things, but as Chief Algorithm Officer, ie Chief Truth-Teller? What truths about technology can I express that Senators and grieving parents should know? I would say:
“My Honored Senators and parents everywhere, we humans are the victims of our own success, our hands so skilled they fill the world with captivating things, our eyes so innocent they follow eagerly. That statement combining our human ability to be charmed with our ability to manufacture charming things is not just self-evidently true, it is also the text of the peer-reviewed conclusion of a theoretical neuroscience research paper in the most prestigious journal possible, as conveyed by the journal’s founder. My wife Criscillia and I spent two years alone, unpaid, writing that paper and sealing a scientific truth into the permanent record, forever.
“Our paper, ‘Sensory Metrics of Neuromechanical Trust,’ in effect proves that all sensory systems — as a general mathematical principle — are subject to self-reinforcing informational addictions, such as screen addiction or social media addiction. Like our tongue’s attraction to sugar, brains are attracted to special rare things, but are at the same time damaged by too much of them. Media companies did read our paper the week it came out in 2017, yet have stayed mum ever since.
“The natural and mathematical splendor of the human sensorimotor system, and its tragic capture by digital dazzlement, is the message of ‘Sensory Metrics’ and also our message for you. Human brains need attention for our own needs, to trust our senses and ourselves. That’s how brains must work, especially childrens’ brains. Attention is meant to be used, not captured. Even a few distractions can be dangerous, but to let a corporation or machine make money stealing attention wholesale is theft of the most destructive sort, taking not your belongings but your mind.
“It is true these screens and apps kill children, so they must stop, and fast. That is why you should pass KOSA, the Kids Online Safety Act, which might have saved these children, and will save many more.
“Bills like KOSA are a necessary stopgap, but cannot be the final word. The laws of attention and economics are continuous, naturally capturing or flowing around regulatory boundaries. The problem is like nailing jelly to a wall: As long as software is allowed to look for a connection between human attention and profit, it will find it. And software is getting better every year, as Silicon Valley boasts. As a general rule, each single step inside a killer app is functional, legal and profitable. That’s why those steps are there. Get a child’s attention, keep it by keeping just one step ahead, show things the child likes, get paid by showing related things, algorithmically select and pair child with message with ad, refresh once a second…each of these steps is standard, rational business and engineering. That’s how software is supposed to work. The so-called “industry-leading tools” that Facebook claims to make would have to impede such programs. Fortunately for Facebook, the tools themselves don’t actually work, so they, in fact, do lead the industry by protecting profit through fig leaf control.
“So as long as computerized advertising or attention-grabbing remains legal, it will undermine human attention and mental health. There is no particular piece to regulate to stop collective damage. In the long run, harvesting attention is like harvesting organs. Profit must be prohibited or people will die.
“There is, in fact, a software tool technologists could build to substantially reduce that human collateral damage. Much like Facebook’s tools, and much like the dashboard tools I invented and built for businesses myself, continuous statistical algorithms fed by real-time metrics optimally fit to historical trends. The data is there, the math is there, the computing power is there, the chance for infinite transparency is there. Version 1.0 could be done in a year. The only difference is what this new tool would measure. Instead of the profit and user value and time-on-device which tools measure now, this new tool would measure the body count and misery index, a set of meters and dials of despair. How many hours of human life are reduced per dollar of advertising profit? This tool would show the only cost/benefit analysis which matters: the cost in human life offsetting the benefit of algorithmic profit. It would use and show real live data and straightforward metrics, sensory metrics, of neuromechanical trust. This tool would show the world what only executives know now. This new tool would lead the industry, but in the right direction.
“These software tool-builders are my tribe. I love being a scientist and technologist. To ask questions no one ever asked, to answer them, to build cool stuff which never before existed. It means hanging out with fun, sharp people, differently brilliant, but in ways we mutually respect. We can save the world if you invite us to.
“But only you, Honorable Senators, can make it happen. The market can’t. Our current paymasters have their own paymasters, and therefore they shackle us to wooden metrics pulling galleys of ads the wrong direction, against the needs of real live kids. Only human people, acting on behalf of human people over and above machines, can compel all of us collectively to do the job that we so want to do. The market can’t undo its damage; you can.”
That is what I would have said to the US Senators in place of Mr. Zuckerberg, had I been invited to the US Senate.
Instead, I had been invited to testify to a smaller committee: one for the Senate of the state of Vermont, which was considering banning cell phones in schools. They were deciding whether even to pass this proposal, S.284, to the full Senate for debate. I was one of four people giving testimony, mine remote from California.
The best testimony came from the principal of a private school, who explained how much happier and more productive everyone on his campus, students and teachers alike, became since the school banned phones. Every person testifying brought a different perspective: how distracting phones are to all children, how socially disruptive, how addictive, how phones enable cyberbullying. They gave reason after reason, all the way up to how phones cause mental illness and suicide. Proponents showed Senators the following graph, which illustrates both those horrors rising about 50% (as I mentioned prior) in the years since smartphones became widespread.
In particular, the boost in attempted suicide from 4% to 7% means an extra 3% per year which wouldn’t be there otherwise, presumably due to phones and online activity. Vermont’s teenage population is about 60,000, so 3% translates to 1,800 extra attempted suicides per year. Those additional almost-deaths are what the bill’s proponents are fighting.
I didn’t have those numbers when I testified. In the moment, I emphasized three firm points — not of opinion, but of undisputed scientific truth. First, all these facts about screen damage are absolutely true, interrelated, and explained by the simplest scientific understanding possible: the understanding that humans are three-dimensional creatures for whom screens are alien. That means these problems are rooted in human biology, and won’t go away or get better. Second, this quantified perspective comes courtesy of our paper, which is scientifically perfect. It’s public, peer-reviewed, based on undisputed principles and itself undisputed for over seven years. Third, only a legal structure committed to actual scientific truth, rather than to a legalistic set of rules and targets, has the power to stop technology destroying kids’ brains. For man-made law to work, natural law must trump it. Any law worth passing must first and foremost be aimed at protecting children’s health, regardless of how bad actors learn to bend the rules.
The Vermont Senators presumably did not look into that scientific proof. Within 24 hours, the proposed ban on phones was watered-down into a proposal that in two years, the Secretary of Education would gather lots of information and generate an overall policy. This would enable local schools to set their own individual policies: which phones, where, what times, what punishments etc. That is, wait two years and begin all over, but with much less urgency. In general, people trying to prevent real deaths don’t just ask for studies.
That attitude was in the air already. In the week before our hearing, there were quotes in Vermont media on behalf of school principals and the health bureau saying a ban on phones in schools was too much. Their statements sounded practical, the kind of sensible thing a good administrator says. But translated into common sense, they don’t add up.
Below are six such quotes and my translations, which I passed among the Vermont enthusiasts. The silver lining is that each dumb statement has the same structure: inflicting health problems now is okay because it’s cheaper and more administratively convenient. So to these administrators, the theme of known-effective health measures vs short-term convenience is reliable enough to lean on constantly. It doesn’t actually work, but they think it does because they measure the savings, not the damage.
Translating administrative nonsense
“Jay Nichols, executive director of the Vermont Principals’ Association, says he understands the negative impact of social media on young minds. As he put it Feb. 2, the association is ‘on the front line of the negative impacts of digital addiction to social media.’ However, he said it does not support S.284.
“‘Already, most schools have social media and cell phone access completely or significantly restricted during the school day,’ he said. ‘Providing the mental health resources that students need when they need them is probably a better approach to addressing mental health needs in students than banning cell phones and social media from schools from our perspective.’”
In other words, we recognize that phones cause mental health problems, but we’re happy to allow that because we already have systems for addressing and maybe even fixing mental health problems later.
“Nichols called the opt-out element of the bill unreasonable. He told committee members that providing paper copies of digital materials is ‘a huge burden to schools and is not necessary,’ saying later that ‘it’s not appropriate to allow students to simply opt out of learning how to use technology in today’s world.’”
In other words, even though reading on paper causes less eye strain and headaches and is 400% better for comprehension, it’s slightly more expensive. So we will force harmful yet ineffective screens on all students, always.
“‘We completely support the idea of minimizing and reducing exposure to social media while in school,’ Levine told members of the Senate Committee on Education. But he said the bill felt ‘unrealistic’ and a bit ‘heavy handed’ — possibly even ‘accusatory’ and ‘disempowering.’”
In other words, we know screens, phones and social media are bad for kids. But a bigger risk than children’s health is that administrators might be unfairly accused of disapproving of something someone likes.
“While Vermont’s youth may experience negative impacts due to social media, Dr. Levine said the most marginalized youth — who experience social isolation at ‘much higher’ rates than average — can find ‘hope and community’ online.”
In other words, phone and screen use increases social isolation for children everywhere, except for the most isolated kids who paradoxically will benefit from them.
“Levine added that he would like to see a focus on ‘health education’ that could give youth the skills to navigate the complexities of the digital world.”
In other words, phone and screen use is so unhealthy, we want to train them in managing these unhealthy things by doing them even more.
[Lee Thompson-Kolar edited this piece.]
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.
Support Fair Observer
We rely on your support for our independence, diversity and quality.
For more than 10 years, Fair Observer has been free, fair and independent. No billionaire owns us, no advertisers control us. We are a reader-supported nonprofit. Unlike many other publications, we keep our content free for readers regardless of where they live or whether they can afford to pay. We have no paywalls and no ads.
In the post-truth era of fake news, echo chambers and filter bubbles, we publish a plurality of perspectives from around the world. Anyone can publish with us, but everyone goes through a rigorous editorial process. So, you get fact-checked, well-reasoned content instead of noise.
We publish 2,500+ voices from 90+ countries. We also conduct education and training programs
on subjects ranging from digital media and journalism to writing and critical thinking. This
doesn’t come cheap. Servers, editors, trainers and web developers cost
money.
Please consider supporting us on a regular basis as a recurring donor or a
sustaining member.
Will you support FO’s journalism?
We rely on your support for our independence, diversity and quality.
Comment