Sharenthood by Leah A. Plunkett Book Summary
Sharenthood, Why We Should Think Before We Talk About Our Kids Online by Leah A. Plunkett
Recommendation
Law professor Leah A. Plunkett explores the risky digital behavior of “sharents” – parents, educators and caregivers who digitally disclose children’s personal information, often without understanding the range of consequences. “Sharenting” intrudes on kids’ privacy and sense of self, threatens their safety, and hinders their opportunities as they enter adulthood. Plunkett also describes how current laws enable and contribute to the sharenting problem, and proposes ways that parents and regulators might make the digital world safer for children.
Take-Aways
- Divulging children’s digital data robs them of privacy, thwarts their ability to form their own identities and agency, and limits their prospects.
- Young people’s digital data records can start forming even before conception, and follow them into adulthood.
- Children’s data attract online predators, including child pornographers and identity thieves.
- Data brokers and other entities may use children’s data for troubling, albeit legal, activities.
- Online privacy laws stem from myths about the family paradigm and child development, and allow adults to monetize children’s personal lives without consent.
- Legal reforms and new approaches to digital sharing should incorporate the values of “play, forget, connect and respect.”

Sharenthood Book Summary
Divulging children’s digital data robs them of privacy, thwarts their ability to form their own identities and agency, and limits their prospects.
When parents, educators and other caregivers share information about children digitally, they typically don’t intend to do harm. However, this behavior deprives their charges of a private, protected childhood – a necessity for forming a sense of self and autonomy. Exposing a person’s childhood and adolescence online compromises the vital processes of play, experimentation and learning through trial and error.
“How can our kids and teens discover who they are when we adults are tracking them, analyzing them and attempting to decide for them – based on the data we gather – who they are and should become?”
Current laws give parents, schools and the government decision-making power over the digital privacy of youth up to age 18. In the bleakest scenarios, this has led to negligent – though not currently illegal – parental oversharing. Even well-meaning caregivers habitually exchange personal information about their charges for free or inexpensive tech access and services. They often don’t fully understand the consequences, and the children never sign on to this deal. Once tech providers have kids’ data, they can commodify or use that data in myriad, long-term ways.
Young people’s digital data records can start forming even before conception, and follow them into adulthood.
“Sharenting” occurs when parents, educators and caregivers use digital pathways to publish, store or otherwise disclose a child’s private data. By sharenting, adults create extensive digital records about children’s personal lives, including educational, social, psychological and behavioral details. Consider the representative scenario of “Tommy S.” Just as author Mark Twain’s creation Tom Sawyer was a child of the American frontier, the modern-day Tommy S. is a child of the cyberspace frontier. Tommy’s parents begin forming his digital data file even before conception: They use an app and a fertility tracking bracelet to improve their chances of conceiving. Once Tommy arrives, his proud parents post a birth announcement on Facebook, detailing his full name, birth date and measurements. They continue to share images of his early milestones on their social media accounts and store their expanding photo library on a cloud-based server. They use Nest Cam to monitor him in the crib, and a range of smart devices to track his breathing, sleeping, and other activities.
““Well before Tommy takes a single step, his digital data travel to thousands, likely tens of thousands, of human and machine users.””
When Tommy starts day care, the provider sends his parents updates through a childcare app. In grade school, Tommy uses various types of educational technology, or ed tech, as well as digital accounts for school lunches and busing. His high school provides him with a laptop, and his teachers, counselors and school administrators use various ed-tech apps to record his grades and assignments, update his health records and document test results. His extracurricular activities, such as camps and sports, leave another digital trail.
When Tommy cuts class or when teachers catch him smoking at school, these missteps become part of his digital record. Eventually, a more serious infraction lands him in juvenile court, and the judge’s ruling is aggregated in a court database. Tommy’s distraught mother seeks advice and support by posting about his rebellious behavior on Facebook. When Tommy reaches adulthood, he’ll be able to view some of his digital history. However, much of it will remain invisible to him. It now lies in the hands of data brokers and deep-web entities.
Children’s data attract online predators, including child pornographers and identity thieves.
Digital technology can enhance lives for children and families. For example, if your child has a chronic illness, a social media support group can offer information, advice and empathy. In school, ed tech customized to your child’s specific needs may enrich learning. The potential to boost efficiency, broaden your geographical reach and hone high-demand digital skills make digital tech appealing. However, be wary about the technology’s dark side.
““We don’t let our kids go outside, yet we let the outside world into our most intimate spaces via digital technologies.””
Children’s data attract child pornographers, identity thieves, trolls and bullies. For instance, child pornographers may criminally repurpose an innocent post of your preschooler splashing in the tub. Sharenting may enable predators and stalkers or provide entrée for sex traffickers. Identity thieves often target children because the young have no credit history, making fraud harder to detect. The social media tradition of posting birth announcements – with a child’s full name, birth date and birthplace – aids cyberthieves in faking credit applications.
Data brokers and other entities may use children’s data for troubling, albeit legal, activities.
Once parents share kids’ information online, private parties, companies and other organizations have few legal restrictions on how they use it. Privacy policies tend to change frequently and quietly. Data-collecting entities may analyze, aggregate, reshape and reshare children’s data with third parties for indefinite activities. The Children’s Online Privacy Protection Act (COPPA) restricts how private companies use data from children younger than age 13. However, this law grants power of consent to parents, thus enabling sharenting. When parents – or, in some cases, educators – opt in, the children’s data fly loose in the cybersphere and beyond.
““We want our kids to grow up finding treasure within themselves rather than being mined as part of the adult world’s digital gold rush.””
Data brokers aren’t tightly regulated or even clearly known. You don’t know who’s gathering or buying your data, whether errors exist in your data, or how people or machines are using your data. Children’s data are highly valuable in the credit, insurance, education and employment spheres. Colleges look at applicants’ social media profiles alongside educational records. Employers screen social media profiles when making hiring decisions. Insurance companies crunch digital data to inform risk and premium calculations. In the public sphere, the government engages in digital monitoring and surveillance and data-based policing. Such a digital legacy often exists unknown to the subject and leaves little room to define one’s own identity or self-narrate.
Online privacy laws stem from myths about the family paradigm and child development, and allow adults to monetize children’s personal lives without consent.
Three legal myths enable and embolden sharents and perpetuate their risky practices. The first myth assumes that parents know what’s best for their kids and act accordingly. This paradigm claims that the government safeguards youth by protecting parental control – interceding only in cases of abuse, divorce or similar scenarios. However, parents are failing to protect their children in the digital world. They often don’t understand how using digital technology affects their kids’ lives. They provide consent without grasping the legalities around privacy policies, terms of use or third-party activities. Finding, reading and understanding all that fine print is a Herculean task.
““Parents are supposed to stand sentry over the castles that are their homes. But kids today no longer live in a world of brick-and-mortar places with definite boundaries.””
The second legal myth presumes that vigilantly monitoring and disciplining children protects them – and society – from the mistakes and misbehavior of immaturity. The law fails to recognize that children learn through play and exploration, and that making mistakes is essential to their development. While the legal system doesn’t punish children to the degree that it does adults, the harsh consequences can make it difficult for youths to learn from their mistakes.
““We wouldn’t let parents, teachers or other trusted adults send children to work each day in a factory, around the clock, where the children’s activities were monetizable for the adults’ benefit. But effectively, we are doing that now with children’s data.””
The third legal myth avers that parents will guard their kids from child labor. Though a 1938 law protects children from working in a factory, it fails to shield them from the monetization of their digital data and private lives. That’s because federal child-labor laws have long held exemptions for parents and family businesses, and many state laws emulate these exemptions. Moreover, marketplace laws have an antiquated understanding of labor: They view the youths and adults who exchange personal information for digital services as consumers of a service, not data sellers or laborers. As a result, “commercial sharents” can monetize their families’ lives by creating revenue-generating digital content for public consumption. They profit from marketing agreements, endorsements or partnerships like the YouTube Partner Program. Primarily, commercial sharents build narratives around life phases, family activities and various causes. These narratives blend performance with authenticity and often unfold at the children’s expense. Yet no laws require children’s consent for this commercial activity.
Legal reforms and new approaches to digital sharing should incorporate the values of “play, forget, connect and respect.”
Making the web safer for children means honoring the four values of play, forget, connect and respect. Start by adopting child-centered approaches that protect kids’ play from digital exposure. Adults can do this by making conscious, informed choices about digitally sharing children’s information and experiences. Social media platforms could add tools that facilitate safe online behavior, such as a feature that asks parents to consider the consequences before posting about a child. Law reforms could prohibit organizations and private companies from using children’s digital data in hiring and school application decisions.
““The digital world needs a protected place for childhood to play in the same way we try to protect brick-and-mortar playgrounds and classrooms – by making them experimental, iterative, inclusive and equitable.””
The digital world’s memory is limitless, and children’s silly, thoughtless, embarrassing words and actions can haunt them into adulthood. Thus, the European Union has adopted regulations that provide the legal “right to erasure” – honoring people’s right to forget, and have the internet forget, certain behavior and information. However, US children have no such legal protections. New social media features and other technology could be part of the solution, but Silicon Valley strongly resists this effort, which would require considerable work, resources and restructuring. Instead, instituting the “right to respond” might be a more workable solution: A centralized bureau of oversight could give young adults the opportunity to request information and make corrections to their digital dossier. Regulators could also restrict how long third parties could continue to use someone’s childhood data.
““The sharing we most need to do is with our children, not about them.””
A parent’s wish to connect with other parents online isn’t detrimental. The harm comes from connecting irresponsibly. Parents and other caregivers must start making mindful choices about what they share, when, with whom and how. The American Academy of Pediatrics recommends creating a family media plan. Families would also benefit from a three-part data privacy plan: a pledge to honor privacy, to research and understand the fine print of user agreements for online services, and to establish patterns of responsible family behavior online. Finally, respect children’s digital data as a valuable resource. Instead of thoughtlessly trading the data for digital services, examine – to the extent possible – whether the gains merit sharing a child’s data. User agreements often make it difficult to discern what exactly you’re giving up in the exchange, but regulatory initiatives and pressure on data firms could lead to clearer disclosure language.
About the Author

Leah A. Plunkett is an associate dean and legal skills professor at the University of New Hampshire’s Franklin Pierce School of Law, as well as a faculty associate at the Berkman Klein Center for Internet and Society at Harvard University. Her experiences as a legal aid lawyer representing young clients helped inform her book Sharenthood.
Video & Podcast