Baby’s Digital Footprint: A New Parent (and Technologist) Grapples with Protecting His Child’s Privacy Online
Article In The Thread
New America / Andrew Angelov on Shutterstock
July 13, 2023
As a new parent in 2023, I take an astonishing number of baby pictures. I probably snapped more photos during the first three months of my baby’s life than my parents took in the first three years of mine. Lacking the constraints of film that my parents faced, I document moments both meaningful and mundane. But as someone who thinks a lot about internet privacy issues, figuring out how to share the good ones has turned out to be another new hurdle of early parenthood: I now have to wrestle with what it means to create and contain a digital footprint for someone who can’t reliably hold a rattle.
Ubiquitous social media, along with countless parenting apps, and internet-connected baby gadgetry are forcing me to reflect on what I’m actually worried about, what I can realistically do to limit data collection about my baby, and where I may just have to compromise.
As with many things in parenting, I’m not convinced there is only one right way to approach online privacy for your child. It’s complicated, and I’m certainly not going to tell other parents they are doing it wrong. New parents are overwhelmed with decisions that need to be made, and people have different relationships to their online privacy. In any case, I don’t think absolutist approaches are nearly as helpful as those rooted in harm reduction, but to do that you actually need to consider what the possible harms are.
When I think about the significance of baby data, I approach it in two ways by considering: (1) the technical questions around the kind of data being created, where it is stored, by whom, for how long, and for what purposes, and (2) the questions about more direct social impacts of all this content as kids pass through the phases of childhood and adolescence.
Frankly, the technical questions aren’t strictly about the technology or types of data being collected, but the ecosystem in which it is stored and used. In what seems like a good description of my nominally technical worries, writer Ted Chiang suggests that “our fears or anxieties about technology are best understood as fears or anxiety about how capitalism will use technology against us....Technology and capitalism have been so closely intertwined that it’s hard to distinguish the two.”
In the end, the technical worry isn’t about the posting of the photo, it is how that photo will be used by the platform I post it to.
When pictures get posted to social media, they get run through that platform’s artificial intelligence (AI) filters. For example, Facebook knows who is in a photo because they run the photo through a facial recognition AI. In order for that to work, every face in the picture has to be detected, and compared to Facebook’s database of faces. It’s hard to know what Facebook is doing with that data, and even if Facebook is acting with the best intentions to exclude babies from its face database, they would still have to train their AI models to find and exclude baby faces, and it most certainly wouldn’t be perfect.
The worry isn’t limited to big platforms. It is common for tech companies of all sizes to gather massive amounts of data on individuals, and this new economy did not draw a boundary at the nursery door.
A few years ago, some colleagues and I worked on a security review of a smart baby monitor, which included a broad look at what features baby monitors now contain. A baby monitor may now include options that go well beyond the old-school audio monitors, or more recent generations of video monitors. Features like monitoring of the baby’s breath, movement, and vitals (like heart rate and temperature) are all available. The ability to record video and save it to the cloud is a pretty standard feature, with some baby monitor companies offering to analyze the footage for you.
As I wrote at the time, we don’t know what will happen to all this data. But given the frequency of data breaches, there is a chance that companies holding this data will eventually lose control of it due to either buyouts, technical failure, legal action, or hacking.
Given that uncertainty, the only sure thing is to not feed any data about my baby to the internet. But that is not very practical, even if it were possible at all. For now, I’ve adopted a very limited strategy for my own baby—sharing mostly through Signal groups and text messages—and while we’ve found okay ways to share photos with grandparents, I’ll admit it can be a bit of a hassle.
Beyond the risk of someone who can’t eat solid food yet becoming the victim of a data breach, or having their face used to train AI models, there’s the parenting question of what I might be forcing my kid to live down. Growing up, I went through periods where I didn’t want my parents showing people photo albums with pictures of me as a small child. That was much easier to understand and control when all those pictures lived under the same roof I did. I am not sure how I would have managed that embarrassment if those photos were just out there for peers to find, worse yet full of the cutesy things my parents said about them at the time.
I worry what will happen when kids are old enough to understand and think about their own online data trail. How do we expect them to contend with possibly thousands of pictures that were posted before they could ask for them not to be?
Accepting a certain fatalism that their kid’s face will eventually be scanned by Facebook (or others), many of my friends who are also parents find middle ground for themselves. One approach that I heard from a friend, seems like a solid method to me: sharing photos to a tightly controlled group where all posts get deleted after thirty days. Plenty of time for sharing and connecting, but no mountain of pictures to sort through.
Being fatalistic about your baby winding up in big data sets is probably the most realistic response. Right now, I can ask the folks around me not to post pictures of my child to the internet, but the more I go out into the world, the less control I have. The more I think about it, the more I find places where compromise may be the best or only option. Am I going to turn down a spot at daycare because they post pictures online, or use some app I’ve never heard of? Not in the middle of this childcare crisis. Do I really want to be the person that stops another parent from posting pictures of their kid’s birthday party because my kid was there? No, of course not.
The reality is that none of us can predict all the ways in which the data collected about babies will ultimately be used. Tackling this through legislation has shown itself to be fraught anyway. Technology and legal experts, myself included, often find ourselves opposing laws intended to protect children, because they can both harm children’s privacy rights and create huge unintended consequences for other internet users. If there is a policy solution, it is certainly neither a quick or easy one to develop.
Considering what data is collected about babies is ultimately yet another thing that new parents must figure out. While we probably can’t stop all instances of data collection, and it is even harder to figure out where that data goes, we certainly should talk more openly about these questions, and I encourage all parents to think deeply about the possible long-term effects of a seemingly transient social media post.
But we should not put the responsibility entirely on parents. Doing so obscures the fact that parents can’t actually get all the information needed to make a truly informed decision. The tech industry makes vast sums of money from data it gathers about people, and has generally felt little need to explain what it collects or what it does with it. With this business philosophy extended to my nursery, I live in a reality where my only option is a series of unsatisfying compromises.
You May Also Like
Youth, Pride, and the Digital Divide: Keeping Today’s LGBTQ Youth Connected and Safe (The Thread, 2023): Queer and trans youth face a future of increasing discrimination as the number of policies targeting LGBTQ communities online continues to grow.
Follow The Thread! Subscribe to The Thread monthly newsletter to get the latest in policy, equity, and culture in your inbox the first Tuesday of each month.