A Technological Future for the Family
A few thoughts on "A New Technology Agenda for the Right"
A couple weeks ago, a number of authors I respect published A Future for the Family: A New Technology Agenda for the Right. Click here to read it.
They say:
A new era of technological change is upon us. It threatens to supplant the human person and make the family functionally and biologically unnecessary. But this anti-human outcome is not inevitable. Conservatives must welcome dynamic innovation, but they should oppose the deployment of technologies that undermine human goods.
These are strong words. Does this era of technology really threaten to supplant the human person, making us unnecessary? I think so. The threat exists. And that’s not simply because of social media, which is undoubtedly part of the concern. Every University professor I speak to is frustrated with students using generative AI (e.g., ChatGPT). Students are bypassing learning for the sake of a grade. The rise of self-driving cars (over 25 million driverless miles have been traveled by Waymo’s cars) raises questions about the necessity of human drivers. AI’s pattern recognition threatens to make some doctors redundant. And when tablets pervade the classroom due to “efficiency gains,” teachers, with all their human biases, can appear either redundant or harmful, or both. These are real threats.
My tech futurist readers will push back. They’ll say that self-driving cars are a technological marvel which should be celebrated. They will displace some jobs, yes, but we still need humans to make self-driving technologies. New tech necessitates new jobs. Doctors and teachers can work alongside AI, like a co-pilot, rather than be replaced. And of course, the classic argument is made that 95% of us used to be farmers, but we’re getting along just fine after the Industrial Revolution.
So should we be pro-tech or anti-tech? Most people recognize there needs to be some middle ground between rapid technological innovation on the one hand, and the limitation of technological intrusion into daily life on the other. No serious person really thinks Zoom calls are better than in-person gatherings, or that teachers should be replaced by machines. We recognize the need for balance.
And that’s why the principles drafted by these authors are so good. They encourage “dynamic innovation” while seeking to protect our humanity from the cult of technological progress and unchecked adoption.
First, look at the verbs chosen for the beginning of each of the ten principles:
Respect…
Support…
Protect…
Work…
Oppose…
Legislate…
Favor…
Favor…
Accelerate…
Launch…
On these verbs alone, you catch a sense of what these authors are about. It’s not merely being against something, it’s being for something. In our often-cited meaning crisis in the 21st century (as an example, Justin Brierley has spoken about this on his podcast), we need to state what we will prioritize. What are we for?
Many people want “progress” — but toward what? Hence, the ten principles offer what the authors want to conserve, which is defined by the nouns which follow the verbs.
Respect the natural cycle of mortality…
Support women…
Protect human sexuality…
Work to wrest childhood…
Oppose the political economy of addiction…
Legislate toward a restored republican culture in a digital age…
Favor technologies that enhance local and familial autonomy…
Favor technologies that enhance human skill and improve worker satisfaction…
Accelerate the transition to a new household economy…
Launch projects that encourage man’s cultivation of the natural world…
What is this set of principles about? People. As Kranzberg has said, “Technology is a very human activity.” And so, this Technology Agenda provides a set of principles for human persons. (It is hard to summarize all that I mean when I say “human persons,” but I follow the insights in Andy Crouch’s book, The Life We’re Looking For, where he says persons are heart-soul-mind-strength complexes designed for love).
What does it mean to desire technology to be built for persons? The way that most people think about “putting people first” may not align with the anthropology embedded in these principles. Some see themselves as no different than how they view their avatar on social media, where their bodies are like garments that can be changed according to an inner desire. While some aspects of our body can be manipulated (cutting hair and clipping nails), people are not simply inner-selves, autonomous and atomized. Our bodies matter, and so does the relationship our bodies have with others. I say this as a Christian. Scripture teaches me to respect our God-given bodies. More than matter, the “very good” word from God comes after the composition of body and soul (Genesis 1:26-31). For Christians, we honor God with our bodies; our bodies are temples of God.
I also say these things as someone who experiences reality daily. My body is part of who I am. It is silly to say otherwise. I am male. I am not female. I am a child of my parents, and I have grown beyond childhood. These are realities. I am not like an ever-changing video game character on a screen. I exist as a human — and that has consequences. Each of us has different responsibilities and abilities depending on our bodies. As Samuel James discussed in Digital Liturgies, modern technologies can misdirect us away from the realities embedded in persons. These principles help elevate realities of personhood above the fog of digital disruption.

Unlike some AI Ethics statements, this Technology Agenda has specific policy proposals in mind. For example:
“remove screens from the center of the classroom while restoring physical books and the mechanical arts.”
“Encourage the growing market of smart devices that offer tools for productivity and connectivity only.” (Note: this is a particular problem in Canada, since “dumb phones” like Wisephone are not available here. Though alternatives like Brick are effective substitutes.)
“protect privacy by blocking the transformation of everyday appliances into surveillance systems”
“require platforms to build robust tools that give users transparency and choices about the algorithms that construct their feeds”
“Favor technologies that enhance local and familial autonomy through right-to-repair laws, open-source software, and open-platform designs, all of which make technology less reliant on distant power centers.”
“Oppose the imposition of universal technological changes, such as the EV mandate, that undercut the capacity and responsibility of local actors.”
“Dismantle government incentives that push the American people toward artificial or virtual substitutes to embodied life, such as subsidies for lab-grown meat and a liability regime that punishes embodied industries and activities.”
Amen! These are good proposals. At first I questioned a few. But then on a second reading I realized I agreed. For example, there could be a potential medical benefit of lab-grown meat in certain circumstances for specific illnesses. To take one instance, the ketogenetic diet is increasingly used as a medical treatment for a set of illnesses, and following that diet can be difficult without the aid of the food industry developing intentional products to meet this need. But the statement in the Technology Agenda is against “subsidies” rather than against research. And this agenda is centered around America, whose medical system is designed for competition, allowing for the best products to succeed. The popularity of the keto diet is an example of this dynamic, as companies already make keto foods without needing a subsidy, based upon what the market demands.
There are a few things that these principles don’t say:
“pause the research and development of AI models unless certain criteria are met”
“ban Chatbots who impersonate human interaction”
“ban TikTok”
“require age verification for social media, p*rn, and online gambling”
I would be open to arguments for all four of these, and would likely be in favor if properly defined. And when I meet with two of the authors later this week (we’ll be recording an episode on these principles for What Would Jesus Tech), I’ll ask them if they believe went far enough with their proposals.
I also have other questions. Why isn’t God mentioned? Why isn’t the church mentioned? I think this is just the nature of a document that exists for the US government rather than primarily with Christians in mind. As a Christian, I’m in favor of these family-centric policies and principles. Though I think good Christians can disagree with these proposals while remaining faithful to the core aspects of our faith. With that in mind, I can get the logic of keeping Christian language out of something that nevertheless stems from Christian principles. I also respect that positioning these principles as “conservative” and “for the Right” will help spread them further. It’s an important clarifying set of guidelines. It’s not the final solution.
If you haven’t already, please read the principles for yourself and share them with others.
The real question isn’t whether the future will have technology. It’s whether we want that future to have the family as the fundamental institution of society.
We don’t have to be passive. We shouldn’t be passive. We should build a future where innovation serves families, respecting how we were made and who we were created to be.
Below are the authors of the New Technology Agenda. Two are on Substack.
Michael Toscano, executive director of the Institute for Family Studies; director, Family First Technology Initiative.
Brad Littlejohn, fellow, the Ethics and Public Policy Center’s Technology and Human Flourishing Project.
Clare Morell, fellow, the Ethics and Public Policy Center; director, Technology and Human Flourishing Project; author of the forthcoming book The Tech Exit (June 2025).
Jon Askonas, assistant professor of politics, The Catholic University of America; senior fellow, Foundation for American Innovation.
Emma Waters, senior research associate in the DeVos Center for Life, Religion, and Family at The Heritage Foundation.
Loved every word of this! Thank you!