If you’re anything like my co‑workers, there’s a good chance you watched the Super Bowl. Even though I wasn’t rooting for either team, my wife and I watched the game (minus the halftime show). One thing that stood out to me was the commercials, some, particularly from Ring and Google, felt creepier than in years past. Those ads didn’t just make me uncomfortable as a consumer, they reminded me why privacy matters so much in the work we do as ministries.
In my personal life, I’ve tried to maintain a balance between convenience and privacy. That’s why you won’t find Alexa, Google, or Ring devices in my home. As AI has become more common, maintaining that balance has grown more difficult, both personally and professionally, and those Super Bowl commercials brought that tension into sharp focus for me.
CIM has a long history of working with missionaries who serve in countries that are not friendly to the U.S. or to Christianity. Because of that, we’ve always taken data security seriously. Historically, that meant protecting information from hostile governments. Today, that responsibility has expanded well beyond that original scope. That concern isn’t theoretical and I was reminded of that firsthand when I recently tested a new AI tool.
When Customer Data Becomes Training Data
As many of you know, I like to test new AI systems. Recently, I tried Gemini for the first time. Within 24 hours, I received an email from Google explaining that my chats and shared content, including files, audio, transcripts, browsing activity, and location information, may be saved and reviewed by trained service providers to improve Google services, including generative AI models.
After reading that, I went through every setting I could find to limit data sharing and then stopped using Gemini. In all my years of reading AI privacy statements, this was the first time I recall seeing a company explicitly state that user data may be reviewed by individuals.
Google isn’t alone in this shift. Around the same time, I learned about changes to Starlink’s privacy policy made shortly before SpaceX’s acquisition of xAI (Grok). The updated policy allows collected data to be used to train AI models, using a broad definition that includes communications, uploaded files, and inferred personal information. Taken together, these changes point to a larger trend: more companies treating customer data as training material by default.
“Once data is exposed, there’s no undo button.”
To help you out, I’ve created a webpage that CIM will keep updated with steps to opt-out of data collection and remember that Microsoft 365 Copilot does provide your ministry with Enterprise Data Protection and is currently running a promo that ends March 31.
May the Gospel of Jesus be the message that is not kept private,
Jonathan Meester, VP & Chief Technologist, Computers in Ministry
