A few years ago, I walked into a confessional while traveling in another country. I was in a busy European capital, but the church was quiet, perhaps because it was the middle of the week. It was an old Gothic-style Catholic church, not one of those modern buildings. Its thick stone construction did an excellent job of muffling the noise of the outside world.
The priest was waiting, scrolling on a handheld device, maybe reading, praying, or just passing time. Before beginning, I asked if he wouldn’t mind placing the device outside. He did, without hesitation, though slightly surprised by the request. We spoke about it briefly afterward. I never thought much of it—until recently.
You see, I’ve never brought a smartphone into a confessional, a medical office, or any other room where sensitive conversations take place. Maybe that’s generational. I grew up with fax machines and typewriters still in the background, even as the digital age arrived and began erasing privacy.
I’m no Luddite. I appreciate good tech. Tools can make us better. But they must remain tools, not proxies for discernment, trust, or discretion.
Lately, I’ve noticed a troubling trend: clients—and sometimes potential clients—activating AI tools like ChatGPT during private video conferences or even face-to-face meetings. Yes, I still prefer in-person meetings. Sometimes these apps are passively transcribing; other times, they’re actively queried for legal insight. Often, my fellow organics don’t realize the consequences.
First, these tools aren’t secure. Data shared with them is often stored, processed, and potentially used to train future systems. Second, using such apps while speaking to an attorney risks waiving attorney-client privilege. Confidentiality is the cornerstone of that privilege, and these systems are third parties.
Finally, these tools aren’t lawyers. They don’t offer legal advice; they generate responses based on patterns in data. Professional codes of conduct do not bind them. Some experts are even exploring ways to make them sentient. But remember: we know who the trustworthy source of life is, and it is not a machine, nor will it ever be. For now, and hopefully for a long time to come, no app owes you a duty of care. And you can’t sue them when they get it wrong.
In our human rights work—often in places where the rule of law is more aspirational than real—we’ve used advanced technologies to support independent lawyers and activists. But we’ve also seen the same tools weaponized by authoritarian regimes. What once existed only in the dreams of totalitarians is now scaled and seamless. These technologies aren’t neutral; they amplify the intent of those who wield them.
If we thought the Cold War-era surveillance systems of the Eastern Bloc were intrusive, today’s technologies make them look almost primitive. Governments—and bureaucracies of all kinds—have practiced surveillance in one form or another for centuries. But today, it’s not just states collecting information.
The largest gatherers of data are corporations. And the most powerful surveillance tool is the smartphone in your pocket. You’re not just using it—you are the product. Every tap, swipe, and search generates data you are, in theory, voluntarily handing over, allowing others to profile and influence you in ways that are often invisible.
During the Cold War, totalitarian regimes built vast security apparatuses to control their populations. In Romania under Nicolae Ceaușescu, the Securitate maintained such tight control that even typewriter imprints were registered with the state to trace anonymous writings.
In Communist Cuba, since the 1960s and to this day, the security services and the Communist Party monitor individuals from birth to death, sometimes even before. Detailed records are kept on nearly every aspect of a person’s life: education, employment, religious activity, political leanings, family ties, and foreign contacts. Surveillance is not just a tactic, it’s a way of life, woven into the very structure of the state.
Stalin’s Soviet Union used the NKVD, later the KGB, to root out dissent and enforce conformity through mass surveillance and forced confessions. East Germany’s Stasi monitored everything: recording conversations, collecting scent samples from citizens, and recruiting friends and family as informants.
Hungary’s ÁVH and Yugoslavia’s UDBA followed suit. These systems were terrifyingly effective, yet they operated without the digital reach and data-mining capabilities we have today. With artificial intelligence, geolocation, and biometric profiling, the tools of control are no longer confined to authoritarian states. They are global, fast, and increasingly invisible.
Under the National Socialists in Germany, surveillance and control were systematized to a terrifying degree. The Gestapo cultivated a vast network of informants, and even casual remarks could result in arrest, forced confessions, or execution. Dissent wasn’t just dangerous, it was often fatal.
The information collected by the Nazis was so detailed and meticulously cataloged that it enabled the orchestration of the Holocaust, during which millions were murdered. Bureaucrats and police documented every stage, from identification and property seizure to transportation and extermination, with chilling administrative precision. It was genocide by paperwork.
For clients, especially those already burdened with stress, uncertainty, or genuine fear, these modern applications can introduce a potentially damaging layer of noise. Whether the issue is family, business, or something even more personal, people in vulnerable moments are already managing a flood of information and emotion. Turning to an AI tool in those moments may feel like clarity, but it’s often a false sense of control. These tools don’t understand nuance, urgency, or stakes. They don’t know you. And they bear no consequences when they fail you.
That’s why we, as lawyers, must be even more vigilant. In an age when technology is shaping expectations—and sometimes muddying the waters—we must be the steady hands. It’s our responsibility to ensure that client conversations remain protected, that advice is grounded in genuine understanding, and that we help clients navigate not only their legal issues but also the tools they’re tempted to trust.
Diligence today doesn’t just mean knowing the law. The practice of law has never been only about statutes and precedent. It’s about discerning problems, both visible and hidden, and working closely with each client to determine what truly serves their best interests. In an age of powerful new technologies, it also means, more than ever, safeguarding the space where the law meets the human condition.
That said, this is also an exciting time for law, for medicine, for humanity. These tools, if used wisely, can make us sharper, more informed, maybe even more just. They have the potential to help good lawyers become even better ones. But just as a lawyer should never submit a brief drafted by AI without checking every citation and argument, clients and the public shouldn’t mistake outputs for counsel or accuracy.
The key is discernment. These are still early days. In some cases, these tools create more work, not less. But they hold promise. We need to remember what they are: tools. Not stand-ins for wisdom, not replacements for trust, and indeed not substitutes for a conversation held in confidence, human to human.
As we step deeper into this new technological era, we must not forget the lessons of history. Modern surveillance may not always wear a uniform or bear a government seal, but it is no less real. The challenge ahead is not simply to resist these tools, but to use them with awareness, humility, and a sense of accountability. In doing so, we preserve the human space that liberty requires.