January 27, 2026
Murali Mani has spent most of his life as an engineer, from his Ph.D. on the geometry of motion to his work integrating chipsets into early HDTVs. But about a decade into his career, he shifted from physical mechanics to the invisible architecture of privacy engineering, where the problems were thornier and the stakes often higher.
As a senior privacy officer for a global health care company that worked on clinical trials and later as a compliance leader for a medical device division and a genetic testing startup, he built protections around health data, among the most sensitive — but with safeguards that still allowed doctors and researchers to unlock insights for new treatments without putting patient privacy at risk.
For Mani, the goal of privacy engineering has always been about building trust and protecting people while enabling innovation. Today he is a vice president for Privacy, AI and Data Responsibility at Mastercard, responsible for protecting the company’s transaction data and other sensitive information as artificial intelligence makes raw data more powerful — and more revealing.
As the world has gone digital, data is no longer simply a by-product of life. It is life: our purchases, our habits, our movements, our identities. As a result, the job of privacy engineers has become more important and exponentially harder. Data moves faster, crosses borders more easily and can be combined in ways that reveal far more than anyone ever intended, because AI systems can detect patterns invisible to humans.
Governments around the world are also tightening regulations on how that information is used and where it can be processed, raising global debates about privacy, security and sovereignty. But protecting people’s information is not just about compliance — it’s also about anticipating how data might be used, misused or misunderstood.
While Mani works with teams throughout the business — with product and software developers, data scientists, AI experts and lawyers — he approaches data privacy as an engineer, finding novel ways to embed privacy into the machinery of Mastercard.
And, as might be expected with an engineer, he uses an automative metaphor to explain his role: “It’s all about helping the teams who are actually managing the data, talking to them and implementing the controls,” he says. “Imagine the product team is building a vehicle, with the latest engine and technology features, and as a privacy person, I am giving them standard safety features, such as a seat belt and sideview mirror. But sometimes it’s an airbag and antilock braking — so that everyone is better protected.”
After leaving the genetic testing startup, Mani was looking for a new challenge. He was intrigued by Mastercard’s investment in privacy technologies, especially its work on data anonymization through Trūata, a Dublin-based “data trust” that started as a joint venture with IBM and is now an integrated part of Mastercard’s enterprise-wide data resources. (The 20-minute commute to Mastercard’s Purchase, New York, headquarters from his Westchester County home didn’t hurt either.)
Payments data, Mani quickly learned, is extraordinarily powerful, but also unique in its makeup. In 2024, Mastercard processed 159 billion transactions, and that information is anonymized and aggregated when harnessed for data insights. But Mastercard also uses techniques like tokenization — replacing the credit card account number with a unique placeholder — so the card data can’t be traced back to individuals if hacked.
These are examples of privacy controls, which come in two broad categories. Technical controls are built into the systems themselves — like de-identifying data before it’s used. Administrative controls rely on people and processes, such as training employees to recognize when something looks wrong. And in some situations, both are required.
Mani’s job is to create and embed these controls so the de-identified data can be accessed quickly and handled securely and in compliance with a host of national and international regulations.
For example, purpose limitation — using data only for the reason for which it was collected — is tricky to implement. Sometimes it’s enforced through training. Increasingly, though, companies are turning to technical controls that prevent data from being used for unauthorized purposes. “You can create data as a product,” Mani says, “and the platform would prevent you from using it for a different purpose.”
Mastercard engineers are developing software tools for data profiling, scanning massive datasets to determine their origin, sensitivity and characteristics, much like how a blood test reveals what’s happening inside the body. The company also maintains separate identified and de-identified databases, ensuring that analysts can never access both at the same time, a safeguard against re-identification.
Today, Mani explains, one of the biggest challenges in global privacy is data localization — laws that require data derived locally to stay within a particular country’s borders. Data engineers are working on tools to tag data with dozens of attributes that will allow Mastercard to enforce those rules automatically. In the future, data can be tagged to reflect contractual requirements and customer preferences; allowing, for example, an open banking account holder to grant, revoke or renew time-limited consents for account or transaction data sharing with third parties.
“Implementing the controls at scale,” he says, “allows us to use analytics at scale.”
Mastercard is constantly exploring other privacy-preserving technologies, such as synthetic data — artificially generated datasets that mimic real data without any connection to underlying customer information. Synthetic data is useful for demos, testing and evaluating third-party tools, though Mani cautions that AI modelers prefer real-world training data.
So-called clean rooms allow Mastercard and its partners to combine data temporarily, run analytics and then delete the data afterward. Multiparty computation techniques let companies derive insights from these combined datasets without sharing the underlying information with other partners.
If privacy was already complex, AI is turning it into three-dimensional chess. Traditional analytics might categorize cardholders by how often they use their cards. AI can detect intricate behavioral patterns, signals that humans would never think to look for. That power raises the risk of re-identification and what Mani calls the “creepiness factor.”
“AI could find all kinds of complicated signals that we don’t even know about,” he says, citing a case where a major retailer’s data analytics software was able to infer that women were pregnant and estimate their due dates based on seemingly innocuous purchases like unscented lotion.
Mastercard’s AI teams review every use case and apply strict controls to those that are approved. High-risk applications are stopped before they reach production. And built into every algorithm is transparency (showing how an AI system works and what data it uses), observability (monitoring how it behaves to find and fix problems) and tools to detect bias, so people can trust its outputs.
For all the complexity of building privacy controls at global scale, what Mani loves about his job is the people around him. “Mostly that I’m working with these brilliant people and I’m learning something new every day,” he says. “And I’m able to contribute in that environment and create new ideas and also help to protect privacy at the same time.”
On the eve of Data Privacy Day January 28, Mani’s advice for anyone worried about their own digital trail is far simpler than the systems he designs: “Keep a low profile,” he says, including on social media and search engines, which minimizes the breadcrumbs you leave online. Turn off the cookies you don’t need; don’t broadcast your whereabouts; reduce the surface area of what the world can learn about you.
It’s the same philosophy that underpins Mastercard’s approach to data: Trust is built by embedding security, integrity and accountability into the mechanics of its network — trust that is engineered and also earned.