Hiring managers expect privacy experts to be jacks of all trades: they must have legal experience, technical expertise, and knowledge of frameworks and controls in addition to privacy-specific knowledge.1 As technology and enterprises have evolved, ethics has also increasingly been in the purview of privacy teams. The, which was open for initial public draft, contained many ethics and fairness-related tasks, knowledge, and skills.2 The privacy profession is evolving to include multiple components of ethics, and privacy professionals must incorporate ethics-related considerations into their daily work.
Why Does Ethics Matter?
New technologies, such as fitness trackers and voice assistants, can make life easier by providing health-related insights and anticipating the user’s needs. These devices have become commonplace in many homes: In the United States, households have an average of 21 connected devices.3 However, as these technologies become ubiquitous, the ethical creation and application of these devices becomes crucial. For example, health trackers need to collect and process health-related information, and vSurveillance issues and potential harm from these devices increase exponentially when they become a fixture in people’s lives.
Additionally, artificial intelligence (AI) has the potential for application in life-or-death situations; it can help doctors review diagnostic tests and even identify potential drug interactions.4 If devices leveraging AI are trained on biased or poor-quality data, the outputs are likely to be biased, which could potentially lead to lethal consequences. Developing AI models with consideration for ethics is paramount given the current and potential applications of this technology.
In the context of technology, ethics includes (but is not limited to):
- Responsible technology development and deployment—This includes accommodating privacy by design and security by design in development, and forecasting potential harmful use of the technology once deployed. Guardrails to prevent misuse should be incorporated.
- Bias and fairness—Bias can manifest in myriad ways: data bias, algorithmic bias, and sampling bias are just a few types. Bias should be addressed across all projects, but anything involving automated decision making is of significant importance.
- Data quality—High-quality data is paramount to developing ethical technology. Data regarding user needs, preferences, and ethical standards can inform future development efforts. AI systems trained on poor or incomplete datasets will lead to bad, incomplete outputs.
- Accessibility—User abilities can shape how people interact with technology. Designing in an accessible manner ensures that technology can be adopted equitably across a variety of abilities.
Why Privacy Professionals?
Privacy professionals already largely deal with ethics-related issues. Privacy goals and objectives often involve upholding data subject rights. Privacy by design, the integration of privacy into the entire engineering process, relies on respecting the end user. While privacy can be a competitive advantage and support enterprise objectives, the core intent of privacy is less about revenue and more about protecting individual rights.
Privacy professionals are likely familiar with privacy-related laws and regulations. Many enterprises understand responsible technology as being fully compliant with existing regulations.5 While laws and regulations may be a component of a responsible technology program, they should not be the end-all and be-all objective: technology evolves faster than laws and regulations, and only meeting compliance mandates does not ensure ethical behavior.6 Regardless, privacy professionals’ experience with laws and regulations provides them with a solid foundation to ensure ethical outcomes.
Concerns
While privacy professionals may be more qualified to address ethical issues than other staff, there are some reasons to be concerned with this delegation of responsibilities. Forty-eight percent of privacy professionals in ISACA’s State of Privacy 2025 survey believe that their privacy budget will decrease in the next 12 months, and 43% believe their privacy budget is currently underfunded.7 Potential cuts to privacy budgets could mean that ancillary work, such as ethics, could take a backseat to compliance efforts. The ultimate impact could mean that enterprises operate without consideration for ethics simply due to limited privacy resources.
Additionally, many privacy professionals do not have expertise in all the disciplines that ethics encompasses. For example, the NIST Privacy Workforce Taxonomy draft includes a task about evaluating “AI systems in regards to disability inclusion, including consideration of disability status in bias testing, and discriminatory screen out processes that may arise from non-inclusive design or deployment decisions.”8 While accessibility in design is an important issue, privacy professionals may not have expertise in this field. Concerns about inclusion are often the purview of user interface/user experience (UI/UX) staff. Resources such as the Web Accessibility Initiative (WAI)9 can provide privacy professionals with a good starting point for accessibility evaluations, but additional study and research may be needed for more in-depth testing.
Addressing issues of bias may require knowledge of data science knowledge and statistics. Some privacy professionals may have this statistical knowledge, but theoretical knowledge about bias may not suffice when trying to identify and remediate biases.
Privacy professionals may also be expected to have at least a foundational understanding of how the technology works. The NIST Privacy Workforce Taxonomy draft has multiple tasks addressing bias in AI systems, which implies that privacy professionals understand bias and the workings of AI. Note that bias and ethical technology are not limited to AI: ethics should play a role in the entire technology landscape.
Anyone tasked with the ethical development and deployment of technology must understand how the technology operates and how that operation might impact human rights.
Preparing for This Evolution
Privacy professionals need to be compliance experts, technical experts, and ethics experts. While education and hands-on experience is valuable, privacy professionals should not neglect the importance of skills typically considered to be “soft skills.” Agility, strong communication skills, and critical thinking abilities are prerequisites for anyone hoping to work on responsible and ethical technology development. Given that much of the risk around emerging technologies is unknown, privacy professionals must think creatively to anticipate and address the potential harm that these technologies could cause.
Privacy professionals cannot address ethical technology on their own; thus, collaboration will be vital. UI/UX colleagues, lawyers, and data scientists are just a few roles that can support the development of ethical technology. Leveraging the expertise of other departments is vital to ensure that ethical considerations in development and deployment are considered from multiple angles. Additionally, hiring managers may want to consider hiring privacy professionals who have less-traditional backgrounds: business ethics, design experience, and creativity may become more highly sought after in light of a growing emphasis on ethics.
As with all digital trust professions, curiosity and a desire to learn are vital for privacy professionals. Understanding how emerging technology works can shed light on how to develop it responsibly, while learning about ethics can ensure that new initiatives do not violate human rights and aim to improve the human condition; ideally, privacy professionals will understand both. Dedicating even just an hour a week to proactively seek out articles, books, and other resources about ethics in technology can help privacy professionals develop safe, responsible, and fair tools and systems.
Endnotes
1 ISACA©, State of Privacy 2025, January 2025,
2 National Institute of Standards and Technology (NIST) Privacy Workforce Public Working Group, NISTPrivacy Workforce Taxonomy Initial Public Draft, 21 November 2024
3 Blinder, C.; Velasquez, V.; “Average Number of Smart Devices in a Home 2025,” Consumer Affairs, 23 April 2024
4 Kazi, S.; Beder, C.; “Safely and Responsibly Using Emerging Health Technology,” ISACA, 25 October 2024
5 MIT Technology Review Insights, “The State of Responsible Technology,” 11 January 2023
6 Kazi, S.; “Privacy in Practice: Ethics vs. Compliance,” ISACA Journal, vol. 1, 2025
7 ISACA, State of Privacy 2025
8 NIST Privacy Workforce Public Working Group, NIST Privacy Workforce Taxonomy Initial Public Draft
9 Web Accessibility Initiative (WAI)
Safia Kazi, AIGP, CIPT
Is a privacy professional practices principal at ISACA. In this role, she focuses on the development of ISACA’s privacy-related resources, including books, white papers, and review manuals. Kazi has worked at ISACA for more than a decade, previously working on the ISACA Journal and developing the award-winning ISACA Podcast.