Women working in cybersecurity and GRC often see the industry from a slightly different vantage point. From that vantage point, a pattern becomes clear: many of the biggest risks organizations face today are not technical at all.
They are structural. Cultural. Embedded in governance frameworks that were never designed around how real people think, work, and make decisions.
This International Women’s Day, under the IWD 2026 theme of Give to Gain, we asked five women across the cybersecurity and GRC landscape a simple question: What knowledge are you choosing to give forward? What lessons, hard truths, or perspectives do you wish someone had shared earlier in your career?
Their answers reflect years of experience navigating security programs, compliance expectations, and organizational realities. But more importantly, they reveal something larger about where the field is heading.
The insights they share cut through the noise. They challenge assumptions the industry has quietly accepted for years and point toward a more thoughtful, human-centered approach to governance, risk, and compliance.

Like Mariana Padilla shares, that instinct to shrink, to become palatable, isn’t just a personal struggle. It’s also a professional one with real consequences for the field. When the people who see things differently stay quiet, the frameworks they might have challenged stay unchallenged. And in GRC, unchallenged assumptions have a habit of becoming systemic blind spots.
The biggest risk you’re not measuring
Many GRC programs still treat human behavior as a variable to control. Train employees. Run phishing simulations. Track click rates. Repeat.
Yet the same problems keep resurfacing. Audit fatigue grows. Compliance starts feeling like paperwork. Security culture struggles to take root. These outcomes are often blamed on employees, but the real issue is structural. Most governance frameworks were not designed around how people actually think, process information, or react under pressure.
At the same time, AI is accelerating the pace and scale of modern threats. Yet many organizations are still trying to solve this human gap with the same compliance approaches that struggled before.
Lisa Ventura, CEO and Founder of the AI and Cyber Security Association, captures the core issue when she says the biggest risk ahead is “the continued failure to design GRC frameworks around how human beings actually think, feel, and process information.”
Her perspective highlights an important shift. Compliance cultures work when people feel safe to speak up, question decisions, and report problems early. When employees feel psychological safety, they are far more likely to surface risks before they escalate.

Designing compliance around how people actually work, communicate, and respond under pressure is not a soft idea. They are practical ways to build compliance environments that actually hold up under pressure.
The future of compliance is not simply more controls. It is organizations where doing the right thing feels possible for everyone who works within them.
Cut the noise. Prioritise what actually matters.
One of the most persistent failure modes in cybersecurity is tool-first thinking: the belief that a new platform, a new automation layer, a new detection capability will solve what is fundamentally a judgment problem.
The practitioners who cut through this are the ones who have learned to distinguish signal from noise: to ask not “what does this tool surface?” but “what are our highest-risk parameters, and are we covered there?” That’s a harder question, and a more honest one, especially when talent is scarce and budgets are constrained.

This connects directly to what it means to view security through a business lens by treating resource constraints not as obstacles to good security, but as the conditions within which good security decisions actually have to be made. Technology, in that frame, becomes an enabler of sharper judgment rather than a substitute for it.

The knowledge that gets passed forward through mentorship and sponsorship isn’t usually a technical skill. It’s the ability to reason about a problem in a way that holds up when the commercial pressure is high, the talent pool is thin, and the vendor landscape is louder than ever.
Governance as the civilisational response
There is a tendency to treat GRC as compliance overhead, the paperwork that sits between an organisation and its actual work. But that framing misses something important about the moment the industry is operating in.
AI at scale, financial systems moving at machine speed, surveillance infrastructure expanding faster than the legal frameworks designed to govern it. The stakes are shifting, and fast.
Governance, in this context, isn’t administrative. It’s the mechanism through which organisations decide what kind of entity they’re going to be.
It’s where values become controls, and where accountability is either built in or quietly discarded.
As Michelle Finneran Dennedy, Abaxx Technologies, says, “GRC professionals didn’t choose the arena, but we’re in it now: AI technology at scale, money at speed, laws in flux, accountability under pressure, surveillance everywhere, politics on fire. “

That framing, governance as re-entering the arena, is a significant recast of what GRC professionals actually do. Not compliance overhead. Not box-ticking. A deliberate, values-driven intervention into how systems of consequence are built and run.
It demands practitioners who are willing to hold the line even when the commercial pressure runs the other direction.
What giving forward actually looks like
The IWD 2026 theme of Give to Gain has an obvious individual dimension of sharing knowledge, mentoring the next practitioner, and opening doors. But the five perspectives in this piece point to something more structural: the industry itself is better when the people who see its blind spots say so out loud.
Giving forward, in this field, looks like naming the failure in GRC design that everyone quietly knows but few put in writing. It looks like calling out tool sprawl when the harder question is about prioritisation.
It looks like the real challenge is resisting tool-first thinking at the moment when it would be easiest to defer to the platform. It looks like refusing to shrink and choosing to create your own opportunities when none are offered. And it looks like holding governance to a higher standard, not because of regulatory obligation, but because the stakes genuinely demand it.
Each of these women is passing something forward that their field needs more of. The gain is not just theirs. It belongs to everyone who comes after them.




































