Quick take
Stop treating security as a gate at the end of the pipeline. Embed it into how your team thinks, designs, and ships. This requires specific policies, visible leadership, and the willingness to slow down a release when the answer to “what happens if this gets compromised?” is “I don’t know.”
Security culture isn’t a project
I’ve watched teams try to buy their way into security. They purchase a scanner, run a penetration test once a year, and call it done. Then a junior engineer hardcodes an API key, pushes it to a public repo, and suddenly the scanner doesn’t matter.
Security culture is the thing that prevents that push in the first place. Not a tool. A habit.
At the fintech startup, we handle financial data. Market signals, user portfolios, payment information. There’s no version of “we’ll fix it later” that regulators or users will accept. Security had to be foundational from the start, not bolted on after the first scare.
What NATO Cyber Defense taught me
A principle I carry from NATO Cyber Defense exercises: the single biggest lesson wasn’t technical. It was organizational. The teams that performed well weren’t the ones with the best tools. They were the ones where every person understood the threat model and acted accordingly without waiting for permission.
That stuck with me. Security at scale is a culture problem, not an engineering problem.
The policies I actually enforce
Theory is cheap. Here are the specific rules I’ve implemented across teams.
No secrets in code. Ever. We use environment variables and a secrets manager. The CI pipeline fails if it detects anything that looks like a key or credential in the codebase. This is automated, not optional, and not overridable without a written justification that I personally review.
Every pull request gets a security question. Not a full threat model. Just one question: “What is the worst thing that happens if this code is exploited?” If the author can’t answer it, the PR doesn’t merge. This forces engineers to think about attack surface as part of their daily work, not as a separate exercise.
Least privilege by default. New services start with zero permissions and add only what they need. New employees get read access to the repositories they work on and nothing else. Escalation requires a request and a reason. I review access quarterly and revoke anything that isn’t actively justified.
Dependency updates aren’t optional. We track dependencies weekly. Known vulnerabilities get patched within 48 hours for critical severity, one week for high. This is a policy, not a suggestion. I’ve delayed feature work to meet these windows.
Incident response is rehearsed. We run a tabletop exercise every quarter. Not a checkbox drill. A real scenario where I throw a curveball halfway through to see how the team adapts. The ones who have done this three or four times respond to real incidents with calm instead of panic.
The hard part is consistency
Any team can write a security policy document. The hard part is enforcing it on the days when you’re behind on a deadline and the shortcut is right there.
I’ve blocked releases. I’ve told product managers that a feature would ship a week late because the authentication flow wasn’t reviewed. Those conversations are uncomfortable. They are also the moments that define whether your culture is real or performative.
Engineers watch what leadership does, not what leadership says. If the CTO merges a PR that skips the security review because “we need to ship,” every engineer on the team learns that security is negotiable. It takes one exception to undo months of habit building.
Security champions scale your coverage
I can’t review every line of code. No security team can. The answer isn’t to hire more security people. It’s to make every engineer a little bit dangerous.
At the fintech startup, I designated one engineer on each team as a security champion. Not a full-time role. Maybe 10-15% of their time. They attend a monthly session where we review recent vulnerabilities in our stack, discuss attack patterns relevant to fintech, and update our threat model.
These champions become the first line of defense in code reviews. They catch the obvious issues before they reach me, and they raise the questions that nobody else on the team would think to ask.
The key is that being a security champion is respected. It counts in performance reviews. It isn’t extra work on top of their real job. It’s part of their real job.
Make the secure path the easy path
If doing the right thing is harder than doing the wrong thing, people will do the wrong thing. This isn’t a moral failing. It’s human behavior.
We built internal libraries that handle authentication, input validation, and encryption correctly. When an engineer needs to call an external API, they use our wrapper that handles TLS verification, credential injection, and audit logging automatically. The secure path is also the path with less code to write.
We templated our infrastructure so that new services deploy with network isolation, encrypted storage, and logging enabled by default. An engineer has to go out of their way to deploy something insecure. That friction is intentional.
Incidents are data, not blame
When something goes wrong, and it will, the response determines whether your culture strengthens or collapses.
We run blameless postmortems. Not because blame feels bad, but because blame destroys information flow. If engineers fear punishment, they hide mistakes. Hidden mistakes compound. The breach that takes down a company is rarely the first failure. It’s the tenth failure that nobody reported because the first person who reported one got burned.
Every postmortem produces exactly two outputs: a timeline of what happened, and a list of specific changes to prevent recurrence. Not “be more careful.” Specific changes. A new automated check. A revised permission. A policy update.
Security is a daily decision
Security culture is discipline. It’s the same decision made correctly a thousand times, even when it’s inconvenient. Especially when it’s inconvenient.
You don’t build it with a training video or an annual audit. You build it by making security part of every design discussion, every code review, every deployment decision. You build it by enforcing the standards you set, including on yourself.
Discipline over heroics. Every time.