We’ve seen the headlines. Companies invest millions into the latest security solutions and tick every box on their compliance auditor's list and still get breached. So if the current "best practices" aren't making companies truly secure, what are we missing?
Our CEO, Björn Orri Guðmundsson, recently sat down with cybersecurity veteran David Jacoby on our "Hack & Tell" podcast to find out. David has been in the game since the days of dial-up modems and BBS boards and he has a refreshing and blunt take on why our approach to security is not quite cutting it.
David’s journey didn't start in a university lecture hall. It started with a broken manual. Back in the day, he tried to use American "phreaking" guides to hack Swedish phone systems. They didn't work. That failure sparked a lifelong obsession with hacking and curiosity.
But David has some thoughts on the term “hacker”. He doesn’t think of it as a dirty word, but also not as a job description. It’s a mindset. It’s about being the person who understands how the machine works better than the person who built it. He describes it as “controlling technology instead of letting it control you.”
He also believes that one of the biggest issues in the industry is the obsession with hacker terminology such as "Red Teaming" and pretending like hackers are either movie villains or vigilantes. David argues we need to grow up and start using vocabulary that is more descriptive of the profession itself. He suggests professional hackers be called “Digital Inspector” instead.
Think of it like a fire inspector. They don't just tell you the building burned down. They check if the sprinklers work and if the exit signs are lit. A good security test should be a "control function", which documents what is strong just as much as what is broken. If your security team is just trying to "win," they aren't actually helping you stay safe.
This discussion of control and inspection naturally brings up a major point of failure in modern workplaces: the standard-issue employee computer. Most people underestimate the security of their computers and oftentimes feel they’re more secure than their mobile devices. But David points out that our computers haven't fundamentally changed in 20 years, whereas mobile devices are always improving. We give "normal" users administrative rights, access to PowerShell, and a file system they don't even know how to use. This is like giving them a Formula 1 car to go get groceries.
David envisions a future where 90% of employees ditch the wide-open laptop for sandboxed devices like iPads or Chromebooks. They are easier to manage, harder to break, and significantly cut down the "attack surface" that hackers love to exploit.
If a salesperson can do their job on a tablet, why give them a laptop that acts as a gateway to your entire network?
The conversation about devices quickly pivots from hardware to a larger institutional failure, which is compliance. There’s a big difference between checking a box and achieving actual safety. Just because you have an ISO certificate doesn't mean you're secure. Compliance proves you have a process, but it doesn't prove that your "seatbelt" isn't made of paper.
True security requires understanding your specific "threat model." You need to know exactly what you have to lose and how someone might actually take it, regardless of what the regulatory checkbox says.
Just as with compliance, much of our security awareness training is also designed to check a box rather than make companies more secure. Phishing simulations are a perfect example of this. They check who clicks a link and then "punishes" the clickers with more training. But it’s an unfair game. If a hacker has enough context, they will trick someone eventually.
Instead of measuring how many people click a link, we should be measuring our processes. Here are some examples:
Most people update their password by just changing a "1" to a "2" and adding an exclamation mark at the end. It makes us feel safe, but it’s a gift to hackers. David and Björn discussed a world where we stop obsessing over passwords and start focusing on identity instead. How does this look?
When MFA is the default everywhere, the entire "business" of phishing for credentials starts to crumble. We aren't quite there yet, but the goal is to make the "human error" of a leaked password a thing of the past.
But even with the best locks in the world, a house is only as secure as the spare key you forgot under the mat three years ago. David shared a story about a "Holy Grail" server. This was a high-value target protected by every modern defense imaginable, from strict network gates to MFA.
But he did eventually find an entry point. That was a forgotten, powered-off version of that same server sitting in the virtual environment. Because it was "retired," it wasn't encrypted or monitored.
To compromise it, David simply:
It’s a classic reminder that hackers don't always break in through the front door, they often find an old opening you forgot to delete.
Inspired by David's insights, here is what companies should actually focus on:
Technology alone can't solve a human problem. We need better tools, sure, but we also need better conversations. Security is a journey of staying humble and constantly inspecting the controls we've put in place.
This article summarizes key takeaways from David Jacobi's appearance on the "Hack & Tell" podcast.
Watch the full episode on YouTube:
Or listen on Spotify.