Media contact

Richard Buckland
School of Computer Science and Engineering
02 9385 4063

When it comes to computer security, there’s no silver bullet. There’s no way to be certain that a system can never be hacked. In fact, at current rates of growth anyone born today will – at some point by the age of 30 – have been hacked, scammed, had their identity stolen, or their smartphone (or future equivalent) compromised. Cybercrime affects everybody.

The attack on the Australian Bureau of Statistics’ Census website appears to have been a simple Denial of Service (DoS) attack; such attacks are not about extracting data from a site, but are designed to overwhelm it with too much traffic in the hope of rendering it non-responsive. Typically, the attacker uses thousands previously hacked computers which – unbeknownst to their owners – bombard the target website with data. Like a flood of hoax calls to a talkback show, it becomes hard for legitimate callers to get through.

There’s no shame in being the victim of DoS attack – it has happened to the most secure websites. What is concerning is not being prepared for one, especially for such a tempting target as, say, a national online census about which privacy concerns have been voiced for months before the eventful date arrived.

The ABS are superb statisticians who are highly regarded worldwide. They’re not – and should not be expected to be – cyber security engineers. They were unlucky: if their collection site hadn’t been attacked in that particular way at that particular time, it’s likely no-one would have realised how flawed their security design was. Typically, no-one notices security risk because there’s no consequence unless you are unlucky and weaknesses get discovered in a public way. As the business magnate Warren Buffett has said, “Only when the tide goes out do you discover who’s been swimming naked.”

Yes, they should have proper filtering in place, multiple back-up systems on standby, and these should have been properly stress-tested, and a whole range of other technical things could have been done. But that’s not the problem, it’s just a symptom. The problem is that someone signed it off saying, “This is secure. Don’t worry. This is fine. Trust us”. Demonstrably, it wasn’t.

What’s needed is not for heads to roll at the ABS, but for everyone in government and industry to rethink computer security – not just the ABS. We need more openness, public scrutiny and discussion, rather than “Trust us” assurances that are unrealistic and build a false sense of security. That’s more trying to sell us something than trying to address a legitimate security concern.

Hackers were once communities of young people who enjoyed breaking into systems to show off or thumb their noses at authority. That changed in early 2000s, when banking began to go online. Once money was involved, criminal cyber gangs soon arose. Whereas before you could only be robbed by criminals in your city or neighbourhood, now anyone in the world can rob you.

So much of our lives are online now: we can shop, bank, video call anywhere in the world, watch movies and download music. It’s incredibly convenient, but it also makes us more vulnerable. There’s no way to have one without the other: the only way to be totally safe from cybercrime is not to be online. And that’s unrealistic.

There’s no magic technical solution that will forever beat cybercrime or hacking. Computers and the systems they form are so complex, with many millions in lines of code and places things can go wrong, that they are simply too complex to make foolproof, let alone bug-proof. There will always be vulnerabilities to exploit; as each is found and closed off, others will be discovered.

Instead of thinking of cybersecurity as experts building walls to keep unconcerned citizens safe, we need to think of it like living with cars. We can design safety features that make cars safer, reduce fatalities and minimise traffic accidents, but we will never eliminate all accidents. We citizens, as pedestrians, have also learned to be aware of cars: when we go out into the street, we are mindful of cars and act defensively around them.

That’s what we need to do with cyber security. We shouldn’t just rely on our engineers to build the safest cars, or our bureaucrats to create the best safety standards – yes, that reduces risk, but doesn’t eliminate it. We too need to teach ourselves and our families a mindset that has us vigilant when we’re online. Just as we teach our children how to be mindful pedestrians, to understand cars and act defensively, so too we need to teach them to be conscious and informed about life online.

Most importantly, government and industry need to abandon the culture of “Trust us” secrecy that surrounds cyber security. In the United States, it is mandatory to report cyber security breaches. Where once U.S. companies were hacked and hid it, now they have to disclose it; this has led to a major improvement in cyber security across the board to the extent where the U.S. are leaders.

We need the same openness laws in Australia. Because, while cyber security is costly, the risk of being hacked is invisible – until you get hacked. If everyone is required to be open about cyber security incidents, we’ll all be a lot safer in the end.

Richard Buckland is Associate Professor in Computer Security, Cybercrime and Cyberterror at UNSW, and a Board Director of Australian Computer Society.

This opinion piece was first published in The Sydney Morning Herald.