I saw a story pop up this week which made a bunch of headlines and upon sharing it, also sparked some vigorous debate. It all had to do with a 19-year-old bloke in Canada downloading some publicly accessible documents which, as it later turned out, shouldn't have been publicly accessible. Let's start with this video as it pretty succinctly explains the issue in consumer-friendly terms:
VIDEO: Nova Scotia's government is accusing a 19-year-old of breaching their government website's security ~ Privacy experts disagree.
— Brett Ruskin (@Brett_CBC) April 13, 2018
Oh, and here's how the teen did it: pic.twitter.com/FQ2qXJoP89
So the crux of the matter seems to be that the guy pulled down a bunch of files by enumerating through file names without realising that the publisher of said files had not intended for them to be public. Allegedly, he didn't realise the data wasn't meant to be public and as I later put it, this was his mistake:
At face value, this sounds like a pretty innocent mistake. There are plenty of crossroads many of us face where similar mistakes can easily be made. https://t.co/86uaq8IShw
— Troy Hunt (@troyhunt) April 18, 2018
The crossroads I referred to in that tweet reflects the fact that many of us working in this space are often faced with a decision; we've identified that data is accessible in this fashion (we've discovered a URL parameter can be modified to pull another resource), do we proceed with accessing more data or stop there? Everyone will agree with everything I've written so far, let's start getting into the more contentious side of things and we'll start with the view in defence of the young bloke:
This was public data. Whether it was intended to be public or not does not change the fact that it was published to a location which exposed it to the world without any requirement for authorisation whatsoever. His "crime" was simply to use the technology as it was designed to work. There was a lot of support for this position:
If something can be publicly accessed, then it IS public. Intended or not - that's on the publisher.
— Viesturs Kavacs (@VKMKD) April 18, 2018
The counterargument is that this was not simply a case of the guy following links and landing somewhere the site operator didn't intend people to find, this was parameter tampering. He manipulated the URL such that it exposed resources beyond the ones he organically found by browsing the site and that is exploiting a known vulnerability. In fact, it was up there in the OWASP Top 10 until last year (when it was merged into "broken access control") and it's referred to as an insecure direct object reference.
Seeing legal action appear as a result of enumerating through URLs is not unprecedented. In 2011, Patrick Webster identified a weakness in First State Superannuation's web portal which allowed him to access 770k financial records belonging to other customers. The cops subsequently turned up on his doorstep and took his computer things away. The previous year, Andrew Auernheimer (AKA "weev") found he could enumerate IDs in AT&T's iPad enrolment API such that he managed to obtain 114k records of other subscribers. He was subsequently charged and found guilty of identity fraud and conspiracy to access a computer without authorisation.
Now, at this stage you may well say "Yeah, but Patrick and weev knew they were exploiting a security weakness whilst the young Canadian guy simply thought he was accessing material that was intended to be public", to which I would wholeheartedly agree. These cases are very different in intent and assuming we can take the news reports at face value, the charges against him are totally out of line. However, much of the defence I've seen for the guy's actions centred on the premise that if there's no protection on the data (as was the case with Patrick and weev), then it's free game. That's something I vehemently disagree with.
Last year I did a talk at the local AusCERT conference titled The Responsibility of Disclosure: Playing nice and staying out of prison. I've embedded a video of that below deep-linked precisely to the point where I talk about the ethics of probing away at direct object reference vulnerabilities and it's worth watching just a few minutes here:
The key takeaway here is that in terms of vulnerabilities, once you "plus 1" in a URL and pull someone else's record, that's it - you're done. You've proven the risk. For example, when I was investigating the vulnerability in Nissan's LEAF a couple of years ago, once I found one other vehicle via an exposed VIN then that was it. I could have pulled hundreds or thousands of other vehicle's data, but to what effect? Some people will argue that you won't be taken seriously enough unless you make a big impact by pulling a heap of data, but is that worth ending up in the same boat as any of the 3 guys mentioned above? No, it's not, especially when there are numerous other ways to highlight the vulnerability.
Now, to the question posed in the title, is any of this "hacking"? Frankly, I don't think it matters what term you put on it and you could argue it either way: it was by no means a sophisticated attack and it's something even my 8-year-old son could do, but it did also result in access to material which wasn't intended to be accessible. If I had to take a side, I'd say "no hack" simply because the intent wasn't there, but equally I'd argue that the other two examples I've given could be construed as hacks because the intent was clearly to access data that both parties knew was meant to be protected.
In summary, improperly secured publicly facing data shouldn't be viewed as a free for all. There are many cases where those accessing it know damn well it's not intended to be exposed in that fashion and indeed there are many precedents of very unpleasant legal consequences as a result. But that doesn't seem to be what this case is about and assuming there's not some major piece of the story missing from the reporting, the young guy is getting a pretty raw deal. In this case, I think this is a much fairer comment on the whole thing:
"An organisation failed to keep data private, so instead let's just punish the poor sod who happened to discover that" is quite messed up frankly.
— ??????? (@_neonsea_) April 18, 2018