Over the last few weeks I’ve been working on a piece with 4 Corners titled In Google We Trust which went to air last night. For international readers (or local folk who just don’t watch the ABC), 4 Corners has been around for decades and has always been high quality journalism on thoroughly investigated stories without the sensationalism we get used to in many other current affairs programs. Seeing it all come together it was obvious that everyone involved was genuinely interested in the detail of the story and worked extremely hard to get the facts right and avoid anything that could be misconstrued, regardless of how good a story it may have made.
In short, the piece looks at the digital footprints we’re leaving behind us through the use of apps and from externalised tracking such as via the roads authority, police and shopping centres covertly monitoring devices. All of these pieces of data help create a picture that can be very revealing about individuals and whilst the individual pieces of the puzzle may not be too sensitive in and of themselves, it’s when you tie it all together that things get interesting.
Here’s the main story:
And an extended interview with myself:
Let me also add to or clarify a few points:
- Terms and conditions hide nasties: We all probably know this already, but terms and conditions are massive and we have absolutely no idea what’s in them before we agree to them (check out this short video I did a couple of years ago on Apple’s). However, we do know that they often make provisions for organisations to do things like share our data with partners or use it in other ways we wouldn’t normally consent to.
- Intercepting traffic: When I captured Alexi’s traffic and found risks in apps, I’m just proxying the data through Fiddler which is running on my laptop. This required access to his device in order to point it at the proxy but it’s entirely representative of what an attacker can see at any point they have access to on the network. Later on there’s a more comprehensive data capture exercise of the family’s mobile traffic and the Fiddler root cert is trusted for the purpose of monitoring where requests are being sent and what data they contain.
- Sydney Roosters: There was absolutely no protection of credit card details on the checkout page of their store. This page accepted credit card details and other personally identifiable information into an insecure page then posted it to an HTTP address with no encryption. It’s very unusual to see such a massive oversight and indeed PCI DSS is very strict about sending payment data over an unencrypted connection. The Roosters have since implemented SSL on the site (note that the certificate is valid from September 4 which is when they resolved this).
- Westfield: The original post I wrote on the parking situation nearly two years ago is Find my car, find your car, find everybody’s car; the Westfield’s iPhone app privacy smorgasbord. As the article said, they did fix the flaw very quickly after I raised it (actually they simply turned the service off which was the best thing to do under the circumstances).
- Non-personally identifiable information: This term is regularly used to describe data which an organisation thinks doesn’t tie a piece of data explicitly back to an individual. Except it almost certainly could. In the story they talk about the MAC address of your personal device which is a unique identifier that only your device has. It doesn’t say “Troy’s iPhone”, but there are many, many ways to resolve the MAC back to me. It may not be personally identifiable on its own, but make no mistake that it is a piece of personal data that may be tied back to your identity.
- Police tracking: The automatic number plate recognition technology is very cool, no doubt, as is the ability to automatically identify vehicles where officers need to, uh, “have a little chat” with the driver. It has enormous relevance in real time where officers need it. However, the relevance beyond this is questionable particularly when you’re talking about all vehicles it observes being recorded – more than 208 million records of them. The other thing is that they’re storing photos, not just OCR’d plate numbers so now you have hundreds of millions of pics with geotags stored for years – and the police won’t say why. But don’t worry, they’re “stored in a separate database”…
- Meta data: This loosely defined term is being used more and more frequently as a way of defining “data that describes data”. In the context of information security and personal data, I’ve seen meta data used to describe information such as what phone numbers are calling what other phone numbers but it’s “meta” because it doesn’t actually contain the contents of the phone call. It’s a slippery slope – regardless of the term you whack on it, the sort of information being described as “meta data” can be extremely revealing particularly when combined with other data about the individual(s) involved.