Security by Obscurity in the Age of AI

Security by Obscurity in the Age of AI

My headline is borrowed from my colleague Andrew Welch, the master of words.

In the digital age, data is the new currency, and how we manage, protect, and leverage it can determine the success or failure of organisations. But what if I told you that a long-standing practice in data protection, known as 'Security by Obscurity,' is more like hiding a spare key under a doormat than installing a lock? Let's delve into this concept and its implications.

1. The Era of Obscurity
For years, companies have amassed vast amounts of data, much of it stored away in seldom-accessed databases or formats not readily usable or queriable for the average person. This practice of keeping data out of sight has historically been considered a form of security. After all, if critical information is not where you expect to find it or in a state not immediately usable, it is less likely to be seen by people it was not intended for.

2. The Paradigm Shift with AI
Artificial intelligence (AI) and large language models (LLMs) have dramatically shifted the landscape. These technologies can sift through large, unstructured datasets, interpret them, and make the information accessible and useful. What was once obscured by obscurity can suddenly become transparent, searchable, and analysable. This capability exposes a significant flaw in the security-by-obscurity strategy: once obscure data becomes easily accessible, it can also become a liability.

3. The Double-Edged Sword
This newfound accessibility is a double-edged sword. On one hand, businesses can leverage AI to gain insights into old datasets, potentially driving innovation and improving decision-making. On the other, if sensitive information is merely obscured rather than adequately secured, it becomes low-hanging fruit for people using AI tools to find and exploit knowledge.

4. AI as the Great Revealer
AI doesn’t just use data; it reveals connections and patterns that are not obvious to human analysts. This means that information previously thought to be safely obscured might now be connected with other data points, leading to unintended disclosures. For example, AI can link disparate pieces of anonymised data to de-anonymize individuals, leading to privacy breaches. There is a massive difference between PI data and PII data (but that is for another post)

5. Moving Beyond Obscurity
The lesson here is clear: security by obscurity is insufficient in the age of AI. Organisations must adopt robust security measures that rely not solely on data obscurity. This includes employing encryption, rigorous access controls, and ongoing data integrity checks. Moreover, transparent data governance policies are needed to protect data and regulate how AI interacts with it..👇

Get in touch to find out how I can help.

Previous
Previous

The Rise of the Citizen Electrician

Next
Next

The Microsoft Partner is dead