Speak to Our Team
Menu
Speak To Our Team

Staring Down Data Dystopia: Contact Tracing and Data Privacy

Nullafi
May 24, 2020

We Live In An Unprecedented Time. In Terms of Data.

We live in an unprecedented time. I know, you already know that. But consider this: in less than 20 years we have effectively adopted technology that 20 years ago was only in the minds of futurists and sci-fi authors. We have moved from desk-mounted PCs to laptops and now phones and mobile devices being the most prolific means of access and communication. We are on the threshold of quantum and AI like capabilities, and among other things we have collectively adopted biometrics as a primary means of identifying ourselves to technology. We also now have a once-a-century (hopefully a millennium) virus that has forced 10 years’ worth of technology advancements to take place in 3 months. The intersection of technology and virology is now pushing us collectively towards the dystopia that those early science fiction writers had warned might happen as the result of exactly this type of scenario. Honestly, half of the social apocalypse movies out there start out just like 2020 did. We are all facing a change in how modern life functions because of the unforeseen and unpredicted collision that these two factors are imposing upon us, and the speed at which we allow these coming “innovations” to happen. 

For a second let’s also think about the immediate issue of the rush to COVID-19 tracking and the mountain of technology issues that result from it. Governments across the globe and massive private companies are “miraculously” developing, deploying, and mandating the use of tracking applications. At any other time in history these applications would have, at the very least, caused advocates to bristle at impact these apps would have on privacy, while other experts and security advocates would be screaming about the speed with which these applications are developed (and how certain they would be that security infrastructure was an afterthought and that there would be “guaranteed failure”). Still other leaders and influencers would fight tooth and nail to counter the tracking and potential human rights violations that these types of apps might have. But as it stands today, all is relatively quiet. These applications are being deployed and users are being enrolled. Their data is being used and they are being geographically tracked to within 6 feet of other humans. Biometrics, including temperature, and other personal information is being mined. And yet not much pushback is being shouted-from-the-rooftops from experts. Perhaps we are simply boiling the frog slowly enough that the reptilian is unaware of the coming calamity, or perhaps privacy and security experts are afraid of being portrayed as unsupportive of society’s desire to move beyond the COVID crisis? 

Fool us once, shame on you. Fool us twice, shame on us.

Fool us countless times, and we hand you our most intimate and sensitive information...?

The same companies that have been the poster children for application security failures, data breaches, lack of encryption, and violations of trust and user privacy laws are the ones developing and deploying the vast surveillance apparatus that is being touted as “a necessary part” for a “return to normal.” Perhaps we need this capability to return to normal, and maybe we need it to stave off future pandemics. Fine. But we also need secure, protected, and private applications that are purpose-built with a focus on protecting and validating those data communications to make those COVID tracking systems useful and safe. Would you accept and drive a vehicle that should have seatbelts and airbags inside that was built by a company that had proven time and time again that it could not roll a car off the manufacturing line without the car bursting into flames as soon as the engine was tested?

In literature, there is a message that presents itself again and again: if humanity is shown a brighter future, we will all rush towards it. Like lemmings following those running ahead of them, we see a piece of a solution, see that there are potential benefits to that solution, and we sprint towards the cliff blatantly ignoring the massive billboard in front of us stating “GIANT CLIFF: If you run here, you will fall off like everyone running in front of you.” But we sprint along at breakneck speed because we have been sold a promise – a promise being made by organizations that routinely break this very promise. A promise that hasn't, importantly, truly considered the long-term security, data, encryption, or privacy methods that are necessary to prevent real-life imitating the art of all those sci-fi dystopian narratives.

It doesn't have to be this way. 

We do not have to swallow this bitter pill that stinks of ignorant security and data stewardship, and ignores the lessons that 2 decades of security failures should have taught us. We have the means to employ new approaches to these issues and we can mandate that our data is made more secure by default. If we do anything less, we accept the potential for a dystopian future in which our most sensitive personal data is never again ours to control, but the time to act and demand true data anonymity and security is now.

 

Kill the data, change the game.

 

Forbes.com

What Forbes Thinks of Nullafi:

"One of the most interesting startups at RSA was Nullafi, who specializes in a novel API-based data security technology that combines data aliasing, vaulting, encryption, and monitoring to create an advanced data protection platform that makes hacked data useless to hackers. What makes Nullafi noteworthy is how they’ve been able to build a data architecture that protects legacy and new infrastructures while making the original data impossible for a hacker to reverse engineer and gain access to."

Subscribe for the latest updates on Data Security!