Torn Fabrics of Online-User Rights – Hacker Noon

Privacy rights cannot be framed in the chiaroscuro lighting that’s been used before to showcase the granted can’s and can not’s of citizens. Extrapolating property rights into the virtual domain falls short at properly signifying the perpetual and shifting bi-lateral partnership users and companies are brought under when data is exchanged. Whereas land can be purchased for a price and a clear boundary established between both parties, data is a dynamically compounding asset and one that a firm has to use in building a house of cards — their dataset.

Just like in a divorce with no pre-nuptial, some couples have their assets split 50/50 — not what’s in their pockets when they sign the divorce papers. There’s an intrinsic value that generates over time which is profitable to both parties and it’s unfair to demand a complete disintegration of gains. You can deactivate your account from Facebook at anytime but that doesn’t require you to break off the friendships you’ve made.

Giving users particular rights that may seem ‘righteous’, such as being able to eradicate all traces of previously stored data, can leave firms vulnerable and crippled if this data is later made unavailable.

There is some middle ground between traditional contractual understandings, such as identity protection, that would be used to facilitate legal frameworks for these data-transactions. However, these pre-existing legislative frameworks fail to acknowledge the dichotomy between the capital opportunity made only available to firms in possession of the user’s data.

Herein lies the key pain point: user’s can’t profit from their data while firms can. With this in mind, I think the guiding principle for online user data rights should be about protecting the integrity of communication between both parties.

To implement this, a few ideas I have at the moment on online user rights are the following:

  1. At the lowest level, data needs to be properly contained and anonymously labelled. Data should be seen as an almost-like radioactive commodity given how explosive it can be for both parties. As such, an independent authority should be constructed to oversee and grant certification that firms are securely storing data and that it is non-identifiable. It’s purpose similar to that of Occupational Health and Safety standards. Importantly, transgressions should be treated with the respect to number of users affected. Again, data is an asset that has its valued compounded and further inflated when more associations can be constructed — using traditional punishment methods generally fail to proportionately bring justice to those responsible. Many regulators make the mistake of making pervasive crimes a cheap endeavour — Facebook was fined $122 million (USD) for misleading EU regulators on how they would manage Whatsapp’s data despite paying $19 billion for the app (a fine of just 0.64% of the saleprice). Recently Facebook was again fined by the EU for mismanagement of data with Cambridge Analytica, a trivial £500,000. As an exemplar, Google’s $6.8 billion fine for using anti-monopolistic tactics shows how mis-stepping companies (with market values nearing $1 trillion) such as these should be handled.
  2. Data is handled by programmers and data scientists, they’re the foot-soldiers that work in hand-to-hand combat. Infantrymen are trained in first-aid with an emphasis on blood-loss and pain-management and it was because of improved training during the Second Iraq War that 99% of patients that reached a hospital within 60 minutes survived. In addition, these same soldiers were also trained about how to implement Hearts and Minds tactics. Irrespective of your opinion on military deployments to the Middle East, military training now encompasses a broad range of skills due to the complexity of battlefields today. Programmers are therefore not just ‘foot-soldiers’ and so we need to re-define what it means to be a programmer. The Russian-led Facebook and Google ads during the 2016 Presidential Election, the corruption of Iran and North Korea’s nuclear/space programme, the power outage in Ukraine, firewalls across China and Egypt along with the host of other cyber-crimes that are committed every few seconds are all done so at the fingertips of programmers. An ambitious and competent programmer could possibly be one of the most dangerous weapons on this planet. With that in mind, programmers need to understand the severity and consequences of their actions. Conscious programming is a derivative of a larger global shift in mindsets with society seemingly embracing a greater sense of responsibility for care-taking and protective-actions that reduce harm to future generations. Given the proliferation and accessability of machine-learning models today along with the ease of advancements being readily communicated via the Internet, it should be evident that programmers are more than just ‘programmers’ — in many way they are guardians. Whether society chooses they complete a set of certified courses, classes on legislative rights or simply extensive background searches is a tactical discussion for later.
  3. Unlike sharing data via an API request like Spotify connecting to your Facebook account, selling data should be strictly governed, if not outlawed. Once competent and efficient mechanisms are made available to trace data’s usage (a theoretical and technical nightmare but an evident necessity which hopefully means there’s an eventual solution available) then perhaps regulation for selling data can be loosened. In the United States, users are by default in an opt-in agreement — a criminally unfair position for many users that are not only bewildered by pages of terminology but joining an app in a temporally sensitive scenario. When was the last time you had a few hours to read over a privacy policy before downloading an app you needed. Think Uber, Facebook, Instagram, Tinder. Consequently, opt-out programmes should be reversed to opt-in as they are in Australia.
  4. Until a mechanism is developed that can properly audit a user’s data usage amongst third-parties and offer such details in a transparent way to users, their claims to share a portion of profits will collapse into an abyss of non-actionable demands. Expecting firms to be able to offer a capital amount for the intrinsic value a user’s dataset offers another firm will possibly open that initial firm to a slew of class action with unending claims of discrimination, racism, sexism etc. However, the need for such tracking measures becomes irrelevant if selling data becomes outlawed.

Hopefully you enjoyed reading, let me know if you have any suggestions!

Cheers 🙂

read original article here