Are we there yet? Consumer Empowerment I mean.
Surely there are difficult hurdles to be overcome before we can even speak of true Consumer Empowerment?
Hurdles such as how our data is being collected, enriched for segmentation or profiling purposes, and shared with third parties are still being debated.
An unfair balance currently persists as technology continues to evolve while legislation plays catch-up. In the mean, while some progress is being made such as state legislation related to Revenge Porn and actors like Google decide to take a stance by banning listings, many data -gatherers, -hoarders and -leechers continue to clearly align their strategy in their own best interest.
“I think a defining moral issue of the next decade will be that nobody should know more about your life than you do” said Alistair Croll, founder of Solve For Interesting, in his keynote address at the Strata + Hadoop World conference in San Jose, California in February.
For an industry that touts itself of being able to predict whether you’ll live happily ever after with your better half, that’s quite a shot in the predictive sound bite’s foot!
Having said that, Alistair has a point, as usual and it’s about choice: the opportunity to deliberately choose how your data is being collected, accessed and shared. And that is exactly what MyPermissions is all about: it allows you to understand which third party mobile applications are accessing your data, which permissions are being requested when you click on that “I agree” button.
This is the first step in the path towards true Customer Empowerment.
Customer Empowerment starts with an equilibrium: citizens, consumers if you prefer, understanding, in all transparency how and which type of data is being shared with which legal entity and how this data is then being passed along.
True transparency also requires alerts to understand, when an application is being updated, which new permissions might come along with those new cool features. MyPermissions is about empowerment, up to a certain stage as mobile applications work differently depending upon the type of phone you use.
That’s where not only legislative bodies but also industry standards should ideally at some point align, in helping us, non technical mortals, understand the consequences of our actions, leaving an increasingly vast digital footprint.
The Privacy “sector” is increasingly talking about creating balanced value propositions when exchanging data for free products.
The issue however lies within defining this exact balance: what might seem a good deal for some, might not for others as Privacy is very context sensitive and evolves according to age, gender, cultural backgrounds as well as types of data shared and companies involved.
At the beginning of 2014, the Boston Consulting Group came out with a report called Data Privacy by the Numbers, corroborating that “a companies’ starting point matters” while also contradicting the myth that younger generations care less about Privacy.
As citizens continue to look to the law to define clear boundaries of what is acceptable behavior and what is not, the data market still fails at better understanding such context driven boundaries. Additionally, a lack of education by consumers counterbalanced by a lack of transparency by the data industry fuels fear and disgruntlement related to data uses.
Big Data, supported by increased mobile uses, remains Privacy neutral yet can be extremely beneficial to society as a whole. It is within the very context of those data uses that the Privacy debate sits, balancing customer expectations and technological capabilities.
Customer Expectations related to Data Privacy depend on education and understanding of technology.
Once there, it relies on shared transparency related to data uses and incentives by companies in order to build Consumer Trust. Consumers are “dirtying” databases with false data if the right balance is not struck hence data quality suffers as consumers hedge against what they considered to be intrusive data collection methods.
One might argue that there is no incentive for companies to become more transparent or even go as far as stating that “algorithmic transparency raises intellectual property issues“.
Yet if we are not careful, Big Data, for all it’s promises might follow the same discontentment path as CRM some decades ago: “Garbage in, garbage out“.
Our methods of collecting data have become less prone to error. Let us not fall into the data greed trap that will drive us towards a similar outcome.
As companies increasingly build Big Data teams to collect and integrate data from the Internet, mobile or call it IoT, accountability mechanisms need to put in place in order to understand the Data Privacy Consumer Trust equation. Data minimization, security and choice as well as transparency are all part of that equation.
If not, and as the European Data Protection Regulation is around the corner requiring companies to at least better document their data practices, erosion of that long sough after customer trust might become a reality.
Customer expectations of Privacy is a moving target that will require a lot of education. Starting with transparency about how technology works and what data can be stored, shared and calculated, companies can take the first steps towards meeting those expectations, using tools such as MyPermissions’ Risk Index.
The next level is indeed Consumer Empowerment.
Yet Consumer Empowerment is the very idea behind the W3C’s Do Not Track (DNT) workgroup.
Highly resisted, DNT never developed a consensus and should be seen today as “a mechanism for transparency and enforcement of potentially divergent user preferences“. The very fact that the Internet industry could not align on best practices pushed consumers towards alternatives: ad blocking and bad data.
Consumers today just confirm the Dalai Lama’s thoughts: “A lack of transparency results in distrust and a deep sense of insecurity“.