The need for technologies that enhance our autonomy as well as our privacy
Innovations in Privacy Enhancing Technologies have been around for a while but consumers’ loss of autonomy in digital markets has been overlooked. Could Autonomy Enhancing Technologies be the answer?
Consumers are routine users of curated and friction-free services, all automated by data and sophisticated user interfaces. But these convenient and fluid experiences are accompanied by concerns about security, privacy, lack of accountability and loss of control.
Policy makers have given the security and privacy part a lot of attention, not least because they don’t want these worries to impact trust and uptake. And in the bold spirit of innovation, they’ve considered the viability of tools like Privacy Enhancing Technologies that “support the protection of personal data…through either reducing personal data, or preventing undesired personal data processing” UK Data Strategy 2020.
Less scrutiny has been given to the loss of personal autonomy and control in digital environments. Sometimes hard to nail down exactly what this loss is, there is a growing sense of anxiety about being somehow manipulated or losing control and losing that thing we call autonomy, or the ability to make your own decisions without being controlled by anyone else. Online, this is usually felt as a lack of ability to do or know something: not knowing why you are offered a particular product or price; not being able to undo data relationships with companies; not knowing how decisions are being made about you.
It’s not surprising that this makes people uneasy, humans are driven by a quest for meaning and mastery, we want to be independent individuals who can make sense of the world and make decisions for ourselves. Automated technologies which process data behind the scenes to take decisions on our behalf are bound to have negative repercussions - no matter how convenient the service, or how spot on the recommendation. The new CMA programme into algorithms and consumer harms gets close to thinking through some of these problems, naming them as manipulation and proposing changes to the design architecture that drives them.
There could also be a role for what I’m calling Autonomy Enhancing Technologies, a new range of emerging technologies and concepts which have the broad purpose of restoring autonomy to human intention, choices and action, for example:
P(algorithms) or personal-algorithm, or even ‘palgo’ is a type of software running on fully transparent algorithms that are switched on or off by individuals, and run behalf by themselves or someone they choose to trust.
GLIAnet a personal AI described as a trust intermediary (or ‘trustmediary’ if you like a portmanteau), which would act in line with people’s intentions and goals when interacting online.
Moonsift a recommendation engine for online fashion, where instead of being targeted, consumers organise and communicate their preferences by actively choosing what they want to see. This is one example of a vendor-relationship-management tool which enables people to take control of their relationships with commercial organisations.
RadicalXChange a (conceptual) data coalition that combines people’s data, thus enabling them to have greater collective bargaining power for services of their choosing. This has lots in common with the model of bottom-up data trusts and I’ve included it for its potential to return power and therefore some control to people.
These technologies demonstrate that other models for organising consumer interactions in online markets are available which could answer the challenge of a loss of control and autonomy. Of course, such technologies in isolation won’t be enough to restore autonomy, that requires a much bigger reset of how we govern technology. But while the debates on how to deal with market dominance, privacy and surveillance-based business models continue across the world, the need for autonomy protection and autonomy enhancing tools should be recognised.