Photo by Austin Chan on Unsplash

The ethics behind defaults

In the software world, defaults can be defined as the shortest path to the destination, set to relieve users of the extra mental processing, that comes pre-decided for the users, arguably, in their best interest, by the creators of the application or platform. They often come disguised as pre-populated inputs and pre-checked boxes. But is that all that there is to defaults?

Veethika
Prototypr
Published in
6 min readFeb 21, 2021

--

When paired with relevant behavioural insights, the same set of defaults that masquerade as plain pre-filled inputs on a web interface, could be used as a tool for controlling human actions. Sometimes for good, not always. Many governments are known to have employed behavioural insights to improve their public facing policies for good. My personal favourite of all the examples is the usage of anchoring effect in the deposit limit tools to influence the decisions of gamblers in Great Britain. Other popular example is when the 401K plans for the US citizens were changed from a voluntary option to auto-enrolment because it was observed that (most) people were incapable of weighing the immediate gratification of spending the money against the long-term benefits of saving them for retirement.

However, the same smart persuasion could also be used to discreetly package a set of consents carefully placed in 8 point size text in the `Terms and Conditions Agreement`. There are also tools available to measure the persuasiveness of certain settings and over human behaviour, and if you put all the pieces together and let you imagination get ahead of you, things don’t look so good. Do they?

🌑 The dark side of default

In 2015, a news surfaced about a chrome extension — Marauders Map — that had been using the data from Facebook Messenger to locate users real-time with an accuracy of up to a few metres. The extension exploited Facebook’s default location settings that came enabled on both, the Android as well as the iOS app. This incident exposed a major flaw in the default permissions for Messenger. To quote the creator of the extension:

“The main problem is that every time you open your phone and send a single message it’s so easy to forget about your location data being attached to it. Furthermore, it seems so harmless to attach a location with a single message, but the problem is, over time the information from these messages adds up.”

As data replaces oil as the most expensive resource, the occurrence of such incidents are on a perpetual rise, with no hint of receding in near future. The most threatening of revelations of the damaged that could be incurred with such under-the-hood information collection is the not-so-old exposé on Cambridge Analytica, accused of building voter profiles by unethically harvesting data from Facebook. The chain of events set off a few years ago is finally having its toll on the trust of Facebook users over the platform and this is proving to be a hurdle in the success of their AR glasses project.

According to another article on FastCompany, the redesign of Instagram could be an attempt to serve surveillance capitalism in disguise. With its existing computational abilities, the amount of data the platform could harvest about the users from the content on their profiles is unimaginable, yet believable. And once again, the default settings and algorithms that recommend the defaults are at the centre stage of this idea.

The level of trust that users have on the platforms, products and services might gradually be eroding away, but it isn’t happening as fast as it should be to keep the damages at bay.

A recent study that investigated the effect of awareness about targeting [ads] on individuals’ intentions towards the advertised product surfaced that majority of users do not want targeted ads. The pervasiveness of the medium and technology leaves them feeling vulnerable and violated. This result strongly hints that most software products do not come with their default settings optimised for and aligned with the average user’s expectations.

Most such products flaunt about their highly configurable settings and the choices they provide to their user. But the reality tells a very different story.

Users are gullible, and often assume a good faith when provided with the default settings. According to a small research conducted by Jared Spool and his team, a very small subset (<5%) of users take effort to changes the settings that came out of the box. Looking at this the other way — about >95% of users go withe the options that came pre-decided for them and is potentially controlling their workflow. As experience designers, it shouldn’t come as a surprise to us that our work has an immediate and direct impact on the lives of our users, both online and offline. The decisions made at the comfort of our workstations have a far-reaching impact and this adds a humungous gravity to our responsibilities.

🧜🏽 Make defaults Inclusive

If we look at the everyday things around us — the cabinets in the kitchen, the furnitures in our living room, the tools in our workshops, the clothes we wear — they come optimised to the physical capabilities of (almost) everyone in our family. Makes you wonder how the measurements for these are decided? The process is known as Anthropometry. The dataset received from this process is then used for setting standards for measurements, ideally, for a particular geography.

The three basic choices applied while applying this data are:

  1. Design for average
  2. Design for adjustability
  3. Design for extremes

Now let’s circle back to digital platforms and products. When it comes to privacy and security, we could follow the same principles and approach as Anthropometry and cater to: a) the average — users who do not appreciate pervasiveness, b) the extremes — users who have no awareness about pervasive technology and its scope, and c) leave some room for adjustability — for users who want to further optimise, to make the the system sound somewhat reasonable.

🌥 Silver lining?

Screenshot from https://www.mozilla.org/en-US/firefox/new/

Mozilla, known for its privacy first approach and efforts to make web a safe space for users, in their latest release of the Firefox browser, claims that they collect as little information about their users through their default settings, as is required to get the work done. The punchline for their browser reads:

No shady privacy policies or back doors for advertisers. Just a lightning fast browser that doesn’t sell you out.

Recently, WhatsApp made it mandatory for their users (outside the European region) to agree to their updated policy that implied that they would be sharing some data with their parent company Facebook. This stance of the platform forced more individuals to read and talk about privacy policies in the tech space, and as a result, Signal app a platforms that is popular for its privacy-centred design, observed a soaring influx in its user base (almost) overnight which even led to some downtime, (as you’d know, when an app scales in India it doesn’t go so easy!).

The online discussion around the comparison between Signal App v WhatsApp gained such a huge traction, that to counter the ubiquitous WhatsApp ads, a Signal App user created an ad campaign that went viral and was also mistaken to be the official campaign by Signal App, but precisely communicates the most accurate behavioural insights around user’s expectations from a messaging platform.

Hopefully, in the future more and more applications and platforms will follow suit, and, instead of providing us with shady defaults camouflaged in bulky configuration options, they’d go for the right defaults that respect our privacy to safeguard our personal data as users.

--

--