Despite the increasing use of algorithms in companies to tailor experiences to their customers, a sizeable percentage of business leadership remains convinced that their customer base may not trust, or may resent, algorithms. For every Netflix, which proudly admits that the offerings a subscriber sees are driven by big data-informed decisions about what they’ve chosen to watch in the past, there is a voice-activated assistant like Alexa, with a human voice — and name — overlaid on top of data-driven responses.
Algorithms increasingly drive what customers see.
Companies Often Believe That Customers Won’t Like Data Science…
A recent article in Harvard Business Review, for example, discusses the use of algorithms in Stitch Fix, a successful online clothing service that sends clothing choices to its subscribers, who can accept the clothes and pay for them, or send them back. Its founder is known for basing the business on Netflix, and emphasizes the company’s reliance on data science. In fact, in an earlier piece for HBR, she noted that “data science isn’t woven into our culture; it is our culture.” It’s a surprising statement for an industry that relies so much on personal taste.
The company uses algorithms to choose the items of clothing sent out, but customers are also told that stylists will make the choices, and in fact, receive a note from the stylist the company employs. But is such a combination — which tends to minimize the contribution of algorithms — necessary? Might customers be told what HBR readers were, that data science is crucial to what they’re seeing?
…but Research Indicates People Trust Algorithms
The authors of the HBR article ultimately think it is not. They designed an experiment to test how much people trust algorithms. Participants were asked to make forecasts, ranging from the order of songs on the Billboard Hot 100 to sanctions against cyber attacks. After making the prediction, they received advice from data and from other people.
Ultimately, participants tended to rely more on algorithmically driven advice than on advice from other people. The implication? People are fine with receiving algorithmic advice, clearly labeled as such.
In a later study, professional forecasters working for the U.S. government were sampled. They tended to rely more on algorithms than on another person’s input as well. Only when they had to choose between algorithms or their judgment, algorithms were less popular. The forecasters chose their own judgment more.
Because of the rising use of algorithms for everything from choosing which job candidates to interview to predicting healthcare usage, the information that people are okay knowing that algorithms are driving choices and tend to prefer algorithmic advice is important. The researchers found that to ignore algorithms when they have seen it make a mistake, or when differing tastes and values play a role, such as in joke selection or debating ethical implications of a social issue.
Ultimately, the authors suggest that high expenditures on making algorithms seem “human” or disguising their use may not be necessary for businesses to make. Increasingly, customers know that algorithms exist, and appreciate their role in driving choices.