In the fall of 2016, Airbnb, the online home rental business, announced a series of changes to address a growing problem: virtual bias.
Companies such as Airbnb and Uber are facing increasing criticism amid allegations of racism and discrimination on the part of drivers and passengers, renters and property owners.
The steps taken by Airbnb are a good first attempt that may serve as a framework for others. Below is a rundown of where things stand regarding online businesses and bias: the way forward.
The Airbnb Response
As public allegations of bias began to grow, Airbnb pledged to get tougher when it came to hosts who may discriminate against potential guests based on race, gender identity, national origin, age, religion, disability or sexual orientation.
The company pledged to reject bias, reminding platform users that, in doing so, they pledge “to treat all fellow members of this community … with respect, and without judgment or bias.”
Further, with guidance from former U.S. Attorney General Eric Holder, now an advisor to the company, Airbnb released a new nondiscriminatory policy. The company is encouraging hosts to allow more bookings that do not require pre-approval. If a host rejects a guest claiming a lack of availability, Airbnb will automatically block those dates from being available on the host’s calendar.
The company will also experiment with reducing the emphasis on guest photos in favor of other identifying information.
Bias a Common Problem Among Online Platforms
Research conducted by faculty at Harvard Business School and Boston University revealed that guests on Airbnb with “black-sounding names” were 16 percent less likely to be accepted as guests than those with “white-sounding names.”
The problems were not unique to Airbnb; researchers found similar bias at websites for ride sharing, dog walking, and freelance work. While most companies are ignoring the issue, some companies are being responsive, the researchers state. For example, eBay, the online auction site, has in the past hired social psychologists to determine whether male sellers got better prices than female sellers for the same products.
The researchers offered two principles for platform developers to adopt within their business strategy to help curb bias.
Principle 1: Recognize the Potential for Discrimination
Platforms should make sure they know the demographics of its users, particularly gender and race. Regular reporting can provide analysis of the success rates of users of different identity types on the platform, which can help determine if bias is an issue. Without that data, it’s much more difficult to determine whether there is an issue and if so, where the problem exists,
Principle 2: Be Willing to Experiment
Platforms should test design choices to see which may influence or make it easy to discriminate. In particular, platforms should ask the following design questions:
- Is there too much information available that can lead to discrimination, such as pictures or last names?
- Can transaction processes be more automated?
- Can non-discrimination policies be more prominent?
- Are your algorithms constructed to identify bias?
While these actions will not solve all bias incidents, they are positive moves that can reduce incidents and demonstrate an earnest commitment to the issues.