海角大神

海角大神 / Text

Is Amazon same-day delivery service racist?

Amazon doesn't offer same-day delivery to some poorer, black neighborhoods. Is this a result of relying too much on big data to make same-day shipping area decisions?

By Bamzi Banchiri, Staff

For many black consumers and especially those who have experienced discrimination in retail shops, Amazon presents a good option for shopping without being followed around or singled out.

Or is it?

A few years ago, the聽Seattle-based e-commerce company launched a same-day delivery service that allows some customers to receive their products on the same day, eliminating one of the advantages that retail stores held over Amazon. The service, which is now available in 27 US metro areas, covers about 1,000 cities across the country. It's part of the Amazon Prime $99 annual membership, which includes two-day free shipping and other benefits such as Prime video and Prime music. The service can also include same-day free shipping for products over $35 in areas where it is available.

But a聽Bloomberg analysis of the same-day shipping services reveals a racial bias: Amazon's same-day shipping service often excludes predominantly black neighborhoods in six major cities.

In New York, for example, the service excludes the Bronx which is mostly inhabited by black and Hispanic residents. In Atlanta, same-day deliver covers half of the northern area, which is predominantly white, and leaves out the southern part of the city that has a predominant black population. In Boston, the neighborhood of Roxbury, has no service but is bordered on all sides by same-day-delivery zones. Roxbury is predominantly a black neighborhood with 62.29 African Americans, 15.83 percent Caucasians, and 1.92 percent Asians,聽according to Areavibes.com

Unlike some organizations which have made delivery decisions based on the composition of neighborhoods, Amazon鈥檚 delivery decision aren鈥檛 inherently biased, note Bloomberg reporters David Ingold and Spencer Soper. Amazon officials say that its delivery decisions aren鈥檛 based on the ethnic composition of a neighborhood, but several factors including the the concentration of Prime members, as well as the proximity of the area to Amazon's warehouses.

But as Tech Insider writes, the data-and-algorithm-driven system that Amazon uses to determine the numbers for Prime memberships may be reinforcing the bias.聽

鈥淭he problem with this thinking is it ignores the realities that there are biases involved in building any data-driven analysis, biases involved in what data gets included in the analysis, and biases inherent to a world scarred by centuries of ongoing racism and other bias,鈥 writes Rafi Letzter, a technology reporter for Tech Insider. 鈥淭he algorithms don't self-assemble. People make them.鈥

Sorelle Friedler, a computer science professor at Haverford College, who studies data bias, told the Wall Street Journal:聽鈥淎s soon as you try to represent something as complex as a neighborhood with a spreadsheet based on a few variables,聽you鈥檝e made some generalizations and assumptions that may not be true, and they may not affect all people equally.聽If you aren鈥檛 purposefully trying to identify it and correct it, this bias is likely to creep into your outcomes.鈥

Big data has become a powerful predictive tool that companies and organizations rely on to make better and more efficient decisions. But the method is increasingly being scrutinized after several cases have revealed that the method can often magnify bias.

An often cited 2013 study by Harvard professor Latanya Sweeney聽found that searching for聽2,000 鈥渞acially associated names鈥 also produced accompanying Google ad results 鈥 driven by an algorithm 鈥 that indicated that person might have a criminal record. Sweeney writes:

While it may be good business for Amazon maximize profits, and roll out services to parts of a city where there are more customers with more income, the strictly algorithm driven approach also makes for poor public relations, observes Tech Insider: