Let the data drive your decisions.
This has been something of a mantra for many technical people, and even many business people, across the last twenty or so years. The allure of business intelligence is harnessing lots of data to make decisions that are rooted in some rational analysis of what has happened. Many companies use “data driven decisions” as a way of achieving success.
However, what about when the data is flawed? When deliberate or inadvertent actions give us data that isn’t quite as pure as we expect. In the last week we have seen many protests and complaints about the ways that many people feel they have been unfairly treated by police. That brutality, particularly for African Americans in the US, has been a problem for decades. Some of that is due to human biases, beliefs, and more. However technology plays a part, and will for some time to come.
I have watched as algorithms have been used in sentencing, and I’ve questioned their use, as have others. There is this idea that computers will be more fair, looking at inputs and making a decision that isn’t encumbered by human biases. The problem is that humans that program the systems might have some bias. Maybe more disconcerting is that the data used to train systems is likely biased as well.
There is also the concern that as technology advances, it can be put to new uses, perhaps in ways that the inventor regrets. Oppenheimer regretted the violent use of his work, and I wonder if technology inventors will feel the same way. Surveillance technology is controversial. It can help retails companies prevent theft, but it can also be used in ways that might enhance and reinforce bias in police work. This can be controversial, and no matter how you may feel about the technology, there are moral questions of privacy and prejudgment that are worth debating.
The last couple weeks have saddened, upset, and angered me at different times. I am also confused and concerned, unsure of how to discuss and debate these topics. I find my position moving slightly with different stories and different information, as I should. I learn more and my views grow and change, shaped by what touches me. I do worry about how we use data in the future, and how it can be abused. There are ways in which more data can help improve our world, but the potential for abuse is high, and I do believe we need governance, transparency, and an independent appeal process for those wrongly impacted.