This is really interesting. A computer chip that doesn’t exactly perform mathematical calculations correctly. That sounds crazy, and in some cases it is. Most of us want math to be exact, and work perfectly every time when we run calculations. Can you imagine if Excel didn’t always add up the totals in your expense report according to the rules of addition?
However in more complex operations, such as pattern matching for images, we might not want exact calculations. If two radar images (one problem in the article) don’t match exactly at the binary level, would we want a search for similar images to discard one? After all, exactness and perfect matching are great in some areas, such as financial accounting, but in other areas, such as imaging, there is plenty of noise that isn’t important to the content of the image.
Do we care? Perhaps. As data professionals, I suspect that we will get more and more data of disparate types, including images, that will become part of our databases. There will be clients the need to search and query this data, which means that we may need fuzzy search tools that work well.
New search tools will likely mean that we will need to learn more about how to tune queries, or even indexes, that might deal with data in a way that isn’t exact. I suspect that there will be opportunities for those that learn how to deal with these types of problems in an effective manner. There might even be really good paychecks that come along with these jobs.
It’s a bit disconcerting to think that we might want computer applications that don’t work exactly as we expect with regards to calculations. However, I think the science of uncertainty will become more relevant and valuable to us data professionals in the future as we deal with lots of data that is more complex than a simple addition problem.