The events of recent years have shown that with great technology comes great benefit and great risk. Even the best services and tools can be used in ways that have opposite consequences of what was intended. And the risk of the latter happening gets compounded when the genie is out of the bottle; it gets super hard to stop again. In most instances it is not even possible.
For that reason we need to design products, services and tools in a different way. Where we have long made security a key component of how we think about designing systems, we should also have what I would call the flipside as a key consideration: How could this be exploited to evil ends, and what do we build into the product or service that will help prevent that.
I think that it is both a needed thing to do and a potential gamechanger for many. Trust has eroded in a lot of the platforms and companies that have struggle with ‘doing no evil’, and tomorrows winners will be those that serve an entirely good purpose and – by design – prohibits evil exploitation.
Launching an entirely new research area into machine behaviour as suggested by MIT Media Lab seems like an obvious good idea. Because the more we leave to machines, the better we need to understand the decisions those machines make, the rationale behind them and the impact they will have on our outcomes.
Forcing ourselves to understand machine behavior may also be the best backstop we have towards making sure that machines don’t completely take over in a ‘Terminator’-like scenario. Because even if we agree we should never get to that point, I am not overconfident that that isn’t exactly what could be happening a few decades from now (without necessarily resulting in the dystopian scenarios, Hollywood likes to present on the big screen, though).
It would also inject some much needed ‘softer’ fields of study into the world of engineering and computing, which I think we need. Not so much to keep things in check as to make sure that we really utilize technology to help us solving really big problems with a massive impact. While we, humans, remain firmly in control.