Manual labour is difficult, and tiring. Emotional labour is also difficult, and tiring. Especially for companies who are more concerned with liability and efficiency than the emotional well being of their contractors or soon to be former contractors.
Yet over the last few years, the AI hype machine has started to falter, and the mystique surrounding AI has waned. That doesn’t mean the technology will go away, far from it. However our understanding of it, and more importantly our relationship with it, is starting to evolve substantially.
Yet what if our perception of predictions was itself biased in favour of the predictions? That we become so wedded to the prediction, that quite like target fixation, the act of prediction influences the outcome of the prediction.
What if you could choose your own algorithm? Not just configure a single algorithm that governs a social media platform or website, but actually bring your own algorithm, or select an independent one, to sift through and organize the information available on a platform?
The consequence of believing in the myth that data (or AI) is neutral, is that it discourages people from asking what agenda that data serves, or what biases it may have.