Learning is what makes us human. Our ability to adapt and change has given us an evolutionary advantage when facing adversity or threats. While this is particularly relevant in this pandemic, I wonder how this crisis will impact our learning institutions.
Yet over the last few years, the AI hype machine has started to falter, and the mystique surrounding AI has waned. That doesn’t mean the technology will go away, far from it. However our understanding of it, and more importantly our relationship with it, is starting to evolve substantially.
Yet what if our perception of predictions was itself biased in favour of the predictions? That we become so wedded to the prediction, that quite like target fixation, the act of prediction influences the outcome of the prediction.
What if you could choose your own algorithm? Not just configure a single algorithm that governs a social media platform or website, but actually bring your own algorithm, or select an independent one, to sift through and organize the information available on a platform?
Like any powerful tool or in this case research methodology, there is a responsibility to act ethically, or morally, acknowledging that ethics and morals are themselves subjective.