Koreo Futures: Exploring the Future of Meaningful Work

The rise of emerging technologies are disrupting the current models of living, influencing the way we expose, share and communicate our daily lives. Currently, discussion around the future seems to take two distinct avenues; for some, it’s utopic, while for others, it’s dystopic.

Koreo is a talent consultancy based in London dedicated to social change. Their work revolves around amplifying impact through developing people, organisations and networks. Like what we do at SIX, they tap into the potential of networks and connections to leverage social innovation.

At this recent evening event, Koreo offered a radically different way of discussing the future, one which enabled participants to think and work in active ways to achieve a fair, tangible future for society.

I left the evening empowered, but also critical towards the fixed discourse of future we face today. It prompted me to reflect on the image of the future that is so commonplace, and encouraged me to think beyond the boundaries, to question: how can we ensure that our futures are fairer, more open and more inclusive for everybody whose voices are not heard in today’s discussion on tomorrow?

The evening began with interactive experiences that encouraged participants to explore the possibilities of emerging technologies as tools of empowerment. The interactive experience featured Preloaded’s virtual reality experiences of Amedeo Modigliani’s studio. Pepper, a humanoid robot and Paro, a therapeutic seal robot were also there for interaction with the visitors. Exposure to these sophisticated machines allowed participants to image what coexistence with these technologies in the future might be like.

The event also featured an exhibition by the finalists of Koreo Prize 2017. The displayed work focussed on six pressing social issues (gender equality, social mobility, community resilience, food security, social housing & wellbeing) facing the UK today and proposed forward thinking solutions. For instance, all of the food and refreshments provided for the event were prepared from wasted food. Such detailed execution allowed participants to fully immerse themselves in the experience of the coming-together of social and technological fields.

The panel discussion was joined by two members, Brittany Smith, Senior Policy Analyst at DeepMind and Campbell Robb, Chief Executive at Joseph Rowntree Foundation. The major points of discussion were:

1. Technologies may evolve, power structures won’t

There’s a widely discussed debate on whether emerging technologies such as blockchain or open data could potentially lead to more equality and fairness. Campbell Robb argued that future will remain uneven as it has always been: “when we think about the future of work, we need to remember that the work is already broken.” It is very simple to imagine how we want the future of work for ourselves to be like: promotion, salary raise, new business opportunity,  only to realise that the majority of the world population don’t even have that luxury. “You’re not the future of everybody’s work,” he claimed. In fact, much of the conversation of the future of work, home, education, health, are all owned by those who already occupy the space. So what can we do? “Prepare for the future of work now” was the message communicated to the audience. And to do this, we must recognise who’s not represented, whose voices aren’t heard and missing. Recognising the absence allows us to then identify our next steps, and whether technologies can address these inequalities.

2. Technologies aren’t neutral

As a policy analyst at DeepMind, Brittany Smith’s concern focussed on the ethical implications of artificial intelligence. Her statement “AI should remain under meaningful human control” provided a glimpse to the complexity of AI work. She also spoke about examining the existing power structures of today.

Technologies aren’t fair unless we design them to be. AI, as it stands, is ultimately created by humans with their own belief systems, ideologies, ingrained preferences and biases. One particular example was provided from the US. Data algorithms used in the criminal justice system have been shown to over-predict risk of recidivism for African Americans, and under-predict for white citizens. Biased data sets will only produce biased results, even if they’re applied to new technologies. Without critically analysing the raw data, technologies could fall in the danger of perpetuating the long standing social challenges we already face today.

3. Inclusion and diversity

There is a lot of work to be done if positive changes are to be brought about in the world. “What change do we want to see in the world?” should always be the question to be raised, for new technologies are means to an end, not an end in itself. Brittany Smith emphasised that AI in the future of work needs to consider privacy, transparency and fairness, and the only way to ensure these principles are coded into practice is to make the process more inclusive. “Diversity is key,” she commented, for the only way people can get out of their institutional siloes is to make the space more diverse and inclusive.

4. Examining the “nots” and the “nons”

There exist within this debate a number of moral questions around the necessity of AI. Often it is a matter of asking the right moral questions, such as “should it?” and “do we want to?” Digital technologies aren’t a necessity for successful innovation, so why should we? We often talk about what AI can do, but sometimes the conversations are even more valuable when we talk about what they cannot do. To bring back to an earlier point; do we even need recidivism risk algorithms in the first place? Critical questions like these are important in uncovering the tacit knowledge that goes into  the making of these digital inventions.

5. Exercise your power

So what can users do? Brittany Smith reminded participants that as users, we have the power to not participate. Instead of users giving their choice and power away, it is crucial that more critical thought about participation occurs. If you’re in the position to be invited to a space that only reinforces the dominant discourse (of technologies, futures of work, gender), then perhaps that invitation can be given out to somebody whose voices aren’t heard.

6. Change the success measure

Undoubtedly, if success measures are framed by quantitative factors, then economic value is likely to outweigh social value, or at least be in balance of importance. Having said that, both Campbell Robb and Brittany Smith suggested an alternative framework: what if a success measure for companies was “radical transparency” (i.e. forcing companies to establish open, more transparent relationships with their customers)? Rather than treating social good as an additional option, it should be ingrained as part of an everyday organisational practice.

It goes without saying that the problems faced today in society – climate change, social injustice, resource scarcity, mass migration, political mistrust, nuclear waste – will, for the better or worse, continue to be our major concerns in the future. Unfortunately, new technologies won’t solve these challenges unless we take ownership and responsibility to do something now. Bringing positive change in the world means starting the process by addressing what kind of impact we want to make in this world and who will benefit. In imagining the future, we must never forget the large percentage of the population are missing from this dialogue. Only through tackling existing challenges, questioning the status quo and acting in a proactive way can we begin creating a future which does not limit itself within rigid parametres.