How do you successfully implement a digital strategy? What are the pitfalls? And what is the danger of relying too much on algorithms? We get answers to these questions from Etienne Fouarge, co-founder and director of data-consulting company LARKinfolab and Roman Briker, Assistant Professor in Organisational Behaviour & HRM at Maastricht University School of Business and Economics.
“In this digital age, practically every company requires a digital strategy. Digitalisation can support existing business operations, for example, to improve certain business processes, such as the self-service checkouts in supermarkets.” Fouarge shares an example, “think of Picnic with its delivery service, which has no physical branches and has caused disruption in the supermarket landscape. Often new technologies lead to completely new business models that shake up an entire sector.”
Strategy implementation as decisive factor
“Existing companies are forced by digitisation to rethink their business model. The transition from a traditional to a digital environment also requires a cultural shift. Getting people on board with this change is the most difficult aspect. The human and organisational factor is decisive in the success of a new, digital strategy. On the questions of why, what and how you are going to change, the last one is the most important. It is not so much the companies with the best strategy, but the companies that best implement their strategy that are successful in the digital age,” says Fouarge.
Race with the machines
“People are, wrongly, afraid of being replaced by machines. In this light, people sometimes speak of a ‘race against the machines’. I prefer to speak of a ‘race with the machines’. In 2005 there was a chess tournament in which each team could consist of any number of people and computers. It was won by two amateur chess players and their computers. They bridged the gap with the great chess masters through their expertise in working with a computer. The best results are achieved by a combination of human and machine. This also applies to business,” explains Fouarge.
Who can and wants to work with artificial intelligence?
“That cooperation between humans and artificial intelligence – hybrid intelligence – requires different skills from employees,” says Roman Briker. “Some people are afraid that they will be replaced by Artificial Intelligence (AI). In my opinion, the question of whether you will be replaced is irrelevant. It is better to ask yourself what it takes to successfully work with AI. Incidentally, not everyone can or wants to work with AI. It’s interesting to ask what type of person and what skills are needed for successful collaboration,” he continues.
Algorithm aversion
“In today’s job market, things like reliability, diligence and punctuality are key to success. Research shows that skills like these become less important in an AI environment because an algorithm masters them much better than a human. Instead, socio-emotional skills, for example, become much more important. There is also a phenomenon called aversion to algorithms. People are reluctant to allow algorithms to make important decisions. Even if an algorithm is proven to perform better than a human, people still tend to leave the decision to a human”, says Briker.
Don’t blindly trust algorithms
“In principle, AI is a positive development. But if it moves from an assistant to an advisory role, there is a danger that we will blindly follow its advice – the opposite of algorithm aversion. This is particularly dangerous when it comes to moral issues. Ideally, an algorithm should be less biased than a human. But if an algorithm discriminates, the consequences are greater than if a human discriminates, for example, when hiring people. If a company has worked mainly with men for thirty years, it is more likely that an algorithm will predict that a man is more suitable for a job. Even if AI is less biased than a human, its speed and scale can have enormous consequences. Human intervention is needed to understand and interpret data, and to look at social and demographic factors, among other things,” warns Briker. “The question for now and for the future is: how can people and AI work together and establish a relationship based on trust? For this, it is important that people understand how an algorithm works and at least have an influence on it,” he concludes.