Rachel Neaman: Digital Transformation Specialist
AiLab interviewed Rachel Neaman, the Director of Neaman Consulting and non-executive Board Member for the Campaign for Social Science, UKCloud Health, DigitalHealth.London and Digital Leaders. Rachel has also featured in Computer Weekly’s list of Most Influential Women in UK IT for 2016, 2017 and 2018.
We met up in Adelaide, South Australia, whilst Rachel was visiting from the UK, to find out about Rachel's extensive leadership, as well as her insights and work in digital transformation, inclusion, diversity and social good.
This interview took place in September 2018 with Dr. John Flackett.
AiLab: Hi Rachel. Can you tell us a bit about yourself and your current role?
RN: Hi John. I have been working with all kinds and forms of digital technology for over 25 years, although I am not a technologist. I actually started my career in communications and my interest in digital technologies and interfaces started when websites stopped being just a marketing front-end, and started to become a tool for digital engagement. Since then the world has changed radically, and what I am interested in now is the impact of the pace of change we are seeing imposed by technology and the effect this is having on society. For instance: What is the human impact of technological innovation? What is the consequence of the way we are using technology today and of the speed with which it is dominating our world?
AiLab: Is this what led you into the AI and emerging technology space?
RN: Absolutely. I worked in government for over a decade and was the Head of Digital at the Department of Health in the UK for a number of years. I was interested in how digital could support people to self manage their health conditions and how it could augment the work that clinical staff do. From there I became the Chief Executive of an organisation called GO ON UK [Ed: now known as Doteveryone], founded by Martha Lane Fox, which focused on digital inclusion and ensuring every citizen had at least a basic level of digital skills. At this time, the UK government was embracing 'digital by default' and more and more public services were being moved online. I started to be concerned for those people who couldn’t access those services online for a variety of reasons. The irony is that the highest proportion of people that need those public services are in the groups most likely to be digitally excluded. I spent a lot of time thinking about how technology can be used to augment and support people, rather than threaten the way they live. AI is a great example of this.
AiLab: Is this the aspect of AI that you are most interested in?
RN: Yes, I am interested in how we can ensure that AI remains an ethical tool and an augmentative tool, rather than something that works against us.
AiLab: There is a lot of hype around AI at the moment? What do you think people should be aware of in regards to the hype?
RN: I think people need to gain a better understanding of what Artificial Intelligence actually is. There is an awful lot of hype and also misconception, which I think is largely caused by the language around how AI is described. The phrase 'Artificial Intelligence' makes people think of science fiction utopias or dystopias with super-robots taking over the world. Actually, AI can just be rule-based algorithms doing what we tell them to do, and their accuracy and utility depend on how well they have been programmed and the data they use. The processing power that AI provides is fantastic and fabulous – it transforms repetitive and boring processing jobs in a second, which is superb. What AI isn’t though, is some form of superior intelligence that is going to subjugate humans into an underclass.
AiLab: Do you think more education around AI is required to try and mitigate the misconceptions?
RN: Yes. Given that AI and machine learning are starting to dominate our world, we should ensure people are educated about how to live alongside it, as well as be looking at the skills people need to survive in this kind of world. These are life skills, aptitudes and capabilities that can help people to understand the new world they are living in. We should be asking: How do you help people to become resilient to change? How do you help people adapt to and keep up with constant change?
The reality is that we will no longer have a ‘job for life’ and very little around us will stay the same for long. As a result, people will constantly need to refresh their skills – not only for the workplace, but also for their leisure time, for interacting with public services, for filling out tax returns, applying for a passport or driving licence, banking, shopping and so on. It’s not just the role of the formal education system to support these new life skills, it’s something that people need to be able to be continually refreshing throughout their life.
Of course, it’s hugely important that the formal education system gets a handle on this and starts to teach kids to be enquiring, to problem-solve, to think critically, use judgement, be able to tell fact from fiction, and to understand if something is fake news. We need to inculcate that from a very young age, but equally important is how we support people at the age of 45+ that are already in jobs when automation comes along and totally changes their role.
I don’t subscribe to the idea that the dominance of AI means we’ll all be unemployed; we’ll have different jobs and we need to be planning for and thinking about that now. How can we augment our inherent human capabilities to work alongside machines and increase the value we bring? Machines can do repetitive processing tasks far more quickly and accurately than people can, but there are still things that people can do far better than machines. What we need to work towards is an environment where human beings and machines work together as an augmented capability that has more value than either on its own.
AiLab: As a specialist in digital transformation, skills and inclusion, do you think we are making progress in these areas and where do you see the landscape heading?
RN: I think we are still very much in the early stages of this. We’re in the foothills and not at the summit – which is inevitable as the world is changing so quickly. We need to keep some perspective and know we’re not going to get it all right today, because we don't know what’s coming down the track tomorrow. We have to learn to accept that we can’t be in control of innovation all of the time. We do need our institutions, decision-makers and regulators to be agile enough to adapt to change far more quickly than they have to date. The way our institutions, law-making and regulatory frameworks are set up are not suitable for the world we now live in.
AiLab: Is this more important because technology connects people across the world?
RN: Yes, very much so. Technology is making our world much smaller. I’m currently thousands of miles away from home and in a different timezone, but just by using my smartphone I can see and talk to my husband anytime I like, check my dog is not chewing the carpet, or take part in a meeting in real-time back in the office. However, there is a context to how we do and don’t use technology and local conditions and culture play an enormous part in this. At the same time, at a macro level, we are seeing major global implications – in terms of geopolitics, economics, and social change – as a result of our hyper-connected world.
AiLab: I’m glad you brought up local conditions. For instance, isn’t it difficult to get digital inclusion if there is little to no internet access in a rural area, versus high-speed internet available within a city?
RN: Yes, inclusion gets me really worked up. The UK for instance has worked hard on this and 9 out of 10 households are internet connected. However, what about the 10% of households that are not? What about those people that live in a rural community where there is absolutely no connectivity at all? If we are living in world dominated by technology, then access to technology is a utility in the same way that people need access to electricity and water. There are certain conditions of living that we expect everyone to have and access to the internet should be one of those, because it plays such a dominant role in all our lives – but it is still not seen like that despite political rhetoric. From a commercial point of view, it’s not in the interests of a large company to invest in connecting a tiny rural community. It’s not profitable. So what do people do who live in isolated and remote areas? What do their kids do who rely on the internet for their school work?
Internet access is a lifeline and a utility, yet the inequalities it creates are often driven by commercial interests rather than by logic. Inclusion is absolutely critical and access to connectivity is a hugely important factor. However, people also require the skills and confidence to be able to use the technology once they do have access to it.
The other issue that’s often forgotten is affordability; for example if people are struggling to get by then they won’t want to spend money on locked-in contracts – so then what do they do? We have to ensure universal access, affordability and skills as a baseline before we start worrying about ultra-fast speeds and how sophisticated the technology is. These are real basic hygiene factors to put in place before people can start to make the most of technology to improve their personal, social, economic, political, and cultural lives.
AiLab: Great point – as the world is moving into a distributed workforce, social and economic inclusion is really important. How do you see this fitting in with future jobs?
RN: Digital is completely changing the workplace and the workforce and we all need to understand and be able to use digital technologies in a way we haven't before. It’s not just technical teams that need digital skills. For example, you may work in a warehouse, but you still need to use a stock control system. There is a whole issue here around leadership and ensuring that senior leaders really understand what digital means for their business strategy, rather than delegating responsibility and saying 'the IT team can sort that out'. Similarly, technical staff have to understand the business and its needs in order to ensure that whatever they are implementing and supporting is fit for purpose. So there is a changing relationship with ‘what’s our strategy, how do we deliver it and what’s the outcome?'
One thing we’ve seen in the UK is that small and micro businesses are unable to afford the time or resources to keep training their workforce in new digital ways of doing things. Technology enables businesses to adopt new ways of working, but although this may make their organisation more profitable and may improve their customer relationships, the workforce can’t take the time out to do the training because they are trying to keep the business going. So it can be too glib to say small businesses need to invest more in training their workers. These sorts of issues affect people at the sharp end in a way that governments, regulators and big businesses don’t notice and don’t realise – this is when inequality starts to take hold.
Technology is a great democratiser and the internet is the most democratic thing that we could possibly have. Yet it creates divides, because we don’t necessarily have the cultural conditions that allow everybody to benefit at the same pace and in the same way. There is more potential for a digital divide now. The impact if you are not digitally included or able to make the most of technology is far greater today than it ever was before. So that divide is getting much deeper and the inequalities within society are becoming exacerbated rather than resolved through technology.
AiLab: What do you think can be done about the digital divide?
RN: Well, it’s a cultural and social issue as much as it is a technology issue, and is very much about the society we choose to live in and how we run that society. How do we reinvent our world for the technology that can better enable it? What are the norms, the ethics, the standards and regulations we need? Technology itself is a neutral thing – it’s neither good nor bad. It’s how it is developed, deployed and marketed that can have negative or unintended consequences. Therefore, it’s important that we know who is consciously taking the political, ethical and cultural decisions about our technology enabled future. Otherwise it will be left to the large tech companies to shape our world. We are beginning to see progress; in the UK government has established a new Centre for Data Ethics and Innovation and many other not-for-profit, civil society and academic organisations are beginning to look at these issues.
How do we start to codify, and I will use the ‘R’ word – 'regulate' – for this world, but in a way that is positive and constructive, rather than punitive and negative? I don’t have the answers – I wish I did and I don’t think anyone does yet. Plenty of people are now thinking about these issues which is a great step forward.
AiLab: What are your thoughts on realistically addressing these issues?
RN: I think we need to start by acknowledging the reality that we are living in, so there is a bit of ‘getting real’ about our world. Realising that we are not in control of everything in our environment, but that technological innovation is accelerating and affecting how we live whether we like it or not. I’m really interested in this whole area and spoke about this topic in a recent talk. For example, there are positive and negative consequences of all technology and innovation. Big data is fantastic because it can be used to make connections between things and helps with amazing medical research for instance, but if businesses start investing in the use of big data, they can become more vulnerable to hackers and cyber threats. We also have fake news, filter bubbles and echo chambers; people think that their world view and what they are reading is balanced, but of course it can be just reinforcing their own point of view.
There is a real need for people to understand the reality of this world and that the technology is the enabling medium and will do whatever we ask and tell it to do. We’ll continue to innovate and push the boundaries of what technology can and can’t do, but we’ve got to put in place cultural standards around how we use it. Starting with awareness and understanding, it also involve a change in the education system. In addition, and perhaps more importantly, a change in people’s expectations of how their life is going to develop. We are beginning to become more conscious and more aware of the implications of the technological world. For instance, people are now ‘detoxing’ from social media and realising that it’s not essential to have 24/7 information.
The other element that is becoming increasingly understood is the impact of our use of technology on our brains. The whole dopamine effect; the instant gratification that you get from somebody instantly ‘liking’ your tweet or Instagram post, is in itself addictive and starts to create dopamine and physically changes the transmitters in your brain. So, the implications are quite shocking if using some digital technologies is actually changing the way our brains function.
AiLab: Indeed and if AI is being used to drive that, then we have machines changing the way we act. People don’t necessarily realise they are being micro-targeted?
RN: Exactly and everyone using this technology has to understand that. It’s a bit like driving a car; drivers do need to have some knowledge of how the car works. For example, where to put the petrol in, how to pump up the tyres, etc. However, many people use technology without knowing where the ‘petrol’ goes and trusting it blindly. It takes something like the Facebook and Cambridge Analytica scandal for people to realise that these technologies are complex. Not everyone will necessarily understand all the facts behind these stories, but it creates a moment in time when people realise ‘gosh’, this is not quite what it seems. Taken to the other extreme however, we’re also seeing a ‘techlash’ – people now saying they don’t trust any social media companies and that new technology is all terrible – and that is not right either. So, at what point should people take personal responsibility for understanding what they are using?
AiLab: So there are two issues really – one is the frameworks and policy that protects the citizens. The other is we also need need to take personal responsibility to ensure we are doing the right thing and we are educated about it?
RN: Yes, so it comes back to education – but not just in terms of the formal education system. To say that you only need to be educated between the ages of 4-16 is just nonsense. As a child and young person you get bombarded with learning for those intense 12 years. But what happens after that? We need to encourage much greater lifelong learning and who takes responsibility for that? Is it the workplace? Is it a local community? Is it the government? I don’t think there has ever been a time before where this has been so necessary. So are we going to see refresher courses in life skills as changes to how we live, work and play start to become more frequent – who knows? I do think we’re on the cusp of a real revolution in learning and training, which has to be driven by technology; both in the way we access and provide the learning and teaching. How do you equip a workforce and a society to remain relevant as the world around it moves on once individuals are out of the formal education system?
AiLab: With things changing so quickly how difficult is this? For example, providing certification for learning when it is distributed and would you even need formal qualifications?
RN: This is why I think we’re about to see a complete revolution in the way that education is being undertaken. It should be much more about personalised learning and what is relevant to that individual and the context in which they live. What is the value of knowledge today and what is important? We need to be equipping people with the skills to live and adapt to an uncertain and changing world. Given the rise in AI and machine learning, the attributes that humans need to remain relevant and viable are the ‘softer’ skills that are currently not universally taught in formal educational environments.
AiLab: Is there anything else you would like to share with AiLab readers?
RN: There is so much to say on all of this. My overriding sense is that society is still feeling its way and is still inevitably ignorant of the consequences and impact of the technology we are using and developing. How do we make incremental and positive steps forward when so much remains unknown? I guess that in 50 years time, people will look back at 2018 and think how earnest and excited we all were about technology, yet didn’t understand either how it worked or the impact we were creating for future generations.
AiLab: Thanks so much for your time Rachel. Enjoy the rest of your time in Australia and have a good trip back to the UK. Look forward to catching up again over there.
RN: Thanks John. Look forward to it.
This interview is copyright to koolth pty ltd & AiLab © 2018 and may not be reproduced in whole or part without the express permission of AiLab.