Stories

Ten ways from fear to human tech

The digital revolution has given us so much, but has brought with it some serious downsides – online safety fears, addictive devices, surveillance and an increasingly uncertain future. Eva Appelbaum and Jess Tyrrell share 10 challenges and 10 ideas for tech change, from our new Power & Responsibility discussion paper.

Jess & Eva

Big tech – the five top-performing tech stocks Facebook, Apple, Amazon, Microsoft and Google – has without doubt amassed enormous power over the last 20 years.

With great power comes great responsibility, and pressure is mounting on these and other technology companies to do more to mitigate the often unintended but still harmful consequences their products and services have on society.

In our discussion paper, drafted for DigitalAgenda’s Power & Responsibility summit in October 2018, we outline what needs to change and present some ideas to make this happen. 

For many years much of the world has been mesmerised by the rise of digital technology. “Disrupt” has been a positive battlecry. We lived in a state of tech worship, devouring products, sharing data, looking on uncritically as successful tech founders became the heroes of the modern world.

In the era of tech worship, tech companies developed largely outside regulatory constraints, the most successful of them amassing huge fortunes and power. Some of these visionaries would prefer to live in a tech utopia, a place with no governments, no checks on capitalism, where engineers rule.

But the tides started to change and we have seen growing backlash – tech fear, unleashed by many concurrent triggers: the exposure of ‘fake news’, data breaches, growing discontent with wealth inequality, warnings of mass job losses to Ai, frustration over tax evasion and the unaccountable power of Silicon Valley billionaires.

In the UK, legislators are strengthening their response, such as home secretary Sajid Javid calling for social media companies to do more to tackle online child abuse, or face regulation. Regulation looks inevitable.

In the US, the Cambridge Analytica showdown has been followed by another major data breach at Facebook, subjecting Mark Zuckerberg and his company operations to further scrutiny.

Even those who were previous champions of digital technologies, indeed responsible for its very infrastructure, such as Tim Berners-Lee, Jaron Lanier or even Elon Musk, have themselves sounded notes of caution.

Neither tech fear nor tech worship do us much good.  

We put forward another option, where all the benefits of technology are balanced against public goods and social protections integral to serving our human needs. Where the industry engages with society, meets its responsibilities and upholds collective systems. We call this a state of human tech, which unlike tech utopia, balances power and responsibility between technology creators and collective systems.

We may only be in the foothills of the digital age, but we have moved firmly beyond its heady early days and it is high time for an inclusive and meaningful debate how we want technology to serve us.

So, what can we do about it? The first step is to admit we have a problem.

We have identified 10 key challenges across the individual, society and business spheres. This list is by no means comprehensive or exhaustive, but we highlight these as a useful place to start. The challenges range from social media harms to tech addiction, to echo chambers, privacy questions, automation and monopoly power.

The greatest of these is the existential threat posed by the future of Ai as it goes to the heart of the very nature of who we are.

Alongside, we publish a list of 10 ideas for change. There is a positive way forward. We suggest the focus be on improvements in the fields of education, regulation and design.

Technology has moved quicker than our ability to make sense of it, to frame it, to apply values to it. The industry itself is not incentivised to look for evidence or research scenarios that may threaten their business interests.

But human tech will demand that they do so. Responsible companies will engage in this agenda and we believe there are many commercial advantages in doing so.

A new narrative of power and responsibility should interpret the future optimistically and harness the creativity of innovators, while ensuring that an overriding ‘duty of care’ for humans and society is woven in.

Parliament at nightIn particular, we call on tech companies to adopt a more mature attitude to regulation. The right regulation can stimulate innovation, rather than stifling it. By working more closely with regulators companies can build public trust, develop better products and win over consumers.

It is no longer okay for the disruption caused by new technology to happen to us with no agency to shape the outcomes. It is in all our interests for the industry, innovators, educators, regulators, civic society and governments to work together to create our collective future.

The sophistication of the discussions around how tech is impacting us have grown exponentially over the past few months. We look forward to discussing how we in the UK can move from tech worship and tech fear, from a myopic tech utopia to a fully fledged human tech.

Our starting point is setting out 10 challenges we face – and 10 opportunities for change. They are listed below – and explained in more detail in our discussion paper – open for comments until 31 October 2018. Read the full paper and add your comments here.

10 challenges we face

  1. It’s not safe enough online
  2. Not paying tax is unfair
  3. Power sits in the hands of monopolies
  4. The echo chamber dominates
  5. We must face up to tech addiction
  6. Social media impacts mental health
  7. Big brother is watching us
  8. Business models are growing inequality
  9. Automation makes future work uncertain
  10. Existential risks are real.

10 opportunities for change

  1. Prepare citizens of the future
  2. Re-skill workers through lifelong learning
  3. A public health approach to tech risks
  4. Design innovative regulation with industry
  5. Use cities as test beds
  6. Create an overarching Ai ethics committee
  7. Set up a commission into technology regulation
  8. Create a hippocratic oath for data scientists
  9. Devise a code of practice for product designers
  10. Redesign data-led business models.

We want to keep the discussion around this paper open to the community, so our  discussion paper is open for comments until 31 October 2018. Read the full paper and add your comments here.

Eva Appelbaum is co-founder at the Arc Group, @evaapp. Jess Tyrrell is a digital advisor, @jesstyrr

#PowerResponsibility18

Share: