skip to primary navigationskip to content
 

Mozilla Research Grant secured for project building shared understanding around trust and distrust

last modified Jul 12, 2018 03:12 PM
Mozilla Research Grant secured for project building shared understanding around trust and distrust

abstract image of bubbles — vulnerable and linked, like ‘chains of trust’

The Trust & Technology Initiative has been awarded a Mozilla Research grant for a new project to help bridge between sectors and disciplines when exploring issues of trust around internet technologies. We are very grateful for the support from Mozilla which will be really helpful in enabling us to create lively and accessible materials which describe different facets of trust, distrust, mistrust and trustworthiness, and we’ll be sharing all we create under an open licence.

Why does this matter?

To build internet technologies which work well for society, we need to have more effective collaboration across different disciplines and sectors, connecting technology development, social science and humanities, policy-makers and more. Today, we lack the shared understanding and terminology to make this work around one of the most critical concepts: trust.

Trust in technology and the organisations which make it is essential for a future in which the internet is inclusive, supportive, diverse, and benefits everyone. A good understanding of trust and trustworthiness, and also distrust and the dynamics of trust, is essential. Trust is the basis on which all people, and organisations, engage and transact online — whether this is for work, play, civic and democratic duty, learning or caring. It is critical that the diverse groups working to build next generation internet systems and governance understand trust, and are able to discuss aspects of it when designing technology and internet infrastructure, and the communities and organisations which build and operate it. This enables the design of good governance structures and the creation of appropriate accountability for connected systems.

This project seeks to build a shared understanding of trust and distrust and the dynamics around them, by surveying research across disciplines, and identifying and showcasing relatable case studies and examples which highlight behaviours, attitudes and challenges around real human experiences of trust and distrust.

Mozilla is an excellent partner to have for this work.

As the Mozilla Manifesto notes, recent internet developments include both aspects of its early promise, and challenging problems. The future internet needs to respect people and communities more than it does today, tackling inequalities from information power and surveillance, and to be built to be more accountable. Accountability means good governance, transparency in some areas, and a balance of power with means of protection and redress for those who need it. Mozilla identifies this as the need to “recognize and steward the Internet as a critical public resource.” In democracies, the tension between power, trust and distrust is essential to create institutions that can wield power and yet be held to account. Should we build governance structures for the internet, that reflect this, for the new power centres online, sometimes not publicly owned or controlled but instead held by dominant corporations or powerful founding individuals? To even consider this, we need to examine trust and power - and to do so across boundaries.

Trust and distrust, and the dynamics around them, are a key part of the human experience. To create what Mozilla describes as “an Internet that truly puts people first, where individuals can shape their own experience and are empowered, safe and independent” demands that we have an appreciation for trust, to enable greater reliance on trustworthy systems, and detection of those which are not.

This is not simply a technology challenge. It is a challenge about society, organisations and people. To address it requires interdisciplinary and cross-sector collaboration - bringing together work on next-generation technologies and their underpinnings; the many dimensions of trust at individual, organisational and societal levels; and the legal, ethical and political frameworks impacting both trust and technology.

Such collaboration is extremely hard, demanding that fundamental differences in understanding, values and work practices are bridged. We propose in this project to address one small but important piece of that - to build shared understanding around concepts of trust.

The Trust & Technology Initiative has been learning from researchers in and around Cambridge since our work started earlier this year, and in our discussions with scholars across political and social sciences, arts and humanities, and computer science and technologists, we’ve found that there are often real misconceptions about basic ideas of trust, which impede collaboration.

Technology developers (and politicians) often assume that trust can be built, and that more trust is necessarily good. But psychologists and political scientists see this as naive. Taking an example outside technology - democratic institutions are designed for a dynamic where the public often will not trust politicians (partly because of the power they wield). This means checks and balances are built around the changing trust/distrust landscape. Philosopher Onora O’Neill suggests we think more about trustworthiness and less about trust. She notes the difference between what people say about trust, and what they do (which renders surveys of trust ineffective).

In this project we will explore existing academic research, and also explore the use of concepts of trust, distrust, and trustworthiness in technology development, media and policy, in order to fill the understanding gap.

This will include topics such as consideration of trust in media onlinechallenges with networked trustconspiracy theories, the tensions between trust and distrust in power relations, trust making and breaking in cooperative activities, cons and scams, trust in open source communities and collectives as alternatives to corporates for technology provision, how concepts of trust and confidence translate (or don’t!) across languages, and how trust issues drive people to adapt tech to their needs

We’re not seeking to create a single universal definition of trust. Such a thing is impossible; cultures and communities have such deep and varied experiences of trust that there is no single shared definition that would make sense. Instead, we’re looking to illustrate the different facets of experience and understanding around trust, distrust and mistrust, to help communities involved in building, designing, governing and evaluating internet technologies to better understand each other.

We’ll be working in the open as much as possible on this project, and will be gathering ideas for content and feedback on draft work from others to make sure our outputs are useful for different groups. Follow Dr Laura James or the Trust & Technology Initiative on Twitter to stay up to date, and get in touch if you would like to be more actively involved. 

 

About us

The Trust & Technology Initiative brings together and drives forward interdisciplinary research from Cambridge and beyond to explore the dynamics of trust and distrust in relation to internet technologies, society and power; to better inform trustworthy design and governance of next generation tech at the research and development stage; and to promote informed, critical, and engaging voices supporting individuals, communities and institutions in light of technology’s increasing pervasiveness in societies.  

Mailing list

Keep up-to-date with our news, events, and activities.

Sign up here