skip to content
 

Talking about trust

Dr Laura James
Trust & Technology Initiative
 

The Trust & Technology Initiative has been awarded a Mozilla Research grant to help bridge between sectors and disciplines when exploring issues of trust around internet technologies. This project seeks to build a shared understanding of trust and distrust and the dynamics around them, by identifying and showcasing relatable case studies and examples which highlight behaviours, attitudes and challenges around real human experiences of trust and distrust.

Why does this matter?

To build internet technologies which work well for society, we need to have more effective collaboration across different disciplines and sectors, connecting technology development, social science and humanities, policy-makers and more. Today, we lack the shared understanding and terminology to make this work around one of the most critical concepts: trust.

Trust in technology and the organisations which make it is essential for a future in which the internet is inclusive, supportive, diverse, and benefits everyone. A good understanding of trust and trustworthiness, and also distrust and the dynamics of trust, is essential. Trust is the basis on which all people, and organisations, engage and transact online — whether this is for work, play, civic and democratic duty, learning or caring. It is critical that the diverse groups working to build next generation internet systems and governance understand trust, and are able to discuss aspects of it when designing technology and internet infrastructure, and the communities and organisations which build and operate it. This enables the design of good governance structures and the creation of appropriate accountability for connected systems.

Recent internet developments include both aspects of its early promise, and challenging problems. The future internet needs to respect people and communities more than it does today, tackling inequalities from information power and surveillance, and to be built to be more accountable. Accountability means good governance, transparency in some areas, and a balance of power with means of protection and redress for those who need it. Should we build governance structures for the internet, that reflect this, for the new power centres online, sometimes not publicly owned or controlled but instead held by dominant corporations or powerful founding individuals? To even consider this, we need to examine trust and power - and to do so across boundaries.

Trust and distrust, and the dynamics around them, are a key part of the human experience. We need an appreciation for trust, to enable greater reliance on trustworthy systems, and detection of those which are not. This is not simply a technology challenge. It is a challenge about society, organisations and people. To address it requires interdisciplinary and cross-sector collaboration.

The Trust & Technology Initiative has been learning from researchers in and around Cambridge, and in our discussions with scholars across political and social sciences, arts and humanities, and computer science and technologists, we’ve found that there are often real misconceptions about basic ideas of trust, which impede collaboration.

Technology developers (and politicians) often assume that trust can be built, and that more trust is necessarily good. But psychologists and political scientists see this as naive. Taking an example outside technology - democratic institutions are designed for a dynamic where the public often will not trust politicians (partly because of the power they wield). This means checks and balances are built around the changing trust/distrust landscape. Trustworthiness – that technology, systems and organisations are honest, competent and reliable – is a more valuable concept than trust.

We’re exploring trust in media online, challenges with networked trust, conspiracy theories, the tensions between trust and distrust in power relations, trust making and breaking in cooperative activities, cons and scams, trust in open source communities and collectives as alternatives to corporates for technology provision, how concepts of trust and confidence translate (or don’t!) across languages, and how trust issues drive people to adapt tech to their needs.

We’re not seeking to create a single universal definition of trust. Such a thing is impossible; cultures and communities have such deep and varied experiences of trust that there is no single shared definition that would make sense. Instead, we’re looking to illustrate the different facets of experience and understanding around trust, distrust and mistrust, to help communities involved in building, designing, governing and evaluating internet technologies to better understand each other.

Keep in Touch

    Sign up to our Mailing List
    Follow us on Twitter
    Email us