skip to primary navigationskip to content
 

Cambridge Perspectives on Trust and Technology

In the run up to our launch event, we asked researchers from Cambridge to give us their thoughts on trust and technology. These short pieces of work are their perspectives.

(You can also download a PDF of the complete set of articles, which launch attendees received on paper.)

  


Talking About Trust

Dr Laura James

Trust & Technology Initiative; Department of Computer Science and Technology

The Trust & Technology Initiative has been awarded a Mozilla Research grant to help bridge between sectors and disciplines when exploring issues of trust around internet technologies. This project seeks to build a shared understanding of trust and distrust and the dynamics around them, by identifying and showcasing relatable case studies and examples which highlight behaviours, attitudes and challenges around real human experiences of trust and distrust.

Read more >

 


Trust, Technology and Truth-Claims

Dr Ella McPherson

Trust & Technology Initiative; Department of Sociology

My research focuses on the production, evaluation and contestation of truth-claims in the digital age, and my path into this tangled topic is the empirical case of human rights fact-finding.  Because it is so high-risk and so contested, this practice is a canary in the coalmine for wider professions and publics struggling to get to grips with the new information order.  Indeed, human rights practitioners working with digital evidence were sounding alarm bells about fake news well before the problem became mainstream and are at the cutting edge of verification methodologies.  A concern with trust (and with the associated concepts of trustworthiness and credibility) is at the centre of their work – and thus it is at the centre of mine.

Read more > 

 


Fundamentally more secure computer systems: the CHERI approach

Prof Simon Moore

Trust & Technology Initiative; Department of Computer Science and Technology

In collaboration with SRI International (California), members of the Computer Architecture and Security groups in the Cambridge Computer Laboratory have spent over eight year exploring fundamentally more secure ways of building computer systems.  Starting with a conventional microprocessor, we have added augmented the hardware/software interface with new compartmentalisation primitives that allow security critical properties of software to be better represented.  This ensures that the hardware better understands the software that it is running, so is better able to run the code as the programmer intended, not as the attacker tricked it.  Compartmentalisation allows software to exploit the principle of least privilege, a fundamental idea in computer security dating back to the 1960s but is ill supported by prior computers.

Read more >

 


The Political Economy of Trust

Prof John Naughton

Trust & Technology Initiative; CRASSH

Much of the discussion of trustworthy technology is understandably focussed on the technology itself. But this ignores the fact that the kit doesn’t exist in a vacuum. Digital technology is now part of the everyday lives of four billion people and in the process has raised clear questions of trust, reliability, integrity, dependability, equity and control. Some of these issues stem from technical characteristics of the equipment; others stem from the fallibility or ignorance of users; but a significant proportion come from the fact that network technology is deployed by global corporations with distinctive business models and strategic interests which are not necessarily aligned with either the public interest or the wellbeing of users.

Read more >

 


Compliant and Accountable Systems

Dr Jat Singh

Trust & Technology Initiative; Department of Computer Science and Technology

The “Compliant and Accountable Systems” research group takes an interdisciplinary (tech-legal) approach towards issues of governance, control, agency, accountability and trust regarding emerging technologies.

Read more >

 


AI Trust & Transparency with the Leverhulme Centre for the Future of Intelligence

Adrian Weller

Trust & Technology Initiative; Department of Engineering 

This project is developing processes to ensure that AI systems are transparent, reliable and trustworthy. As AI systems are widely deployed in real-world settings, it is critical for us to understand the mechanisms by which they take decisions, when they can be trusted to perform well, and when they may fail. This project addresses these goals in three strands.

Read more >

 


Prof David de Cremer

Judge Business School

Even though the technological use of Email has been predicted to come to an end, within organizations it is still one of the most commonly used communication channels. Recent research indicates that with the use of open work spaces – aimed at promoting more face-to-face communication – the use of email has been increasing again. Email thus remains an important communication tool because it primarily helps to distribute information among different parties involved. Transparency is key in ensuring that this communication technology is trusted by all recipients. Email, at the same time also offers several different possibilities on how to communicate. It is very easy to add people to cc and bcc, which makes that Email can also be turned in a more strategic tool of communication. Despite that these different communication options are easy to select the consequences can however be detrimental to team work.

Read more >

 


Digital Trust Dissonance: when you’ve got them by the app, their clicks and minds will follow

Richard Dent

Department of Sociology

Debate about trust in big tech has recently been re-ignited by Lisa Khan’s 93 page article about Amazon’s business practices may require antitrust legal intervention. Meanwhile the average Amazon customer loves their Prime account, including Khan’s husband who is a regular user. Rachel Botsman asked “Who Can You Trust?” in her 2017 book, with a sub-title of “How Technology Brought Us Together – and Why It Could Drive Us Apart“. I would argue that neither has happened yet. Instead, many individuals have entered a psychological state that one might call ‘digital trust dissonance’.

Read more >

 


Digital Voice, Media and Power in Africa

Dr Stephanie Diepeveen

Department of Politics and International Studies (POLIS)

Digital media appear to disrupt power in profound and polarising ways: opening up new channels for voice, and also bringing unprecedented forms of surveillance by state and private actors. In Africa, the stakes are high. The 2017 elections in Kenya bring these two sides into sharp relief. A critical and satirical political commentary of Kenyans on Twitter (#KOT) contrasted with attempts at control and surveillance as the governing coalition contracted the behavioural insights/marketing firm Cambridge Analytica, ‘fake news’ was reportedly rife, and the technology manager at the electoral commission was murdered a week prior to election day.

Read more >

 


Govtech Requires Many Relationships of Trust

Dr Tanya Filer

Bennett Institute for Public Policy

Governments around the world are beginning to support the growth of domestic Govtech, or government technology, industries. Govtech companies—typically start-ups and SMEs—seek to serve the public sector as client, maximising the efficiency of their public service provision.1 For governments, the development and sustainable growth of the Govtech industry holds a double allure: the promise of economic growth as the global Govtech market courts valuations of $400 billion annually;2 and the possibility of innovating the domestic public sector at a moment when the institution of government is at a crisis point—frequently perceived as retrograde and excessively bureaucratic. My research explores policies for building sustainable and citizen-centred Govtech ecosystems. It finds three relationships of trust—belief in the reliability and capacity of others—to be crucial to this effort

Read more >

 


TRVE Data: Secure and Resilient Collaborative Applications

Dr Martin Kleppmann

Department of Computer Science and Technology

Cloud-based collaboration tools such as Google Docs, Evernote, iCloud and Dropbox are very convenient for users, but problematic from a security point of view. At present, most such services are provided by companies through a centralised server infrastructure, which is vulnerable to operational mistakes by the service provider, security breaches, and cyberattacks.

Read more >

 


The Autonomous City

Dr Ian Lewis

Department of Computer Science and Technology 

We have a broad range of research in the Department of Computer Science and Technology to tackle issues and opportunities arising from the global densification of populations into large urban centres. Our Adaptive Cities Programme is designed to exploit high-volume sensor deployments, collecting and acting upon urban data collected in real-time. The use of the word ‘Adaptive’ (we could have chosen ‘Future’) emphasises that we are collecting the data because we are likely to want to do something about it. For example traffic congestion might be improved by changing the signalling and similar considerations will apply to air quality, waste collection, power distribution and other infrastructure areas.

Read more >

 


Trust, Evidence and Local Democracy; how Cambridgeshire County Council has bridged the gap

Ian Manning

Cambridgeshire County Council

Cambridgeshire County Council needs the trust of its residents; it needs to be able to back up decisions it makes with expert evidence, and show it's open to criticism - and we've done that, in a way that no one else has. 

Read more >


Giving Voice to Digital Democracies

Dr Marcus Tomalin

CRASSH

This exciting new project will begin on 1st October 2018, and it is one of the inaugural projects for the Centre for the Humanities and Social Change that is based at CRASSH. The Centre forms part of the Humanities and Social Change International Foundation. The project will explore the profound social changes induced by Artificial Intelligence (AI) and Information and Communications Technology (ICT) in modern digital democracies.

Read more >

About us

The Trust & Technology Initiative brings together and drives forward interdisciplinary research from Cambridge and beyond to explore the dynamics of trust and distrust in relation to internet technologies, society and power; to better inform trustworthy design and governance of next generation tech at the research and development stage; and to promote informed, critical, and engaging voices supporting individuals, communities and institutions in light of technology’s increasing pervasiveness in societies.

Find out more > 

Mailing list

Sign up to the Trust & Technology mailing list to keep up-to-date with our news, events, and activities

Sign up to our mailing list >