Citizens’ trust in their usefulness and good intentions is a crucial condition for the acceptance of digital infrastructure, platforms, services or participatory processes. Although the public seems to trust manufacturers, big services and the authority of politicians and scientific experts generally, there are also grounds that speak for stronger involvement of citizenry in their control. From an individual perspective, deeper trust might be shaped by a healthy balance of confidence and also reasons for falsifying distrust. Different leaks, privacy breaches or data scandals, behind lacking state regulation and soft self-governance in the private sector, show the challenges arising from a lack of governance. However, if institutions that are considered trustworthy fail to meet the expectations placed in them on a sustained basis, a fundamentally trusting attitude can subsequently turn into categorical distrust. As a consequence, and considering the risks and technical implications caused by digitalisation, we need a sense of critically considered and trustworthy governance that is broadly supported in society.
Who should be responsible for such governance? While people feared the computer state during the 1980s, now they fear big data platforms. “In the EU-27, more than one in five respondents (23 %) do not want to share any of these data with public administration, and 41 % do not want to share these data with private companies” (FRA, 2020). The tech lobby is, according to the portal lobbyfacts.eu, one of the most active in Brussels (the umbrella organisation Digitaleurope alone has 14 lobbyists and Google alone had 230 meetings with the European Commission in 2018). But still, the demands to their regulation or domestication according to democratic principles are not going away. To do this, we need new or differently organized institutions.
Institutions with justified credibility. Paradoxically, trustworthy institutions enable people to develop trust in these organisations or in other people, but also offer a space for practicing critical thinking (or a healthy level of distrust). People in modern societies are able to trust strangers via such institutions, which is the basic condition for large democracies. Institutions serve as a matching space (or a man in the middle) between diverse people and interests. They offer citizens a space where they might experience their common interest mediated through their inscribed purpose and due to their gained credibility: “It is this implied normative meaning of institutions and the moral plausibility I assume it will have for others which allows me to trust those that are involved in the same institutions – although they are strangers and not personally known to me” (Offe, 1999, p. 70).
Whom do we trust?
A study of the EU Fundamental Rights Agency pointed out, that 55% in the EU fear that criminals get access to their personal information. Around one third has concerns against advertisers (31%) and foreign governments (30%). Around one quarter among the respondents is sceptical toward their countries intelligence services (26%) and governments (20%). 17% share concerns regarding law enforcement agencies and employers (FRA, 2020).
In regard to technology companies, the question appears: How can I trust seemingly non-existent institutions? It is often not possible to meet or speak with concrete persons, and companies and providers are not investing in more visibility, responsibility and accountability. As a result, today, more and more, the men in the middle is fading away and citizens need to draw trust from a generalised belief in the adequacy and reliability of technology systems. “If the recorded individual has come into full view, the recording individual has faded into the background, arguably to the point of extinction” (Fourcade/Healy, 2017, p. 11).
The Edelman Trust Barometer explains: In the ethical domain, in particular, civil society is perceived as highly credible and trustworthy, while companies seem to be perceived as competent. Therefore, the challenge for state, media and civil society would be to gain more digitalisation competence and in particular for civil society, to bring clear ethical positions inside the debates, regulations and governance. (Edelman Trust Barometer 2020: p. 20).
Extending digital competence of independent organisations
Since the digital sphere affects people in very different roles as producers of data and content, as consumers, employees or as (digitally) civically engaged citizens, NGOs need to recognize the Digital as a natural field of action and to increase their co-creation competence.
The EU’s Digital Market Act/Digital Services Act package specifically mentions the important role of trusted flaggers in monitoring platforms. Organizations considered as such can draw attention to legal violations on platforms and their reports are prioritized. In this way, civil society in particular supports (often on behalf of vulnerable groups) the supervision and self-regulation of platforms. Other opportunities to monitor, regulate or increase accountability could be ombudsmen or platform councils. Public-civic partnership is also gaining in importance, and this increase in significance could be reflected in better collaboration (for example, in the accompaniment of smart cities or the unlocking of public data for nonprofit purposes) or in hybrid organizations (why not, for example, learn from the success of German Stiftung Warentest – and use a data protection foundation to provide independent and broad information about digital offerings and products?).
However, in order to do this successfully, digital civil society needs the necessary resources or must build them up in an unequal competition with industry or research. But the approach is definitely a step in the right direction and shows that digitally competent civil society is part of the critical infrastructure in a post-digital society.
The EU Fundamental Rights Agency points out – in relation to the monitoring and control of facial recognition technologies – the importance of independent bodies that are committed to fundamental rights first and foremost: “An important way to promote compliance with fundamental rights is oversight by independent bodies” (FRA, 2019, p. 21). This would imply the inclusion of civil society in a structured way in such bodies, but also in arbitrage bodies and in decision-making or rule-setting processes.
More cross-sectoral representation and pluralism in supervision, governance and standard setting of the digital would also acknowledge and better moderate and integrate different perspectives in the society.
Edelman Trust Barometer 2020 (2020).
European Union Agency for Fundamental Rights (EU-FRA 2019). Facial recognition technology: fundamental rights considerations in the context of law enforcement; Luxembourg: Publications Office of the European Union. https://doi.org/10.2811/52597
European Union Agency for Fundamental Rights (FRA 2020). Your rights matter: Data protection and privacy – Fundamental Rights Survey. Luxembourg: Publications Office of the European Union, 2020. https://doi.org/10.2811/292617
Fourcade, M.; Healy, K. (2017). Seeing like a market. Socio-Economic Review, Volume 15, Issue 1, January 2017, Pages 9–29, https://doi.org/10.1093/ser/mww033
Offe, C. (1999). How can we trust our fellow citizens? In Warren, Mark E. (ed.).
Democracy and Trust. Cambridge: Cambridge UP, 1999: 42-873
Activism & Participation – Digital Transformation in Learning for Active Citizenship
This text was published in the frame of the project DIGIT-AL – Digital Transformation Adult Learning for Active Citizenship. It was slightly updated.
Elisa Rapetti and Ricardo Vieira Caldas: Activism and Participation (2020). Part of the reader: Smart City, Smart Teaching: Understanding Digital Transformation in Teaching and Learning. DARE Blue Lines, Democracy and Human Rights Education in Europe, Brussels 2020.