For the better part of the last few decades, the Internet was idealised as a global open network which is free, interoperable, secure and resilient. Although these might have been the intentions once it was designed, all current emerging technologies, with the new opportunities they provide, also bring an array of challenges for human rights, democratic processes and privacy. Biometrics, artificial intelligence, smart cities, IoT or fifth-generation networks are only examples of technologies that might be used to advance digital transformation but also may be applied in an adversarial way, for example to enhance and enable so-called digital authoritarianism. The collection and misuse of citizens’ data is breaking down traditional notions of privacy. There are countries which impose censorship and perform surveillance by using technology to achieve their political goals. Digital tools are facilitating the control over citizens. Governments are deploying sophisticated tools and microtargeting to carry on propaganda, often augmented by the AI in order to foster political divisions and spread fake news.

Therefore, like-minded countries have to stand together for the respect of democratic values, privacy and freedom of expression, and ensure that the Internet does not become a Trojan horse used for increased control, oppression and influence on societies’ behaviour. Technology and online platforms must serve the public good and empower citizens to make their own social, political and economic choices without manipulation, surveillance, spyware and censorship. If democracy as we know it (and as we want it to be) is to get through the digital age, the real solutions to the problem of adversarial and abusive use of technologies must be found.

Nowadays, the discussion on the surveillance practices is very much narrowed to the topic of contact-tracing apps related to the current COVID-19 pandemic. The number of such apps around the world has sharply increased over the past month and the rush to digitise contact tracing is on. The apps are designed to help slow the spread of the virus by tracking the network of contacts between individuals and notifying individuals once they were in close proximity to an infected person.

But with all the good intentions that developers might have, these tools can also have some limitations from the privacy point of view. Based on statistics,[1] it appears that almost one fourth of the apps have no privacy policy. More than half of them do not disclose how long they will store users’ data for and have no publicly stated anonymity measures. Even when apps have these measures in place, they are exposed to hacking and could also tempt some governments to abuse the data and monitor people beyond the pandemic period.

The aim of the first webinar in the series of Road to CYBERSEC Task Force Meetings was to discuss with participants the following questions: how to deploy new disruptive technologies without crossing the invisible line of privacy intrusion; how to make sure that like-minded countries will set up tech policies in the right way; how important is the trust in technology itself or the trust in government while we are deploying intrusive solutions; what does it take to make technology for good and to get public acceptance for it in our democratic societies.

The following aspects were highlighted: the changing nature of surveillance; contact tracing in the times of COVID-19 and the importance of the narrative; political leadership and accountability; cooperation, multi-stakeholder dialogue and interdisciplinary approach, authoritarian practices in developing countries.

Each of the topics is elaborated below. As one of the main goals of the meeting was to clarify challenges and important points in the discussion on the subject of surveillance, very often more questions than answers appeared. We believe that this approach stimulates and serves public debate by boosting the need for a multi-stakeholder approach and by demonstrating the key aspects for further discussion.


Human and machine interactions, increasing amount of digital data and the resulting information are changing the face of humanity. The lack of understanding of how these interactions are driving the surveillance activities, bringing an enormous intrusion into our lives, is becoming the cause of great concern. The societies are aware of the dual-use nature of emerging technologies. In the times of a health crisis, such as the COVID-19 pandemic, digital tools are faced with the challenging task to effectively stop the spread of the virus without endangering citizens’ rights and privacy. Within like-minded countries, there is a consensus on the importance of personal liberty in the sense of the capacity to live the lives according to the reasons and motives that are our own. Fundamental freedoms and resistance against attempts to diminish them are the core values our society is based on. However, the nature of cyberspace and our increased digital presence has taken away a substantial amount of our autonomy (whether we live in a democratic society or under an authoritarian regime) – the autonomy that allows us to tell right from wrong. With the emerging technologies, the meaning of surveillance has been fundamentally redefined. The tools are becoming more and more powerful by enabling information to be collected, stored, and then connected together on an unprecedented scale. It is high time for the discussion on what controls we need to put in place in order to ensure the privacy, security and fundamental right to autonomy, exploiting at the same time the potential of technology for common good and productivity growths.


The current pandemic situation demonstrated the need for states across the world to intervene in unprecedented ways, redefining the state power that might seem frightening. Many governments are nowadays using digital tracing tools in order to protect the well-being of the people and to limit the spread of the virus. Their main objectives are to help manage the risk for individuals and to provide useful information to decision-makers on what steps should be taken in regard to the pandemic. However, one can notice worrisome voices and conspiracy theories about nations trying to build a mass surveillance tool. As stated by one of our experts: the truth is rather more boring and contact-tracing apps are not a one-way highway to a surveillance state. They are rather a careful and transparent experiment, open for independent technical critiques from security experts and cryptographers. The majority of the contact-tracing apps that appeared in the democratic countries are voluntary and do not collect or use any participants’ location data, which is seldom adequately underlined in the public discourse. The question of the proper narrative remains therefore one of the most crucial ones – how the use of the seemingly same technology differs in democratic nations and in non-democratic countries.


Contact-tracing apps are an interesting example and case study of not only the challenges that technology brings but also the uses of technology to deal with emergency situations. One of them is the question of national responsibilities versus international approach. Health domain is an area that national authorities tend to guard within their own framework of activities. The cooperation, in order to be successful, must be then bottom-up (one example of such a cooperation that gathered together national authorities is the common EU toolbox to support contact tracing in the EU’s fight against COVID-19). The second issue is the need for an interdisciplinary approach and wider debate that is crucial in the governance of cutting-edge technologies, which is becoming less and less a matter of technology itself, and much more a social and economic matter in general. In the case of contact-tracing applications these are, among others, public health authorities, technology specialists, privacy experts, ethicists, sociologists. We must always think about a whole array of aspects and impacts a given technology might have.

There is also a need to formulate principles that are more practical, reasonable and feasible, and that hinder or even prevent the surveillance practices. While developing a proper model of governance for technological tools, it is worth to make use of both social and technological constructs at the same time. An example of this approach is the Estonian health records system. Estonia has a centralised, state-run health system accessible to authorised individuals, and citizens are notified once someone has accessed their health records (technology construct). This kind of notifications prevents unnecessary access, exerting social pressure and promoting widely accepted rules (social construct).


Another challenge is that we find ourselves in a situation where, besides activities carried out by the international bodies (EU toolbox created by the EC), technology providers (e.g. currently Google and Apple) are putting themselves as the gatekeepers on how this technology is going to work most effectively, how it is going to be deployed in an interoperable way, and how it can be shared. The challenge regarding the division of responsibilities in digital transformation between public and private sector arises. Therefore, the debate on the cooperation between technology providers and national authorities is ongoing and will probably crop up again for other emerging technologies in the future. Who do we want to take the leadership over the technological tools that are more and more present in our daily lives and to which we are giving more and more of our data – the European Union, the technology providers, or maybe the new coalition that we will create?

Another consideration in this regard is accountability and the question: who will be responsible if something goes wrong? Against whom citizens could seek redress? The more the technology providers set themselves up as rule-makers and gatekeepers on how we are going to use technology in the sensitive areas (like for example contact tracing), the more issues it raises on how to regulate them and how to hold them accountable. And although there are existing frameworks on how to hold national authorities accountable, we still do not have the clarity on how to hold big tech accountable. It is also a leadership challenge that will have to be addressed in the near future.


Recent reports indicate that the export of digital technology from authoritarian countries often goes hand in hand with the export of anti-democratic practices. In 2019, for instance, Chinese technicians were found to work directly with government security forces in Uganda and Serbia to install advanced facial recognition cameras for surveillance purposes. The struggle between digital autocracy model and digital democracy model is set to intensify in the years to come. The questions arise: how to counter this risk of irreversible divide and how to protect developing countries – often looking for cheaper equipment in order to digitise quickly – from importing damaging governance practices as well? The narrative that prioritises security and its importance must be adopted. There is a need to embed in technology itself the measures of privacy by design and cybersecurity by default that strongly influence the whole ecosystem. Developing countries should be approached with attractive counter-offers and the ongoing development of the certification schemes in the European Union might be a good opportunity to widely promote the solutions and tools that offer security and privacy from the start.


Join us at the panel discussion at the CYBERSEC GLOBAL 2020 which will further explore the challenges and risks associated with the development of surveillance technologies. More information.


[1] COVID-19 Digital Rights Tracker (data as of 12 May 2020).

Comments are disabled.