Public Values in a digital Society – Adessium’s partnership with SIDN Fund
Adessium is committed to an open and democratic society. The increasing digitalization of our society has put pressure on the position of individuals’ information and thus the functioning of democracy. A variety of developments play a role in this, such as the effective dissemination of disinformation and the microtargeting of individuals using political advertising. Increasingly more complex algorithms determine which information we see even though we aren’t able to exert any influence over this.
Within the scope of the open call for proposals, “Public Values in a Digital Society”, we joined forces with SIDN Fund to look for innovative and scalable solutions that will help strengthen online news users’ control and contribute to a better information position for individuals in the democratic process.
SIDN is a fund that focuses on innovative Internet projects that contribute to a stronger, more secure Internet and skilled Internet users. This partnership with SIDN Fund was unique for us since we normally don’t work with open calls. SIDN Fund, on the other hand, has a lot of experience in this area. The partnership was very informative thanks to their broad network of experts and their knowledge that enables them to evaluate more technical proposals.
This call generated a mix of projects that didn’t focus exclusively on research and advocacy but also more practical applications and tools for individual users. The projects focus on political microtargeting, building resilience to online hate and manipulation, diversity in news algorithms and education in a digital society.
In early 2020, we started with a session in which the nine organizations involved presented their plans to each other. Later in the year, the organizations shared their progress with one another. This mutual exchange of knowledge led to the strengthening of and collaboration between projects.

Political microtargeting
Platforms gather our data and use this information to sell custom advertising space to third parties. A large digital advertising industry makes microtargeting possible so that various parties can communicate their political messages to the general public in a very focused manner. Taking a stand against this is difficult due to insufficient legislation and regulations and the absence of transparency and accountability mechanisms at Internet companies.
For the project Personal data for political purposes, Tactical Tech commissioned research on the situation in the Netherlands. Even though there are examples of targeted online political campaigns, there is no indication that political parties in the Netherlands are involved in large-scale manipulation of citizens through targeted advertisements. What has been shown however is that Dutch legislation imposes few if any requirements on political parties when it comes to transparency. Tactical Tech’s Digital Detox Kit provides information on political parties’ online strategies and how you as a citizen can defend yourself against these practices.
Tech companies aren’t very transparent about how political parties use their platforms. Two projects in this call researched political advertisements on tech platforms. AlgorithmWatch’s Towards a monitoring of Instagram focuses on the popular social media platform. Together with media partners NOS, Pointer and De Groene Amsterdammer, AlgorithmWatch called on people to share their Instagram data using a special app that is added to your Internet browser. In the run-up to the Netherlands’ elections in March 2021, this will enable them to conduct research on the effects of political messages on Instagram.
Who Targets Me is studying political advertisements on Facebook with its project Who’s using Facebook ads to win your vote. They have already performed research on elections in several countries. Within the scope of this project, political advertisements placed in the prelude to the upcoming Dutch parliamentary elections will also be studied.

Resilience to online hate and manipulation
Basic digital skills are important, but they are not enough to equip us for the bigger problems created by the Internet and data, such as hate and manipulation on social media. Increasingly more complex algorithms make it hard for individuals to understand what exactly happens to their data, never mind trying to verify this for themselves. Two projects focused on strengthening the position of Internet users in trying to stop manipulation and online hate.
Bits of Freedom conducted research for a Short Manipulation Course on the ways tech companies manipulate our information landscape. In its “taxonomy of online platforms”, it makes a distinction between five forms of manipulation: profiling, prioritization, censorship, dark patterns and self-manipulation. The Short Manipulation Course is an online tool that educates a broad audience on these five forms of manipulation and what people can do to counter this. The research also forms the foundation for a report on the effects of online manipulation for policymakers, activists and researchers.
Online hate campaigns targeting social changemakers are a specific problem in the digital landscape. The campaign platform DeGoedeZaak commissioned a study for the project More power to you! How social influencers can face down hate. Different social changemakers were interviewed to gain insight into strategies used by “trolls”. The research showed that online hate campaigns are often organized in nature, and this is the idea behind a “troll toolkit” that contains tools and strategies designed to support changemakers facing attacks from Internet trolls.
Diversity in news algorithms
The media feeds our world view, and a diverse news offering is vitally important to our democracy. Much of the news we get served up online is the result of recommendation systems based on algorithms. One problem with this is that these systems often provide recommendations on the basis of what keeps someone’s attention as long as possible or what people have been interested in in the past. When news recommendations are done right, they can be used to broaden people’s views of the world. This can however lead to a one-sided supply of news and “filter bubbles”. The American elections and Brexit have made it clear that this can have major societal repercussions.
The University of Amsterdam’s Institute for Information Law (IViR) is studying how the diversity of the outcomes of news algorithms can be expanded in its project Algorithms for freedom of expression and a well-informed public. With the aid of a “diversity toolkit”, news organizations can make their news recommendations measurably more diverse.
News users currently have hardly any influence over the recommendations they receive. KU Leuven’s Institute for Media Studies is looking into how to give users more choice from the various news algorithms. The project Who would you like to be guided by? is developing a concept involving different “recommendation personas”. Each persona represents a specific recommendation algorithm. This enables the user to choose “the Expert”, for example, for more in-depth and background information, or “the Challenger” if they want to see articles outside of their substantive and ideological comfort zone.
Education in a digital society
Young people are the citizens of the future. Today’s youth are growing up using the Internet and social media. How can they be prepared for the possibilities and pitfalls of the digital society? SkillsDojo’s video class series Ethics of artificial intelligence, data and democracy uses DIY, maker and programming projects to introduce children from ten to fourteen to artificial intelligence and encourages them to think about the social consequences of this technology.
Entering into a dialog with people who have different opinions is an essential skill for a healthy democracy. There is nonetheless a tendency in today’s online landscape to amplify the contrasts between people. Civinc’s digital education activity Vox Pop Academy MBO pilot for intermediate vocational education (MBO) students facilitates online discussions between young people with different views.
In conclusion
The SIDN Fund partnership has been very valuable for us and has led to support for several promising projects. It has also provided us with more insight into civil society organizations and initiatives committed to a safe and democratic digital society. We have applied the knowledge we have gained through this partnership to further refine our annual plans on this theme.
About this program
Adessium contributes to the creation and implementation of the conditions that are necessary for a responsible digital society. We enable civil society organizations to offer constructive criticism on how we digitalize our society. We support organizations that sound the alarm when fundamental civil rights are at risk, and also back initiatives that can outline responsible technological alternatives.
Via this call, Adessium supported the projects initiated by AlgorithmWatch, Bits of Freedom, Civinc, DeGoedeZaak, KU Leuven’s Institute for Media Studies, UvA’s Institute for Information Law (IViR), SkillsDojo, Tactical Tech and WhoTargetsMe.