Vacancy for a Post-Doctoral Researcher in Computer Science and Artificial Intelligence

Published 10 March 2021

Within the DILEMA Project, the Asser Institute invites applications for a

Post-Doctoral Researcher
in Computer Science and Artificial Intelligence

Full-time (38 hours per week)
Starting date: 1 September 2021

The post-doctoral researcher will join the NWO-funded interdisciplinary research project DILEMA on Designing International Law and Ethics into Military Artificial Intelligence.

Project description

The DILEMA project explores interdisciplinary perspectives on military applications of artificial intelligence (AI), with a focus on legal, ethical, and technical approaches on safeguarding human agency over military AI. It analyses in particular subtle ways in which AI can affect or reduce human agency, and seeks to ensure compliance with international law and accountability by design. The research team investigates why it is essential to safeguard human agency over certain functions and activities, where it is most critical to maintain the role of human agents in order to ensure legal compliance and accountability, and how to technically ensure that military technologies are designed and deployed in line with ethical and legal frameworks. More information about the project is available on the project’s website.

Within this project, the Asser Institute invites applications for a post-doctoral researcher with a background in computer science, to conduct research on how to integrate legal norms and ethical values in the design of autonomous technologies, and how to implement monitoring and enforcement mechanisms that guarantee compliance with these norms and values. The post-doctoral researcher will engage in fundamental and applied research in order to develop methods and prototypical tools on the integration of international legal norms in military technologies, as well as standards and processes to verify and certify compliance of AI systems with international law. Furthermore, the post-doctoral researcher will test to which extent systems optimisation in line with international norms can converge with the goal of safe and effective AI technologies. The post-doctoral researcher will work in collaboration with other members of the DILEMA research team, including a post-doctoral researcher in ethics of technology and a PhD researcher in international law, and be connected with relevant research groups of the University of Amsterdam.


  • Conduct fundamental and applied research within the DILEMA project;
  • Publish articles in high-level academic journals;
  • Develop and apply engineering methods on normative AI design and certification;
  • Present research at academic conferences;
  • Engage with project partners and relevant private and public stakeholders;
  • Assist in the organization of research and dissemination activities within the DILEMA project;
  • Work together with the other team members of the DILEMA project;
  • Contribute actively to the Asser research community.


The successful candidate will meet the following requirements:

  • A completed PhD in computer science or artificial intelligence;
  • Familiarity with both data-driven as well as knowledge-driven AI methods;
  • A track-record of academic publications;
  • A strong research interest in responsible innovation and governance of technologies;
  • Experience or interest in interdisciplinary research;
  • Ability to work both independently and as part of a team;
  • Fluency in English.


  • The employee will be appointed at the University of Amsterdam and seconded to the Asser Institute.
  • The appointment is for two years.
  • The salary will be in accordance with the salary scales of the Collective Labour Agreement of Dutch Universities, scale 11, ranging from € 3.746 to €5.127 gross per month (full-time), depending on the actual level of education and experience. Secondary benefits at Dutch universities are attractive and include 8% holiday pay and an 8,3% end-of-year bonus.


The deadline to apply to this position has elapsed.