The text on this page was automatically translated and hence may differ from the original. No rights can be derived from this translation.
Algorithms can contribute to our society and our shared values, but it is essential that we develop and deploy algorithms in a responsible manner. Last year, we developed the 'Impact Assessment Human Rights & Algorithms (IAMA)' in collaboration with Utrecht University, which can assist (public) organisations in this endeavor.
The IAMA tool helps organisations to carefully consider the role of algorithms and to develop and deploy them responsibly.
There is also increasing political and administrative support to institutionally embed the necessary and desired deliberation. In spring 2022, a motion was passed in the House of Representatives that makes the use of an IAMA mandatory for governments.
At the European level, legislation is also being developed in the field of Artificial Intelligence (AI). Conducting an adequate impact assessment of algorithms (with a certain risk profile) may also become mandatory for companies in the future.


