Postdoc position in Responsible AI (f/m/d)
Eckdaten der angebotenen Stelle
Arbeitgeber | Universität Duisburg-Essen |
Postleitzahl | 47057 |
Ort | Duisburg |
Bundesland | Nordrhein-Westfalen |
Gepostet am | 09.05.2025 |
Remote Option? | - |
Homeoffice Option? | - |
Teilzeit? | - |
Vollzeit? | - |
Ausbildungsstelle? | - |
Praktikumsplatz? | - |
Unbefristet? | - |
Befristet? | - |

Stellenbeschreibung
As part of the Research Alliance Ruhr, the Research Centre for Trustworthy Data Science and Security unites the expertise of its member universities to bridge the gaps between Psychology & Social Sciences, Artificial Intelligence & Machine Learning, Data Science & Statistical Learning, Law, Cybersecurity & Privacy, and more. Postdoc position in Responsible AI (f/m/d) (salary 14 TV-L, 100 %) Appointment date: as soon as possible Contract duration: 3 years Position type: 100 percent of a full-time position (part-time is possible) Application deadline: 05.06.2025 The Compliant & Accountable Systems Group seeks a post-doctoral researcher to work on issues relating to responsible AI. Our group takes an interdisciplinary, socio-technical approach, exploring the technical, legal, policy, social, and user dimensions to help improve the governance of emerging technologies to help ensure they are appropriate, secure, safe, accountable, and aligned with the public good. About the Role We are looking for a post-doctoral researcher to investigate how AI and other emerging technologies can be more appropriately designed, deployed, used, governed, and challenged. This could involve analysing and measuring how systems operate in practice, conducting empirical research into their use and impact, or developing tools, technical mechanisms, and governance/regulatory strategies that support real-world and effective accountability, transparency, and oversight. You will join a supportive interdisciplinary research environment and contribute to addressing real-world challenges in understanding and shaping how emerging technologies are approached, developed, used, and governed in ways aligned with the public good and responsive to individual and broader societal needs. Research Areas and Directions We welcome applicants with a wide range of interests and backgrounds that are relevant to responsible AI. Potential topics include (but are not limited to): Transparency, oversight, and contestability in AI systems Accountability in algorithmic supply chains and infrastructures Technical tools for logging, tracing, and system scrutiny Governance of general-purpose and open-source AI models Data rights, privacy, and control in AI-driven environments Participatory and human-centred methods for responsible technology Legal and policy frameworks for AI accountability and risk mitigation The research will be shaped according to the selected candidate's strengths and interests and may involve self-initiated directions aligned with the group's broader themes. Who We Are Looking For We welcome applicants from a wide range of disciplines, including computer science (e.g. systems, AI/ML, HCI, security, privacy), law (e.g. data protection, technology regulation, liability, digital rights), and public policy, to name a few. You should have: A PhD degree (completed, near-completion, or equivalent experience) in a relevant discipline A demonstrated interest in responsible technology and/or AI governance Excellent analytical, writing, and communication skills in English The ability to work independently and collaboratively in an interdisciplinary setting Whether your strengths lie in technical development, empirical investigation, legal analysis, socio-technical system design, or somewhere else, we encourage you to apply. Responsibilities Lead a defined research direction aligned with the group's themes, working with a high degree of independence, initiative, and leadership. Conduct research using methods appropriate to the project and your background -- including empirical, qualitative, quantitative, or systems-based approaches -- and develop tools, prototypes, or other artefacts to explore and demonstrate key ideas. Publish and present research at top-tier venues and contribute to the academic community. Support the supervision, instruction, mentoring, and development of students/junior researchers. Collaborate with researchers across disciplines and institutions, contributing to a supportive and engaged research environment. Contribute to shaping new research agendas, funding proposals, and collaborative projects within and beyond the group. Drive broader impact activities, including engagement with policy, industry, or the public where appropriate. Opportunities and Environment Successful candidates will join the Compliant & Accountable Systems Group, which is part of the broader RC-Trust: Research Centre for Trustworthy Data Science and Security. RC-Trust is a multi-university collaboration focusing on developing trustworthy intelligent systems through a human-centred, interdisciplinary approach. Work will also involve collaboration with the University of Cambridge. Candidates will have access to: A highly interdisciplinary and collaborative research environment. Opportunities to engage with external stakeholders and contribute to policy discussions. Travel opportunities for conferences, workshops, and research collaborations. A network of researchers and industry partners working on AI trust and governance. What we offer This position is embedded in a creative, dynamic, and internationally renowned research environment. Your research will play a crucial role in developing our new Research Centre and promoting trustworthy technology to the general public. Our extensive international network of researchers and industry partners ensures a seamless transition into your next career step, whether in academia or international research institutions. We prioritize a balanced and family-friendly work-life relationship, offering options for flexible working hours and part-time remote home-office arrangements. How to Apply Applicants must submit the following documents (in separate PDF files): A CV detailing academic qualifications, relevant experience, and publications (if applicable). A one-page cover letter outlining their motivation for applying and how their background aligns with the position. A one-page research statement outlining their research interests within the themes of the position. Incomplete applications will not be reviewed. The appointment will be made based on academic merit, experience, and overall alignment with the group's research goals. As such, fit to the group's needs and culture, and the potential to contribute meaningfully to ongoing or emerging research directions will be important factors in the final selection. Please send your application in an email stating the reference number (221-25) until 05.06.2025 to recruit-compacctsys@rc-trust.ai For informal inquiries about research specifics, please contact Prof. Jat Singh at jat@rc-trust.ai, for questions about the application process, please contact recruit-compacctsys@rc-trust.ai The University of Duisburg-Essen aims to promote the diversity of its members (see https://www.uni-due.de/diversity). It seeks to increase the proportion of women on academic staff and, therefore, strongly encourages qualified women to apply. Women will be given preferential treatment according to the NRW State Equality Act if they have equal qualifications. Applications from suitable severely disabled persons and those of equal status according to § 2 Abs. 3 SGB IX are welcome.Recht Politik Informatik Soziologie Postdoc Lehre & Forschung, Wissenschaft IT, EDV, Telekommunikation Universität Teilzeit Vollzeit