TIAGo robot helping in kitchen in the Cognitive Service Robotics Apartment lab

Researchers around the world can access our Virtual Research and Training Building.

© Bielefeld University/Patrick Pollmeier

Breaking new ground in AI-enabled robotics

Our Vision: The Joint Research Center on Cooperative and Cognition-enabled AI (CoAI JRC) is Germany’s leading center for interdisciplinary research and innovation on cooperative and cognition-enabled artificial intelligence that features human-centered, embodied, and real-world agency in the sense that it is capable of acting together with humans in a meaningful and goal-directed manner.

The central hypothesis of CoAI is that AI systems need to be equipped with powerful cognitive, reasoning, communicative and interaction abilities to successfully act in the physical and social world. They have to understand and reason about their own actions and those of others as a basis to provide contextualized support for the wellbeing of humans and society at large.

Robot arm helping a human with an injured hand

AI that works cooperatively in human physical and social environments

Our Mission: The researchers of CoAI JRC join forces to break new ground in the communication and interaction of humans and robots by developing AI systems that have a deep and actionable understanding of how to perform everyday joint tasks in natural human environments. The AI systems targeted by CoAI are able to successfully act in the physical and social world by adapting and learning new skills through cooperating and communicating with their human partners to eventually become competent and trustworthy partners.

Discover our virtual research and training building

Openness is the most efficient and sustainable approach to successful innovation in our current fast-moving AI research environment. This is why the CoAI Center aims to provide external researchers with convenient access to a Virtual Research and Training Building (ViB), which will enable scientists from anywhere in the world to work as if the are physically present in our labs. In the ViB, members of the international CoAI research community have access to AI-ready digital twin robots, research laboratories, and environments to conduct research in AI-enabled robotics, human-robot interaction, education science and developmental robotics. We also support a variety of open data and open software tools, such as the cognitive robot architecture CRAM.

Operating a digital twin PR2 in a virtual kitchen lab
© University of Bremen/Patrick Pollmeier

News & Events

Katharina Rohlfing and Britta Wrede (left to right)
© Susanne Freitag / Michael Adamski

Building trust in science for UNESCO World Science Day

News
28.11.23

KnowRob featured in “up2date” Magazine

The KnowRob software developed by CoAI JRC member Michael Beetz and colleagues is featured in the current issue of the University of Bremen online magazine up2date. KnowRob is an innovative open-source knowledge representation and reasoning framework designed to help household robots perform everyday tasks, such as preparing a meal, in natural human task settings. KnowRob […]
Uncategorized
14.11.23

Making AI accessible for companies and research institutions

CoAI JRC partner Paderborn University is a member of the new Service Centre WestAI, one of four service centres across Germany that have started work to make it easier for companies and research institutions to access AI applications and to provide support with their implementation. To this end, methods are being developed to train comprehensive […]
News
27.10.23

Robots with Common Sense

Professor Michael Beetz from the University of Bremen and Professor Philipp Cimiano from the University of Bielefeld have established a long-term collaboration with the goal of making knowledge from many different sources accessible in a machine-readable format.

The CoAI JRC Is Part of a Thriving AI Research Ecosystem

The CoAI JRC is embedded in a network of independent projects, research initiatives, and partners in industry and the public sector that collectively form a thriving ecosystem of research in AI and robotics.

Incremental permutation feature importance (iPFI): towards online explanations on data streams (2023)

Authors: Fabian Fumagalli, Maximilian Muschalik, Eyke Hüllermeier, Barbara Hammer
Published at: Machine Learning (Volume: 112)

Perceived realism of haptic rendering methods for bimanual high force tasks: original and replication study (2023)

Authors: Mario Lorenz, Andrea Hoffmann, Maximilian Kaluschke, Taha Ziadeh, Nina Pillen, Magdalena Kusserow, Jérôme Perret, Sebastian Knopp, André Dettmann, Philipp Klimant, Gabriel Zachmann, Angelika C. Bullinger
Published at: Scientific Reports (Volume: 13)

IngridKG: A FAIR Knowledge Graph of Graffiti (2023)

Authors: Mohamed Ahmed Sherif, Ana Alexandra Morim da Silva, Svetlana Pestryakova, Abdullah Fathi Ahmed, Sven Niemann, Axel-Cyrille Ngonga Ngomo
Published at: Scientific Data (Volume: 10)

“I do not know! but why?” — Local model-agnostic example-based explanations of reject (2023)

Authors: André Artelt, Roel Visser, Barbara Hammer
Published at: Neurocomputing (Volume: 558)

Actuator-level motion and contact episode learning and classification using adaptive resonance theory (2023)

Authors: Vinzenz Bargsten, Frank Kirchner
Published at: Intelligent Service Robotics (Volume: 16)

Gamified environmental multi-criteria decision analysis: information on objectives and range insensitivity bias (2023)

Authors: Alice H. Aubert, Judit Lienert, Bettina von Helversen
Published at: International Transactions in Operational Research (Volume: 30)

Model-based explanations of concept drift (2023)

Authors: Fabian Hinder, Valerie Vaquet, Johannes Brinkrolf, Barbara Hammer
Published at: Neurocomputing (Volume: 555)

How Observers Perceive Teleport Visualizations in Virtual Environments (2023)

Authors: Roland Fischer, Marc Jochens, Rene Weller, Gabriel Zachmann
Published at: Proceedings - SUI 2023: ACM Symposium on Spatial User Interaction

The 5th Workshop on Modeling Socio-Emotional and Cognitive Processes from Multimodal Data in the Wild (MSECP-Wild) (2023)

Authors: Bernd Dudzik, Tiffany Matej Hrkalovic, Dennis Küster, David St-Onge, Felix Putze, Laurence Devillers
Published at: ACM International Conference Proceeding Series

AQ-GT: a Temporally Aligned and Quantized GRU-Transformer for Co-Speech Gesture Synthesis (2023)

Authors: Hendric Voß, Stefan Kopp
Published at: ACM International Conference Proceeding Series