mobile icon
Project

Psychological Explorations of Artificial Intelligence

WorkgroupPerception and Action Lab
Duration07/2023-open
FundingIWM budget resources
Project description

Generative Artificial Intelligence is capable of generating texts or images based on verbal prompts. With its universal range of application fields and the human-like output quality the interaction with generative AI becomes increasingly similar to the interaction with other humans. How does interaction with generative AI impact human behavior, understanding, and trust, and how can these insights be used to optimize human-machine collaboration?


In one subproject, problem solving strategies of humans and generative AI are compared. When a problem can be solved both by adding and deleting elements, humans tend to add elements (additive solution strategies). It is investigated whether solution strategies of humans and generative AI differ for various verbal and visual problems.
Another subproject investigates in how far humans trust in the verbal output of generative AI. The subproject focuses on areas in which GPT produces incorrect output. It is investigated whether human trust towards true and false output is comparable, and how more appropriate trust levels could be elicited if generative AI were capable to express uncertainty about its own output.
Another subproject builds on the ability of generative AI to compose summaries of texts. This will be employed in the usage of educational videos where learners may receive a written summary of the video parts they’ve already seen (e.g., after pressing the pause button). Effects on learning are investigated.
A great strength of generative AI is its autonomy. However, this makes it challenging for users to obtain controlled results that precisely match their expectations. In another subproject, we will therefore investigate how well humans succeed in reproducing given images by formulating prompts. We expect results on the users' understanding of how generative AI works and about the speed at which the users learn to deal with this challenge.

Cooperations
  • Dr. Frank Papenmeier, University of Tübingen (Department of Psychology)