Digitalization permeates numerous areas of life and offers ever new opportunities - in private as well as in professional life. Apps are used, for example, to fight pandemics or supplement physiotherapy; robots or other digital assistants support work in companies. However, users are not always enthusiastic about the new technologies and the accompanying changes from the first minute on, even if they would objectively benefit from the use of these technologies.
Looking at the news makes clear: All over the world, people take to the streets for a wide variety of reasons. Their passionate protests for (or against) a certain cause often results from social influence, which nowadays is often achieved through emotionalized communication via the Internet. But what exactly motivates people to participate in demonstrations, sign petitions, or otherwise engage collectively?
In times of fake news, it is particularly important to understand when and why people may believe in unconfirmed or suspected information. Here, we address the question of how people deal with possible causal explanations (e.g. in news headlines), that are – up to today - unexplained. When are (confirmed) facts being valued and treated differently than mere suspicions? And when do different stages of certainty of explanations may become blurred?
Successful cooperation often requires mutual trust. This is all the more the case when it is about cooperation between groups. When the impression of the outgroup is mostly shaped by prejudice and not by knowledge, trust building means are required. But how can trust in an outgroup be enhanced? Within the scope of this dissertation project, we investigate the conditions under which communication increases intergroup trust.
Numerous Conspiracy Theories are circulating online about topics such as climate change, the impact of vaccinations, and other topics of societal relevance. Such conspiracy theories are often extremely popular – but at the same time, they can be dangerous for society, as they can lead to less political and personal engagement, and to less trust in general as well as towards authorities. Despite their popularity, little is yet known about the relationship between conspirational thinking and social influence, that is, about the social factors that play a role in the development and persistence of conspiracy theories. The current research project aims to better understand this relationship and to examine ways to limit the belief in and impact of conspiracy theories.
In newspapers, television, and on the internet, reports on conflicts between groups are frequent. These reports often – intentionally and unintentionally – elicit negative emotions vis-à-vis the other group, which then further fuel the conflict. Based on video clips from media coverage and texts, this project investigates how dealing with these negative emotions affects empathy with and willingness to help members of an opposing group.
Universities and organizations alike often communicate social norms to their members. These norms imply expected types of behavior. In the last years, ‘excellence’ has become increasingly important: Numerous universities and organizations emphasize, for instance, on their websites or internal communication platforms, the importance of excellent performance and the premium quality of their products. How do members respond to such norms about excellence?
In many situations, groups play an important role: Members of a team work on projects collaboratively, students form learning groups, and members of online groups discuss issues that are important to them. In this context, this dissertation project investigates two key questions: How do group members react when another member of their group does not fulfill their expectations regarding appropriate behavior? And when do they show a certain reaction?
Social power characterizes many instances in which people exchange knowledge (e.g., across hierarchies in organizations). Power can tempt people to focus on personal benefits, hindering collaboration. Yet, at times, especially those high in power feel responsible and take care of others' interests. When and why is this the case? Which conditions promote responsibility among power-holders?
Artificial intelligence (AI) is increasingly involved in written online communication and is already influencing the form of messages we receive. Some of these AI-based programs target interpersonal processes and perceptions (e.g. a message’s sender making a positive impression on the recipient) through modifications of message language. However, as AIs have gained a degree of autonomy in their (suggested) modifications, the line between human and technological factors in written online communication has become blurred. In fact, the actual effects of such programs on the users (i.e. the senders of AI-supported messages), the communication process, and the recipients of such modified messages remain largely unclear.
Humans often tend to treat technical systems as social actors and ascribe them human-like characteristics (e.g., when telling a computer to work faster). With the ongoing introduction of Artificial Intelligence (AI), this tendency is likely to increase – as technical systems become more and more capable (e.g., of solving complex problems or adapting to individual users) and are often even explicitly designed to appear human-like.
Social media not only make information more accessible, but also encourage the spread of conspiracy theories. One area where conspiracy theories are attributed negative consequences is vaccination. Today, infection rates of diseases such as measles and mumps are on the rise again in many industrialized countries, which is associated with a decline in vaccinations. This is encouraged by increasing activism against vaccination, often based on conspiracy theories. In this project we investigate the influence of the belief in conspiracy theories in the context of vaccination and how to counteract it.