Original Research
A collection of recent research and white papers by Take This staff and in collaboration with expert academic and industry partners.
Featured Research:
This potential misalignment on harms between the service provider (i.e., company) and user has seen some attention in social media spaces, but there remains limited research on whether industry professionals and players are aligned in the severity of different types of harms in gaming spaces. To gain insight into these lingering questions and potential discrepancies, we conducted an online survey evaluating the perception of severity and prevalence of a range of online harms among game players and trust and safety professionals from the gaming industry.
Player safety is of utmost importance for game makers. However, it is often a revenue based argument that is most influential for enacting meaningful policy change. To facilitate creating a stronger business case for prioritizing resources to address toxicity, we surveyed 2,408 players about their trust and safety expectations and experiences in gaming spaces as they relate to hate and harassment.
The game industry has struggled to effectively mitigate various forms of disruptive behavior in games. Peer-to-peer social disruptions are of particular concern, such as sharing hate speech, harassing other players, and the propagation of extremist rhetoric. Most players have witnessed the expression of hate speech,1 and witnessed or been a direct target of harassment with a significant proportion experiencing sustained harassment over time. The prevalence and intensity of these experiences are magnified among marginalized communities, such as women and people of color. Extremist rhetoric is also commonplace, with extremist ideologies such as misogyny, racism, Islamism, white supremacy, and white nationalism being reported as relatively commonplace occurrences.
Online harassment is a relatively commonplace occurrence in the video gaming industry and player communities. A lack of diversity has unquestionably contributed to the high levels of such incidences. In this paper, we take an exploratory approach, via a cross-sectional observational study, to evaluate the harassment of game industry professionals on social media.
Hate, harassment, and other forms of so-called “toxicity” are colloquially discussed as normalized activities in gaming spaces. However, there are several challenges that have limited researchers’ ability to assess this normalization in terms of the prevalence, nature, and embeddedness of these deviant practices. This work addresses those challenges directly and assesses the rates of dark participation, their mental health impact, player mitigation strategies, and player perceptions around the cultural normalization of these actions within gaming communities. The results provide empirical support for high rates of dark participation in games, a range of mental health consequences to these actions, as well as the endorsement of the culturally justified acceptance of these behaviors within gaming spaces.
This article provides the first comprehensive cataloging and overview of dark participation in games. This includes defining these behaviors, cataloging their variants, and discussing their social and psychological impact and their potential underpinnings.
Concerns about the extremist exploitation of digital games has been increasing in recent years. These worries are centered on both direct (targeted recruitment efforts) and indirect (normalization of extremist sentiment in gaming spaces) radicalization of game players. Given the centrality of digital games in today’s entertainment ecosystem, these concerns are warranted: Today, digital games are larger than the television, film, and music industries combined and nearly one-third of the world plays video games.
A collaboration between Take This and the University of Texas at Austin, this research paper explores the role of identity fusion as one potential mechanism through which extremist ideology may permeate through game players and gaming communities.
While not often talked about in public spheres, there seems to be a widely-held assumption that hate and harassment in gaming spaces (often discussed under the umbrella of ‘toxicity’) remain a cornerstone of these communities because so-called “toxic players” are believed to constitute a significant proportion of consumers. This idea has fueled an unsubstantiated fear that improved moderation efforts against these behaviors would essentially moderate out a core game playing demographic. But is this the case?
The Game Awards Future Class represents the best and brightest in games. Within their inaugural year, the 2021 Future Class organized a “Class Project” to draw attention to a pressing or challenging issue within games and propose solutions for the future. This year, the future class focused on gender-based discrimination, which is a growing topic of concern within the games industry specifically.
Despite the fact that most game players seem to be utilizing voice chat to communicate within gaming spaces, it has received little attention when it comes to moderation efforts. As a consequence, it remains unclear to what extent verbal communication reflects the same levels of hate and harassment as has been documented in text-based communication within gaming spaces.
In 2016, Take This released its first white paper, Crunch Hurts, outlining the harmful effects of long-term, cultural crunch for game developers.
Designing Games to Challenge the Stigma Around Mental Health was written by Take This intern and York University PhD student Matthew Whitby and our research director Dr. Rachel Kowert. It is a practical tool for game designers to help them build games that actively challenge stigma around mental health – or, at least, not contribute more to stigma.
In collaboration with industry insiders and clinical experts, Take This released a State of the Industry white paper in July 2019 outlining areas of opportunity and concern in the industry, and identifying areas of new research and work for industry working groups and partners, including the Fair Play Alliance, IGDA Foundation, and Take This-convened working groups in HR and game play.