6 research outputs found

    A Cross-Game Look at Transgender Representation in Video Games

    Get PDF
    Despite a history of tracking, analyzing, and discussing transgender representation in media, research on video games is often left out. In this project, I analyzed 63 games released from 1988–2019, and documented on the LGBTQ Game Archive as having transgender characters. A content survey revealed four overarching trends in how video games represent transgender characters (i.e., dysphoria/physical transition, mentally ill killers, trans shock/reveal, and ambiguity). I also demonstrated how transgender representation in video games manifests in similar ways to film and television. Three out of four trends in transgender representation have been repeatedly studied in media studies, but the fourth and largest trend, gender ambiguity, remains understudied. Research on transgender representation in video games mostly focuses on explicit representation. However, the findings show that despite the lack of explicit representations, transness is largely included in media in the form of gender ambiguity without explicitly being there

    Review: Intersectional Tech: Black Users in Digital Gaming

    Get PDF
    Review: Intersectional Tech: Black Users in Digital Gaming, by Kishonna L. Gray. 2020. Louisiana State University Press. xiii + 195 pp

    "Should We Unban, Chat?": Communal Content Moderation on Twitch

    No full text
    Employing digital ethnographic and interview methods alongside thematic analysis, I examined communal content moderation practices on Twitch in “unban appeal streams.” Streamers use humor and unban appeal streams as opportunities to make entertaining content, but some streamers use these streams to educate and reform offenders. A feedback loop between Twitch and its most popular streamers exists, allowing Twitch and its users to hold each other accountable. However, platforms like Twitch have centralized actors that hold more power than their users, making conversation between the two difficult. With desires for a more communal, user-driven platform, I offer suggestions for improving protections for marginalized users experiencing online harassment and abuse, as well as suggestions for implementing justice theories into content moderation systems

    How Transgender People and Communities Were Involved in Trans Technology Design Processes

    Full text link
    Trans technology – technology created to help address challenges that trans people face – is an important area for innovation that can help improve marginalized people's lives. We conducted 104 interviews with 115 creators of trans technology to understand how they involved trans people and communities in design processes. We describe projects that used human-centered design processes, as well as design processes that involved trans people in smaller ways, including gathering feedback from users, conducting user testing, or the creators being trans themselves. We show how involving trans people and communities in design is vital for trans technologies to realize their potential for addressing trans needs. Yet we highlight a frequent gap between trans technology design and deployment, and discuss ways to bridge this gap. We argue for the importance of involving community in trans technology design to ensure that trans technology achieves its promise of helping address trans needs and challenges.National Science FoundationRiecker Undergraduate Research FundCenter for the Education of Women+UMSI REMS ProgramPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/176366/1/HaimsonHowTransgenderPeople.pdfDescription of HaimsonHowTransgenderPeople.pdf : Main articleSEL

    (In)visible moderation: A digital ethnography of marginalized users and content moderation on Twitch and Reddit

    Full text link
    Research suggests that marginalized social media users face disproportionate content moderation and removal. However, when content is removed or accounts suspended, the processes governing content moderation are largely invisible, making assessing content moderation bias difficult. To study this bias, we conducted a digital ethnography of marginalized users on Reddit’s /r/FTM subreddit and Twitch’s “Just Chatting” and “Pools, Hot Tubs, and Beaches” categories, observing content moderation visibility in real time. We found that on Reddit, a text-based platform, platform tools make content moderation practices invisible to users, but moderators make their practices visible through communication with users. Yet on Twitch, a live chat and streaming platform, content moderation practices are visible in channel live chats, “unban appeal” streams, and “back from my ban” streams. Our ethnography shows how content moderation visibility differs in important ways between social media platforms, harming those who must see offensive content, and at other times, allowing for increased platform accountability.National Science FoundationPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/174108/1/ThachInvisibleModeration.pdfDescription of ThachInvisibleModeration.pdf : Main articleSEL