skip to main content
10.1145/3491102.3517505acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Designing Word Filter Tools for Creator-led Comment Moderation

Published: 29 April 2022 Publication History

Abstract

Online social platforms centered around content creators often allow comments on content, where creators can then moderate the comments they receive. As creators can face overwhelming numbers of comments, with some of them harassing or hateful, platforms typically provide tools such as word filters for creators to automate aspects of moderation. From needfinding interviews with 19 creators about how they use existing tools, we found that they struggled with writing good filters as well as organizing and revising their filters, due to the difficulty of determining what the filters actually catch. To address these issues, we present FilterBuddy, a system that supports creators in authoring new filters or building from pre-made ones, as well as organizing their filters and visualizing what comments are captured by them over time. We conducted an early-stage evaluation of FilterBuddy with YouTube creators, finding that participants see FilterBuddy not just as a moderation tool, but also a means to organize their comments to better understand their audiences.

Supplementary Material

MP4 File (3491102.3517505-video-figure.mp4)
Video Figure

References

[1]
Crystal Abidin. 2015. Communicative Intimacies: Influencers and Percieved Interconnectedness. (2015).
[2]
Saleema Amershi, Dan Weld, Mihaela Vorvoreanu, Adam Fourney, Besmira Nushi, Penny Collisson, Jina Suh, Shamsi Iqbal, Paul N. Bennett, Kori Inkpen, Jaime Teevan, Ruth Kikin-Gil, and Eric Horvitz. 2019. Guidelines for Human-AI Interaction. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3290605.3300233
[3]
Rachel Berryman and Misha Kavka. 2017. ‘I Guess A Lot of People See Me as a Big Sister or a Friend’: the role of intimacy in the celebrification of beauty vloggers. Journal of Gender Studies 26, 3 (2017), 307–320. https://doi.org/10.1080/09589236.2017.1288611 arXiv:https://doi.org/10.1080/09589236.2017.1288611
[4]
Lindsay Blackwell, Tianying Chen, Sarita Schoenebeck, and Cliff Lampe. 2018. When Online Harassment is Perceived as Justified. In Twelfth International AAAI Conference on Web and Social Media.
[5]
Sean Burch. 2019. YouTube says ’INAPPROPRIATE comments’ could result in video demonetization. https://www.thewrap.com/youtube-inappropriate-comments-demonetization/
[6]
Jean Burgess and Joshua Green. 2018. YouTube: Online video and participatory culture. John Wiley & Sons.
[7]
Facebook Business Center. 2021. About Facebook Pages. https://www.facebook.com/business/help/461775097570076?id=939256796236247
[8]
Jie Cai and Donghee Yvette Wohn. 2019. What Are Effective Strategies of Handling Harassment on Twitch? Users’ Perspectives. In Conference Companion Publication of the 2019 on Computer Supported Cooperative Work and Social Computing (Austin, TX, USA) (CSCW ’19). Association for Computing Machinery, New York, NY, USA, 166–170. https://doi.org/10.1145/3311957.3359478
[9]
Marilyn A Campbell. 2005. Cyber bullying: An old problem in a new guise?Journal of Psychologists and Counsellors in Schools 15, 1 (2005), 68–76.
[10]
Robyn Caplan and Tarleton Gillespie. 2020. Tiered governance and demonetization: The shifting terms of labor and compensation in the platform economy. Social Media+ Society 6, 2 (2020), 2056305120936636.
[11]
Caitlin Ring Carlson, Luc Cousineau, and Caitlin Ring Carlson. 2020. Are You Sure You Want to View This Community? Exploring the Ethics of Reddit’s Quarantine Practice. Journal of Media Ethics 00, 00 (2020), 1–12. https://doi.org/10.1080/23736992.2020.1819285
[12]
Alissa Centivany and Bobby Glushko. 2016. ”Popcorn Tastes Good”: Participatory Policymaking and Reddit’s. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (San Jose, California, USA) (CHI ’16). Association for Computing Machinery, New York, NY, USA, 1126–1137. https://doi.org/10.1145/2858036.2858516
[13]
Ngai Keung Chan. 2019. “Becoming an expert in driving for Uber”: Uber driver/bloggers’ performance of expertise and self-presentation on YouTube. New Media & Society 21, 9 (2019), 2048–2067.
[14]
Eshwar Chandrasekharan, Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2022. Quarantined! Examining the Effects of a Community-Wide Moderation Intervention on Reddit. ACM Trans. Comput.-Hum. Interact.(2022). https://doi.org/10.1145/3490499
[15]
Eshwar Chandrasekharan, Umashanthi Pavalanathan, Anirudh Srinivasan, Adam Glynn, Jacob Eisenstein, and Eric Gilbert. 2017. You Can’t stay here: The efficacy of Reddit’s 2015 ban examined through hate speech. Proc. ACM Hum.-Comput. Interact. 1, CSCW, Article 31 (Dec. 2017), 22 pages. https://doi.org/10.1145/3134666
[16]
Eshwar Chandrasekharan, Mattia Samory, Anirudh Srinivasan, and Eric Gilbert. 2017. The Bag of Communities: Identifying Abusive Behavior Online with Preexisting Internet Data. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 3175–3187. https://doi.org/10.1145/3025453.3026018
[17]
Kathy Charmaz. 2006. Constructing grounded theory: a practical guide through qualitative analysis. https://doi.org/10.1016/j.lisr.2007.11.003 arXiv:arXiv:1011.1669v3
[18]
Clement Chau. 2010. YouTube as a participatory culture. New directions for youth development 2010, 128 (2010), 65–74.
[19]
A Chen. 2019. How YouTubers plan to take on YouTube for better working conditions. MIT Technology Review(2019).
[20]
Chloe Condon. 2021. Some of you have never had...https://twitter.com/chloecondon/status/1425197893678366723.
[21]
Kate Crawford and Tarleton Gillespie. 2016. What is a flag for? Social media reporting tools and the vocabulary of complaint. New Media & Society 18, 3 (2016), 410–428. https://doi.org/10.1177/1461444814543163 arXiv:https://doi.org/10.1177/1461444814543163
[22]
Twitch Creator Dashboard. 2021. Creator Dashboard. https://help.twitch.tv/s/article/creator-dashboard?language=en_US
[23]
Stuart Cunningham and David Craig. 2017. Being ‘really real’on YouTube: authenticity, community and brand culture in social media entertainment. Media International Australia 164, 1 (2017), 71–81.
[24]
Jesse Davis and Mark Goadrich. 2006. The relationship between Precision-Recall and ROC curves. In Proceedings of the 23rd international conference on Machine learning. 233–240.
[25]
Jesse Dodge, Maarten Sap, Ana Marasovic, William Agnew, Gabriel Ilharco, Dirk Groeneveld, Margaret Mitchell, Matt Gardner, and Hugging Face. 2021. Documenting Large Webtext Corpora: A Case Study on the Colossal Clean Crawled Corpus. (2021).
[26]
Bryan Dosono and Bryan Semaan. 2019. Moderation Practices as Emotional Labor in Sustaining Online Communities: The Case of AAPI Identity Work on Reddit. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3290605.3300372
[27]
Stefanie Duguay, Jean Burgess, and Nicolas Suzor. 2020. Queer women’s experiences of patchwork platform governance on Tinder, Instagram, and Vine. Convergence 26, 2 (2020), 237–252.
[28]
Stine Eckert. 2018. Fighting for recognition: Online abuse of women bloggers in Germany, Switzerland, the United Kingdom, and the United States. New Media & Society 20, 4 (2018), 1282–1302.
[29]
Jad Esber, Boaz Sender, Ethan Zuckerman, Crystal Lee, Nana Nwachukwu, Oumou Ly, Peter Suber, Primavera De Filippi, Sahar Massachi, Samuel Klein, 2021. A meta-proposal for Twitter’s bluesky project. Available at SSRN 3816729(2021).
[30]
Umer Farooq and Jonathan Grudin. 2016. Human-computer Integration. Interactions 23, 6 (2016), 26–32.
[31]
Julia R. Fernandez and Jeremy Birnholtz. 2019. ”I Don’t Want Them to Not Know”: Investigating Decisions to Disclose Transgender Identity on Dating Platforms. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 226 (nov 2019), 21 pages. https://doi.org/10.1145/3359328
[32]
Jessica L. Feuston and Jed R. Brubaker. 2021. Putting Tools in Their Place: The Role of Time and Perspective in Human-AI Collaboration for Qualitative Analysis. Proc. ACM Hum.-Comput. Interact. 5, CSCW2, Article 469 (oct 2021), 25 pages. https://doi.org/10.1145/3479856
[33]
Guo Freeman and Donghee Yvette Wohn. 2020. Streaming your Identity: Navigating the Presentation of Gender and Sexuality through Live Streaming. Computer Supported Cooperative Work (CSCW) 29, 6 (2020), 795–825.
[34]
R. Stuart Geiger. 2016. Bot-based collective blocklists in Twitter: the counterpublic moderation of harassment in a networked public space. Information, Communication & Society 19, 6 (2016), 787–803. https://doi.org/10.1080/1369118X.2016.1153700 arXiv:https://doi.org/10.1080/1369118X.2016.1153700
[35]
Ysabel Gerrard. 2018. Beyond the hashtag: Circumventing content moderation on social media. New Media & Society 20, 12 (2018), 4492–4511. https://doi.org/10.1177/1461444818776611 arXiv:https://doi.org/10.1177/1461444818776611
[36]
Tarleton Gillespie. 2018. Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
[37]
Robert Gorwa, Reuben Binns, and Christian Katzenbach. 2020. Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data & Society 7, 1 (2020), 2053951719897945.
[38]
Mary�L Gray and Siddharth Suri. 2019. Ghost work: How to stop Silicon Valley from building a new global underclass. Eamon Dolan Books.
[39]
Michael Green, Ania Bobrowicz, and Chee Siang Ang. 2015. The lesbian, gay, bisexual and transgender community online: discussions of bullying and self-disclosure in YouTube videos. Behaviour & Information Technology 34, 7 (2015), 704–712.
[40]
Oliver L. Haimson, Daniel Delmonaco, Peipei Nie, and Andrea Wegner. 2021. Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas. Proc. ACM Hum.-Comput. Interact. 5, CSCW2, Article 466 (oct 2021), 35 pages. https://doi.org/10.1145/3479610
[41]
William A. Hamilton, Oliver Garretson, and Andruid Kerne. 2014. Streaming on Twitch: Fostering Participatory Communities of Play within Live Mixed Media. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI ’14). Association for Computing Machinery, New York, NY, USA, 1315–1324. https://doi.org/10.1145/2556288.2557048
[42]
Ivan Hare and James Weinstein. 2009. Extreme Speech and Democracy. Oxford University Press, USA.
[43]
Elizabeth Fish Hatfield. 2018. (Not) getting paid to do what you love: Gender, social media, and aspirational work.
[44]
Susan Herring, Kirk Job-Sluder, Rebecca Scheckler, and Sasha Barab. 2002. Searching for safety online: Managing” trolling” in a feminist forum. The information society 18, 5 (2002), 371–384.
[45]
Zorah Hilvert-Bruce, James T Neill, Max Sjöblom, and Juho Hamari. 2018. Social motivations of live-streaming viewer engagement on Twitch. Computers in Human Behavior 84 (2018), 58–67.
[46]
Jacob Hoffman-Andrews. 2021. Block Together. https://blocktogether.org
[47]
Mattias Holmbom. 2015. The YouTuber: A qualitative study of popular content creators.
[48]
Manoel Horta Ribeiro, Shagun Jhaver, Savvas Zannettou, Jeremy Blackburn, Gianluca Stringhini, Emiliano De Cristofaro, and Robert West. 2021. Do Platform Migrations Compromise Content Moderation? Evidence from r/The_Donald and r/Incels. Proc. ACM Hum.-Comput. Interact. 5, CSCW2, Article 316 (oct 2021), 24 pages. https://doi.org/10.1145/3476057
[49]
Homa Hosseinmardi, Sabrina Arredondo Mattson, Rahat Ibn Rafiq, Richard Han, Qin Lv, and Shivakant Mishra. 2015. Detection of cyberbullying incidents on the instagram social network. arXiv preprint arXiv:1503.03909(2015).
[50]
Jane Im, Sonali Tandon, Eshwar Chandrasekharan, Taylor Denby, and Eric Gilbert. 2020. Synthesized Social Signals: Computationally-Derived Social Signals from Account Histories. Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3313831.3376383
[51]
Jelani Ince, Fabio Rojas, and Clayton A Davis. 2017. The social media response to Black Lives Matter: How Twitter users interact with Black Lives Matter through hashtag use. Ethnic and racial studies 40, 11 (2017), 1814–1830.
[52]
Steven J. Jackson, Tarleton Gillespie, and Sandy Payette. 2014. The Policy Knot: Re-Integrating Policy, Practice and Design in Cscw Studies of Social Computing. In Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing (Baltimore, Maryland, USA) (CSCW ’14). Association for Computing Machinery, New York, NY, USA, 588–602. https://doi.org/10.1145/2531602.2531674
[53]
Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019. “Did You Suspect the Post Would Be Removed?”: Understanding User Reactions to Content Removals on Reddit. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 192 (Nov. 2019), 33 pages. https://doi.org/10.1145/3359294
[54]
Shagun Jhaver, Iris Birman, Eric Gilbert, and Amy Bruckman. 2019. Human-Machine Collaboration for Content Regulation: The Case of Reddit Automoderator. ACM Trans. Comput.-Hum. Interact. 26, 5, Article 31 (July 2019), 35�pages. https://doi.org/10.1145/3338243
[55]
Shagun Jhaver, Christian Boylston, Diyi Yang, and Amy Bruckman. 2021. Evaluating the Effectiveness of Deplatforming as a Moderation Strategy on Twitter. Proc. ACM Hum.-Comput. Interact. 5, CSCW2, Article 381 (oct 2021), 30�pages. https://doi.org/10.1145/3479525
[56]
Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2019. Does Transparency in Moderation Really Matter? User Behavior After Content Removal Explanations on Reddit. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 150 (nov 2019), 27�pages. https://doi.org/10.1145/3359252
[57]
Shagun Jhaver, Larry Chan, and Amy Bruckman. 2018. The View from the Other Side: The Border Between Controversial Speech and Harassment on Kotaku in Action. First Monday 23, 2 (2018). http://firstmonday.org/ojs/index.php/fm/article/view/8232
[58]
Shagun Jhaver, Seth Frey, and Amy Zhang. 2022. Decentralizing Platform Power: A Design Space of Multi-level Governance in Online Social Platforms. arXiv preprint arXiv:2108.12529(2022).
[59]
Shagun Jhaver, Sucheta Ghoshal, Amy Bruckman, and Eric Gilbert. 2018. Online Harassment and Content Moderation: The Case of Blocklists. ACM Trans. Comput.-Hum. Interact. 25, 2, Article 12 (March 2018), 33 pages. https://doi.org/10.1145/3185593
[60]
Jialun ”Aaron” Jiang, Charles Kiene, Skyler Middler, Jed R. Brubaker, and Casey Fiesler. 2019. Moderation Challenges in Voice-based Online Communities on Discord. Proc. ACM Hum.-Comput. Interact. 3, CSCW (2019), Article 55. https://doi.org/10.1145/3359157
[61]
Jialun Aaron Jiang, Morgan Klaus Scheuerman, Casey Fiesler, and Jed R Brubaker. 2021. Understanding international perceptions of the severity of harmful content online. PloS one 16, 8 (2021), e0256762.
[62]
David Kaye. 2019. Speech police: The global struggle to govern the Internet. Columbia Global Reports.
[63]
Moira Kenney. 2001. Mapping gay LA: The intersection of place and politics. Temple University Press.
[64]
Jina Kim, Kunwoo Bae, Eunil Park, and Angel P. del Pobil. 2019. Who Will Subscribe to My Streaming Channel? The Case of Twitch. In Conference Companion Publication of the 2019 on Computer Supported Cooperative Work and Social Computing (Austin, TX, USA) (CSCW ’19). Association for Computing Machinery, New York, NY, USA, 247–251. https://doi.org/10.1145/3311957.3359470
[65]
Yubo Kou and Xinning Gui. 2021. Flag and Flaggability in Automated Moderation: The Case of Reporting Toxic Behavior in an Online Game Community. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–12.
[66]
Patricia G Lange. 2007. Commenting on comments: Investigating responses to antagonism on YouTube. In Society for Applied anthropology conference, Vol. 31. 163–190.
[67]
Clayton Lewis. 1982. Using the” thinking-aloud” method in cognitive interface design. IBM TJ Watson Research Center Yorktown Heights, NY.
[68]
Rebecca Lewis, Alice E Marwick, and William Clyde Partin. 2021. “We Dissect Stupidity and Respond to It”: Response Videos and Networked Harassment on YouTube. American Behavioral Scientist 65, 5 (2021), 735–756.
[69]
Jie Li, Xinning Gui, Yubo Kou, and Yukun Li. 2019. Live Streaming as Co-Performance: Dynamics between Center and Periphery in Theatrical Engagement. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 64 (nov 2019), 22 pages. https://doi.org/10.1145/3359166
[70]
Brian Lubars and Chenhao Tan. 2019. Ask not what AI can do, but what AI should do: Towards a framework of task delegability. Advances in Neural Information Processing Systems 32 (2019).
[71]
May O Lwin, Benjamin Li, and Rebecca P Ang. 2012. Stop bugging me: An examination of adolescents’ protection behavior against online harassment. Journal of adolescence 35, 1 (2012), 31–41.
[72]
Maximilian Mackeprang, Claudia Müller-Birn, and Maximilian Timo Stauss. 2019. Discovering the Sweet Spot of Human-Computer Configurations: A Case Study in Information Extraction. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 195 (nov 2019), 30 pages. https://doi.org/10.1145/3359297
[73]
Kaitlin Mahar, Amy X. Zhang, and David Karger. 2018. Squadbox: A Tool to Combat Email Harassment Using Friendsourced Moderation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). ACM, New York, NY, USA, Article 586, 13 pages. https://doi.org/10.1145/3173574.3174160
[74]
Keri Mallari, Spencer Williams, and Gary Hsieh. 2021. Understanding Analytics Needs of Video Game Streamers. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama, Japan) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 337, 12 pages. https://doi.org/10.1145/3411764.3445320
[75]
Alice Marwick. 2015. You may know me from YouTube. A companion to celebrity 333 (2015).
[76]
Mike Masnick. 2019. Protocols, Not Platforms: A Technological Approach to Free Speech. https://knightcolumbia.org/content/protocols-not-platforms-a-technological-approach-to-free-speech
[77]
J Nathan Matias. 2016. Going dark: Social factors in collective action against platform operators in the Reddit blackout. In Proceedings of the 2016 CHI conference on human factors in computing systems. ACM, 1138–1151.
[78]
J Nathan Matias, Amy Johnson, Whitney Erin Boesel, Brian Keegan, Jaclyn Friedman, and Charlie DeTar. 2015. Reporting, reviewing, and responding to harassment on Twitter. Available at SSRN 2602018(2015).
[79]
Nathan J. Matias. 2016. The Civic Labor of Online Moderators. In Internet Politics and Policy conference (Oxford, United Kingdom). Oxford, United Kingdom.
[80]
Sharan B Merriam. 2002. Introduction to Qualitative Research. Qualitative research in practice: Examples for discussion and analysis 1 (2002).
[81]
Channel Moderation. 2021. How to Use AutoMod. https://help.twitch.tv/s/article/how-to-use-automod?language=en_US
[82]
Casey Newton. 2019. The trauma floor: The secret lives of Facebook moderators in America. The Verge 25(2019), 2019.
[83]
Valentin Niebler. 2020. ‘YouTubers unite’: collective action by YouTube content creators.
[84]
Safiya Umoja Noble. 2018. Algorithms of oppression. New York University Press.
[85]
Fayika Farhat Nova, Md. Rashidujjaman Rifat, Pratyasha Saha, Syed Ishtiaque Ahmed, and Shion Guha. 2018. Silenced Voices: Understanding Sexual Harassment on Anonymous Social Media Among Bangladeshi People. In Companion of the 2018 ACM Conference on Computer Supported Cooperative Work and Social Computing (Jersey City, NJ, USA) (CSCW ’18). Association for Computing Machinery, New York, NY, USA, 209–212. https://doi.org/10.1145/3272973.3274057
[86]
Katherine Ognyanova. 2019. In Putin’s Russia, information has you: Media control and internet censorship in the Russian Federation. In Censorship, surveillance, and privacy: Concepts, methodologies, tools, and applications. IGI Global, 1769–1786.
[87]
Deokgun Park, Simranjit Sachar, Nicholas Diakopoulos, and Niklas Elmqvist. 2016. Supporting comment moderators in identifying high quality online news comments. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 1114–1125.
[88]
Simon Parkin. 2018. The YouTube stars heading for burnout:“The most fun job imaginable became deeply bleak.”. The Guardian 8(2018), 2018.
[89]
Block Party. 2021. Block party. https://www.blockpartyapp.com/
[90]
Jessica A. Pater, Moon K. Kim, Elizabeth D. Mynatt, and Casey Fiesler. 2016. Characterizations of Online Harassment: Comparing Policies Across Social Media Platforms. In Proceedings of the 19th International Conference on Supporting Group Work (Sanibel Island, Florida, USA) (GROUP ’16). ACM, New York, NY, USA, 369–374. https://doi.org/10.1145/2957276.2957297
[91]
Emily Pedersen. 2019. ” My Videos are at the Mercy of the YouTube Algorithm”: How Content Creators Craft Algorithmic Personas and Perceive the Algorithm that Dictates their Work.
[92]
Anthony J Pellicone and June Ahn. 2017. The Game of Performing Play: Understanding streaming as cultural production. In Proceedings of the 2017 CHI conference on human factors in computing systems. 4863–4874.
[93]
David Pinsof and Martie G Haselton. 2017. The effect of the promiscuity stereotype on opposition to gay rights. PloS one 12, 7 (2017), e0178534.
[94]
Bailey Poland. 2016. Haters: Harassment, abuse, and violence online. U of Nebraska Press.
[95]
Chand Rajendra-Nicolucci and Ethan Zuckerman. 2021. Top 100: The most popular social media platforms and what they can teach us. https://knightcolumbia.org/blog/top-100-the-most-popular-social-media-platforms-and-what-they-can-teach-us
[96]
Tobias Raun. 2018. Capitalizing intimacy: New subcultural forms of micro-celebrity strategies and affective labour on YouTube. Convergence 24, 1 (2018), 99–113.
[97]
Elissa M Redmiles, Jessica Bodford, and Lindsay Blackwell. 2019. “I just want to feel safe”: A Diary Study of Safety Perceptions on Social Media. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 13. 405–416.
[98]
Kathryn E Ringland, Christine T Wolf, Lynn Dombrowski, and Gillian R Hayes. 2015. Making” Safe” Community-Centered Practices in a Virtual World Dedicated to Children with Autism. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing. 1788–1800.
[99]
Sarah T Roberts. 2019. Behind the Screen: Content Moderation in the Shadows of Social Media. Yale University Press.
[100]
Jonathan Rosa. 2019. Looking like a language, sounding like a race. Oxf Studies in Anthropology of.
[101]
Maarten Sap, Dallas Card, Saadia Gabriel, Yejin Choi, and Noah A Smith. 2019. The risk of racial bias in hate speech detection. In Proceedings of the 57th annual meeting of the association for computational linguistics. 1668–1678.
[102]
J Ben Schafer, Dan Frankowski, Jon Herlocker, and Shilad Sen. 2007. Collaborative filtering recommender systems. In The adaptive web. Springer, 291–324.
[103]
Morgan Klaus Scheuerman, Stacy M. Branham, and Foad Hamidi. 2018. Safe Spaces and Safe Places: Unpacking Technology-Mediated Experiences of Safety and Harm with Transgender People. Proc. ACM Hum.-Comput. Interact. 2, CSCW, Article 155 (nov 2018), 27 pages. https://doi.org/10.1145/3274424
[104]
Morgan Klaus Scheuerman, Jialun Aaron Jiang, Casey Fiesler, and Jed R. Brubaker. 2021. A Framework of Severity for Harmful Content Online. Proc. ACM Hum.-Comput. Interact. 2, CSCW (2021), 1–33.
[105]
Nathan Schneider, Primavera De Filippi, Seth Frey, Joshua Tan, and Amy Zhang. 2021. Modular Politics: Toward a Governance Layer for Online Communities. Proc. ACM Hum.-Comput. Interact.CSCW (Oct. 2021).
[106]
Joseph Seering, Robert Kraut, and Laura Dabbish. 2017. Shaping Pro and Anti-Social Behavior on Twitch Through Moderation and Example-Setting. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (Portland, Oregon, USA) (CSCW ’17). ACM, New York, NY, USA, 111–125. https://doi.org/10.1145/2998181.2998277
[107]
Joseph Seering, Tony Wang, Jina Yoon, and Geoff Kaufman. 2019. Moderator engagement and community development in the age of algorithms. New Media & Society(2019), 1461444818821316.
[108]
Scott Simon and Emma Bowman. 2019. Propaganda, hate speech, violence: The working lives of Facebook’s content moderators. NPR Technology (2019).
[109]
Peter K Smith, Jess Mahdavi, Manuel Carvalho, and Neil Tippett. 2006. An investigation into cyberbullying, its forms, awareness and impact, and the relationship between age and gender in cyberbullying. Research Brief No. RBX03-06. London: DfES(2006).
[110]
Miriah Steiger, Timir J Bharucha, Sukrit Venkatagiri, Martin J. Riedl, and Matthew Lease. 2021. The Psychological Well-Being of Content Moderators: The Emotional Labor of Commercial Moderation and Avenues for Improving Support. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama, Japan) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 341, 14 pages. https://doi.org/10.1145/3411764.3445092
[111]
YouTube Studio. 2021. Navigate Youtube studio - YouTube help. https://support.google.com/youtube/answer/7548152?hl=en
[112]
Nicolas P Suzor, Sarah Myers West, Andrew Quodling, and Jillian York. 2019. What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation. International Journal of Communication 13 (2019), 18.
[113]
Alexandra To, Wenxia Sweeney, Jessica Hammer, and Geoff Kaufman. 2020. ”They Just Don’t Get It”: Towards Social Technologies for Coping with Interpersonal Racism. Proc. ACM Hum.-Comput. Interact. 4, CSCW1, Article 024 (May 2020), 29 pages. https://doi.org/10.1145/3392828
[114]
Brendesha M Tynes, Henry A Willis, Ashley M Stewart, and Matthew W Hamilton. 2019. Race-related traumatic events online and mental health among adolescents of color. Journal of Adolescent Health 65, 3 (2019), 371–377.
[115]
Stefanie Ullmann and Marcus Tomalin. 2020. Quarantining online hate speech: technical and ethical perspectives. Ethics and Information Technology 22, 1 (2020), 69–80.
[116]
Jirassaya Uttarapong, Jie Cai, and Donghee Yvette Wohn. 2021. Harassment Experiences of Women and LGBTQ Live Streamers and How They Handled Negativity. (2021), 7–19. https://doi.org/10.1145/3452918.3458794
[117]
Kristen Vaccaro, Ziang Xiao, Kevin Hamilton, and Karrie Karahalios. 2021. Contestability For Content Moderation. Proc. ACM Hum.-Comput. Interact. 5, CSCW2, Article 318 (oct 2021), 28 pages. https://doi.org/10.1145/3476059
[118]
Maarten W Van Someren, Yvonne F Barnard, and Jacobijn AC Sandberg. 1994. The think aloud method: a practical approach to modelling cognitive. London: AcademicPress 11(1994).
[119]
Jessica Vitak, Kalyani Chadha, Linda Steiner, and Zahra Ashktorab. 2017. Identifying Women’s Experiences With and Strategies for Mitigating Negative Effects of Online Harassment. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (Portland, Oregon, USA) (CSCW ’17). Association for Computing Machinery, New York, NY, USA, 1231–1245. https://doi.org/10.1145/2998181.2998337
[120]
Emily A. Vogels. 2021. The state of online harassment. https://www.pewresearch.org/internet/2021/01/13/the-state-of-online-harassment/
[121]
Jeremy Waldron. 2012. The harm in hate speech. Harvard University Press.
[122]
Jeremy Waldron. 2017. The conditions of legitimacy: A response to James Weinstein. Const. Comment. 32(2017), 697.
[123]
James Weinstein. 2017. Hate speech bans, democracy, and political legitimacy. Const. Comment. 32(2017), 527.
[124]
Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media & Society(2018).
[125]
Stephanie N Williams and Annette V Clarke. 2019. How the Desensitization of Police Violence, Stereotyped Language, and Racial Bias Impact Black Communities. Psychology and Cognitive Sciences–Open Journal 5, 2 (2019).
[126]
Donghee Yvette Wohn. 2019. Volunteer Moderators in Twitch Micro Communities: How They Get Involved, the Roles They Play, and the Emotional Labor They Experience. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, 160.
[127]
Donghee Yvette Wohn and Guo Freeman. 2020. Audience Management Practices of Live Streamers on Twitch. In ACM International Conference on Interactive Media Experiences (Cornella, Barcelona, Spain) (IMX ’20). Association for Computing Machinery, New York, NY, USA, 106–116. https://doi.org/10.1145/3391614.3393653
[128]
Stephen Wolfram. 2019. Testifying at the Senate about A.I.‑Selected Content on the Internet-Stephen Wolfram Writings. https://writings.stephenwolfram.com/2019/06/testifying-at-the-senate-about-a-i-selected-content-on-the-internet/
[129]
Haoti Zhong, Hao Li, Anna Squicciarini, Sarah Rajtmajer, Christopher Griffin, David Miller, and Cornelia Caragea. 2016. Content-Driven Detection of Cyberbullying on the Instagram Social Network. In Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence (New York, New York, USA) (IJCAI’16). AAAI Press, 3952–3958.

Cited By

View all
  • (2024)Online knowledge production in polarized political memes: The case of critical race theoryNew Media & Society10.1177/14614448241252591Online publication date: 12-Jun-2024
  • (2024)Adopting Third-party Bots for Managing Online CommunitiesProceedings of the ACM on Human-Computer Interaction10.1145/36537078:CSCW1(1-26)Online publication date: 26-Apr-2024
  • (2024)Opportunities, tensions, and challenges in computational approaches to addressing online harassmentProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661623(1483-1498)Online publication date: 1-Jul-2024
  • Show More Cited By

Index Terms

  1. Designing Word Filter Tools for Creator-led Comment Moderation

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems
    April 2022
    10459 pages
    ISBN:9781450391573
    DOI:10.1145/3491102
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 29 April 2022

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. FilterBuddy
    2. YouTube
    3. content creators
    4. content moderation
    5. human-computer integration
    6. online harassment
    7. platform governance

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    CHI '22
    Sponsor:
    CHI '22: CHI Conference on Human Factors in Computing Systems
    April 29 - May 5, 2022
    LA, New Orleans, USA

    Acceptance Rates

    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)296
    • Downloads (Last 6 weeks)26
    Reflects downloads up to 22 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Online knowledge production in polarized political memes: The case of critical race theoryNew Media & Society10.1177/14614448241252591Online publication date: 12-Jun-2024
    • (2024)Adopting Third-party Bots for Managing Online CommunitiesProceedings of the ACM on Human-Computer Interaction10.1145/36537078:CSCW1(1-26)Online publication date: 26-Apr-2024
    • (2024)Opportunities, tensions, and challenges in computational approaches to addressing online harassmentProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661623(1483-1498)Online publication date: 1-Jul-2024
    • (2024)Labeling in the Dark: Exploring Content Creators’ and Consumers’ Experiences with Content Classification for Child Safety on YouTubeProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661565(1518-1532)Online publication date: 1-Jul-2024
    • (2024)AppealMod: Inducing Friction to Reduce Moderator Workload of Handling User AppealsProceedings of the ACM on Human-Computer Interaction10.1145/36372968:CSCW1(1-35)Online publication date: 26-Apr-2024
    • (2024)Third-Party Developers and Tool Development For Community Management on Live Streaming Platform TwitchProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642787(1-18)Online publication date: 11-May-2024
    • (2024)Community Begins Where Moderation Ends: Peer Support and Its Implications for Community-Based RehabilitationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642675(1-18)Online publication date: 11-May-2024
    • (2024)A Browser Extension for in-place Signaling and Assessment of MisinformationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642473(1-21)Online publication date: 11-May-2024
    • (2024)Behind the Pup-ularity Curtain: Understanding the Motivations, Challenges, and Work Performed in Creating and Managing Pet Influencer AccountsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642367(1-17)Online publication date: 11-May-2024
    • (2024)Bystanders of Online Moderation: Examining the Effects of Witnessing Post-Removal ExplanationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642204(1-9)Online publication date: 11-May-2024
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media