Cook, D., Zilka, M., DeSandre, H., Giles, S., Weller, A., & Maskell, S.2022-11-042022-11-042022Cook, D., Zilka, M., DeSandre, H., Giles, S., Weller, A., & Maskell, S. (2022). Can We Automate the Analysis of Online Child Sexual Exploitation Discourse?. arXiv preprint arXiv:2209.12320.https://arxiv.org/pdf/2209.12320.pdfhttp://hdl.handle.net/11212/5609Social media’s growing popularity raises concerns around children’s online safety. Interactions between minors and adults with predatory intentions is a particularly grave concern. Research into online sexual grooming has often relied on domain-experts to manually annotate conversations, limiting both scale and scope. In this work, we test how well automated methods can detect conversational behaviors and replace an expert human annotator. Informed by psychological theories of online grooming, we label 6772 chat messages sent by child-sex offenders with one of eleven predatory behaviors. We train bag-of-words and natural language inference models to classify each behavior, and show that the best performing models classify behaviors in a manner that is consistent, but not on-par, with human annotation.enonline safetysocial mediachild sexual exploitationonline groomingmanipulationdetectionCan We Automate the Analysis of Online Child Sexual Exploitation Discourse?Article