21 research outputs found

    Sexual images depicting children : the EU legal framework and online platforms’ policies

    Get PDF
    Sexual(ised) images of children may often be posted or shared on social network sites or content sharing platforms. While such material may be the result of abuse or coercion, evidence shows that it may often be linked to contemporary forms of sexual exploration or intimate communication among underage peers. The aim of this paper is to explore the boundaries of the EU legal and policy framework regulating online platforms' liability for hosting or not removing such imagery. Discussing popular online platforms' policies against their imposed responsibility to contribute to the fight against illegal child sexual abuse material (CSAM) revealed a tendency of online intermediaries to restrict more than legally required from them. However justifiable the adoption of this 'better safe than sorry' approach might be, it sparks additional controversy in relation to children's agency. Navigating between the protection and freedom of children, when the issue at stake associates with elements such as gender, morality, and culture, inherently perplexes the performed balancing and cannot guarantee easy public policy or private industry solutions. However, in the absence of clear and sufficient policy guidelines,online platforms have no other choice but to shape their policies based on popular cultural norms, their business plan, and their understanding of how sensitive content should be dealt with

    Report: 3rd International Press Freedom Seminar : off/online intimidation of journalists

    Get PDF
    The annual International Press Freedom was, for the third time, organised by the Faculty of Law and Criminology and the Faculty of Political and Social Sciences of Ghent University, Belgium. The seminar gathered speakers from different backgrounds: journalists, academics, and civil society organisations supporting and monitoring the protection of journalists, who shared their insights into practices of intimidation of journalists and mechanisms that offer (legal) protection against such practices

    The EU Approach to Safeguard Children’s Rights on Video-Sharing Platforms: Jigsaw or Maze?

    Get PDF
    Children are keen consumers of audiovisual media content. Video-sharing platforms (VSPs), such as YouTube and TikTok, offer a wealth of child-friendly or child-appropriate content but also content which—depending on the age of the child—might be considered inappropriate or potentially harmful. Moreover, such VSPs often deploy algorithmic recommender systems to personalise the content that children are exposed to (e.g., through auto-play features), leading to concerns about diversity of content or spirals of content related to, for instance, eating disorders or self-harm. This article explores the responsibilities of VSPs with respect to children that are imposed by existing, recently adopted, and proposed EU legislation. Instruments that we investigate include the Audiovisual Media Services Directive, the General Data Protection Regulation, the Digital Services Act, and the proposal for an Artificial Intelligence Act. Based on a legal study of policy documents, legislation, and scholarship, this contribution investigates to what extent this legislative framework sets obligations for VSPs to safeguard children’s rights and discusses how these obligations align across different legislative instruments

    Towards a legal qualifcaton of online sexual acts in which children are involved : constructng a typology

    No full text
    In today’s society, the widespread use of digital technologies among children is indisputable. They learn to navigate the digital environment from a young age by using a plethora of online communication platforms and mobile applications. The use of and interaction with digital media and services covers a broad variety of aspects of one’s (private) life, including the development of (online) sexual behaviour. This article develops a typology to address this
    corecore