27 research outputs found

    Honeypots and honeynets: issues of privacy

    Get PDF
    Honeypots and honeynets are popular tools in the area of network security and network forensics. The deployment and usage of these tools are influenced by a number of technical and legal issues, which need to be carefully considered. In this paper, we outline the privacy issues of honeypots and honeynets with respect to their technical aspects. The paper discusses the legal framework of privacy and legal grounds to data processing. We also discuss the IP address, because by EU law, it is considered personal data. The analysis of legal issues is based on EU law and is supported by discussions on privacy and related issues

    Burning Bridges: Facial Recognition Technology and Public Space Surveillance in the Modern State

    No full text
    A live automated facial recognition technology, rolled out in public spaces and cities across the world, is transforming the nature of modern policing. In R (on the application of Bridges) v Chief Constable of South Wales Police, decided in August 2020 (‘Bridges’) – the first successful legal challenge to automated facial recognition technology worldwide - the Court of Appeal in the United Kingdom held that the use of automated facial recognition technology by the South Wales Police was unlawful. This landmark ruling can set a precedent and influence future policy on facial recognition in many countries. Bridges decision imposes some limits on the previously unconstrained police discretion on whom to target and where to deploy the technology. Yet, while the decision demands a clearer legal framework to limit the discretion of police who use such technology, it does not, in principle, oppose the use of facial recognition technology for mass-surveillance in public places, nor for monitoring political protests. To the contrary, the Court accepted that the use of automated facial recognition in public spaces to identify very large numbers of people and to track their movements is proportional to law enforcement goals. Thus, the Court dismissed the wider impact and significant risks posed by using facial recognition technology in public spaces; it underplayed the heavy burden placed on democratic participation and the rights to freedom of expression and association, which require collective action in public spaces. Neither did the Court demand transparency about the technologies used by the police force, which is often shielded behind the ‘trade secrets’ by the corporations who produce them, nor did the Court act to prevent fragmentation and inconsistency between local police forces’ rules and regulations on automated facial recognition technology, which leaves the law less predictable. Thus, while the Bridges decision is reassuring and demands change in the discretionary approaches of UK police in the short term, its long-term impact in burning bridges between the expanding public space surveillance infrastructure and the modern state is less certain

    Technology and the Judicial Role

    No full text
    corecore