10 research outputs found

    Hackers: a case-study of the social shaping of computing

    Get PDF
    The study is an examination of hacking, placing the act in the context of theories of technological change. The account of hacking is used to substantiate those theories that emphasise the societal shaping of technology over the notion of technological determinism. The evolution of hacking is traced, showing how it reflects changing trends in the nature of information: the most vivid of these is the conceptualisation of information known as 'cyberspace'. Instead of simply cataloguing the impact of technical changes within computing, and the effects they have had upon information, the study shows how technical change takes place in a process of negotiation and conflict between groups.The two main groups analysed are those of the Computer Underground (CU) and the Computer Security Industry (CSI). The experiences and views of both groups are recounted in what constitute internalist and externalist accounts of hacking and its significance. The internalist account is the evidence provided by hackers themselves. It addresses such issues as what motivates the act of hacking; whether there is an identifiable hacking culture; and why it is almost an exclusively male activity. The externalist account contains the perceptions of hacking held by those outside the activity.The state of computing's security measures and its vulnerability to hacking is described, and evidence is provided of the extent to which hacking gives rise to technical knowledge that could be of potential use in the fixing of security weaknesses. The division within the CSI between those broadly cooperative with hackers and those largely hostile to them is examined, and the reasons why hacking knowledge is not generally utilised are explored. Hackers are prevented from gaining legitimacy within computing in a process referred to as 'closure'. Examples include hackers being stigmatised through the use of analogies that compare their computing activities to conventional crimes such as burglary and tresspass.Stigmatisation is carried out by the CSI who use it in a process of professional boundary formation to distinguish themselves from hackers. It is also used by other authority figures such as Members of Parliament whose involvement in the process of closure takes the form of the anti-hacking legislation they have passed, an analysis of which concludes this study

    Wide spectrum attribution: Using deception for attribution intelligence in cyber attacks

    Get PDF
    Modern cyber attacks have evolved considerably. The skill level required to conduct a cyber attack is low. Computing power is cheap, targets are diverse and plentiful. Point-and-click crimeware kits are widely circulated in the underground economy, while source code for sophisticated malware such as Stuxnet is available for all to download and repurpose. Despite decades of research into defensive techniques, such as firewalls, intrusion detection systems, anti-virus, code auditing, etc, the quantity of successful cyber attacks continues to increase, as does the number of vulnerabilities identified. Measures to identify perpetrators, known as attribution, have existed for as long as there have been cyber attacks. The most actively researched technical attribution techniques involve the marking and logging of network packets. These techniques are performed by network devices along the packet journey, which most often requires modification of existing router hardware and/or software, or the inclusion of additional devices. These modifications require wide-scale infrastructure changes that are not only complex and costly, but invoke legal, ethical and governance issues. The usefulness of these techniques is also often questioned, as attack actors use multiple stepping stones, often innocent systems that have been compromised, to mask the true source. As such, this thesis identifies that no publicly known previous work has been deployed on a wide-scale basis in the Internet infrastructure. This research investigates the use of an often overlooked tool for attribution: cyber de- ception. The main contribution of this work is a significant advancement in the field of deception and honeypots as technical attribution techniques. Specifically, the design and implementation of two novel honeypot approaches; i) Deception Inside Credential Engine (DICE), that uses policy and honeytokens to identify adversaries returning from different origins and ii) Adaptive Honeynet Framework (AHFW), an introspection and adaptive honeynet framework that uses actor-dependent triggers to modify the honeynet envi- ronment, to engage the adversary, increasing the quantity and diversity of interactions. The two approaches are based on a systematic review of the technical attribution litera- ture that was used to derive a set of requirements for honeypots as technical attribution techniques. Both approaches lead the way for further research in this field

    “Be a Pattern for the World”: The Development of a Dark Patterns Detection Tool to Prevent Online User Loss

    Get PDF
    Dark Patterns are designed to trick users into sharing more information or spending more money than they had intended to do, by configuring online interactions to confuse or add pressure to the users. They are highly varied in their form, and are therefore difficult to classify and detect. Therefore, this research is designed to develop a framework for the automated detection of potential instances of web-based dark patterns, and from there to develop a software tool that will provide a highly useful defensive tool that helps detect and highlight these patterns

    Technical Debt is an Ethical Issue

    Get PDF
    We introduce the problem of technical debt, with particular focus on critical infrastructure, and put forward our view that this is a digital ethics issue. We propose that the software engineering process must adapt its current notion of technical debt – focusing on technical costs – to include the potential cost to society if the technical debt is not addressed, and the cost of analysing, modelling and understanding this ethical debt. Finally, we provide an overview of the development of educational material – based on a collection of technical debt case studies - in order to teach about technical debt and its ethical implication

    Minding the Gap: Computing Ethics and the Political Economy of Big Tech

    Get PDF
    In 1988 Michael Mahoney wrote that “[w]hat is truly revolutionary about the computer will become clear only when computing acquires a proper history, one that ties it to other technologies and thus uncovers the precedents that make its innovations significant” (Mahoney, 1988). Today, over thirty years after this quote was written, we are living right in the middle of the information age and computing technology is constantly transforming modern living in revolutionary ways and in such a high degree that is giving rise to many ethical considerations, dilemmas, and social disruption. To explore the myriad of issues associated with the ethical challenges of computers using the lens of political economy it is important to explore the history and development of computer technology

    Critique of Fantasy, Vol. 2

    Get PDF
    "In The Contest between B-Genres, the “Space Trilogy” by J.R.R. Tolkien’s friend and colleague C.S. Lewis and the roster of American science fictions that Gotthard GĂŒnther selected and glossed for the German readership in 1952 demarcate the ring in which the contestants face off. In carrying out in fiction the joust that Tolkien proclaimed in his manifesto essay “On Fairy-Stories,” Lewis challenged the visions of travel through time and space that were the mainstays of modern science fiction. In the facing corner, GĂŒnther recognized in American science fiction the first stirrings of a new mythic storytelling that would supplant the staple of an expiring metaphysics, the fairy-story basic to Tolkien and Lewis’s fantasy genre. The B-genres science fiction and fantasy were contemporaries of cinema’s emergence out of the scientific and experimental study and recording of motion made visible. In an early work like H.G. Wells’s The Time Machine, which Tolkien credited as work of fantasy, the transport through time – the ununderstood crux of this literary experiment – is conveyed through a cinematic–fantastic component in the narrative, reflecting optical innovations and forecasting the movies to come. Although the historical onset of the rivalry between the B-genres is packed with literary examples, adaptation (acknowledged or not) followed out the rebound of wish fantasy between literary descriptions of the ununderstood and their cinematic counterparts, visual and special effects. The arrival of the digital relation out of the crucible of the unknown and the special effect seemed at last to award the fantasy genre the trophy in its contest with science fiction. And yet, although science fiction indeed failed to predict the digital future, fantasy did not so much succeed as draw benefit from the mere resemblance of fantasying to the new relation. While it follows that digitization is the fantasy that is true (and not, as Tolkien had hoped, the Christian Gospel), the newly renewed B-genre without borders found support in another revaluation that was underway in the other B-genre. Once its future orientation was “history,” science fiction began indwelling the ruins of its faulty forecasts. By its new allegorical momentum, science fiction supplied captions of legibility and history to the reconfigured borderlands it cohabited with fantasy. The second volume also attends, then, to the hybrids that owed their formation to these changes, both anticipated and realized. Extending through the topography of the borderlands, works by J.G. Ballard, Ursula Le Guin, and John Boorman, among others, occupy and cathect a context of speculative fiction that suspended and blended the strict contest requirements constitutive of the separate B-genres

    Ethical research in public policy.

    Get PDF
    Public policy research is research for a purpose, guided by a distinctive range of normative considerations. The values are the values of public service; the work is generally done in the public domain; and the research is an intrinsic part of the democratic process, which depends on deliberation and accountability. Conventional representations of ethical research typically focus on ‘human subjects’ research, which raises different kinds of ethical issues to public policy research. Existing research ethics advice does not address the issues surrounding public policy research. Such research is typically concerned with collective action and the work of institutions, and the central guiding principles are not about responsibility to research participants, but duties to the public, as seen in principles of beneficence, citizenship, empowerment and the democratic process

    Ethical Evidence and Policymaking

    Get PDF
    EPDF and EPUB available Open Access under CC-BY-NC-ND licence. This important book offers practical advice for using evidence and research in policymaking. The book has two aims. First, it builds a case for ethics and global values in research and knowledge exchange, and second, it examines specific policy areas and how evidence can guide practice. The book covers important policy areas including the GM debate, the environment, Black Lives Matter and COVID-19. Each chapter assesses the ethical challenges, the status of evidence in explaining or describing the issue and possible solutions to the problem. The book will enable policymakers and their advisors to seek evidence for their decisions from research that has been conducted ethically and with integrity

    Proceedings of the ETHICOMP 2022: Effectiveness of ICT ethics - How do we help solve ethical problems in the field of ICT?

    Get PDF
    This Ethicomp is again organized in exceptional times. Two previous ones were forced to turn to online conferences because of Covid-pandemic but it was decided that this one would be the physical one or cancelled as the need for real encounters and discussion between people are essential part of doing philosophy. We need possibility to meet people face to face and even part of the presentation were held distance–because of insurmountable problems of arriving by some authors– we manage to have real, physical conference, even the number of participants was smaller than previous conferences.The need of Ethicomp is underlined by the way world nowadays is portrayed for us. The truthfulness and argumentation seem to be replaced by lies, strategic games, hate and disrespect of humanity in personal, societal and even global communication. EThicomp is many times referred as community and therefore it is important that we as community do protect what Ethicomp stands for. We need to seek for goodness and be able to give argumentation what that goodness is. This lead us towards Habermass communicative action and Discourse ethics which encourages open and respectful discourse between people (see eg.Habermass 1984;1987;1996). However, this does not mean that we need to accept everything and everybody. We need to defend truthfulness, equality and demand those from others too. There are situations when some people should be removed from discussions if they neglect the demand for discourse. Because by giving voice for claims that have no respect for argumentation, lacks the respect of human dignity or are not ready for mutual understanding (or at least aiming to see possibility for it) we cannot have meaningful communication. This is visible in communication of all levels today and it should not be accepted, but resisted. It is duty of us all.</p
    corecore