350,898 research outputs found

    Lex Informatica: The Formulation of Information Policy Rules through Technology

    Get PDF
    Historically, law and government regulation have established default rules for information policy, including constitutional rules on freedom of expression and statutory rights of ownership of information. This Article will show that for network environments and the Information Society, however, law and government regulation are not the only source of rule-making. Technological capabilities and system design choices impose rules on participants. The creation and implementation of information policy are embedded in network designs and standards as well as in system configurations. Even user preferences and technical choices create overarching, local default rules. This Article argues, in essence, that the set of rules for information flows imposed by technology and communication networks form a “Lex Informatica” that policymakers must understand, consciously recognize, and encourage

    FISA Reform

    Get PDF
    Congress and the Executive Branch are poised to take up the issue of FISA reform in 2014. What has been missing from the discussion is a comprehensive view of ways in which reform could be given effect—i.e., a taxonomy of potential options. This article seeks to fill the gap. The aim is to deepen the conversation about abeyant approaches to foreign intelligence gathering, to allow fuller discussion of what a comprehensive package could contain, and to place initiatives that are currently under consideration within a broader, over-arching framework. The article begins by considering the legal underpinnings and challenges to the President\u27s Surveillance Program. It then examines how technology has altered the types of information available, as well as methods of transmission and storage. The article builds on this to develop a taxonomy for how a statutory approach to foreign intelligence gathering could be given force. It divides foreign intelligence gathering into two categories: front-end collection and back-end analysis and use. Each category contains a counterpoise structured to ensure the appropriate exercise of Congressionally-mandated authorities. For the front-end, this means balancing the manner of collection with requirements for approval. For the back-end, this means offsetting implementation with transparency and oversight. The article then considers the constituent parts of each category

    An examination into the role of knowledge management and computer security in organizations

    Get PDF
    Organisations develop their computer security procedures based on external guidelines such as ISO 17799 with very little provision to incorporate organisational knowledge in their security procedures. While these external guidelines make recommendations as to how an organisation should develop and implement best practices in computer security they often fail to provide a mechanism that links the security process to the organisational knowledge. The result is that often, security policies, procedures and controls are implemented that are neither strong nor consistent with the organisation's objectives. This study has examined the role of Knowledge Management in organisational Computer Security in 19 Australian SMEs. The study has determined that although the role of knowledge management in organisational computer security is currently limited, there appears to be evidence to argue that the application of knowledge management systems to organisational computer security development and management processes will considerably enhance performance and reduce costs. The study supports that future research is warranted to focus on how existing computer security standards and practices can be improved to allow for a stronger integration with organisational knowledge through the application of knowledge management systems

    Regulating Data as Property: A New Construct for Moving Forward

    Get PDF
    The global community urgently needs precise, clear rules that define ownership of data and express the attendant rights to license, transfer, use, modify, and destroy digital information assets. In response, this article proposes a new approach for regulating data as an entirely new class of property. Recently, European and Asian public officials and industries have called for data ownership principles to be developed, above and beyond current privacy and data protection laws. In addition, official policy guidances and legal proposals have been published that offer to accelerate realization of a property rights structure for digital information. But how can ownership of digital information be achieved? How can those rights be transferred and enforced? Those calls for data ownership emphasize the impact of ownership on the automotive industry and the vast quantities of operational data which smart automobiles and self-driving vehicles will produce. We looked at how, if at all, the issue was being considered in consumer-facing statements addressing the data being collected by their vehicles. To formulate our proposal, we also considered continued advances in scientific research, quantum mechanics, and quantum computing which confirm that information in any digital or electronic medium is, and always has been, physical, tangible matter. Yet, to date, data regulation has sought to adapt legal constructs for “intangible” intellectual property or to express a series of permissions and constraints tied to specific classifications of data (such as personally identifiable information). We examined legal reforms that were recently approved by the United Nations Commission on International Trade Law to enable transactions involving electronic transferable records, as well as prior reforms adopted in the United States Uniform Commercial Code and Federal law to enable similar transactions involving digital records that were, historically, physical assets (such as promissory notes or chattel paper). Finally, we surveyed prior academic scholarship in the U.S. and Europe to determine if the physical attributes of digital data had been previously considered in the vigorous debates on how to regulate personal information or the extent, if at all, that the solutions developed for transferable records had been considered for larger classes of digital assets. Based on the preceding, we propose that regulation of digital information assets, and clear concepts of ownership, can be built on existing legal constructs that have enabled electronic commercial practices. We propose a property rules construct that clearly defines a right to own digital information arises upon creation (whether by keystroke or machine), and suggest when and how that right attaches to specific data though the exercise of technological controls. This construct will enable faster, better adaptations of new rules for the ever-evolving portfolio of data assets being created around the world. This approach will also create more predictable, scalable, and extensible mechanisms for regulating data and is consistent with, and may improve the exercise and enforcement of, rights regarding personal information. We conclude by highlighting existing technologies and their potential to support this construct and begin an inventory of the steps necessary to further proceed with this process

    SecMon: End-to-End Quality and Security Monitoring System

    Get PDF
    The Voice over Internet Protocol (VoIP) is becoming a more available and popular way of communicating for Internet users. This also applies to Peer-to-Peer (P2P) systems and merging these two have already proven to be successful (e.g. Skype). Even the existing standards of VoIP provide an assurance of security and Quality of Service (QoS), however, these features are usually optional and supported by limited number of implementations. As a result, the lack of mandatory and widely applicable QoS and security guaranties makes the contemporary VoIP systems vulnerable to attacks and network disturbances. In this paper we are facing these issues and propose the SecMon system, which simultaneously provides a lightweight security mechanism and improves quality parameters of the call. SecMon is intended specially for VoIP service over P2P networks and its main advantage is that it provides authentication, data integrity services, adaptive QoS and (D)DoS attack detection. Moreover, the SecMon approach represents a low-bandwidth consumption solution that is transparent to the users and possesses a self-organizing capability. The above-mentioned features are accomplished mainly by utilizing two information hiding techniques: digital audio watermarking and network steganography. These techniques are used to create covert channels that serve as transport channels for lightweight QoS measurement's results. Furthermore, these metrics are aggregated in a reputation system that enables best route path selection in the P2P network. The reputation system helps also to mitigate (D)DoS attacks, maximize performance and increase transmission efficiency in the network.Comment: Paper was presented at 7th international conference IBIZA 2008: On Computer Science - Research And Applications, Poland, Kazimierz Dolny 31.01-2.02 2008; 14 pages, 5 figure

    Fostering the Biosecurity Norm: Biosecurity Education for the Next Generation of Life Scientists

    Get PDF
    Sustainable education on biosecurity and dual use for life scientists is increasingly recognised as being an important element of broader efforts to achieve biosecurity. To address this issue, a joint project between the Landau Network-Centro Volta and the Bradford Disarmament Research Centre has been initiated to analyse what currently exists in terms of biosecurity and dual use education, but also how such education can be most effectively achieved in a sustainable fashion. The purpose of this paper is to elaborate on the findings of a survey on the extent of, and attitudes to, biosecurity and dual use education in European universities, and outline the educational activities undertaken through a network of contacts built through the survey and some of the conclusions drawn from engagement with this network. The paper also outlines the development and optimization of an Educational Module Resource intended to support lecturers in the improvement and implementation of educational material related to biosecurity and dual use. This is further expanded by the authors' experience derived from implementation tests conducted at universities around Europe, in which material was tested with students and faculty members. To date, the main results from this evaluative process are that students and faculties are generally unaware of biosecurity and dual use concerns, but nonetheless appear interested in discussing these topics and have initiated challenging debates on the importance of balancing factors such as security, research, secrecy and development. However, serious efforts in terms of developing and promulgating education more broadly across the life science community will require concerted actions which look at education but also at other mutually reinforcing intervention points such as funding bodies, authors and publishers. Moreover, in the longer term it will also be necessary to develop new mechanisms and metrics to determine success in these activities and ensure that educational activities are contributing, along with other legal and regulatory measures, to mitigating the challenge of potential misuse of the life sciences in the 21st century
    • 

    corecore