38,716 research outputs found

    The Advanced Framework for Evaluating Remote Agents (AFERA): A Framework for Digital Forensic Practitioners

    Get PDF
    Digital forensics experts need a dependable method for evaluating evidence-gathering tools. Limited research and resources challenge this process and the lack of multi-endpoint data validation hinders reliability in distributed digital forensics. A framework was designed to evaluate distributed agent-based forensic tools while enabling practitioners to self-evaluate and demonstrate evidence reliability as required by the courts. Grounded in Design Science, the framework features guidelines, data, criteria, and checklists. Expert review enhances its quality and practicality

    On the genesis of computer forensis

    Get PDF
    This thesis presents a coherent set of research contributions to the new discipline of computer forensis. It analyses emergence of computer forensis and defines challenges facing this discipline, carries forward research advances in conventional methodology, introduces novel approach to using virtual environments in forensis, and systemises the computer forensis body of knowledge leading to the establishment of tertiary curriculum. The emergence of computer forensis as a separate discipline of science was triggered by evolution and growth of computer crime. Computer technology reached a stage when a conventional, mechanistic approach to collecting and analysing data is insufficient: the existing methodology must be formalised, and embrace technologies and methods that will enable the inclusion of transient data and live systems analysis. Further work is crucial to incorporate advances in related disciplines like computer security and information systems audit, as well as developments in operating systems to make computer forensics issues inherent in their design. For example: it is proposed that some of the features offered by persistent systems could be built into conventional operating systems to make illicit activities easier to identify and analyse. The analysis of permanent data storage is fundamental to computer forensics practice. There is very little finalised, and a lot still to be discovered in the conventional computer forensics methodology. This thesis contributes to formalisation and improved integrity of forensic handling of data storage by: formalising methods for data collection and analysis in NTFS (Microsoft file system) environment: presenting safe methodology for handling data backups in order to avoid information loss where Alternate Data Streams (ADS) are present: formalising methods of hiding and extracting hidden and encrypted data. A significant contribution of this thesis is in the field of application of virtualisation, or simulation of the computer in the virtual environment created by the underlying hardware and software, to computer forensics practice. Computer systems are not easily analysed for forensic purpose, and it is demonstrated that virtualisation applied in computer forensics allows for more efficient and accurate identification and analysis of the evidence. A new method is proposed where two environments used in parallel can bring faster and verifiable results not dependent on proprietary, close source tools and may lead to gradual shift from commercial Windows software to open source software (OSS). The final contribution of this thesis is systemising the body of knowledge in computer forensics, which is a necessary condition for it to become an established discipline of science. This systemisation led to design and development of tertiary curriculum in computer forensics illustrated here with a case study of computer forensics major for Bachelor of Computer Science at University of Western Sydney. All genesis starts as an idea. A natural part of scientific research process is replacing previous assumptions, concepts, and practices with new ones which better approximate the truth. This thesis advances computer forensis body of knowledge in the areas which are crucial to further development of this discipline. Please note that the appendices to this thesis consist of separately published items which cannot be made available due to copyright restrictions. These items are listed in the PDF attachment for reference purposes

    Incorporation of therapeutic effect of daylight in the architectural design of in-patient rooms to reduce patient length of stay (LoS) in hospitals

    Get PDF
    The biological need for lighting by an individual differs from the merely visual purpose, such as viewing objects and doing work or movement. Lack of adequate daylight for biological stimulation can lead to health problems, for e.g. imbalanced circadian rhythm. The importance of daylight is vital for hospital patients who are mostly physically and/or psychologically stressed. As, many patients stay indoors for 24 hours, they might be vulnerable to the lack of daylight which is necessary for health reasons. Hence, for hospital patients, daylight can be a strong therapeutic environmental design element to ensure good health and accelerate clinical recovery. The complex relationship between daylight environment and individuals responses are not fully understood. Controversy results that are debated by the previous researchers, has made the implementation of daylighting strategies in the architectural design of hospital in-patient rooms critical, mainly for therapeutic purpose. Strong evidence needs to be established that can build confidence to both architects and policy makers to use daylight for therapeutic purpose and integration of therapeutic effect of daylight to in-patient room architecture is necessary as well. This thesis provides information to architects (with examples) for incorporation of therapeutic effect of daylight in the design of in-patient rooms to reduce patient length of stay (LoS) in hospitals. A triangulation research method was applied in this work, where theories were developed qualitatively and tested quantitatively. Literature review was carried out to establish the potential effect of daylight on patient health. Retrospective field investigations were conducted to establish the quantitative relationship between daylight intensity and patient LoS inside in-patient rooms by developing Multiple Linear Regression (MLR) models under a general hospital environment. Using the daylighting goal to enhance therapeutic benefit for hospital patients, referred from literature and verified from field investigation data, a daylight design concept (sky window configurations) was developed and evaluated by prospective simulation study, and found better compared to traditional standard hospital window configurations, in order to enhance therapeutic benefit for hospital patients. A dynamic annual Climate-Based Daylight Modelling (CBDM) method that uses RADIANCE (backward) raytracer combined with a daylight coefficient approach considering Perez all weather sky luminance model (i.e. DAYSIM), was used for simulation analysis. This thesis develops strategies for architects to incorporate therapeutic effect of daylight in the architectural design of hospital in-patient rooms, including guidelines to support architectural decisions in case of conflicting situations, and to identify the range of daylight intensities within which patient LoS is expected to be reduced. The strategies also consider the ultraviolet radiation (UVR) protections and discuss the challenges of climate change for daylight researchers for the incorporation of therapeutic effect of daylight in the design of hospital in-patient rooms. The thesis provides a contribution to knowledge by establishing strong evidence of quantitative relationship between daylight and LoS, and by presenting new architectural forms for hospital in-patient room design as one of the possible ways to incorporate therapeutic effect of daylight in the design of hospital in-patient rooms effectively. It is expected that the research will encourage and help architects and policy makers to incorporate therapeutic effect of daylight in the design of hospital in-patient rooms, efficiently

    IPCFA: A Methodology for Acquiring Forensically-Sound Digital Evidence in the Realm of IAAS Public Cloud Deployments

    Get PDF
    Cybercrimes and digital security breaches are on the rise: savvy businesses and organizations of all sizes must ready themselves for the worst. Cloud computing has become the new normal, opening even more doors for cybercriminals to commit crimes that are not easily traceable. The fast pace of technology adoption exceeds the speed by which the cybersecurity community and law enforcement agencies (LEAs) can invent countermeasures to investigate and prosecute such criminals. While presenting defensible digital evidence in courts of law is already complex, it gets more complicated if the crime is tied to public cloud computing, where storage, network, and computing resources are shared and dispersed over multiple geographical areas. Investigating such crimes involves collecting evidence data from the public cloud that is court-sound. Digital evidence court admissibility in the U.S. is governed predominantly by the Federal Rules of Evidence and Federal Rules of Civil Procedures. Evidence authenticity can be challenged by the Daubert test, which evaluates the forensic process that took place to generate the presented evidence. Existing digital forensics models, methodologies, and processes have not adequately addressed crimes that take place in the public cloud. It was only in late 2020 that the Scientific Working Group on Digital Evidence (SWGDE) published a document that shed light on best practices for collecting evidence from cloud providers. Yet SWGDE’s publication does not address the gap between the technology and the legal system when it comes to evidence admissibility. The document is high level with more focus on law enforcement processes such as issuing a subpoena and preservation orders to the cloud provider. This research proposes IaaS Public Cloud Forensic Acquisition (IPCFA), a methodology to acquire forensic-sound evidence from public cloud IaaS deployments. IPCFA focuses on bridging the gap between the legal and technical sides of evidence authenticity to help produce admissible evidence that can withstand scrutiny in U.S. courts. Grounded in design research science (DSR), the research is rigorously evaluated using two hypothetical scenarios for crimes that take place in the public cloud. The first scenario takes place in AWS and is hypothetically walked-thru. The second scenario is a demonstration of IPCFA’s applicability and effectiveness on Azure Cloud. Both cases are evaluated using a rubric built from the federal and civil digital evidence requirements and the international best practices for iv digital evidence to show the effectiveness of IPCFA in generating cloud evidence sound enough to be considered admissible in court

    Designing and Operating Safe and Secure Transit Systems: Assessing Current Practices in the United States and Abroad, MTI Report 04-05

    Get PDF
    Public transit systems around the world have for decades served as a principal venue for terrorist acts. Today, transit security is widely viewed as an important public policy issue and is a high priority at most large transit systems and at smaller systems operating in large metropolitan areas. Research on transit security in the United States has mushroomed since 9/11; this study is part of that new wave of research. This study contributes to our understanding of transit security by (1) reviewing and synthesizing nearly all previously published research on transit terrorism; (2) conducting detailed case studies of transit systems in London, Madrid, New York, Paris, Tokyo, and Washington, D.C.; (3) interviewing federal officials here in the United States responsible for overseeing transit security and transit industry representatives both here and abroad to learn about efforts to coordinate and finance transit security planning; and (4) surveying 113 of the largest transit operators in the United States. Our major findings include: (1) the threat of transit terrorism is probably not universal—most major attacks in the developed world have been on the largest systems in the largest cities; (2) this asymmetry of risk does not square with fiscal politics that seek to spread security funding among many jurisdictions; (3) transit managers are struggling to balance the costs and (uncertain) benefits of increased security against the costs and (certain) benefits of attracting passengers; (4) coordination and cooperation between security and transit agencies is improving, but far from complete; (5) enlisting passengers in surveillance has benefits, but fearful passengers may stop using public transit; (6) the role of crime prevention through environmental design in security planning is waxing; and (7) given the uncertain effectiveness of antitransit terrorism efforts, the most tangible benefits of increased attention to and spending on transit security may be a reduction in transit-related person and property crimes

    Web attack risk awareness with lessons learned from high interaction honeypots

    Get PDF
    Tese de mestrado, Segurança Informática, Universidade de Lisboa, Faculdade de Ciências, 2009Com a evolução da web 2.0, a maioria das empresas elabora negócios através da Internet usando aplicações web. Estas aplicações detêm dados importantes com requisitos cruciais como confidencialidade, integridade e disponibilidade. A perda destas propriedades influencia directamente o negócio colocando-o em risco. A percepção de risco providencia o necessário conhecimento de modo a agir para a sua mitigação. Nesta tese foi concretizada uma colecção de honeypots web de alta interacção utilizando diversas aplicações e sistemas operativos para analisar o comportamento do atacante. A utilização de ambientes de virtualização assim como ferramentas de monitorização de honeypots amplamente utilizadas providencia a informação forense necessária para ajudar a comunidade de investigação no estudo do modus operandi do atacante, armazenando os últimos exploits e ferramentas maliciosas, e a desenvolver as necessárias medidas de protecção que lidam com a maioria das técnicas de ataque. Utilizando a informação detalhada de ataque obtida com os honeypots web, o comportamento do atacante é classificado entre diferentes perfis de ataque para poderem ser analisadas as medidas de mitigação de risco que lidam com as perdas de negócio. Diferentes frameworks de segurança são analisadas para avaliar os benefícios que os conceitos básicos de segurança dos honeypots podem trazer na resposta aos requisitos de cada uma e a consequente mitigação de risco.With the evolution of web 2.0, the majority of enterprises deploy their business over the Internet using web applications. These applications carry important data with crucial requirements such as confidentiality, integrity and availability. The loss of those properties influences directly the business putting it at risk. Risk awareness provides the necessary know-how on how to act to achieve its mitigation. In this thesis a collection of high interaction web honeypots is deployed using multiple applications and diverse operating systems in order to analyse the attacker behaviour. The use of virtualization environments along with widely used honeypot monitoring tools provide the necessary forensic information that helps the research community to study the modus operandi of the attacker gathering the latest exploits and malicious tools and to develop adequate safeguards that deal with the majority of attacking techniques. Using the detailed attacking information gathered with the web honeypots, the attacking behaviour will be classified across different attacking profiles to analyse the necessary risk mitigation safeguards to deal with business losses. Different security frameworks commonly used by enterprises are analysed to evaluate the benefits of the honeypots security concepts in responding to each framework’s requirements and consequently mitigating the risk
    corecore