921 research outputs found

    A Threat to Cyber Resilience : A Malware Rebirthing Botnet

    Get PDF
    This paper presents a threat to cyber resilience in the form of a conceptual model of a malware rebirthing botnet which can be used in a variety of scenarios. It can be used to collect existing malware and rebirth it with new functionality and signatures that will avoid detection by AV software and hinder analysis. The botnet can then use the customized malware to target an organization with an orchestrated attack from the member machines in the botnet for a variety of malicious purposes, including information warfare applications. Alternatively, it can also be used to inject known malware signatures into otherwise non malicious code and traffic to overloading the sensors and processing systems employed by intrusion detection and prevention systems to create a denial of confidence of the sensors and detection systems. This could be used as a force multiplier in asymmetric warfare applications to create confusion and distraction whilst attacks are made on other defensive fronts

    Accuracy when inferential statistics are used as measurement tools

    Get PDF
    BACKGROUND: Inferential statistical tests that approximate measurement are called acceptance procedures. The procedure includes type 1 error, falsely rejecting the null hypothesis, and type 2 error, failing to reject the null hypothesis when the alternative should be supported. This approach involves repeated sampling from a distribution with established parameters such that the probabilities of these errors can be ascertained. With low error probabilities the procedure has the potential to approximate measurement. How close this procedure approximates measurement was examined. FINDINGS: A Monte Carlo procedure set the type 1 error at p = 0.05 and the type 2 error at either p = 0.20 or p = 0.10 for effect size values of d = 0.2, 0.5, and 0.8. The resultant values are approximately 15 and 6.25 % larger than the effect sizes entered into the analysis depending on a type 2 error rate of p < 0.20, or p < 0.10 respectively. CONCLUSIONS: Acceptance procedures approximate values wherein a decision could be made. In a health district a deviation at a particular level could signal a change in health. The approximations could be reasonable in some circumstances, but if more accurate measures are desired a deviation could be reduced by the percentage appropriate for the power. The tradeoff for such a procedure is an increase in type 1 error rate and a decrease in type 2 errors

    Significance Testing Needs a Taxonomy:Or How the Fisher, Neyman-Pearson Controversy Resulted in the Inferential Tail Wagging the Measurement Dog

    Get PDF
    Accurate measurement and a cutoff probability with inferential statistics are not wholly compatible. Fisher understood this when he developed the F test to deal with measurement variability and to make judgments on manipulations that may be worth further study. Neyman and Pearson focused on modeled distributions whose parameters were highly determined and concluded that inferential judgments following an F test could be made with accuracy because the distribution parameters were determined. Neyman and Pearson’s approach in the application of statistical analyses using alpha and beta error rates has played a dominant role guiding inferential judgments, appropriately in highly determined situations and inappropriately in scientific exploration. Fisher tried to explain the different situations, but, in part due to some obscure wording, generated a long standing dispute that currently has left the importance of Fisher’s p &lt; .05 criteria not fully understood and a general endorsement of the Neyman and Pearson error rate approach. Problems were compounded with power calculations based on effect sizes following significant results entering into exploratory science. To understand in a practical sense when each approach should be used, a dimension reflecting varying levels of certainty or knowledge of population distributions is presented. The dimension provides a taxonomy of statistical situations and appropriate approaches by delineating four zones that represent how well the underlying population of interest is defined ranging from exploratory situations to highly determined populations. </jats:p

    The Precision of Effect Size Estimation From Published Psychological Research: Surveying Confidence Intervals

    Get PDF
    Confidence interval ( CI) widths were calculated for reported Cohen's d standardized effect sizes and examined in two automated surveys of published psychological literature. The first survey reviewed 1,902 articles from Psychological Science. The second survey reviewed a total of 5,169 articles from across the following four APA journals: Journal of Abnormal Psychology, Journal of Applied Psychology, Journal of Experimental Psychology: Human Perception and Performance, and Developmental Psychology. The median CI width for d was greater than 1 in both surveys. Hence, CI widths were, as Cohen (1994) speculated, embarrassingly large. Additional exploratory analyses revealed that CI widths varied across psychological research areas and that CI widths were not discernably decreasing over time. The theoretical implications of these findings are discussed along with ways of reducing the CI widths and thus improving precision of effect size estimation. </jats:p

    Goal-orientated cognitive rehabilitation for dementias associated with Parkinson's disease―A pilot randomised controlled trial

    Get PDF
    OBJECTIVE: To examine the appropriateness and feasibility of cognitive rehabilitation for people with dementias associated with Parkinson's in a pilot randomised controlled study. METHODS: This was a single-blind pilot randomised controlled trial of goal-oriented cognitive rehabilitation for dementias associated with Parkinson's. After goal setting, participants were randomised to cognitive rehabilitation (n = 10), relaxation therapy (n = 10), or treatment-as-usual (n = 9). Primary outcomes were ratings of goal attainment and satisfaction with goal attainment. Secondary outcomes included quality of life, mood, cognition, health status, everyday functioning, and carers' ratings of goal attainment and their own quality of life and stress levels. Assessments were at 2 and 6 months following randomisation. RESULTS: At 2 months, cognitive rehabilitation was superior to treatment-as-usual and relaxation therapy for the primary outcomes of self-rated goal attainment (d = 1.63 and d = 1.82, respectively) and self-rated satisfaction with goal attainment (d = 2.04 and d = 1.84). At 6 months, cognitive rehabilitation remained superior to treatment-as-usual (d = 1.36) and relaxation therapy (d = 1.77) for self-rated goal attainment. Cognitive rehabilitation was superior to treatment as usual and/or relaxation therapy in a number of secondary outcomes at 2 months (mood, self-efficacy, social domain of quality of life, carers' ratings of participants' goal attainment) and at 6 months (delayed recall, health status, quality of life, carer ratings of participants' goal attainment). Carers receiving cognitive rehabilitation reported better quality of life, health status, and lower stress than those allocated to treatment-as-usual. CONCLUSIONS: Cognitive rehabilitation is feasible and potentially effective for dementias associated with Parkinson's disease

    Lessons Learned from an Investigation into the Analysis Avoidance Techniques of Malicious Software

    Get PDF
    This paper outlines a number of key lessons learned from an investigation into the techniques malicious executable software can employ to hinder digital forensic examination. Malware signature detection has been recognised by researchers to be far less than ideal. Thus, the forensic analyst may be required to manually analyse suspicious files. However, in order to hinder the forensic analyst, hide its true intent and to avoid detection, modern malware can be wrapped with packers or protectors, and layered with a plethora of antianalysis techniques. This necessitates the forensic analyst to develop static and dynamic analysis skills tailored to navigate a hostile environment. To this end, the analyst must understand the anti-analysis techniques that can be employed and how to mitigate them, the limitations of existing tools and how to extend them, and how to employ an appropriate analysis methodology to uncover the intent of the malware

    Malware Forensics: Discovery of the Intent of Deception

    Get PDF
    Malicious software (malware) has a wide variety of analysis avoidance techniques that it can employ to hinder forensic analysis. Although legitimate software can incorporate the same analysis avoidance techniques to provide a measure of protection against reverse engineering and to protect intellectual property, malware invariably makes much greater use of such techniques to make detailed analysis labour intensive and very time consuming. Analysis avoidance techniques are so heavily used by malware that the detection of the use of analysis avoidance techniques could be a very good indicator of the presence of malicious intent. However, there is a tendency for analysis tools to focus on hiding the presence of the tool itself from being detected by the malware, and not on recording the detection and recording of analysis avoidance techniques. In addition, the coverage of anti-anti-analysis techniques in common tools and plugins is much less than the number of analysis avoidance techniques that exist. The purpose of this paper is to suggest that the discovery of the intent of deception may be a very good indicator of an underlying malicious objective of the software under investigation

    Malware Forensics: Discovery of the Intent of Deception

    Get PDF
    Malicious software (malware) has a wide variety of analysis avoidance techniques that it can employ to hinder forensic analysis. Although legitimate software can incorporate the same analysis avoidance techniques to provide a measure of protection against reverse engineering and to protect intellectual property, malware invariably makes much greater use of such techniques to make detailed analysis labour intensive and very time consuming. Analysis avoidance techniques are so heavily used by malware that the detection of the use of analysis avoidance techniques could be a very good indicator of the presence of malicious intent. However, there is a tendency for analysis tools to focus on hiding the presence of the tool itself from being detected by the malware, and not on recording the detection and recording of analysis avoidance techniques. In addition, the coverage of anti-anti-analysis techniques in common tools and plugins is much less than the number of analysis avoidance techniques that exist. The purpose of this paper is to suggest that the discovery of the intent of deception may be a very good indicator of an underlying malicious objective of the software under investigation

    Real-time imaging of Leishmania mexicana-infected early phagosomes: a study using primary macrophages generated from green fluorescent protein-Rab5 transgenic mice

    Get PDF
    The small GTPase Rab5 is a key regulator of endosome/phagosome maturation and in intravesicular infections marks a phagosome stage at which decisions over pathogen replication or destruction are integrated. It is currently unclear whether Leishmania-infected phagosomes uniformly pass through a Rab5+ stage on their intracellular path to compartments with late endosomal/early lysosomal characteristics. Differences in routes and final compartments could have consequences for accessibility to antileishmanial drugs. Here, we generated a unique gfp-rab5 transgenic mouse model to visualize Rab5 recruitment to early parasite-containing phagosomes in primary host cells. Using real-time fluorescence imaging of phagosomes carrying Leishmania mexicana, we determined that parasite-infested phagosomes follow a uniform sequence of transient Rab5 recruitment. Residence in Rab5+ compartments was much shorter compared with phagosomes harboring latex beads. Furthermore, a comparative analysis of parasite life-cycle stages and mutants deficient in lpg1, the gene encoding the enzyme required for synthesis of the dominant surface lipophosphoglycan, indicated that parasite surface ligands and host cell receptors modulate pathogen residence times in Rab5+ phagosomes, but, as far as tested, had no significant effect on intracellular L. mexicana survival or replication.—Lippuner, C., Paape, D., Paterou, A., Brand, J., Richardson, M., Smith, A. J., Hoffmann, K., Brinkmann, V., Blackburn, C., Aebischer, T. Real-time imaging of Leishmania mexicana-infected early phagosomes: a study using primary macrophages generated from green fluorescent protein-Rab5 transgenic mice
    corecore