1,467 research outputs found

    Proposing a secure component-based-application logic and system’s integration testing approach

    Get PDF
    Software engineering moved from traditional methods of software enterprise applications to com-ponent based development for distributed system’s applications. This new era has grown up forlast few years, with component-based methods, for design and rapid development of systems, butfact is that , deployment of all secure software features of technology into practical e-commercedistributed systems are higher rated target for intruders. Although most of research has been con-ducted on web application services that use a large share of the present software, but on the otherside Component Based Software in the middle tier ,which rapidly develops application logic, alsoopen security breaching opportunities .This research paper focus on a burning issue for researchersand scientists ,a weakest link in component based distributed system, logical attacks, that cannotbe detected with any intrusion detection system within the middle tier e-commerce distributed ap-plications. We proposed An Approach of Secure Designing application logic for distributed system,while dealing with logically vulnerability issue

    Observing the clouds : a survey and taxonomy of cloud monitoring

    Get PDF
    This research was supported by a Royal Society Industry Fellowship and an Amazon Web Services (AWS) grant. Date of Acceptance: 10/12/2014Monitoring is an important aspect of designing and maintaining large-scale systems. Cloud computing presents a unique set of challenges to monitoring including: on-demand infrastructure, unprecedented scalability, rapid elasticity and performance uncertainty. There are a wide range of monitoring tools originating from cluster and high-performance computing, grid computing and enterprise computing, as well as a series of newer bespoke tools, which have been designed exclusively for cloud monitoring. These tools express a number of common elements and designs, which address the demands of cloud monitoring to various degrees. This paper performs an exhaustive survey of contemporary monitoring tools from which we derive a taxonomy, which examines how effectively existing tools and designs meet the challenges of cloud monitoring. We conclude by examining the socio-technical aspects of monitoring, and investigate the engineering challenges and practices behind implementing monitoring strategies for cloud computing.Publisher PDFPeer reviewe

    Supervisory Control System Architecture for Advanced Small Modular Reactors

    Full text link

    Cybersecurity for Manufacturers: Securing the Digitized and Connected Factory

    Full text link
    As manufacturing becomes increasingly digitized and data-driven, manufacturers will find themselves at serious risk. Although there has yet to be a major successful cyberattack on a U.S. manufacturing operation, threats continue to rise. The complexities of multi-organizational dependencies and data-management in modern supply chains mean that vulnerabilities are multiplying. There is widespread agreement among manufacturers, government agencies, cybersecurity firms, and leading academic computer science departments that U.S. industrial firms are doing too little to address these looming challenges. Unfortunately, manufacturers in general do not see themselves to be at particular risk. This lack of recognition of the threat may represent the greatest risk of cybersecurity failure for manufacturers. Public and private stakeholders must act before a significant attack on U.S. manufacturers provides a wake-up call. Cybersecurity for the manufacturing supply chain is a particularly serious need. Manufacturing supply chains are connected, integrated, and interdependent; security of the entire supply chain depends on security at the local factory level. Increasing digitization in manufacturing— especially with the rise of Digital Manufacturing, Smart Manufacturing, the Smart Factory, and Industry 4.0, combined with broader market trends such as the Internet of Things (IoT)— exponentially increases connectedness. At the same time, the diversity of manufacturers—from large, sophisticated corporations to small job shops—creates weakest-link vulnerabilities that can be addressed most effectively by public-private partnerships. Experts consulted in the development of this report called for more holistic thinking in industrial cybersecurity: improvements to technologies, management practices, workforce training, and learning processes that span units and supply chains. Solving the emerging security challenges will require commitment to continuous improvement, as well as investments in research and development (R&D) and threat-awareness initiatives. This holistic thinking should be applied across interoperating units and supply chains.National Science Foundation, Grant No. 1552534https://deepblue.lib.umich.edu/bitstream/2027.42/145442/1/MForesight_CybersecurityReport_Web.pd

    Command & Control: Understanding, Denying and Detecting - A review of malware C2 techniques, detection and defences

    Full text link
    In this survey, we first briefly review the current state of cyber attacks, highlighting significant recent changes in how and why such attacks are performed. We then investigate the mechanics of malware command and control (C2) establishment: we provide a comprehensive review of the techniques used by attackers to set up such a channel and to hide its presence from the attacked parties and the security tools they use. We then switch to the defensive side of the problem, and review approaches that have been proposed for the detection and disruption of C2 channels. We also map such techniques to widely-adopted security controls, emphasizing gaps or limitations (and success stories) in current best practices.Comment: Work commissioned by CPNI, available at c2report.org. 38 pages. Listing abstract compressed from version appearing in repor

    Automated histopathological analyses at scale

    Get PDF
    Thesis: S.M., Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2017.Cataloged from PDF version of thesis.Includes bibliographical references (pages 68-73).Histopathology is the microscopic examination of processed human tissues to diagnose conditions like cancer, tuberculosis, anemia and myocardial infractions. The diagnostic procedure is, however, very tedious, time-consuming and prone to misinterpretation. It also requires highly trained pathologists to operate, making it unsuitable for large-scale screening in resource-constrained settings, where experts are scarce and expensive. In this thesis, we present a software system for automated screening, backed by deep learning algorithms. This cost-effective, easily-scalable solution can be operated by minimally trained health workers and would extend the reach of histopathological analyses to settings such as rural villages, mass-screening camps and mobile health clinics. With metastatic breast cancer as our primary case study, we describe how the system could be used to test for the presence of a tumor, determine the precise location of a lesion, as well as the severity stage of a patient. We examine how the algorithms are combined into an end-to-end pipeline for utilization by hospitals, doctors and clinicians on a Software as a Service (SaaS) model. Finally, we discuss potential deployment strategies for the technology, as well an analysis of the market and distribution chain in the specific case of the current Indian healthcare ecosystem.by Mrinal Mohit.S.M

    Efficiency and Automation in Threat Analysis of Software Systems

    Get PDF
    Context: Security is a growing concern in many organizations. Industries developing software systems plan for security early-on to minimize expensive code refactorings after deployment. In the design phase, teams of experts routinely analyze the system architecture and design to find potential security threats and flaws. After the system is implemented, the source code is often inspected to determine its compliance with the intended functionalities. Objective: The goal of this thesis is to improve on the performance of security design analysis techniques (in the design and implementation phases) and support practitioners with automation and tool support.Method: We conducted empirical studies for building an in-depth understanding of existing threat analysis techniques (Systematic Literature Review, controlled experiments). We also conducted empirical case studies with industrial participants to validate our attempt at improving the performance of one technique. Further, we validated our proposal for automating the inspection of security design flaws by organizing workshops with participants (under controlled conditions) and subsequent performance analysis. Finally, we relied on a series of experimental evaluations for assessing the quality of the proposed approach for automating security compliance checks. Findings: We found that the eSTRIDE approach can help focus the analysis and produce twice as many high-priority threats in the same time frame. We also found that reasoning about security in an automated fashion requires extending the existing notations with more precise security information. In a formal setting, minimal model extensions for doing so include security contracts for system nodes handling sensitive information. The formally-based analysis can to some extent provide completeness guarantees. For a graph-based detection of flaws, minimal required model extensions include data types and security solutions. In such a setting, the automated analysis can help in reducing the number of overlooked security flaws. Finally, we suggested to define a correspondence mapping between the design model elements and implemented constructs. We found that such a mapping is a key enabler for automatically checking the security compliance of the implemented system with the intended design. The key for achieving this is two-fold. First, a heuristics-based search is paramount to limit the manual effort that is required to define the mapping. Second, it is important to analyze implemented data flows and compare them to the data flows stipulated by the design
    • …
    corecore