52 research outputs found

    A Method for Visualizing the Structural Complexity of Organizational Architectures

    Get PDF
    To achieve a high level of performance and efficiency, contemporary aerospace systems must become increasingly complex. While complexity management traditionally focuses on a product’s components and their interconnectedness, organizational representation in complexity analysis is just as essential. This thesis addresses this organizational aspect of complexity through an Organizational Complexity Metric (OCM) to aid complexity management. The OCM augments Sinha’s structural complexity metric for product architectures into a metric that can be applied to organizations. Utilizing nested numerical design structure matrices (DSMs), a compact visual representation of organizational complexity was developed. Within the nested numerical DSM are existing organizational datasets used to quantify the complexity of both organizational system components and their interfaces. The OCM was applied to a hypothetical system example, as well as an existing aerospace organizational architecture. Through the development of the OCM, this thesis assumed that each dataset was collected in a statistically sufficient manner and has a reasonable correlation to system complexity. This thesis recognizes the lack of complete human representation and aims to provide a platform for expansion. Before a true organizational complexity metric can be applied to real systems, additional human considerations should be considered. These limitations differ from organization to organization and should be taken into consideration before implementation into a working system. The visualization of organizational complexity uses a color gradient to show the relative complexity density of different parts of the organization

    AFMB-Net: DeepFake Detection Network Using Heart Rate Analysis

    Get PDF
    With advances in deepfake generating technology, it is getting increasingly difficult to detect deepfakes. Deepfakes can be used for many malpractices such as blackmail, politics, social media, etc. These can lead to widespread misinformation and can be harmful to an individual or an institution’s reputation. It has become important to be able to identify deepfakes effectively, while there exist many machine learning techniques to identify them, these methods are not able to cope up with the rapidly improving GAN technology which is used to generate deepfakes. Our project aims to identify deepfakes successfully using machine learning along with Heart Rate Analysis. The heart rate identified by our model is unique to each individual and cannot be spoofed or imitated by a GAN and is thus susceptible to improving GAN technology. To solve the deepfake detection problem we employ various machine learning models along with heart rate analysis to detect deepfakes

    Multisensor Data Fusion for Cultural Heritage Assets Monitoring and Preventive Conservation

    Get PDF
    This paper shows the first phase of an ongoing interdisciplinary research project aimed at codifying procedures for the control and non-destructive analysis of the conservation status of CH artefacts to guide preventive preservation actions. It specifically explains the results of an experiment aimed at defining the procedural phases of semantic-informative enrichment of a digital architectural model where the morpho-metric components acquired with instrumental survey techniques are linked with cognitive and technical aspects (microclimatic, material, and geometric deviation data), with the aim of making this model a support for the simulation of scenarios connected to preventive preservation programmes. The research was carried out on the church of San Michele Arcangelo in Padula, affected by plaster detachment from the frescoes on the intrados of the vaulted systems. The work was conceived to support a mainly qualitative assessment regarding a possible relationship between micro-environmental variations and visually perceived degradation phenomena to provide a first indication of the conservation status of the investigated surfaces. The analyses were conducted through algorithms that, as such, are repeatable and objective. In addition, these processes, as they were applied to the models derived from the architectural survey, made it possible to make the most of these outputs. Therefore, by combining the algorithmic manipulation of the digital representations with the necessary critical interpretation of the data by the specialist, it was possible to address some actions of direct intervention and guide the most appropriate choices for subsequent in-depth diagnostics, more targeted, reducing the damage to the historical heritage
    • 

    corecore