66,040 research outputs found

    Data engineering and best practices

    Get PDF
    Mestrado Bolonha em Data Analytics for BusinessThis report presents the results of a study on the current state of data engineering at LGG Advisors company. Analyzing existing data, we identified several key trends and challenges facing data engineers in this field. Our study's key findings include a lack of standardization and best practices for data engineering processes, a growing need for more sophisticated data management and analysis tools and data security, and a lack of trained and experienced data engineers to meet the increasing demand for data-driven solutions. Based on these findings, we recommend several steps that organizations at LGG Advisors company can take to improve their data engineering capabilities, including investing in training and education programs, adopting best practices for data management and analysis, and collaborating with other organizations to share knowledge and resources. Data security is also an essential concern for data engineers, as data breaches can have significant consequences for organizations, including financial losses, reputational damage, and regulatory penalties. In this thesis, we will review and evaluate some of the best software tools for securing data in data engineering environments. We will discuss these tools' key features and capabilities and their strengths and limitations to help data engineers choose the best software for protecting their data. Some of the tools we will consider include encryption software, access control systems, network security tools, and data backup and recovery solutions. We will also discuss best practices for implementing and managing these tools to ensure data security in data engineering environments. We engineer data using intuition and rules of thumb. Many of these rules are folklore. Given the rapid technological changes, these rules must be constantly reevaluated.info:eu-repo/semantics/publishedVersio

    The Impact of Changing Requirements

    Get PDF
    The fundamental purpose of an Engineering Change Proposal (ECP) is to change the requirements of a contract. To build in flexibility, the acquisition practice is to estimate a dollar value to hold in reserve after the contract is awarded. There appears to be no empirical-based method for estimating this ECP withhold in the literature. Using the Cost Assessment Data Enterprise (CADE) database, 533 contracts were randomly selected to build two regression models: one to predict the likelihood of a contract experiencing an ECP, and the other to determine the expected median percent increase in baseline contract cost if an ECP was likely. Results suggest that this two-step approach works well over a managed portfolio of contracts in contrast to three investigated rules-of-thumb. Significant drivers are the basic contract cost and the number of contract line items

    Planetary Entry Probe Dataset: Analysis and Rules of Thumb for Future Missions

    Full text link
    Since the beginning of robotic interplanetary exploration nearly six decades ago, successful atmospheric entry has been accomplished at Venus, Earth, Mars, Jupiter, and Titan. More entry probe missions are planned to Venus, Titan, and Uranus in the next decade. Atmospheric entry subjects the vehicle to rapid deceleration and aerothermal loads which the vehicle must be designed for, to deliver the robotic instruments inside the atmosphere. The design of planetary probes and their mission architecture is complex, and involves various engineering constraints such as peak deceleration, heating rate, heating load, and communications which must be satisfied within the budget and schedule of cost constrained mission opportunities. Engineering design data from previous entry probe missions serve as a valuable reference for designing future missions. The present study compiles an augmented version of the blue book entry probe dataset, performs a comparative analysis of the entry conditions, and provides engineering rules of thumb for design of future missions. Using the dataset, the present study proposes a new empirical correlation which aims to more accurately predict the thermal protection system mass fraction for high heat load conditions during entry and aerocapture at Uranus and Neptune.Comment: 15 pages, 15 figure

    Algorithm Selection Framework for Cyber Attack Detection

    Full text link
    The number of cyber threats against both wired and wireless computer systems and other components of the Internet of Things continues to increase annually. In this work, an algorithm selection framework is employed on the NSL-KDD data set and a novel paradigm of machine learning taxonomy is presented. The framework uses a combination of user input and meta-features to select the best algorithm to detect cyber attacks on a network. Performance is compared between a rule-of-thumb strategy and a meta-learning strategy. The framework removes the conjecture of the common trial-and-error algorithm selection method. The framework recommends five algorithms from the taxonomy. Both strategies recommend a high-performing algorithm, though not the best performing. The work demonstrates the close connectedness between algorithm selection and the taxonomy for which it is premised.Comment: 6 pages, 7 figures, 1 table, accepted to WiseML '2

    A Semantic Approach To Autonomous Mixing

    Get PDF

    CLASSIFICATION OF FEATURE SELECTION BASED ON ARTIFICIAL NEURAL NETWORK

    Get PDF
    Pattern recognition (PR) is the central in a variety of engineering applications. For this reason, it is indeed vital to develop efficient pattern recognition systems that facilitate decision making automatically and reliably. In this study, the implementation of PR system based on computational intelligence approach namely artificial neural network (ANN) is performed subsequent to selection of the best feature vectors. A framework to determine the best eigenvectors which we named as ‘eigenpostures’ of four main human postures specifically, standing, squatting/sitting, bending and lying based on the rules of thumb of Principal Component Analysis (PCA) has been developed. Accordingly, all three rules of PCA namely the KG-rule, Cumulative Variance and the Scree test suggest retaining only 35 main principal component or ‘eigenpostures’. Next, these ‘eigenpostures’ are statistically analyzed via Analysis of Variance (ANOVA) prior to classification. Thus, the most relevant component of the selected eigenpostures can be determined. Both categories of ‘eigenpostures’ prior to ANOVA as well as after ANOVA served as inputs to the ANN classifier to verify the effectiveness of feature selection based on statistical analysis. Results attained confirmed that the statistical analysis has enabled us to perform effectively the selection of eigenpostures for classification of four types of human postures

    Model-based spacecraft and mission design for the evaluation of technology

    Get PDF
    In order to meet the future vision of robotic missions, engineers will face intricate mission concepts, new operational approaches, and technologies that have yet to be developed. The concept of smaller, model driven projects helps this transition by including life-cycle cost as part of the decision making process. For example, since planetary exploration missions have cost ceilings and short development periods, heritage flight hardware is utilized. However, conceptual designs that rely solely on heritage technology will result in estimates that may not be truly representative of the actual mission being designed and built. The Laboratory for Spacecraft and Mission Design (LSMD) at the California Institute of Technology is developing integrated concurrent models for mass and cost estimations. The purpose of this project is to quantify the infusion of specific technologies where the data would be useful in guiding technology developments leading up to a mission. This paper introduces the design-to-cost model to determine the implications of various technologies on the spacecraft system in a collaborative engineering environment. In addition, comparisons of the benefits of new or advanced technologies for future deep space missions are examined

    Discrete-event simulation of process control in low volume high value industries

    Get PDF
    This paper presents a new method of process control for set-up dominant processes. This new method known as Set-up Process Algorithm (SUPA) was compared with existing industrial practices and statistical techniques in the literature. To test the method’s robustness, a generic discrete-event simulation model was built. This model was used to test four different statistical approaches to process control. It was concluded that SUPA offers a method of process control for set-up dominant processes, which is easier to apply than classically derived SPC approaches, by using simple rules and a traffic light system based on design specification. Simulation analysis shows that SUPA: is more sensitive, at detecting an incapable process as it will monitor more units when a process is less capable; is more sensitive than PRE-Control at detecting mean shifts in a process. SUPA is also a nonparametric methodology and therefore robust against processes with non-Gaussian distributions
    corecore