6,725 research outputs found
Rotorcraft technology at Boeing Vertol: Recent advances
An overview is presented of key accomplishments in the rotorcraft development at Boeing Vertol. Projects of particular significance: high speed rotor development and the Model 360 Advanced Technology Helicopter. Areas addressed in the overview are: advanced rotors with reduced noise and vibration, 3-D aerodynamic modeling, flight control and avionics, active control, automated diagnostics and prognostics, composite structures, and drive systems
GTTC Future of Ground Testing Meta-Analysis of 20 Documents
National research, development, test, and evaluation ground testing capabilities in the United States are at risk. There is a lack of vision and consensus on what is and will be needed, contributing to a significant threat that ground test capabilities may not be able to meet the national security and industrial needs of the future. To support future decisions, the AIAA Ground Testing Technical Committees (GTTC) Future of Ground Test (FoGT) Working Group selected and reviewed 20 seminal documents related to the application and direction of ground testing. Each document was reviewed, with the content main points collected and organized into sections in the form of a gap analysis current state, future state, major challenges/gaps, and recommendations. This paper includes key findings and selected commentary by an editing team
DATA MINING METHODOLOGY FOR DETERMINING THE OPTIMAL MODEL OF COST PREDICTION IN SHIP INTERIM PRODUCT ASSEMBLY
In order to accurately predict costs of the thousands of interim products that are assembled in shipyards, it is necessary to use skilled engineers to develop detailed Gantt charts for each interim product separately which takes many hours. It is helpful to develop a prediction tool to estimate the cost of interim products accurately and quickly without the need for skilled engineers. This will drive down shipyard costs and improve competitiveness. Data mining is used extensively for developing prediction models in other industries. Since ships consist of thousands of interim products, it is logical to develop a data mining methodology for a shipyard or any other manufacturing industry where interim products are produced. The methodology involves analysis of existing interim products and data collection. Pre-processing and principal component analysis is done to make the data “user-friendly” for later prediction processing and the development of both accurate and robust models. The support vector machine is demonstrated as the better model when there are a lower number of tuples. However as the number of tuples is increased to over 10000, then the artificial neural network model is recommended
Recommended from our members
Automatic synthesis of analog layout : a survey
A review of recent research in the automatic synthesis of physical geometry for analog integrated circuits is presented. On introduction, an explanation of the difficulties involved in analog layout as opposed to digital layout is covered. Review of the literature then follows. Emphasis is placed on the exposition of general methods for addressing problems specific to analog layout, with the details of specific systems only being given when they surve to illustrate these methods well. The conclusion discusses problems remaining and offers a prediction as to how technology will evolve to solve them. It is argued that although progress has been and will continue to be made in the automation of analog IC layout, due to fundamental differences in the nature of analog IC design as opposed to digital design, it should not be expected that the level of automation of the former will reach that of the latter any time soon
Integrating Aircraft Cost Modeling into Conceptual Design
The article presents cost modeling results from the application of the Genetic-Causal cost modeling principle. Industrial results from redesign are also presented to verify the opportunity for early concept cost optimization by using Genetic-Causal cost drivers to guide the conceptual design process for structural assemblies. The acquisition cost is considered through the modeling of the recurring unit cost and non-recurring design cost. The operational cost is modeled relative to acquisition cost and fuel burn for predominately metal or composites designs. The main contribution of this study is the application of the Genetic-Causal principle to the modeling of cost, helping to understand how conceptual design parameters impact on cost, and linking that to customer requirements and life cycle cost
Recommended from our members
EARLY-WARNING PREDICTION FOR MACHINE FAILURES IN AUTOMATED INDUSTRIES USING ADVANCED MACHINE LEARNING TECHNIQUES
This Culminating Experience Project explores the use of machine learning algorithms to detect machine failure. The research questions are: Q1) How does the quality of input data, including issues such as outliers, and noise, impact the accuracy and reliability of machine failure prediction models in industrial settings? Q2) How does the integration of SMOTE with feature engineering techniques influence the overall performance of machine learning models in detecting and preventing machine failures? Q3) What is the performance of different machine learning algorithms in predicting machine failures, and which algorithm is the most effective? The research findings are: Q1) Effective outlier handling is vital for predictive maintenance as the variables distribution initially showed a right-skewed pattern but after rectifying, it became more centralized, with correlations between specific sensors showing potential for further exploration. Q2) Data balancing through SMOTE and feature engineering is essential due to the rarity of actual failure instances. Substantial challenges are observed when predicting \u27Failure\u27 instances, with a lower true positive rate (73%), resulting in low precision (0.02) and recall (0.73) for \u27Failure\u27 predictions. This is further reflected in the low F1-Score (0.03) for \u27Failure,\u27 indicating a trade-off between precision and recall. Despite a commendable overall accuracy of 94%, the class imbalance within the dataset (92,200 \u27Running\u27 instances vs. 126 \u27Failure\u27 instances) remains a contributing factor to the model\u27s limitations. Q3) Machine learning algorithm performance varies, with Catboost excelling in accuracy and failure detection. The choice of algorithm and continuous model refinement are critical for enhanced predictive accuracy in industrial contexts. The main conclusions are: Q1) Addressing outliers in data preprocessing significantly enhances the accuracy of machine failure prediction models. Q2) focuses on addressing the issue of equipment failure parameter imbalance. It was found in the research findings that there was a significant imbalance in the failure data, with only 0.14% of the dataset representing actual failures and 99.86% of the dataset pertaining to non-failure data. This extreme class disparity can result in biased models that underperform on underrepresented classes, which is a common problem in machine learning. Q3) Catboost outperforms other algorithms in predicting machine failures with remarkable accuracy and failure detection rates of 92% accuracy and 99% times it is correct, and further exploration of diverse data and algorithms is needed for tailored industrial applications. Future research areas include advanced outlier handling, sensor relationships, and data balancing for improved model accuracy. Addressing rare failures, enhancing model performance, and exploring diverse machine learning algorithms are critical for advancing predictive maintenance
Towards a semantic Construction Digital Twin: directions for future research
As the Architecture, Engineering and Construction sector is embracing the digital age, the processes involved in the design, construction and operation of built assets are more and more influenced by technologies dealing with value-added monitoring of data from sensor networks, management of this data in secure and resilient storage systems underpinned by semantic models, as well as the simulation and optimisation of engineering systems. Aside from enhancing the efficiency of the value chain, such information-intensive models and associated technologies play a decisive role in minimising the lifecycle impacts of our buildings. While Building Information Modelling provides procedures, technologies and data schemas enabling a standardised semantic representation of building components and systems, the concept of a Digital Twin conveys a more holistic socio-technical and process-oriented characterisation of the complex artefacts involved by leveraging the synchronicity of the cyber-physical bi-directional data flows. Moreover, BIM lacks semantic completeness in areas such as control systems, including sensor networks, social systems, and urban artefacts beyond the scope of buildings, thus requiring a holistic, scalable semantic approach that factors in dynamic data at different levels. The paper reviews the multi-faceted applications of BIM during the construction stage and highlights limits and requirements, paving the way to the concept of a Construction Digital Twin. A definition of such a concept is then given, described in terms of underpinning research themes, while elaborating on areas for future research
- …