929 research outputs found

    A quantitative analysis of parametric CAD model complexity and its relationship to perceived modeling complexity

    Get PDF
    Digital product data quality and reusability has been proven a critical aspect of the Model-Based Enterprise to enable the efficient design and redesign of products. The extent to which a history-based parametric CAD model can be edited or reused depends on the geometric complexity of the part and the procedure employed to build it. As a prerequisite for defining metrics that can quantify the quality of the modeling process, it is necessary to have CAD datasets that are sorted and ranked according to the complexity of the modeling process. In this paper, we examine the concept of perceived CAD modeling complexity, defined as the degree to which a parametric CAD model is perceived as difficult to create, use, and/or modify by expert CAD designers. We present a novel method to integrate pair-wise comparisons of CAD modeling complexity made by experts into a single metric that can be used as ground truth. Next, we discuss a comprehensive study of quantitative metrics which are derived primarily from the geometric characteristics of the models and the graph structure that represents the parent/child relationships between features. Our results show that the perceived CAD modeling complexity metric derived from experts’ assessment correlates particularly strongly with graph-based metrics. The Spearman coefficients for five of these metrics suggest that they can be effectively used to study the parameters that influence the reusability of models and as a basis to implement effective personalized learning strategies in online CAD training scenarios

    Effectiveness of CAD-CAM Application for the Development design and implementation of maintenance tools

    Get PDF
    This study aims to evaluate the effectiveness of using Computer-Aided Design and Computer-Aided Manufacturing (CAD-CAM) applications in developing the design and implementation of maintenance tools. The use of CAD-CAM has become a major trend in the modern manufacturing industry because it provides various advantages such as time efficiency, high precision and increased productivity. However, it is important to assess the true effectiveness of this technology in the context of maintenance tool development to fully understand its potential benefits. The literature review analysis method was used to compile an in-depth review of the latest research and publications related to the use of CAD-CAM in the design development and implementation of maintenance tools. A number of case studies and field experiments were also included in the analysis to provide further insight into the application of this technology in various industrial environments. The results of the analysis show that the use of CAD-CAM in the development of maintenance tool designs has brought significant positive changes. This application is able to reduce development cycle time, enable model-based engineering, and improve modeling accuracy. Apart from that, CAD-CAM also facilitates better collaboration between design teams, engineers and other stakeholders, which contributes to improving the quality of the final product. However, despite the many benefits offered by CAD-CAM, there are also challenges that need to be overcome to increase the effectiveness of its use. Some of these are high initial investment costs, the need for higher technical skills, and complex integration with existing infrastructure. In conclusion, CAD-CAM has proven effective in developing the design and implementation of maintenance tools. With this technology, companies can increase operational efficiency, improve product quality, and gain a competitive advantage in the market. By understanding the benefits and challenges of its use, professionals can more effectively adopt CAD-CAM and fully exploit its potential in the modern manufacturing industry

    Designing a Framework for Exchanging Partial Sets of BIM Information on a Cloud-Based Service

    Get PDF
    The rationale behind this research study was based on the recognised difficulty of exchanging data at element or object level due to the inefficiencies of compatible hardware and software. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. The only way that software file exchanges between two applications can produce consistent data and change management results for large projects is through a building model repository. The overall aim of this thesis was to design and develop an integrated process that would advance key decisions at an early design stage through faster information exchanges during collaborative work. In the construction industry, Building Information Modeling is the most integrated shared model between all disciplines. It is based on a manufacturing-like process where standardised deliverables are used throughout the life cycle with effective collaboration as its main driving force. However, the dilemma is how to share these properties of BIM applications on one single platform asynchronously. Cloud Computing is a centralized heterogeneous network that enables different applications to be connected to each other. The methodology used in the research was based on triangulation of data which incorporated many techniques featuring a mixture of both quantitative and qualitative analysis. The results identified the need to re-engineer Simplified Markup Language, in order to exchange partial data sets of intelligent object architecture on an integrated platform. The designed and tested prototype produced findings that enhanced project decisions at a relatively early design stage, improved communication and collaboration techniques and cross disciple co-ordination

    A framework for whole lifecycle cost of long-term digital preservation

    Get PDF
    Digital preservation, also known as digital curation, is the active management of digital information, over time, to ensure its accessibility and usability. Digital preservation is nowadays an active area of research, for many reasons: the rapid evolution of technology, which also results in the rapid obsolescence of old technologies; degradation of physical records; constantly increasing volumes of digital information and, importantly, the fact that it has started to become a legal obligation in many countries. This research project aims to develop an innovative framework estimate costs of long term digital preservation. The framework can lead to generating a cost model that quantifies costs within different business sectors, while capturing the impact of obsolescence and uncertainties on predicted cost. Case studies from financial, healthcare and clinical trials sectors are used to prove the framework concept. Those sectors were chosen because between them they share all file types that are required to be preserved and all are either obliged by European or local laws, e.g. EU Data Retention Directive (2006/24/EC) and/or UK Data Retention Regulations 2014 No. 2042, or interested in preserving their digital assets. The framework comprises of three phases: assessing digital preservation activities, cost analysis and expansion and cost estimation. The framework has integrated two processes that will enable the user to reach a more accurate cost estimate; a process for identifying uncertainties with digital preservation activities and a cost modelling process. In the framework cloud computing was used as an example for storage and compute technologies. Combining different research methodology techniques was used in this research project. Starting with conducting a thorough literature review covering digital preservation and cost modelling. Following the literature review; is a combination qualitative and quantitative approaches, using semi-structured interview technique to collect data from industry experts. Industry experts were chosen from companies, firms and government bodies working with or researching digital preservation. Finalising with validating results by real-life case studies from businesses in selected sectors and experts’ verdict. Comparing the output of the framework to real-life case studies, demonstrated how companies/firms, who target to preserve their digital assets, can utilise it to predict accurately future costs for undertaking such investment. By applying industrially-based cost modelling approaches the framework generates a cost model that predicts single-point and three-points cost estimates, an obsolescence taxonomy, uncertainties identification process and quantifying uncertainties and obsolescence impact on cost prediction. Providing decision makers with all the framework outputs, will provide them with quantifiable information about their future investment, while remaining clear to understand and easy to amend. This makes the framework provide long-term total cost prediction solution for digital preservation to firms; helping, guiding and adding insight into digital preservation added value

    Technologies and Applications for Big Data Value

    Get PDF
    This open access book explores cutting-edge solutions and best practices for big data and data-driven AI applications for the data-driven economy. It provides the reader with a basis for understanding how technical issues can be overcome to offer real-world solutions to major industrial areas. The book starts with an introductory chapter that provides an overview of the book by positioning the following chapters in terms of their contributions to technology frameworks which are key elements of the Big Data Value Public-Private Partnership and the upcoming Partnership on AI, Data and Robotics. The remainder of the book is then arranged in two parts. The first part “Technologies and Methods” contains horizontal contributions of technologies and methods that enable data value chains to be applied in any sector. The second part “Processes and Applications” details experience reports and lessons from using big data and data-driven approaches in processes and applications. Its chapters are co-authored with industry experts and cover domains including health, law, finance, retail, manufacturing, mobility, and smart cities. Contributions emanate from the Big Data Value Public-Private Partnership and the Big Data Value Association, which have acted as the European data community's nucleus to bring together businesses with leading researchers to harness the value of data to benefit society, business, science, and industry. The book is of interest to two primary audiences, first, undergraduate and postgraduate students and researchers in various fields, including big data, data science, data engineering, and machine learning and AI. Second, practitioners and industry experts engaged in data-driven systems, software design and deployment projects who are interested in employing these advanced methods to address real-world problems
    corecore