16,733 research outputs found

    Meso-scale FDM material layout design strategies under manufacturability constraints and fracture conditions

    Get PDF
    In the manufacturability-driven design (MDD) perspective, manufacturability of the product or system is the most important of the design requirements. In addition to being able to ensure that complex designs (e.g., topology optimization) are manufacturable with a given process or process family, MDD also helps mechanical designers to take advantage of unique process-material effects generated during manufacturing. One of the most recognizable examples of this comes from the scanning-type family of additive manufacturing (AM) processes; the most notable and familiar member of this family is the fused deposition modeling (FDM) or fused filament fabrication (FFF) process. This process works by selectively depositing uniform, approximately isotropic beads or elements of molten thermoplastic material (typically structural engineering plastics) in a series of pre-specified traces to build each layer of the part. There are many interesting 2-D and 3-D mechanical design problems that can be explored by designing the layout of these elements. The resulting structured, hierarchical material (which is both manufacturable and customized layer-by-layer within the limits of the process and material) can be defined as a manufacturing process-driven structured material (MPDSM). This dissertation explores several practical methods for designing these element layouts for 2-D and 3-D meso-scale mechanical problems, focusing ultimately on design-for-fracture. Three different fracture conditions are explored: (1) cases where a crack must be prevented or stopped, (2) cases where the crack must be encouraged or accelerated, and (3) cases where cracks must grow in a simple pre-determined pattern. Several new design tools, including a mapping method for the FDM manufacturability constraints, three major literature reviews, the collection, organization, and analysis of several large (qualitative and quantitative) multi-scale datasets on the fracture behavior of FDM-processed materials, some new experimental equipment, and the refinement of a fast and simple g-code generator based on commercially-available software, were developed and refined to support the design of MPDSMs under fracture conditions. The refined design method and rules were experimentally validated using a series of case studies (involving both design and physical testing of the designs) at the end of the dissertation. Finally, a simple design guide for practicing engineers who are not experts in advanced solid mechanics nor process-tailored materials was developed from the results of this project.U of I OnlyAuthor's request

    Technical Dimensions of Programming Systems

    Get PDF
    Programming requires much more than just writing code in a programming language. It is usually done in the context of a stateful environment, by interacting with a system through a graphical user interface. Yet, this wide space of possibilities lacks a common structure for navigation. Work on programming systems fails to form a coherent body of research, making it hard to improve on past work and advance the state of the art. In computer science, much has been said and done to allow comparison of programming languages, yet no similar theory exists for programming systems; we believe that programming systems deserve a theory too. We present a framework of technical dimensions which capture the underlying characteristics of programming systems and provide a means for conceptualizing and comparing them. We identify technical dimensions by examining past influential programming systems and reviewing their design principles, technical capabilities, and styles of user interaction. Technical dimensions capture characteristics that may be studied, compared and advanced independently. This makes it possible to talk about programming systems in a way that can be shared and constructively debated rather than relying solely on personal impressions. Our framework is derived using a qualitative analysis of past programming systems. We outline two concrete ways of using our framework. First, we show how it can analyze a recently developed novel programming system. Then, we use it to identify an interesting unexplored point in the design space of programming systems. Much research effort focuses on building programming systems that are easier to use, accessible to non-experts, moldable and/or powerful, but such efforts are disconnected. They are informal, guided by the personal vision of their authors and thus are only evaluable and comparable on the basis of individual experience using them. By providing foundations for more systematic research, we can help programming systems researchers to stand, at last, on the shoulders of giants

    Efficiency measurement based on novel performance measures in total productive maintenance (TPM) using a fuzzy integrated COPRAS and DEA method

    Get PDF
    Total Productive Maintenance (TPM) has been widely recognized as a strategic tool and lean manufacturing practice for improving manufacturing performance and sustainability, and therefore it has been successfully implemented in many organizations. The evaluation of TPM efficiency can assist companies in improving their operations across a variety of dimensions. This paper aims to propose a comprehensive and systematic framework for the evaluation of TPM performance. The proposed total productive maintenance performance measurement system (TPM PMS) is divided into four phases (e.g., design, evaluate, implement, and review): i) the design of new performance measures, ii) the evaluation of the new performance measures, iii) the implementation of the new performance measures to evaluate TPM performance, and iv) the reviewing of the TPM PMS. In the design phase, different types of performance measures impacting TPM are defined and analyzed by decision-makers. In the evaluation phase, novel performance measures are evaluated using the Fuzzy COmplex Proportional Assessment (FCOPRAS) method. In the implementation phase, a modified fuzzy data envelopment analysis (FDEA) is used to determine efficient and inefficient TPM performance with novel performance measures. In the review phase, TPM performance is periodically monitored, and the proposed TPM PMS is reviewed for successful implementation of TPM. A real-world case study from an international manufacturing company operating in the automotive industry is presented to demonstrate the applicability of the proposed TPM PMS. The main findings from the real-world case study showed that the proposed TPM PMS allows measuring TPM performance with different indicators especially soft ones, e.g., human-related, and supports decision makers by comparing the TPM performances of production lines and so prioritizing the most important preventive/predictive decisions and actions according to production lines, especially the ineffective ones in TPM program implementation. Therefore, this system can be considered a powerful monitoring tool and reliable evidence to make the implementation process of TPM more efficient in the real-world production environment

    Exploring the Training Factors that Influence the Role of Teaching Assistants to Teach to Students With SEND in a Mainstream Classroom in England

    Get PDF
    With the implementation of inclusive education having become increasingly valued over the years, the training of Teaching Assistants (TAs) is now more important than ever, given that they work alongside pupils with special educational needs and disabilities (hereinafter SEND) in mainstream education classrooms. The current study explored the training factors that influence the role of TAs when it comes to teaching SEND students in mainstream classrooms in England during their one-year training period. This work aimed to increase understanding of how the training of TAs is seen to influence the development of their personal knowledge and professional skills. The study has significance for our comprehension of the connection between the TAs’ training and the quality of education in the classroom. In addition, this work investigated whether there existed a correlation between the teaching experience of TAs and their background information, such as their gender, age, grade level taught, years of teaching experience, and qualification level. A critical realist theoretical approach was adopted for this two-phased study, which involved the mixing of adaptive and grounded theories respectively. The multi-method project featured 13 case studies, each of which involved a trainee TA, his/her college tutor, and the classroom teacher who was supervising the trainee TA. The analysis was based on using semi-structured interviews, various questionnaires, and non-participant observation methods for each of these case studies during the TA’s one-year training period. The primary analysis of the research was completed by comparing the various kinds of data collected from the participants in the first and second data collection stages of each case. Further analysis involved cross-case analysis using a grounded theory approach, which made it possible to draw conclusions and put forth several core propositions. Compared with previous research, the findings of the current study reveal many implications for the training and deployment conditions of TAs, while they also challenge the prevailing approaches in many aspects, in addition to offering more diversified, enriched, and comprehensive explanations of the critical pedagogical issues

    neuroAIx-Framework: design of future neuroscience simulation systems exhibiting execution of the cortical microcircuit model 20× faster than biological real-time

    Get PDF
    IntroductionResearch in the field of computational neuroscience relies on highly capable simulation platforms. With real-time capabilities surpassed for established models like the cortical microcircuit, it is time to conceive next-generation systems: neuroscience simulators providing significant acceleration, even for larger networks with natural density, biologically plausible multi-compartment models and the modeling of long-term and structural plasticity.MethodsStressing the need for agility to adapt to new concepts or findings in the domain of neuroscience, we have developed the neuroAIx-Framework consisting of an empirical modeling tool, a virtual prototype, and a cluster of FPGA boards. This framework is designed to support and accelerate the continuous development of such platforms driven by new insights in neuroscience.ResultsBased on design space explorations using this framework, we devised and realized an FPGA cluster consisting of 35 NetFPGA SUME boards.DiscussionThis system functions as an evaluation platform for our framework. At the same time, it resulted in a fully deterministic neuroscience simulation system surpassing the state of the art in both performance and energy efficiency. It is capable of simulating the microcircuit with 20× acceleration compared to biological real-time and achieves an energy efficiency of 48nJ per synaptic event

    latent Dirichlet allocation method-based nowcasting approach for prediction of silver price

    Get PDF
    Silver is a metal that offers significant value to both investors and companies. The purpose of this study is to make an estimation of the price of silver. While making this estimation, it is planned to include the frequency of searches on Google Trends for the words that affect the silver price. Thus, it is aimed to obtain a more accurate estimate. First, using the Latent Dirichlet Allocation method, the keywords to be analyzed in Google Trends were collected from various articles on the Internet. Mining data from Google Trends combined with the information obtained by LDA is the new approach this study took, to predict the price of silver. No study has been found in the literature that has adopted this approach to estimate the price of silver. The estimation was carried out with Random Forest Regression, Gaussian Process Regression, Support Vector Machine, Regression Trees and Artificial Neural Networks methods. In addition, ARIMA, which is one of the traditional methods that is widely used in time series analysis, was also used to benchmark the accuracy of the methodology. The best MSE ratio was obtained as 0,000227131 ± 0.0000235205 by the Regression Trees method. This score indicates that it would be a valid technique to estimate the price of "Silver" by using Google Trends data using the LDA method

    A Decision Support System for Economic Viability and Environmental Impact Assessment of Vertical Farms

    Get PDF
    Vertical farming (VF) is the practice of growing crops or animals using the vertical dimension via multi-tier racks or vertically inclined surfaces. In this thesis, I focus on the emerging industry of plant-specific VF. Vertical plant farming (VPF) is a promising and relatively novel practice that can be conducted in buildings with environmental control and artificial lighting. However, the nascent sector has experienced challenges in economic viability, standardisation, and environmental sustainability. Practitioners and academics call for a comprehensive financial analysis of VPF, but efforts are stifled by a lack of valid and available data. A review of economic estimation and horticultural software identifies a need for a decision support system (DSS) that facilitates risk-empowered business planning for vertical farmers. This thesis proposes an open-source DSS framework to evaluate business sustainability through financial risk and environmental impact assessments. Data from the literature, alongside lessons learned from industry practitioners, would be centralised in the proposed DSS using imprecise data techniques. These techniques have been applied in engineering but are seldom used in financial forecasting. This could benefit complex sectors which only have scarce data to predict business viability. To begin the execution of the DSS framework, VPF practitioners were interviewed using a mixed-methods approach. Learnings from over 19 shuttered and operational VPF projects provide insights into the barriers inhibiting scalability and identifying risks to form a risk taxonomy. Labour was the most commonly reported top challenge. Therefore, research was conducted to explore lean principles to improve productivity. A probabilistic model representing a spectrum of variables and their associated uncertainty was built according to the DSS framework to evaluate the financial risk for VF projects. This enabled flexible computation without precise production or financial data to improve economic estimation accuracy. The model assessed two VPF cases (one in the UK and another in Japan), demonstrating the first risk and uncertainty quantification of VPF business models in the literature. The results highlighted measures to improve economic viability and the viability of the UK and Japan case. The environmental impact assessment model was developed, allowing VPF operators to evaluate their carbon footprint compared to traditional agriculture using life-cycle assessment. I explore strategies for net-zero carbon production through sensitivity analysis. Renewable energies, especially solar, geothermal, and tidal power, show promise for reducing the carbon emissions of indoor VPF. Results show that renewably-powered VPF can reduce carbon emissions compared to field-based agriculture when considering the land-use change. The drivers for DSS adoption have been researched, showing a pathway of compliance and design thinking to overcome the ‘problem of implementation’ and enable commercialisation. Further work is suggested to standardise VF equipment, collect benchmarking data, and characterise risks. This work will reduce risk and uncertainty and accelerate the sector’s emergence

    The determinants of value addition: a crtitical analysis of global software engineering industry in Sri Lanka

    Get PDF
    It was evident through the literature that the perceived value delivery of the global software engineering industry is low due to various facts. Therefore, this research concerns global software product companies in Sri Lanka to explore the software engineering methods and practices in increasing the value addition. The overall aim of the study is to identify the key determinants for value addition in the global software engineering industry and critically evaluate the impact of them for the software product companies to help maximise the value addition to ultimately assure the sustainability of the industry. An exploratory research approach was used initially since findings would emerge while the study unfolds. Mixed method was employed as the literature itself was inadequate to investigate the problem effectively to formulate the research framework. Twenty-three face-to-face online interviews were conducted with the subject matter experts covering all the disciplines from the targeted organisations which was combined with the literature findings as well as the outcomes of the market research outcomes conducted by both government and nongovernment institutes. Data from the interviews were analysed using NVivo 12. The findings of the existing literature were verified through the exploratory study and the outcomes were used to formulate the questionnaire for the public survey. 371 responses were considered after cleansing the total responses received for the data analysis through SPSS 21 with alpha level 0.05. Internal consistency test was done before the descriptive analysis. After assuring the reliability of the dataset, the correlation test, multiple regression test and analysis of variance (ANOVA) test were carried out to fulfil the requirements of meeting the research objectives. Five determinants for value addition were identified along with the key themes for each area. They are staffing, delivery process, use of tools, governance, and technology infrastructure. The cross-functional and self-organised teams built around the value streams, employing a properly interconnected software delivery process with the right governance in the delivery pipelines, selection of tools and providing the right infrastructure increases the value delivery. Moreover, the constraints for value addition are poor interconnection in the internal processes, rigid functional hierarchies, inaccurate selections and uses of tools, inflexible team arrangements and inadequate focus for the technology infrastructure. The findings add to the existing body of knowledge on increasing the value addition by employing effective processes, practices and tools and the impacts of inaccurate applications the same in the global software engineering industry
    • …
    corecore