455 research outputs found

    Uncertainty and Interpretability Studies in Soft Computing with an Application to Complex Manufacturing Systems

    Get PDF
    In systems modelling and control theory, the benefits of applying neural networks have been extensively studied. Particularly in manufacturing processes, such as the prediction of mechanical properties of heat treated steels. However, modern industrial processes usually involve large amounts of data and a range of non-linear effects and interactions that might hinder their model interpretation. For example, in steel manufacturing the understanding of complex mechanisms that lead to the mechanical properties which are generated by the heat treatment process is vital. This knowledge is not available via numerical models, therefore an experienced metallurgist estimates the model parameters to obtain the required properties. This human knowledge and perception sometimes can be imprecise leading to a kind of cognitive uncertainty such as vagueness and ambiguity when making decisions. In system classification, this may be translated into a system deficiency - for example, small input changes in system attributes may result in a sudden and inappropriate change for class assignation. In order to address this issue, practitioners and researches have developed systems that are functional equivalent to fuzzy systems and neural networks. Such systems provide a morphology that mimics the human ability of reasoning via the qualitative aspects of fuzzy information rather by its quantitative analysis. Furthermore, these models are able to learn from data sets and to describe the associated interactions and non-linearities in the data. However, in a like-manner to neural networks, a neural fuzzy system may suffer from a lost of interpretability and transparency when making decisions. This is mainly due to the application of adaptive approaches for its parameter identification. Since the RBF-NN can be treated as a fuzzy inference engine, this thesis presents several methodologies that quantify different types of uncertainty and its influence on the model interpretability and transparency of the RBF-NN during its parameter identification. Particularly, three kind of uncertainty sources in relation to the RBF-NN are studied, namely: entropy, fuzziness and ambiguity. First, a methodology based on Granular Computing (GrC), neutrosophic sets and the RBF-NN is presented. The objective of this methodology is to quantify the hesitation produced during the granular compression at the low level of interpretability of the RBF-NN via the use of neutrosophic sets. This study also aims to enhance the disitnguishability and hence the transparency of the initial fuzzy partition. The effectiveness of the proposed methodology is tested against a real case study for the prediction of the properties of heat-treated steels. Secondly, a new Interval Type-2 Radial Basis Function Neural Network (IT2-RBF-NN) is introduced as a new modelling framework. The IT2-RBF-NN takes advantage of the functional equivalence between FLSs of type-1 and the RBF-NN so as to construct an Interval Type-2 Fuzzy Logic System (IT2-FLS) that is able to deal with linguistic uncertainty and perceptions in the RBF-NN rule base. This gave raise to different combinations when optimising the IT2-RBF-NN parameters. Finally, a twofold study for uncertainty assessment at the high-level of interpretability of the RBF-NN is provided. On the one hand, the first study proposes a new methodology to quantify the a) fuzziness and the b) ambiguity at each RU, and during the formation of the rule base via the use of neutrosophic sets theory. The aim of this methodology is to calculate the associated fuzziness of each rule and then the ambiguity related to each normalised consequence of the fuzzy rules that result from the overlapping and to the choice with one-to-many decisions respectively. On the other hand, a second study proposes a new methodology to quantify the entropy and the fuzziness that come out from the redundancy phenomenon during the parameter identification. To conclude this work, the experimental results obtained through the application of the proposed methodologies for modelling two well-known benchmark data sets and for the prediction of mechanical properties of heat-treated steels conducted to publication of three articles in two peer-reviewed journals and one international conference

    Quality by design approach for tablet formulations containing spray coated ramipril by using artificial intelligence techniques

    Get PDF
    Different software programs based on mathematical models have been developed to aid the product development process. Recent developments in mathematics and computer science have resulted in new programs based on artificial neural networks (ANN) techniques. These programs have been used to develop and formulate pharmaceutical products. In this study, intelligent software was used to predict the relationship between the materials that were used in tablet formulation and the tablet specifications and to determine highly detailed information about the interactions between the formulation parameters and the specifications. The input data were generated from historical data and the results obtained from analyzing tablets produced by different formulations. The relative significance of inputs on various outputs such as assay, dissolution in 30 min and crushing strengths, was investigated using the artificial neural networks (ANNs), neurofuzzy logic and genetic programming (FormRules, INForm ANN and GEP).This study indicated that ANN and GEP can be used effectively for optimizing formulations and that GEP can be evaluated statistically because of the openness of its equations. Additionally, FormRules was very helpful for teasing out the relationships between the inputs (formulation variables) and the outputs

    Quality by design approach for tablet formulations containing spray coated ramipril by using artificial intelligence techniques

    Get PDF
    Different software programs based on mathematical models have been developed to aid the product development process. Recent developments in mathematics and computer science have resulted in new programs based on artificial neural networks (ANN) techniques. These programs have been used to develop and formulate pharmaceutical products. In this study, intelligent software was used to predict the relationship between the materials that were used in tablet formulation and the tablet specifications and to determine highly detailed information about the interactions between the formulation parameters and the specifications. The input data were generated from historical data and the results obtained from analyzing tablets produced by different formulations. The relative significance of inputs on various outputs such as assay, dissolution in 30 min and crushing strengths, was investigated using the artificial neural networks (ANNs), neurofuzzy logic and genetic programming (FormRules, INForm ANN and GEP).This study indicated that ANN and GEP can be used effectively for optimizing formulations and that GEP can be evaluated statistically because of the openness of its equations. Additionally, FormRules was very helpful for teasing out the relationships between the inputs (formulation variables) and the outputs

    Smart process manufacturing for formulated products

    Get PDF
    We outline the smart manufacturing challenges for formulated products, which are typically multicomponent, structured, and multiphase. These challenges predominate in the food, pharmaceuticals, agricultural and specialty chemicals, energy storage and energetic materials, and consumer goods industries, and are driven by fast-changing customer demand and, in some cases, a tight regulatory framework. This paper discusses progress in smart manufacturing—namely, digitalization and the use of large datasets with predictive models and solution-finding algorithms—in these industries. While some progress has been achieved, there is a strong need for more demonstration of model-based tools on realistic problems in order to demonstrate their benefits and highlight any systemic weaknesses

    Mind the Gap: Developments in Autonomous Driving Research and the Sustainability Challenge

    Get PDF
    Scientific knowledge on autonomous-driving technology is expanding at a faster-than-ever pace. As a result, the likelihood of incurring information overload is particularly notable for researchers, who can struggle to overcome the gap between information processing requirements and information processing capacity. We address this issue by adopting a multi-granulation approach to latent knowledge discovery and synthesis in large-scale research domains. The proposed methodology combines citation-based community detection methods and topic modeling techniques to give a concise but comprehensive overview of how the autonomous vehicle (AV) research field is conceptually structured. Thirteen core thematic areas are extracted and presented by mining the large data-rich environments resulting from 50 years of AV research. The analysis demonstrates that this research field is strongly oriented towards examining the technological developments needed to enable the widespread rollout of AVs, whereas it largely overlooks the wide-ranging sustainability implications of this sociotechnical transition. On account of these findings, we call for a broader engagement of AV researchers with the sustainability concept and we invite them to increase their commitment to conducting systematic investigations into the sustainability of AV deployment. Sustainability research is urgently required to produce an evidence-based understanding of what new sociotechnical arrangements are needed to ensure that the systemic technological change introduced by AV-based transport systems can fulfill societal functions while meeting the urgent need for more sustainable transport solutions

    HIERARCHICAL-GRANULARITY HOLONIC MODELLING

    Get PDF
    This thesis aims to introduce an agent-based system engineering approach, named Hierarchical-Granularity Holonic Modelling, to support intelligent information processing at multiple granularity levels. The focus is especially on complex hierarchical systems. Nowadays, due to ever growing complexity of information systems and processes, there is an increasing need of a simple self-modular computational model able to manage data and perform information granulation at different resolutions (i.e., both spatial and temporal). The current literature lacks to provide such a methodology. To cite a relevant example, the object-oriented paradigm is suitable for describing a system at a given representation level; notwithstanding, further design effort is needed if a more synthetical of more analytical view of the same system is required. In the literature, the agent paradigm represents a viable solution in complex systems modelling; in particular, Multi-Agent Systems have been applied with success in a countless variety of distributed intelligence settings. Current agent-oriented implementations however suffer from an apparent dichotomy between agents as intelligent entities and agents\u2019 structures as superimposed hierarchies of roles within a given organization. The agents\u2019 architectures are often rigid and require intense re-engineering when the underpinning ontology is updated to cast new design criteria. The latest stage in the evolution of modelling frameworks is represented by Holonic Systems, based on the notion of \u2018holon\u2019 and \u2018holarchy\u2019 (i.e., hierarchy of holons). A holon, just like an agent, is an intelligent entity able to interact with the environment and to take decisions to solve a specific problem. Contrarily to agent, holon has the noteworthy property of playing the role of a whole and a part at the same time. This reflects at the organizational level: holarchy functions first as autonomous wholes in supra-ordination to their parts, secondly as dependent parts in sub-ordination to controls on higher levels, and thirdly in coordination with their local environment. These ideas were originally devised by Arthur Koestler in 1967. Since then, Holonic Systems have gained more and more credit in various fields such as Biology, Ecology, Theory of Emergence and Intelligent Manufacturing. Notwithstanding, with respect to these disciplines, fewer works on Holonic Systems can be found in the general framework of Artificial and Computational Intelligence. Moreover, the distance between theoretic models and actual implementation is still wide open. In this thesis, starting from the Koestler\u2019s original idea, we devise a novel agent-inspired model that merges intelligence with the holonic structure at multiple hierarchical-granularity levels. This is made possible thanks to a rule-based knowledge recursive representation, which allows the holonic agent to carry out both operating and learning tasks in a hierarchy of granularity levels. The proposed model can be directly used in terms of hardware/software applications. This endows systems and software engineers with a modular and scalable approach when dealing with complex hierarchical systems. In order to support our claims, exemplar experiments of our proposal are shown and prospective implications are commented

    Dynamic Optical Coherence Tomography in Dermatology

    Get PDF
    Optical coherence tomography (OCT) represents a non-invasive imaging technology, which may be applied to the diagnosis of non-melanoma skin cancer and which has recently been shown to improve the diagnostic accuracy of basal cell carcinoma. Technical developments of OCT continue to expand the applicability of OCT for different neoplastic and inflammatory skin diseases. Of these, dynamic OCT (D-OCT) based on speckle variance OCT is of special interest as it allows the in vivo evaluation of blood vessels and their distribution within specific lesions, providing additional functional information and consequently greater density of data. In an effort to assess the potential of D-OCT for future scientific and clinical studies, we have therefore reviewed the literature and preliminary unpublished data on the visualization of the microvasculature using D-OCT. Information on D-OCT in skin cancers including melanoma, as well as in a variety of other skin diseases, is presented in an atlas. Possible diagnostic features are suggested, although these require additional validation
    • 

    corecore