8,599 research outputs found

    Influence of the ratio on the mechanical properties of epoxy resin composite with diapers waste as fillers for partition panel application

    Get PDF
    Materials play significant role in the domestic economy and defense with the fast growth of science and technology field. New materials are the core of fresh technologies and the three pillars of modern science and technology are materials science, power technology and data science. The prior properties of the partition panel by using recycled diapers waste depend on the origin of waste deposits and its chemical constituents. This study presents the influence of the ratio on the mechanical properties of polymer in diapers waste reinforced with binder matrix for partition panel application. The aim of this study was to investigate the influence of different ratio of diapers waste polymer reinforced epoxy-matrix with regards to mechanical properties and morphology analysis. The polymer includes polypropylene, polystyrene, polyethylene and superabsorbent polymer (SAP) were used as reinforcing material. The tensile and bending resistance for ratio of 0.4 diapers waste polymers indicated the optimum ratio for fabricating the partition panel. Samples with 0.4 ratios of diapers waste polymers have highest stiffness of elasticity reading with 76.06 MPa. A correlation between the micro structural analysis using scanning electron microscope (SEM) and the mechanical properties of the material has been discussed

    Multi-agent knowledge integration mechanism using particle swarm optimization

    Get PDF
    This is the post-print version of the final paper published in Technological Forecasting and Social Change. The published article is available from the link below. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. Copyright @ 2011 Elsevier B.V.Unstructured group decision-making is burdened with several central difficulties: unifying the knowledge of multiple experts in an unbiased manner and computational inefficiencies. In addition, a proper means of storing such unified knowledge for later use has not yet been established. Storage difficulties stem from of the integration of the logic underlying multiple experts' decision-making processes and the structured quantification of the impact of each opinion on the final product. To address these difficulties, this paper proposes a novel approach called the multiple agent-based knowledge integration mechanism (MAKIM), in which a fuzzy cognitive map (FCM) is used as a knowledge representation and storage vehicle. In this approach, we use particle swarm optimization (PSO) to adjust causal relationships and causality coefficients from the perspective of global optimization. Once an optimized FCM is constructed an agent based model (ABM) is applied to the inference of the FCM to solve real world problem. The final aggregate knowledge is stored in FCM form and is used to produce proper inference results for other target problems. To test the validity of our approach, we applied MAKIM to a real-world group decision-making problem, an IT project risk assessment, and found MAKIM to be statistically robust.Ministry of Education, Science and Technology (Korea

    How active perception and attractor dynamics shape perceptual categorization: A computational model

    Get PDF
    We propose a computational model of perceptual categorization that fuses elements of grounded and sensorimotor theories of cognition with dynamic models of decision-making. We assume that category information consists in anticipated patterns of agent–environment interactions that can be elicited through overt or covert (simulated) eye movements, object manipulation, etc. This information is firstly encoded when category information is acquired, and then re-enacted during perceptual categorization. The perceptual categorization consists in a dynamic competition between attractors that encode the sensorimotor patterns typical of each category; action prediction success counts as ‘‘evidence’’ for a given category and contributes to falling into the corresponding attractor. The evidence accumulation process is guided by an active perception loop, and the active exploration of objects (e.g., visual exploration) aims at eliciting expected sensorimotor patterns that count as evidence for the object category. We present a computational model incorporating these elements and describing action prediction, active perception, and attractor dynamics as key elements of perceptual categorizations. We test the model in three simulated perceptual categorization tasks, and we discuss its relevance for grounded and sensorimotor theories of cognition.Peer reviewe

    Auditing Symposium XIII: Proceedings of the 1996 Deloitte & Touche/University of Kansas Symposium on Auditing Problems

    Get PDF
    Meeting the challenge of technological change -- A standard setter\u27s perspective / James M. Sylph, Gregory P. Shields; Technological change -- A glass half empty or a glass half full: Discussion of Meeting the challenge of technological change, and Business and auditing impacts of new technologies / Urton Anderson; Opportunities for assurance services in the 21st century: A progress report of the Special Committee on Assurance Services / Richard Lea; Model of errors and irregularities as a general framework for risk-based audit planning / Jere R. Francis, Richard A. Grimlund; Discussion of A Model of errors and irregularities as a general framework for risk-based audit planning / Timothy B. Bell; Framing effects and output interference in a concurring partner review context: Theory and exploratory analysis / Karla M. Johnstone, Stanley F. Biggs, Jean C. Bedard; Discussant\u27s comments on Framing effects and output interference in a concurring partner review context: Theory and exploratory analysis / David Plumlee; Implementation and acceptance of expert systems by auditors / Maureen McGowan; Discussion of Opportunities for assurance services in the 21st century: A progress report of the Special Committee on Assurance Services / Katherine Schipper; CPAS/CCM experiences: Perspectives for AI/ES research in accounting / Miklos A. Vasarhelyi; Discussant comments on The CPAS/CCM experiences: Perspectives for AI/ES research in accounting / Eric Denna; Digital analysis and the reduction of auditor litigation risk / Mark Nigrini; Discussion of Digital analysis and the reduction of auditor litigation risk / James E. Searing; Institute of Internal Auditors: Business and auditing impacts of new technologies / Charles H. Le Grandhttps://egrove.olemiss.edu/dl_proceedings/1012/thumbnail.jp

    New Trends in the Use of Artificial Intelligence for the Industry 4.0

    Get PDF
    Industry 4.0 is based on the cyber-physical transformation of processes, systems and methods applied in the manufacturing sector, and on its autonomous and decentralized operation. Industry 4.0 reflects that the industrial world is at the beginning of the so-called Fourth Industrial Revolution, characterized by a massive interconnection of assets and the integration of human operators with the manufacturing environment. In this regard, data analytics and, specifically, the artificial intelligence is the vehicular technology towards the next generation of smart factories.Chapters in this book cover a diversity of current and new developments in the use of artificial intelligence on the industrial sector seen from the fourth industrial revolution point of view, namely, cyber-physical applications, artificial intelligence technologies and tools, Industrial Internet of Things and data analytics. This book contains high-quality chapters containing original research results and literature review of exceptional merit. Thus, it is in the aim of the book to contribute to the literature of the topic in this regard and let the readers know current and new trends in the use of artificial intelligence for the Industry 4.0

    Data Mining in Smart Grids

    Get PDF
    Effective smart grid operation requires rapid decisions in a data-rich, but information-limited, environment. In this context, grid sensor data-streaming cannot provide the system operators with the necessary information to act on in the time frames necessary to minimize the impact of the disturbances. Even if there are fast models that can convert the data into information, the smart grid operator must deal with the challenge of not having a full understanding of the context of the information, and, therefore, the information content cannot be used with any high degree of confidence. To address this issue, data mining has been recognized as the most promising enabling technology for improving decision-making processes, providing the right information at the right moment to the right decision-maker. This Special Issue is focused on emerging methodologies for data mining in smart grids. In this area, it addresses many relevant topics, ranging from methods for uncertainty management, to advanced dispatching. This Special Issue not only focuses on methodological breakthroughs and roadmaps in implementing the methodology, but also presents the much-needed sharing of the best practices. Topics include, but are not limited to, the following: Fuzziness in smart grids computing Emerging techniques for renewable energy forecasting Robust and proactive solution of optimal smart grids operation Fuzzy-based smart grids monitoring and control frameworks Granular computing for uncertainty management in smart grids Self-organizing and decentralized paradigms for information processin

    Machine learning-based automated segmentation with a feedback loop for 3D synchrotron micro-CT

    Get PDF
    Die Entwicklung von Synchrotronlichtquellen der dritten Generation hat die Grundlage für die Untersuchung der 3D-Struktur opaker Proben mit einer Auflösung im Mikrometerbereich und höher geschaffen. Dies führte zur Entwicklung der Röntgen-Synchrotron-Mikro-Computertomographie, welche die Schaffung von Bildgebungseinrichtungen zur Untersuchung von Proben verschiedenster Art förderte, z.B. von Modellorganismen, um die Physiologie komplexer lebender Systeme besser zu verstehen. Die Entwicklung moderner Steuerungssysteme und Robotik ermöglichte die vollständige Automatisierung der Röntgenbildgebungsexperimente und die Kalibrierung der Parameter des Versuchsaufbaus während des Betriebs. Die Weiterentwicklung der digitalen Detektorsysteme führte zu Verbesserungen der Auflösung, des Dynamikbereichs, der Empfindlichkeit und anderer wesentlicher Eigenschaften. Diese Verbesserungen führten zu einer beträchtlichen Steigerung des Durchsatzes des Bildgebungsprozesses, aber auf der anderen Seite begannen die Experimente eine wesentlich größere Datenmenge von bis zu Dutzenden von Terabyte zu generieren, welche anschließend manuell verarbeitet wurden. Somit ebneten diese technischen Fortschritte den Weg für die Durchführung effizienterer Hochdurchsatzexperimente zur Untersuchung einer großen Anzahl von Proben, welche Datensätze von besserer Qualität produzierten. In der wissenschaftlichen Gemeinschaft besteht daher ein hoher Bedarf an einem effizienten, automatisierten Workflow für die Röntgendatenanalyse, welcher eine solche Datenlast bewältigen und wertvolle Erkenntnisse für die Fachexperten liefern kann. Die bestehenden Lösungen für einen solchen Workflow sind nicht direkt auf Hochdurchsatzexperimente anwendbar, da sie für Ad-hoc-Szenarien im Bereich der medizinischen Bildgebung entwickelt wurden. Daher sind sie nicht für Hochdurchsatzdatenströme optimiert und auch nicht in der Lage, die hierarchische Beschaffenheit von Proben zu nutzen. Die wichtigsten Beiträge der vorliegenden Arbeit sind ein neuer automatisierter Analyse-Workflow, der für die effiziente Verarbeitung heterogener Röntgendatensätze hierarchischer Natur geeignet ist. Der entwickelte Workflow basiert auf verbesserten Methoden zur Datenvorverarbeitung, Registrierung, Lokalisierung und Segmentierung. Jede Phase eines Arbeitsablaufs, die eine Trainingsphase beinhaltet, kann automatisch feinabgestimmt werden, um die besten Hyperparameter für den spezifischen Datensatz zu finden. Für die Analyse von Faserstrukturen in Proben wurde eine neue, hochgradig parallelisierbare 3D-Orientierungsanalysemethode entwickelt, die auf einem neuartigen Konzept der emittierenden Strahlen basiert und eine präzisere morphologische Analyse ermöglicht. Alle entwickelten Methoden wurden gründlich an synthetischen Datensätzen validiert, um ihre Anwendbarkeit unter verschiedenen Abbildungsbedingungen quantitativ zu bewerten. Es wurde gezeigt, dass der Workflow in der Lage ist, eine Reihe von Datensätzen ähnlicher Art zu verarbeiten. Darüber hinaus werden die effizienten CPU/GPU-Implementierungen des entwickelten Workflows und der Methoden vorgestellt und der Gemeinschaft als Module für die Sprache Python zur Verfügung gestellt. Der entwickelte automatisierte Analyse-Workflow wurde erfolgreich für Mikro-CT-Datensätze angewandt, die in Hochdurchsatzröntgenexperimenten im Bereich der Entwicklungsbiologie und Materialwissenschaft gewonnen wurden. Insbesondere wurde dieser Arbeitsablauf für die Analyse der Medaka-Fisch-Datensätze angewandt, was eine automatisierte Segmentierung und anschließende morphologische Analyse von Gehirn, Leber, Kopfnephronen und Herz ermöglichte. Darüber hinaus wurde die entwickelte Methode der 3D-Orientierungsanalyse bei der morphologischen Analyse von Polymergerüst-Datensätzen eingesetzt, um einen Herstellungsprozess in Richtung wünschenswerter Eigenschaften zu lenken
    corecore