3,634 research outputs found

    Effect of Extrusion Parameters on Properties of Powder Coatings Determined by Infrared Spectroscopy

    Get PDF
    In polymer extrusion, compounding is a continuous mixing process that is also used to produce highly reactive powder coatings. A premixed batch of powder coating is added to the feeding section and extruded, preferably by a co-rotating twin-screw extruder. One essential parameter in the processing of highly reactive materials is the melt temperature: If it is too high, pre-reactions occur during the extrusion process, which may cause high rejection rates. We studied the melt temperature of an epoxy/carboxyl-based powder coating using a retractable thermocouple at 3 different axial positions along the barrel of a ZSK34 co-rotating twin-screw extruder. The influence of different processing conditions on the reactivity of a highly reactive powder coating was examined by infrared spectroscopy and differential scanning calorimetry. Furthermore, the specific energy input and the color change in the finished powder coating at different processing points were investigated. Multivariate data analysis was used to correlate mid-infrared spectra, melt temperatures, specific energy inputs, enthalpies of reaction and changes in color

    Afterburner Performance of Circular V-Gutters and a Sector of Parallel V-Gutters for a Range of Inlet Temperatures to 1255 K (1800 F)

    Get PDF
    Combustion tests of two V-gutter types were conducted in a 19.25-in. diameter duct using vitiated air. Fuel spraybars were mounted in line with the V-gutters. Combustor length was set by flame-quench water sprays which were part of a calorimeter for measuring combustion efficiency. Although the levels of performance of the parallel and circular array afterburners were different, the trends with geometry variations were consistent. Therefore, parallel arrays can be used for evaluating V-gutter geometry effects on combustion performance. For both arrays, the highest inlet temperature produced combustion efficiencies near 100 percent. A 5-in. spraybar - to - V-gutter spacing gave higher efficiency and better lean blowout performance than a spacing twice as large. Gutter durability was good

    Quantum Optical Experiments Modeled by Long Short-Term Memory

    Get PDF
    We demonstrate how machine learning is able to model experiments in quantum physics. Quantum entanglement is a cornerstone for upcoming quantum technologies such as quantum computation and quantum cryptography. Of particular interest are complex quantum states with more than two particles and a large number of entangled quantum levels. Given such a multiparticle high-dimensional quantum state, it is usually impossible to reconstruct an experimental setup that produces it. To search for interesting experiments, one thus has to randomly create millions of setups on a computer and calculate the respective output states. In this work, we show that machine learning models can provide significant improvement over random search. We demonstrate that a long short-term memory (LSTM) neural network can successfully learn to model quantum experiments by correctly predicting output state characteristics for given setups without the necessity of computing the states themselves. This approach not only allows for faster search but is also an essential step towards automated design of multiparticle high-dimensional quantum experiments using generative machine learning models

    Improved Quantification of Important Beer Quality Parameters based on Non-linear Calibration Methods applied to FT-MIR Spectra

    Get PDF
    During the production process of beer, it is of utmost importance to guarantee a high consistency of the beer quality. For instance, the bitterness is an essential quality parameter which has to be controlled within the specifications already at the beginning of the production process in the unfermented beer (wort) as well as in final products such as beer and beer mix beverages. Nowadays, analytical techniques for quality control in beer production are mainly based on manual supervision, i.e. samples are taken from the process and analyzed in the laboratory. This typically requires significant lab technicians efforts for only a small fraction of samples to be analyzed, which leads to significant costs for beer breweries and companies. Fourier transform mid-infrared (FT-MIR) spectroscopy was used in combination with non-linear multivariate calibration techniques to overcome (i) the time consuming off-line analyses in beer production and (ii) already known limitations of standard linear chemometric methods , like partial least squares (PLS), for important quality parameters [1][2] such as bitterness, citric acid, total acids, free amino nitrogen, final attenuation or foam stability. The calibration models are established with enhanced non-linear techniques based (i) on a new piece-wise linear version of PLS by employing fuzzy rules for local partitioning the latent variable space and (ii) on extensions of support vector regression variants (ε-PLSSVR and ν-PLSSVR), for overcoming high computation times in high-dimensional problems and time-intensive and inappropriate settings of the kernel parameters. Furthermore, we introduce a new model selection scheme based on bagged ensembles in order to improve robustness and thus predictive quality of the final models. The approaches are tested on real-world calibration data sets for wort and beer mix beverages, and successfully compared to linear methods, as showing a clear out-performance in most cases and being able to meet the model quality requirements defined by the experts at the beer company

    Smart Cities and Cyber Security: Are We There Yet? A Comparative Study on the Role of Standards, Third Party Risk Management and Security Ownership

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.Smart cities have brought a variety of benefits aiming to revolutionise people’s lives. Those include but are not limited to, increasing economic e ciency, reducing cost and decreasing environmental output. However, the smart city itself is still in its infancy. As it heavily relies on technologies, it opens up doors to cyber attackers and criminals, which can lead to significant losses. An outstanding problem concerns the social and organisational aspects of smart cities security resulting from competing interests of di event parties, high levels of interdependence, and social and political complexity. Our review shows that current standards and guidelines have not clearly defined roles and responsibilities of di erent parties. A common understanding of key security requirements is not shared between di erent parties. This research assessed the smart cities and their cyber security measures, with a particular focus on technical standards and the regulatory framework. It comprehensively reviewed 93 security standards and guidance. It then performed a comparative case study of Barcelona, Singapore and London smart cities on their governance models, security measures, technical standards and third party management. Based on the review and the case study, this research concluded on a recommended framework encompassing technical standards, governance input, regulatory framework and compliance assurance to ensure that security is observed at all layers of the smart cities

    Calibration Model Maintenance in Melamine Resin Production: Integrating Drift Detection, Smart Sample Selection and Model Adaptation

    Get PDF
    The physico-chemical properties of Melamine Formaldehyde (MF) based thermosets are largely influenced by the degree of polymerization (DP) in the underlying resin. On-line supervision of the turbidity point by means of vibrational spectroscopy has recently emerged as a promising technique to monitor the DP of MF resins. However, spectroscopic determination of the DP relies on chemometric models, which are usually sensitive to drifts caused by instrumental and/or sample associated changes occurring over time. In order to detect the time point when drifts start causing prediction bias, we here explore a universal drift detector based on a faded version of the Page-Hinkley (PH) statistic, which we test in three data streams from an industrial MF resin production process. We employ committee disagreement (CD), computed as the variance of model predictions from an ensemble of partial least squares (PLS) models, as a measure for sample-wise prediction uncertainty and use the PH statistic to detect hanges in this quantity. We further explore supervised and unsupervised strategies for (semi-)automatic model adaptation upon detection of a drift. For the former, manual reference measurements are requested whenever statistical thresholds on Hotelling’s T2T^2 and/or Q-Residuals are violated. Models are subsequently re-calibrated using weighted partial least squares in order to increase the influence of newer samples, which increases the flexibility when adapting to new (drifted) states. Unsupervised model adaptation is carried out exploiting the dual antecedent-consequent structure of a recently developed fuzzy systems variant of PLS termed FLEXFIS-PLS. In particular, antecedent parts are updated while maintaining the internal structure of the local linear predictors (i.e. the consequents). We found improved drift detection capability of the CD compared to Hotelling’s T2T^2 and Q-Residuals when used in combination with the proposed PH test. Furthermore, we found that active selection of samples by active learning (AL) used for subsequent model adaptation is advantageous compared to passive (random) selection in case that a drift leads to persistent prediction bias allowing more rapid adaptation at lower reference measurement rates. Fully unsupervised adaptation using FLEXFIS-PLS could improve predictive accuracy significantly for light drifts but was not able to fully compensate for prediction bias in case of significant lack of fit w.r.t. the latent variable space

    Microscopic Analysis of Thermodynamic Parameters from 160 MeV/n - 160 GeV/n

    Get PDF
    Microscopic calculations of central collisions between heavy nuclei are used to study fragment production and the creation of collective flow. It is shown that the final phase space distributions are compatible with the expectations from a thermally equilibrated source, which in addition exhibits a collective transverse expansion. However, the microscopic analyses of the transient states in the reaction stages of highest density and during the expansion show that the system does not reach global equilibrium. Even if a considerable amount of equilibration is assumed, the connection of the measurable final state to the macroscopic parameters, e.g. the temperature, of the transient ''equilibrium'' state remains ambiguous.Comment: 13 pages, Latex, 8 postscript figures, Proceedings of the Winter Meeting in Nuclear Physics (1997), Bormio (Italy
    • …
    corecore