10,604 research outputs found

    Statistical Methodologies of Functional Data Analysis for Industrial Applications

    Get PDF
    This thesis stands as one of the first attempt to connect the statistical object oriented data analysis (OODA) methodologies with the industry field. Indeed, the aim of this thesis is to develop statistical methods to tackle industrial problems through the paradigm of the OODA. The new framework of Industry 4.0 requires factories that are equipped with sensor and advanced acquisition systems that acquire data with a high degree of complexity. OODA can be particularly suitable to deal with this increasing complexity as it considers each statistical unit as an atom or a data object assumed to be a point in a well-defined mathematical space. This idea allows one to deal with complex data structure by changing the resolution of the analysis. Indeed, from standard methods where the atom is represented by vector of numbers, the focus now is on methodologies where the objects of the analysis are whole complex objects. In particular, this thesis focuses on functional data analysis (FDA), a branch of OODA that considers as the atom of the analysis functions defined on compact domains. The cross-fertilization of FDA methods to industrial applications is developed into three parts in this dissertation. The first part presents methodologies developed to solve specific applicative problems. In particular, a first consistent portion of this part is focused on \textit{profile monitoring} methods applied to ship CO\textsubscript{2} emissions. A second portion deals with the problem of predicting the mechanical properties of an additively manufactured artifact given the particle size distribution of the powder used for its production. And, a third portion copes with the cluster analysis for the quality assessment of metal sheet spot welds in the automotive industry based on observations of dynamic resistance curve. Stimulated by these challenges, the second part of this dissertation turns towards a more methodological line that addresses the notion of \textit{interpretability} for functional data. In particular, two new interpretable estimators of the coefficient function of the function-on-function linear regression model are proposed, which are named S-LASSO and AdaSS, respectively. Moreover, a new method, referred to as SaS-Funclust, is presented for sparse clustering of functional data that aims to classify a sample of curves into homogeneous groups while jointly detecting the most informative portions of domain. In the last part, two ongoing researches on FDA methods for industrial application are presented. In particular, the first one regards the definition of a new robust nonparametric functional ANOVA method (Ro-FANOVA) to test differences among group functional means by being robust against the presence of outliers with an application to additive manufacturing. The second one sketches a new methodological framework for the real-time profile monitoring

    RRS James Cook Cruise 30, 26 Dec 2008-30 Jan 2009. Antarctic Deep Water Rates of Export (ANDREX)

    Get PDF
    This report describes scientific activities on RRS James Cook cruise 30, “ANDREX”, westwards from 30°E and in the vicinity of latitude 60°S, between late December 2008 and late January 2009. The cruise was terminated about halfway through by a medical emergency. Hydrographic work comprised 27 CTD/LADCP stations. Water samples were captured for measurement of salinity, dissolved oxygen, inorganic nutrients, oxygen isotope fraction, chlorofluorocarbons and sulphur hexafluoride, dissolved inorganic carbon and alkalinity, helium / tritium / noble gases and radiocarbon. Underway measurements comprised navigation, currents (ADCP), meteorology, and sea surface temperature and salinity. The remainder of the hydrographic section was executed a year later on RRS James Clark Ross, cruise JR239

    Deducing water parameters in rivers via statistical modelling

    Get PDF
    Advanced monitoring of water quality in order to perform a real-time hazard analysis prior to Water Treatment Works (WTW) is more of a necessity nowadays, both to give warning of any contamination and also to avoid downtime of the WTW. Downtimes could be a major contributor to risk. Any serious accident will cause a significant loss in customer and investor confidence. This has challenged the industry to become more efficient, integrated and attractive, with benefits for its workforce and society as a whole. The reality is that water companies are not yet prepared to invest heavily in trials, before another company announces its success in implementing a new monitoring strategy. This has slowed down the development of the water industry. This research has taken the theoretical idea that the use of advanced online monitoring technique in the water industry would be beneficial and a step further; demonstrating by means of a state-of-the-art assessment, usability trials, case studies and demonstration that the barriers to mainstream adoption can be overcome. The findings of this work have been presented in four peer-reviewed papers. The research undertaken has shown that Turbidity levels in rivers can be measured from the rivers’ mean flow rate, using either Doppler Ultrasound device for real-time readings or based on past performance history. In both cases, the Turbidity level can also help estimate both the Colour and Conductivity levels of the subject river. Recalibration of the equations used is a prerequisite as each individual river has its own unique “finger print”

    Data analysis and data assimilation of Arctic Ocean observations

    Get PDF
    Thesis (Ph.D.) University of Alaska Fairbanks, 2019Arctic-region observations are sparse and represent only a small portion of the physical state of nature. It is therefore essential to maximize the information content of observations and bservation-conditioned analyses whenever possible, including the quantification of their accuracy. The four largely disparate works presented here emphasize observation analysis and assimilation in the context of the Arctic Ocean (AO). These studies focus on the relationship between observational data/products, numerical models based on physical processes, and the use of such data to constrain and inform those products/models to di_erent ends. The first part comprises Chapters 1 and 2 which revolve around oceanographic observations collected during the International Polar Year (IPY) program of 2007-2009. Chapter 1 validates pan- Arctic satellite-based sea surface temperature and salinity products against these data to establish important estimates of product reliability in terms of bias and bias-adjusted standard errors. It establishes practical regional reliability for these products which are often used in modeling and climatological applications, and provides some guidance for improving them. Chapter 2 constructs a gridded full-depth snapshot of the AO during the IPY to visually outline recent, previouslydocumented AO watermass distribution changes by comparing it to a historical climatology of the latter 20th century derived from private Russian data. It provides an expository review of literature documenting major AO climate changes and augments them with additional changes in freshwater distribution and sea surface height in the Chukchi and Bering Seas. The last two chapters present work focused on the application of data assimilation (DA) methodologies, and constitute the second part of this thesis focused on the synthesis of numerical modeling and observational data. Chapter 3 presents a novel approach to sea ice model trajectory optimization whereby spatially-variable sea ice rheology parameter distributions provide the additional model flexibility needed to assimilate observable components of the sea ice state. The study employs a toy 1D model to demonstrate the practical benefits of the approach and serves as a proof-of-concept to justify the considerable effort needed to extend the approach to 2D. Chapter 4 combines an ice-free model of the Chukchi Sea with a modified ensemble filter to develop a DA system which would be suitable for operational forecasting and monitoring the region in support of oil spill mitigation. The method improves the assimilation of non-Gaussian asynchronous surface current observations beyond the traditional approach.Chapter 1: Sea-surface temperature and salinity product comparison against external in situ data in the Arctic Ocean -- Chapter 2: Changes in Arctic Ocean climate evinced through analysis of IPY 2007-2008 oceanographic observations -- Chapter 3: Toward optimization of rheology in sea ice models through data assimilation -- Chapter 4: Ensemble-transform filtering of HFR & ADCP velocities in the Chukchi Sea -- General conclusion

    Environmental factors controlling lake diatom communities: a meta-analysis of published data

    Get PDF
    p. 15889, 15909Diatoms play a key role in the development of quantitative methods for environmental reconstruction in lake ecosystems. Diatom-based calibration datasets developed dur-ing the last decades allow the inference of past limnological variables such as TP, pH or conductivity and provide information on the autecology and distribution of diatom taxa. However, little is known about the relationships between diatoms and climatic or geographic factors. The response of surface sediment diatom assemblages to abi-otic factors is usually examined using canonical correspondence analysis (CCA) and subsequent forward selection of variables based on Monte Carlo permutation tests that show the set of predictors best explaining the distributions of diatom species. The results reported in 40 previous studies using this methodology in different regions of the world are re-analyzed in this paper. Bi- and multivariate statistics (canonical cor-relation and two-block partial least-squares) were used to explore the correspondence between physical, chemical and physiographical factors and the variables that explain most of the variance in the diatom datasets. Results show that diatom communities respond mainly to chemical variables (pH, nutrients) with lake depth being the most important physiographical factor. However, the relative importance of certain param-eters varied along latitudinal and trophic gradients. Canonical analyses demonstrated a strong concordance with regard to the predictor variables and the amount of variance they captured, suggesting that, on a broad scale, lake diatoms give a robust indication of past and present environmental conditions.S

    Sensitivity Study on Canadian Air Quality Measurements from Geostationary Orbit

    Get PDF
    Tropospheric Emissions: Monitoring of Pollution (TEMPO) is a satellite-based remote sensing air quality instrument destined for geostationary orbit over North America beginning in 2022. TEMPO will take hourly measurements with unprecedented resolution which will greatly benefit air quality forecasting, monitoring of emission sources, and health impact studies related to air quality. The field of regard of TEMPO contains a significant portion of Canada, including regions of particular interest such as major population centers and the Alberta oil sands. However, the standard retrieval algorithms that will be used to process TEMPO data do not explicitly account for some of the challenges that exist for measurements over Canada, such as pervasive snow cover, shallow lines of sight, and limited daylight hours. With the ultimate goal of creating new or optimized algorithms that address these challenges and allow Canada to take full advantage of TEMPO, standard retrieval algorithms for nitrogen dioxide and ozone have been replicated and studied. These algorithms use differential optical absorption spectroscopy (DOAS), the technique that will be used to create the standard TEMPO products, and they will serve as a baseline for comparison with future algorithms. The SASKTRAN radiative transfer framework, developed at the University of Saskatchewan, has been utilized to calculate air mass factors, key quantities in the DOAS-style retrieval, using three complementary methods which are all in agreement with each other. End-to-end retrievals modelled after cutting-edge algorithms used by modern instruments have been implemented, and they have been used to conduct a preliminary sensitivity study that quantifies the major sources of uncertainty in DOAS retrievals using synthetic TEMPO measurements

    Information systems for regional development planning

    Get PDF

    Chemometrics Methods Applied to Non-Selective Signals in Order to Address Mainly Food, Industrial and Environmental Problems

    Get PDF
    Chemometrics is a chemical discipline that uses mathematical and statistical methods in order to extract useful information from multivariate chemical data. Moreover, chemometrics is applied to correlate quality parameters or physical properties to analytical instrument data such as calculating pH from a measurement of hydrogen ion activity or a Fourier transform interpolation of a spectrum. Aim of this thesis project is to develop chemometrical strategies for the elaboration and the interpretation of non-selective complex data in order to solve real problems in food, industry and environmental fields
    • …
    corecore