1,769 research outputs found

    Advanced glycation end products and age-related diseases in the general population

    Get PDF
    In this thesis, epidemiological, nutritional, and gut microbiome related studies are presented to illustrate the relation of advanced glycation end products (AGEs) with age-related diseases. The studies are embedded in the Rotterdam Study, a cohort of the Dutch general population of middle-aged and elderly adults. The amount of skin AGEs measured as SAF was used as a representative of the long-term AGE burden. Chapter 1 gives an overview of the whole thesis (Section 1.1) and gives a brief introduction to AGEs and their implications in disease pathophysiology. Chapter 2 focuses on the interplay of AGEs in the skin and clinical and lifestyle factors, and Chapter 3 concerns the link of skin and dietary AGEs with age-related diseases. Chapter 4 discusses the interpretations and implications of the findings, major methodological considerations, and pressing questions for future research

    Advanced glycation end products and age-related diseases in the general population

    Get PDF
    In this thesis, epidemiological, nutritional, and gut microbiome related studies are presented to illustrate the relation of advanced glycation end products (AGEs) with age-related diseases. The studies are embedded in the Rotterdam Study, a cohort of the Dutch general population of middle-aged and elderly adults. The amount of skin AGEs measured as SAF was used as a representative of the long-term AGE burden. Chapter 1 gives an overview of the whole thesis (Section 1.1) and gives a brief introduction to AGEs and their implications in disease pathophysiology. Chapter 2 focuses on the interplay of AGEs in the skin and clinical and lifestyle factors, and Chapter 3 concerns the link of skin and dietary AGEs with age-related diseases. Chapter 4 discusses the interpretations and implications of the findings, major methodological considerations, and pressing questions for future research

    Tools for efficient Deep Learning

    Get PDF
    In the era of Deep Learning (DL), there is a fast-growing demand for building and deploying Deep Neural Networks (DNNs) on various platforms. This thesis proposes five tools to address the challenges for designing DNNs that are efficient in time, in resources and in power consumption. We first present Aegis and SPGC to address the challenges in improving the memory efficiency of DL training and inference. Aegis makes mixed precision training (MPT) stabler by layer-wise gradient scaling. Empirical experiments show that Aegis can improve MPT accuracy by at most 4\%. SPGC focuses on structured pruning: replacing standard convolution with group convolution (GConv) to avoid irregular sparsity. SPGC formulates GConv pruning as a channel permutation problem and proposes a novel heuristic polynomial-time algorithm. Common DNNs pruned by SPGC have maximally 1\% higher accuracy than prior work. This thesis also addresses the challenges lying in the gap between DNN descriptions and executables by Polygeist for software and POLSCA for hardware. Many novel techniques, e.g. statement splitting and memory partitioning, are explored and used to expand polyhedral optimisation. Polygeist can speed up software execution in sequential and parallel by 2.53 and 9.47 times on Polybench/C. POLSCA achieves 1.5 times speedup over hardware designs directly generated from high-level synthesis on Polybench/C. Moreover, this thesis presents Deacon, a framework that generates FPGA-based DNN accelerators of streaming architectures with advanced pipelining techniques to address the challenges from heterogeneous convolution and residual connections. Deacon provides fine-grained pipelining, graph-level optimisation, and heuristic exploration by graph colouring. Compared with prior designs, Deacon shows resource/power consumption efficiency improvement of 1.2x/3.5x for MobileNets and 1.0x/2.8x for SqueezeNets. All these tools are open source, some of which have already gained public engagement. We believe they can make efficient deep learning applications easier to build and deploy.Open Acces

    Explainable temporal data mining techniques to support the prediction task in Medicine

    Get PDF
    In the last decades, the increasing amount of data available in all fields raises the necessity to discover new knowledge and explain the hidden information found. On one hand, the rapid increase of interest in, and use of, artificial intelligence (AI) in computer applications has raised a parallel concern about its ability (or lack thereof) to provide understandable, or explainable, results to users. In the biomedical informatics and computer science communities, there is considerable discussion about the `` un-explainable" nature of artificial intelligence, where often algorithms and systems leave users, and even developers, in the dark with respect to how results were obtained. Especially in the biomedical context, the necessity to explain an artificial intelligence system result is legitimate of the importance of patient safety. On the other hand, current database systems enable us to store huge quantities of data. Their analysis through data mining techniques provides the possibility to extract relevant knowledge and useful hidden information. Relationships and patterns within these data could provide new medical knowledge. The analysis of such healthcare/medical data collections could greatly help to observe the health conditions of the population and extract useful information that can be exploited in the assessment of healthcare/medical processes. Particularly, the prediction of medical events is essential for preventing disease, understanding disease mechanisms, and increasing patient quality of care. In this context, an important aspect is to verify whether the database content supports the capability of predicting future events. In this thesis, we start addressing the problem of explainability, discussing some of the most significant challenges need to be addressed with scientific and engineering rigor in a variety of biomedical domains. We analyze the ``temporal component" of explainability, focusing on detailing different perspectives such as: the use of temporal data, the temporal task, the temporal reasoning, and the dynamics of explainability in respect to the user perspective and to knowledge. Starting from this panorama, we focus our attention on two different temporal data mining techniques. The first one, based on trend abstractions, starting from the concept of Trend-Event Pattern and moving through the concept of prediction, we propose a new kind of predictive temporal patterns, namely Predictive Trend-Event Patterns (PTE-Ps). The framework aims to combine complex temporal features to extract a compact and non-redundant predictive set of patterns composed by such temporal features. The second one, based on functional dependencies, we propose a methodology for deriving a new kind of approximate temporal functional dependencies, called Approximate Predictive Functional Dependencies (APFDs), based on a three-window framework. We then discuss the concept of approximation, the data complexity of deriving an APFD, the introduction of two new error measures, and finally the quality of APFDs in terms of coverage and reliability. Exploiting these methodologies, we analyze intensive care unit data from the MIMIC dataset

    Re-evaluation of the risks to public health related to the presence of bisphenol A (BPA) in foodstuffs

    Get PDF
    Publisher Copyright: © 2023 European Food Safety Authority. EFSA Journal published by Wiley-VCH GmbH on behalf of European Food Safety Authority.In 2015, EFSA established a temporary tolerable daily intake (t-TDI) for BPA of 4 μg/kg body weight (bw) per day. In 2016, the European Commission mandated EFSA to re-evaluate the risks to public health from the presence of BPA in foodstuffs and to establish a tolerable daily intake (TDI). For this re-evaluation, a pre-established protocol was used that had undergone public consultation. The CEP Panel concluded that it is Unlikely to Very Unlikely that BPA presents a genotoxic hazard through a direct mechanism. Taking into consideration the evidence from animal data and support from human observational studies, the immune system was identified as most sensitive to BPA exposure. An effect on Th17 cells in mice was identified as the critical effect; these cells are pivotal in cellular immune mechanisms and involved in the development of inflammatory conditions, including autoimmunity and lung inflammation. A reference point (RP) of 8.2 ng/kg bw per day, expressed as human equivalent dose, was identified for the critical effect. Uncertainty analysis assessed a probability of 57–73% that the lowest estimated Benchmark Dose (BMD) for other health effects was below the RP based on Th17 cells. In view of this, the CEP Panel judged that an additional uncertainty factor (UF) of 2 was needed for establishing the TDI. Applying an overall UF of 50 to the RP, a TDI of 0.2 ng BPA/kg bw per day was established. Comparison of this TDI with the dietary exposure estimates from the 2015 EFSA opinion showed that both the mean and the 95th percentile dietary exposures in all age groups exceeded the TDI by two to three orders of magnitude. Even considering the uncertainty in the exposure assessment, the exceedance being so large, the CEP Panel concluded that there is a health concern from dietary BPA exposure.Peer reviewe

    University of Windsor Graduate Calendar 2023 Spring

    Get PDF
    https://scholar.uwindsor.ca/universitywindsorgraduatecalendars/1027/thumbnail.jp

    Implementation of a real time Hough transform using FPGA technology

    Get PDF
    This thesis is concerned with the modelling, design and implementation of efficient architectures for performing the Hough Transform (HT) on mega-pixel resolution real-time images using Field Programmable Gate Array (FPGA) technology. Although the HT has been around for many years and a number of algorithms have been developed it still remains a significant bottleneck in many image processing applications. Even though, the basic idea of the HT is to locate curves in an image that can be parameterized: e.g. straight lines, polynomials or circles, in a suitable parameter space, the research presented in this thesis will focus only on location of straight lines on binary images. The HT algorithm uses an accumulator array (accumulator bins) to detect the existence of a straight line on an image. As the image needs to be binarized, a novel generic synchronization circuit for windowing operations was designed to perform edge detection. An edge detection method of special interest, the canny method, is used and the design and implementation of it in hardware is achieved in this thesis. As each image pixel can be implemented independently, parallel processing can be performed. However, the main disadvantage of the HT is the large storage and computational requirements. This thesis presents new and state-of-the-art hardware implementations for the minimization of the computational cost, using the Hybrid-Logarithmic Number System (Hybrid-LNS) for calculating the HT for fixed bit-width architectures. It is shown that using the Hybrid-LNS the computational cost is minimized, while the precision of the HT algorithm is maintained. Advances in FPGA technology now make it possible to implement functions as the HT in reconfigurable fabrics. Methods for storing large arrays on FPGA’s are presented, where data from a 1024 x 1024 pixel camera at a rate of up to 25 frames per second are processed

    University of Windsor Graduate Calendar 2023 Winter

    Get PDF
    https://scholar.uwindsor.ca/universitywindsorgraduatecalendars/1026/thumbnail.jp
    • …
    corecore