399 research outputs found

    Coherent and non-coherent processing of multiband radar sensor data

    Get PDF
    Increasing resolution is an attractive goal for all types of radar sensor applications. Obtaining high radar resolution is strongly related to the signal bandwidth which can be used. The currently available frequency bands however, restrict the available bandwidth and consequently the achievable range resolution. As nowadays more sensors become available e.g. on automotive platforms, methods of combining sensor information stemming from sensors operating in different and not necessarily overlapping frequency bands are of concern. It will be shown that it is possible to derive benefit from perceiving the same radar scenery with two or more sensors in distinct frequency bands. Beyond ordinary sensor fusion methods, radar information can be combined more effectively if one compensates for the lack of mutual coherence, thus taking advantage of phase information. <P> At high frequencies, complex scatterers can be approximately modeled as a group of single scattering centers with constant delay and slowly varying amplitude, i.e. a set of complex exponentials buried in noise. The eigenanalysis algorithms are well known for their capability to better resolve complex exponentials as compared to the classical spectral analysis methods. These methods exploit the statistical properties of those signals to estimate their frequencies. Here, two main approaches to extend the statistical analysis for the case of data collected at two different subbands are presented. One method relies on the use of the band gap information (and therefore, coherent data collection is needed) and achieves an increased resolution capability compared with the single-band case. On the other hand, the second approach does not use the band gap information and represents a robust way to process radar data collected with incoherent sensors. Combining the information obtained with these two approaches a robust estimator of the target locations with increased resolution can be built

    Lung Segmentation from Chest X-rays using Variational Data Imputation

    Full text link
    Pulmonary opacification is the inflammation in the lungs caused by many respiratory ailments, including the novel corona virus disease 2019 (COVID-19). Chest X-rays (CXRs) with such opacifications render regions of lungs imperceptible, making it difficult to perform automated image analysis on them. In this work, we focus on segmenting lungs from such abnormal CXRs as part of a pipeline aimed at automated risk scoring of COVID-19 from CXRs. We treat the high opacity regions as missing data and present a modified CNN-based image segmentation network that utilizes a deep generative model for data imputation. We train this model on normal CXRs with extensive data augmentation and demonstrate the usefulness of this model to extend to cases with extreme abnormalities.Comment: Accepted to be presented at the first Workshop on the Art of Learning with Missing Values (Artemiss) hosted by the 37th International Conference on Machine Learning (ICML). Source code, training data and the trained models are available here: https://github.com/raghavian/lungVAE

    From Euclidean Geometry to Knots and Nets

    Get PDF
    This document is the Accepted Manuscript of an article accepted for publication in Synthese. Under embargo until 19 September 2018. The final publication is available at Springer via https://doi.org/10.1007/s11229-017-1558-x.This paper assumes the success of arguments against the view that informal mathematical proofs secure rational conviction in virtue of their relations with corresponding formal derivations. This assumption entails a need for an alternative account of the logic of informal mathematical proofs. Following examination of case studies by Manders, De Toffoli and Giardino, Leitgeb, Feferman and others, this paper proposes a framework for analysing those informal proofs that appeal to the perception or modification of diagrams or to the inspection or imaginative manipulation of mental models of mathematical phenomena. Proofs relying on diagrams can be rigorous if (a) it is easy to draw a diagram that shares or otherwise indicates the structure of the mathematical object, (b) the information thus displayed is not metrical and (c) it is possible to put the inferences into systematic mathematical relation with other mathematical inferential practices. Proofs that appeal to mental models can be rigorous if the mental models can be externalised as diagrammatic practice that satisfies these three conditions.Peer reviewe

    The "Artificial Mathematician" Objection: Exploring the (Im)possibility of Automating Mathematical Understanding

    Get PDF
    Reuben Hersh confided to us that, about forty years ago, the late Paul Cohen predicted to him that at some unspecified point in the future, mathematicians would be replaced by computers. Rather than focus on computers replacing mathematicians, however, our aim is to consider the (im)possibility of human mathematicians being joined by “artificial mathematicians” in the proving practice—not just as a method of inquiry but as a fellow inquirer

    Suppression of grating lobes for MMW sparse array setups

    Get PDF
    Abstract. For arrays the placement of the single elements determines the angular resolution and the unambiguity interval. The width of the total array determines the resolution capabilities. The wider the elements are placed from each other, the more space in Fourier domain is covered by the measurement and the resolution in time domain will improve. On the other hand the density of the elements has an effect on the angular interval in which objects can be detected unambiguously. For objects within the unambiguous interval grating lobes will appear outside this area while objects outside result in grating lobes in the interval of interest. In this paper the properties of arrays regarding resolution and unambiguity interval will be discussed and methods for the suppression of ambiguous grating lobes are suggested. One approach to suppress the influence of the grating lobes lies in the evaluation of different frequency bands. </jats:p

    Relevant issues in tumor regression grading of histopathological response to neoadjuvant treatment in adenocarcinomas of the esophagus and gastroesophageal junction

    Get PDF
     Multimodality treatment combining surgery and oncologic treatment has become widely applied in curative treatment of esophageal and gastroesophageal junction adenocarcinoma. There is a need for a standardized tumor regression grade scoring system for clinically relevant effects of neoadjuvant treatment effects. There are numerous tumor regression grading systems in use and there is no international standardization. This review has found nine different international systems currently in use. These systems all differ in detail, which inhibits valid comparisons of results between studies. Tumor regression grading in esophageal and gastroesophageal junction adenocarcinoma needs to be improved and standardized. To achieve this goal, we have invited a significant group of international esophageal and gastroesophageal junction adenocarcinoma pathology experts to perform a structured review in the form of a Delphi process. The aims of the Delphi include specifying the details for the disposal of the surgical specimen and defining the details of, and the reporting from, the agreed histological tumor regression grade system including resected lymph nodes. The second step will be to perform a validation study of the agreed tumor regression grading system to ensure a scientifically robust inter- and intra-observer variability and to incorporate the consented tumor regression grading system in clinical studies to assess its predictive and prognostic role in treatment of esophageal and gastroesophageal junction adenocarcinomas. The ultimate aim of the project is to improve survival in esophageal and gastroesophageal adenocarcinoma by increasing the quality of tumor regression grading, which is a key component in treatment evaluation and future studies of individualized treatment of esophageal cancer.</p

    An Internet-Based Tool for Use in Assessing the Likely Effect of Intensification on Losses of Nitrogen to the Environment

    Get PDF
    The EU Nitrates, Habitat and National Emissions Ceilings directives and the Kyoto Agreement mean that agricultural losses of NO3, NH3 and N2O are under scrutiny by national and international environmental authorities. When farmers wish to intensify their operations, the authorities must then assess the likely environmental impact of the change in operation. The FARM-N internet tool was developed to help farmers and authorities agree how the farm will be structured and managed in the future, and to provide an objective assessment of the environmental losses that will result

    Recent progress in random metric theory and its applications to conditional risk measures

    Full text link
    The purpose of this paper is to give a selective survey on recent progress in random metric theory and its applications to conditional risk measures. This paper includes eight sections. Section 1 is a longer introduction, which gives a brief introduction to random metric theory, risk measures and conditional risk measures. Section 2 gives the central framework in random metric theory, topological structures, important examples, the notions of a random conjugate space and the Hahn-Banach theorems for random linear functionals. Section 3 gives several important representation theorems for random conjugate spaces. Section 4 gives characterizations for a complete random normed module to be random reflexive. Section 5 gives hyperplane separation theorems currently available in random locally convex modules. Section 6 gives the theory of random duality with respect to the locally L0L^{0}-convex topology and in particular a characterization for a locally L0L^{0}-convex module to be L0L^{0}-pre-barreled. Section 7 gives some basic results on L0L^{0}-convex analysis together with some applications to conditional risk measures. Finally, Section 8 is devoted to extensions of conditional convex risk measures, which shows that every representable LL^{\infty}-type of conditional convex risk measure and every continuous LpL^{p}-type of convex conditional risk measure (1p<+1\leq p<+\infty) can be extended to an LF(E)L^{\infty}_{\cal F}({\cal E})-type of σϵ,λ(LF(E),LF1(E))\sigma_{\epsilon,\lambda}(L^{\infty}_{\cal F}({\cal E}), L^{1}_{\cal F}({\cal E}))-lower semicontinuous conditional convex risk measure and an LFp(E)L^{p}_{\cal F}({\cal E})-type of Tϵ,λ{\cal T}_{\epsilon,\lambda}-continuous conditional convex risk measure (1p<+1\leq p<+\infty), respectively.Comment: 37 page

    Is the Total Amount as Important as Localization and Type of Collagen in Liver Fibrosis Attributable to Steatohepatitis?

    Get PDF
    Is liver fibrosis just liver fibrosis? Or do the subtype of collagen, its spatial localization in the liver, its cell of origin, and the time point at which it is synthesized also matter? It is important, since the various collagen subtypes hold different informative values regarding reparative processes in the liver, and as collagens have also emerged as important signaling molecules (1). Novel data have challenged our perception of liver fibrosis and collagens, which may have important implications regarding the development of new biomarkers and anti-fibrotic interventions. The traditional histological analysis of liver biopsies using histochemical collagen stains, such as the Masson's Trichrome stain or the Sirius Red stain, group all triple helical collagen structures into one gross bucket. Importantly, these stains ignore many other quantitatively minor but nonetheless functionally and structurally relevant collagen and non-collagen extracellular matrix (ECM) components.</p
    corecore