5,441 research outputs found

    L1-norm Regularized L1-norm Best-fit line problem

    Get PDF
    Background Conventional Principal Component Analysis (PCA) is a widely used technique to reduce data dimension. PCA finds linear combinations of the original features capturing maximal variance of data via Singular Value Decomposition (SVD). However, SVD is sensitive to outliers, and often leads to high dimensional results. To address the issues, we propose a new method to estimate best-fit one-dimensional subspace, called l1-norm Regularized l1-norm. Methods In this article, we describe a method to fit a lower-dimensional subspace by approximate a non-linear, non-convex, non-smooth optimization problem called l1 regularized l1-norm Best- Fit Line problem; minimize a combination of the l1 error and of the l1 regularization. The procedure can be simply performed using ratios and sorting. Also ,we present applications in the area of video surveillance, where our methodology allows for background subtraction with jitters, illumination changes, and clutters. Results We compared our performance with SVD on synthetic data. The numerical results showed our algorithm successfully found a better principal component from a grossly corrupted data than SVD in terms of discordance. Moreover, our algorithm provided a sparser principal component than SVD. However, we expect it to be faster on multi-node environment. Conclusions This paper proposes a new algorithm able to generate a sparse best-fit subspace robust to outliers. The projected subspaces sought on non-contaminated data, differ little from that of traditional PCA. When subspaces are projected from contaminated data, it attain arguably significant both smaller discordance and lower dimension than that of traditional PCA.https://scholarscompass.vcu.edu/gradposters/1074/thumbnail.jp

    Normal subgroups of diffeomorphism and homeomorphism groups of R^n and other open manifolds

    Full text link
    We determine all the normal subgroups of the group of C^r diffeomorphisms of R^n, r = 1,2,...,infinity, except when r=n+1 or n=4, and also of the group of homeomorphisms of R^n (r=0). We also study the group A_0 of diffeomorphisms of an open manifold M that are isotopic to the identity. If M is the interior of a compact manifold with nonempty boundary, then the quotient of A_0 by the normal subgroup of diffeomorphisms that coincide with the identity near to a given end e of M is simple.Comment: This version corrects an error regarding the diffeomorphism groups of R^1, since the line has two ends, in contrast to higher dimension

    Peak Effect in Superconductors: Melting of Larkin Domains

    Full text link
    Motivated by the recent observations of the peak effect in high-TcT_c YBCO superconductors, we reexamine the origin of this unusual phenomenon. We show that the mechanism based on the kk-dependence (nonlocality) of the vortex-lattice tilt modulus C44(k)C_{44}({\bf k}) cannot account for the essential feature of the peak effect. We propose a scenario in which the peak effect is related to the melting of Larkin domains. In our model, the rise of critical current with increasing temperature is a result of a crossover from the Larkin pinning length to the length scale set by thermally excited free dislocations.Comment: 13 pages, 2 figures, REVTE

    Study of near consensus complex social networks using Eigen theory

    Get PDF
    This paper extends the definition of an exact consensus complex social network to that of a near consensus complex social network. A near consensus complex social network is a social network with nontrivial topological features and steady state values of the decision certitudes of the majority of the nodes being either higher or lower than a threshold value. By using eigen theories, the relationships among the vectors representing the steady state values of the decision certitudes of the nodes, the influence weight matrix and the set of vectors representing the initial state values of the decision certitudes of the nodes that satisfies a given near consensus specification are characterized

    Repeat prescribing of medications: a system-centred risk management model for primary care organisations

    Get PDF
    Rationale, aims and objectives: Reducing preventable harm from repeat medication prescriptions is a patient safety priority worldwide. In the United Kingdom, repeat prescriptions items issued has doubled in the last 20 years from 5.8 to 13.3 items per patient per annum. This has significant resource implications and consequences for avoidable patient harms. Consequently, we aimed to test a risk management model to identify, measure, and reduce repeat prescribing system risks in primary care. Methods: All 48 general medical practices in National Health Service (NHS) Lambeth Clinical Commissioning Group (an inner city area of south London in England) were recruited. Multiple interventions were implemented, including educational workshops, a web-based risk monitoring system, and external reviews of repeat prescribing system risks by clinicians. Data were collected via documentation reviews and interviews and subject to basic thematic and descriptive statistical analyses. Results: Across the 48 participating general practices, 62 unique repeat prescribing risks were identified on 505 occasions (eg, practices frequently experiencing difficulty interpreting medication changes on hospital discharge summaries), equating to a mean of 8.1 risks per practice (range: 1-33; SD = 7.13). Seven hundred sixty-seven system improvement actions were recommended across 96 categories (eg, alerting hospitals to illegible writing and delays with discharge summaries) with a mean of 15.6 actions per practice (range: 0-34; SD = 8.0). Conclusions: The risk management model tested uncovered important safety concerns and facilitated the development and communication of related improvement recommendations. System-wide information on hazardous repeat prescribing and how this could be mitigated is very limited. The approach reported may have potential to close this gap and improve the reliability of general practice systems and patient safety, which should be of high interest to primary care organisations internationally

    Promoting hydrogen production and minimizing catalyst deactivation from the pyrolysis-catalytic steam reforming of biomass on nanosized NiZnAlOx catalysts

    Get PDF
    Hydrogen production from the thermochemical conversion of biomass was carried out with nano-sized NiZnAlOx catalysts using a two-stage fixed bed reactor system. The gases derived from the pyrolysis of wood sawdust in the first stage were catalytically steam reformed in the second stage. The NiZnAlOx catalysts were synthesized by a co-precipitation method with different Ni molar fractions (5, 10, 15, 25 and 35%) and a constant Zn:Al molar ratio of 1:4. The catalysts were characterized by a wide range of techniques, including N2 adsorption, SEM, XRD, TEM and temperature-programmed oxidation (TPO) and reduction (TPR). Fine metal particles of size around 10–11 nm were obtained and the catalysts had high stability characteristics, which improved the dispersion of active centers during the reaction and promoted the performance of the catalysts. The yield of gas was increased from 49.3 to 74.8 wt.%, and the volumetric concentration of hydrogen was increased from 34.7 to 48.1 vol.%, when the amount of Ni loading was increased from 5 to 35%. Meanwhile, the CH4 fraction decreased from 10.2 to 0.2 vol.% and the C2–C4 fraction was reduced from 2.4 vol.% to 0.0 vol.%. During the reaction, the crystal size of all catalysts was successfully maintained at around 10–11 nm with lowered catalyst coke formation, (particularly for the 35NiZn4Al catalyst where negligible coke was found) and additionally no obvious catalyst sintering was detected. The efficient production of hydrogen from the thermochemical conversion of renewable biomass indicates that it is a promising sustainable route to generate hydrogen from biomass using the NiZnAl metal oxide catalyst prepared in this work via a two-stage reaction system

    Autophagy-Derived Alzheimer’s Pathogenesis

    Get PDF

    Cross laminated plywood construction of a free-form roof

    Get PDF
    The paper discusses the design, fabrication and assembly of a free-form roof structure constructed using a hybrid method of cross-laminating structural plywood. The research outlined the roof's design, fabrication and engineering workflow within an integrated fabrication environment led by the architect in collaboration with the engineer. Such workflow constructed resilience in the design and manufacturing process. The digital fabrication of the structure was developed in a single parametric model informed through 3 sets of physical prototypes. By incorporating as-built site information through digital scans, the assembly process was enhanced with feedback to ensure precision in manufacturing, thereby demonstrating care in construction through digital technology
    corecore