413 research outputs found
Conservative and aggressive rough SVR modeling
AbstractSupport vector regression provides an alternative to the neural networks in modeling non-linear real-world patterns. Rough values, with a lower and upper bound, are needed whenever the variables under consideration cannot be represented by a single value. This paper describes two approaches for the modeling of rough values with support vector regression (SVR). One approach, by attempting to ensure that the predicted high value is not greater than the upper bound and that the predicted low value is not less than the lower bound, is conservative in nature. On the contrary, we also propose an aggressive approach seeking a predicted high which is not less than the upper bound and a predicted low which is not greater than the lower bound. The proposal is shown to use Ï”-insensitivity to provide a more flexible version of lower and upper possibilistic regression models. The usefulness of our work is realized by modeling the rough pattern of a stock market index, and can be taken advantage of by conservative and aggressive traders
Fast Cross-Validation via Sequential Testing
With the increasing size of today's data sets, finding the right parameter
configuration in model selection via cross-validation can be an extremely
time-consuming task. In this paper we propose an improved cross-validation
procedure which uses nonparametric testing coupled with sequential analysis to
determine the best parameter set on linearly increasing subsets of the data. By
eliminating underperforming candidates quickly and keeping promising candidates
as long as possible, the method speeds up the computation while preserving the
capability of the full cross-validation. Theoretical considerations underline
the statistical power of our procedure. The experimental evaluation shows that
our method reduces the computation time by a factor of up to 120 compared to a
full cross-validation with a negligible impact on the accuracy
Example-based learning for single-image super-resolution and JPEG artifact removal
This paper proposes a framework for single-image super-resolution and JPEG artifact removal. The underlying idea is to learn a map from input low-quality images (suitably preprocessed low-resolution or JPEG encoded images) to target high-quality images based on example pairs of input and output images. To retain the complexity of the resulting learning problem at a moderate level, a patch-based approach is taken such that kernel ridge regression (KRR) scans the input image with a small window (patch) and produces a patchvalued output for each output pixel location. These constitute a set of candidate images each of which reflects different local information. An image output is then obtained as a convex combination of candidates for each pixel based on estimated confidences of candidates. To reduce the time complexity of training and testing for KRR, a sparse solution is found by combining the ideas of kernel matching pursuit and gradient descent. As a regularized solution, KRR leads to a better generalization than simply storing the examples as it has been done in existing example-based super-resolution algorithms and results in much less noisy images. However, this may introduce blurring and ringing artifacts around major edges as sharp changes are penalized severely. A prior model of a generic image class which takes into account the discontinuity property of images is adopted to resolve this problem. Comparison with existing super-resolution and JPEG artifact removal methods shows the effectiveness of the proposed method. Furthermore, the proposed method is generic in that it has the potential to be applied to many other image enhancement applications
Sustainable Structural Design for High-Performance Buildings and Infrastructures
Exceptional design loads on buildings and structures may have different causes, including high-strain natural hazards, man-made attacks and accidents, and extreme operational conditions. All of these aspects can be critical for specific structural typologies and/or materials that are particularly sensitive. Dedicated and refined methods are thus required for design, analysis, and maintenance under structuresâ expected lifetimes. Major challenges are related to the structural typology and material properties. Further issues are related to the need for the mitigation or retrofitting of existing structures, or from the optimal and safe design of innovative materials/systems. Finally, in some cases, no design recommendations are available, and thus experimental investigations can have a key role in the overall process. For this SI, we have invited scientists to focus on the recent advancements and trends in the sustainable design of high-performance buildings and structures. Special attention has been given to materials and systems, but also to buildings and infrastructures that can be subjected to extreme design loads. This can be the case of exceptional natural events or unfavorable ambient conditions. The assessment of hazard and risk associated with structures and civil infrastructure systems is important for the preservation and protection of built environments. New procedures, methods, and more precise rules for safety design and the protection of sustainable structures are, however, needed
Recommended from our members
Feedback control of gas metal arc braze-welding using thermal signals
textIn serial manufacturing processes, localized energy sources (e.g. plasma cutters, arc welders or water jets) induce material geometry transformations that yield a desired product. Simple parameter control of these energy sources does not necessarily ensure an optimal or successful part because of disturbances in the manufacturing process (material and temperature variations, etc). Currently, control in manufacturing is based on statistical process control where large databases for the manufacturing of a fixed process are available and have been compiled over several manufacturing runs. In the absence of a statistical database, and with the increased need for improved monitoring and throughput, there is need for active process control in manufacturing. In this work, Gas Metal Arc Braze-Welding (GMABW) will serve as a test-bed for the implementation of model predictive control (MPC) for a serial manufacturing process.
This dissertation investigates the integration of real time modeling of the temperature field with control algorithms to control the evolving temperature field in the
ix
braze-welded base metal. Fundamental problems involving MPC that are addressed are modeling techniques to calculate temperature fields with reduced computational requirements and control algorithms that utilize the thermal models directly to inform the controller.
The dissertation first outlines and compares analytical and computational thermal models and comparison with experimental data are obtained. A thermal model based on a metamodeling approach is used as the plant model for a classical control system and control parameters are found. Various techniques for dealing with signal noise encountered during experimentation are investigated. A proportional controller is implemented in the experimental setup that applies feedback control of the braze âwelding process using thermal signals. A novel approach to MPC is explored by using a metamodel as the plant model for the braze-welding process and having the temperature trajectory dictated by the metamodel in the steady state region of the weld. Lastly, future work and extensions of this research are outlined.Mechanical Engineerin
Abstracts of Papers, 89th Annual Meeting of the Virginia Academy of Science, May 25-27, 2011, University of Richmond, Richmond VA
Full abstracts of the 89th Annual Meeting of the Virginia Academy of Science, May 25-27, 2011, University of Richmond, Richmond V
Front Lines of Thoracic Surgery
Front Lines of Thoracic Surgery collects up-to-date contributions on some of the most debated topics in today's clinical practice of cardiac, aortic, and general thoracic surgery,and anesthesia as viewed by authors personally involved in their evolution. The strong and genuine enthusiasm of the authors was clearly perceptible in all their contributions and I'm sure that will further stimulate the reader to understand their messages. Moreover, the strict adhesion of the authors' original observations and findings to the evidence base proves that facts are the best guarantee of scientific value. This is not a standard textbook where the whole discipline is organically presented, but authors' contributions are simply listed in their pertaining subclasses of Thoracic Surgery. I'm sure that this original and very promising editorial format which has and free availability at its core further increases this book's value and it will be of interest to healthcare professionals and scientists dedicated to this field
PRIMARY AND SECONDARY PREVENTION OF HEPATITIS C VIRUS AMONG RURAL APPALACHIAN PEOPLE WHO USE DRUGS
Hepatitis C virus (HCV) remains a major cause of morbidity and mortality worldwide, with 3% of the global population chronically infected. Clinical impacts in the United States are projected to increase for two decades, and mortality attributed to HCV now exceeds HIV. Injection drug use (IDU) is the most common route of transmission in the developed world. Advances in treatment offer hope of mitigating HCV impacts, but substantial barriers obstruct people who inject drugs (PWID) from receiving care, particularly in medically underserved regions including Central Appalachia. This study assessed IDU paraphernalia sharing longitudinally over 24 months in a sample of 283 rural PWID recruited by respondentâdriven sampling. Medical followâup among 254 seropositive participants was also assessed using discreteâtime survival analysis.
HCVâpositive screening was associated with reduced IDU sharing frequency 18 months after testing compared to seronegative participants (adjusted OR [aOR]=1.4, 95% confidence interval [CI]: 1.0â1.9), but this effect was not sustained. HCVâpositive participants were less likely to cease IDU 6 months after testing (aOR=0.4, 95% CI: 0.2â0.7). Predictors negatively associated with decreased IDU sharing included recent unprotected sex, sedative use, and frequency of prescription opioid IDU; protective associations included female gender and religious affiliation. IDU cessation was negatively associated with ever being incarcerated, recent unprotected sex with PWID, heavy alcohol use, lifetime use of OxyContinÂź, and baseline frequency of prescription opioid IDU; protective associations included number of dependents, receiving disability payments, and substance abuse treatment. Drugâspecific associations decreasing IDU cessation included recent illicit use of OxyContinÂź, other oxycodone, and cocaine.
150 of 254 (59%) seropositive participants saw a clinician after HCVâpositive screening and counseling, 35 (14%) sought treatment, and 21 (8%) received treatment. Positive predictors of following up with a clinician following testing and counseling included health insurance, internet access, past substance abuse treatment, generalized anxiety disorder, and recent marijuana use. Factors decreasing odds of followâup included major depression, lifetime illicit methadone use, and recent legal methadone. These analyses shed valuable light on determinants of behavior impacting primary and secondary HCV prevention. Integrated, multidisciplinary approaches are recommended to meaningfully impact epidemic levels of HCV among rural PWID in Eastern Kentucky
Recommended from our members
Perceptual video quality and quality of experience for adaptive video streaming
We live in a world where images and videos dominate our everyday lives. Every day, an enormous amount of video data is being shared in social media and consumer applications, while video streaming is becoming a new form of digital entertainment. Large-scale video streaming on demand has become possible thanks to numerous engineering achievements in fields such as video compression, high-speed computation and display technologies. Nevertheless, the skyrocketing needs for bandwidth and network resources consumed by video applications challenges modern video content delivery. Since the available bandwidth resources are limited, streaming service providers have to mediate between operation costs, bandwidth efficiency and maximizing user quality of experience. However, these goals are inherently conflicting and require knowledge of how user quality of experience is affected by the network-induced changes in video quality. Being able to understand and predict user quality of experience and perceptually optimize rate allocation, can have significant effects in better network utilization, reduced costs for service providers and improved user satisfaction. The goal of this dissertation is to study and predict user quality of experience in video streaming applications, by exploiting perceptual video quality and human behavioral responses to streaming-related video impairments. To this end, I present the details of three large-scale video subjective studies which target video streaming under multiple viewing conditions, such as display device, session duration, content characteristics and network/buffer conditions. By analyzing how humans react to changes in visual quality and streaming video impairments, I also design numerous video quality and quality of experience prediction models that can be used to evaluate the overall and the continuous-time perceived video quality. Throughout this dissertation, my goal is to perceptually optimize various stages of the video streaming pipeline, such as video encoding and video quality control as well as client-based rate adaptation. Ultimately, I envision that the outcome of this dissertation can be useful for video streaming applications at global scaleElectrical and Computer Engineerin
Intelligence in 5G networks
Over the past decade, Artificial Intelligence (AI) has become an important part of our daily lives; however, its application to communication networks has been partial and unsystematic, with uncoordinated efforts that often conflict with each other. Providing a framework to integrate the existing studies and to actually build an intelligent network is a top research priority. In fact, one of the objectives of 5G is to manage all communications under a single overarching paradigm, and the staggering complexity of this task is beyond the scope of human-designed algorithms and control systems.
This thesis presents an overview of all the necessary components to integrate intelligence in this complex environment, with a user-centric perspective: network optimization should always have the end goal of improving the experience of the user. Each step is described with the aid of one or more case studies, involving various network functions and elements.
Starting from perception and prediction of the surrounding environment, the first core requirements of an intelligent system, this work gradually builds its way up to showing examples of fully autonomous network agents which learn from experience without any human intervention or pre-defined behavior, discussing the possible application of each aspect of intelligence in future networks
- âŠ