508 research outputs found

    Evaluations of User-Driven Ontology Summarization

    Get PDF
    Ontology Summarization has been found useful to facilitate ontology engineering tasks in a number of different ways. Recently, it has been recognised as a means to facilitate ontology understanding and then support tasks like ontology reuse in ontology construction. Among the works in literature, not only distinctive methods are used to summarize ontology, also different measures are deployed to evaluate the summarization results. Without a set of common evaluation measures in place, it is not possible to compare the performance and therefore judge the effectiveness of those summarization methods. In this paper, we investigate the applicability of the evaluation measures from ontology evaluation and summary evaluation domain for ontology summary evaluation. Based on those measures, we evaluate the performances of the existing user-driven ontology summarization approaches

    Statistically Stable Estimates of Variance in Radioastronomical Observations as Tools for RFI Mitigation

    Full text link
    A selection of statistically stable (robust) algorithms for data variance calculating has been made. Their properties have been analyzed via computer simulation. These algorithms would be useful if adopted in radio astronomy observations in the presence of strong sporadic radio frequency interference (RFI). Several observational results have been presented here to demonstrate the effectiveness of these algorithms in RFI mitigation

    A Study of Archiving Strategies in Multi-Objective PSO for Molecular Docking

    Get PDF
    Molecular docking is a complex optimization problem aimed at predicting the position of a ligand molecule in the active site of a receptor with the lowest binding energy. This problem can be formulated as a bi-objective optimization problem by minimizing the binding energy and the Root Mean Square Deviation (RMSD) difference in the coordinates of ligands. In this context, the SMPSO multi-objective swarm-intelligence algorithm has shown a remarkable performance. SMPSO is characterized by having an external archive used to store the non-dominated solutions and also as the basis of the leader selection strategy. In this paper, we analyze several SMPSO variants based on different archiving strategies in the scope of a benchmark of molecular docking instances. Our study reveals that the SMPSOhv, which uses an hypervolume contribution based archive, shows the overall best performance.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    The Critical Coupling Likelihood Method: A new approach for seamless integration of environmental and operating conditions of gravitational wave detectors into gravitational wave searches

    Get PDF
    Any search effort for gravitational waves (GW) using interferometric detectors like LIGO needs to be able to identify if and when noise is coupling into the detector's output signal. The Critical Coupling Likelihood (CCL) method has been developed to characterize potential noise coupling and in the future aid GW search efforts. By testing two hypotheses about pairs of channels, CCL is able to identify undesirable coupled instrumental noise from potential GW candidates. Our preliminary results show that CCL can associate up to 80\sim 80% of observed artifacts with SNR8SNR \geq 8, to local noise sources, while reducing the duty cycle of the instrument by 15\lesssim 15%. An approach like CCL will become increasingly important as GW research moves into the Advanced LIGO era, going from the first GW detection to GW astronomy.Comment: submitted CQ

    Boundaries of Siegel Disks: Numerical Studies of their Dynamics and Regularity

    Get PDF
    Siegel disks are domains around fixed points of holomorphic maps in which the maps are locally linearizable (i.e., become a rotation under an appropriate change of coordinates which is analytic in a neighborhood of the origin). The dynamical behavior of the iterates of the map on the boundary of the Siegel disk exhibits strong scaling properties which have been intensively studied in the physical and mathematical literature. In the cases we study, the boundary of the Siegel disk is a Jordan curve containing a critical point of the map (we consider critical maps of different orders), and there exists a natural parametrization which transforms the dynamics on the boundary into a rotation. We compute numerically this parameterization and use methods of harmonic analysis to compute the global Holder regularity of the parametrization for different maps and rotation numbers. We obtain that the regularity of the boundaries and the scaling exponents are universal numbers in the sense of renormalization theory (i.e., they do not depend on the map when the map ranges in an open set), and only depend on the order of the critical point of the map in the boundary of the Siegel disk and the tail of the continued function expansion of the rotation number. We also discuss some possible relations between the regularity of the parametrization of the boundaries and the corresponding scaling exponents. (C) 2008 American Institute of Physics.NSFMathematic

    Foodmaster and three stories

    Get PDF
    Graduation date: 2004The purpose of this thesis was to create a sustained piece of fiction that both\ud represented my growth as a writer throughout my time at Oregon State University, and\ud wove together a mixture of imagination, language, and creativity. My hope was to write\ud a novella that incorporated and drew from themes including work, community, and\ud family relationships, and also was an exploration in the very structure and form of literary\ud fiction.\ud After completing the novella, I found that similar themes continued to appear\ud within my fiction during my ongoing growth as a writer. What I ended up with was a\ud novella and collection of related stories that reflected the influences of my advisor Tracy\ud Daugherty and his tutelage, the courses that I took at this university and my\ud undergraduate university, and my own personal history.\ud This thesis was written over a two-year period, during which drafts of this novella\ud and stories were written and rewritten. Each story and chapter was submitted to a writing\ud workshop, read and edited by my major and minor advisor, and carefully reworked and\ud redrafted after much scrutiny and attention.\ud During the course of writing this thesis, many things influenced me, the most\ud prominent being the world of fiction that existed all around me. I was influenced by\ud fiction that I was reading in my course work, such as Donald Barthelme and Philip Roth,\ud but writers that I had grown up with, like Edgar Allen Poe and Ray Bradbury also\ud influenced me. Beyond the world of published fiction, I found not only influence, but\ud also more importantly inspiration from the work and criticism of the writers and students\ud within the Creative Writing Program here at Oregon State University.\ud The end result of these two years of work, study, writing, and criticism was a\ud piece of fiction that I am proud of, and plan to publish. This collection of fiction\ud represents not only a sustained study on the craft of creative writing, but also serves an\ud exploration of my own voice and style, and an awakening of my identity as a fiction\ud writer

    Quantification of depth of anesthesia by nonlinear time series analysis of brain electrical activity

    Full text link
    We investigate several quantifiers of the electroencephalogram (EEG) signal with respect to their ability to indicate depth of anesthesia. For 17 patients anesthetized with Sevoflurane, three established measures (two spectral and one based on the bispectrum), as well as a phase space based nonlinear correlation index were computed from consecutive EEG epochs. In absence of an independent way to determine anesthesia depth, the standard was derived from measured blood plasma concentrations of the anesthetic via a pharmacokinetic/pharmacodynamic model for the estimated effective brain concentration of Sevoflurane. In most patients, the highest correlation is observed for the nonlinear correlation index D*. In contrast to spectral measures, D* is found to decrease monotonically with increasing (estimated) depth of anesthesia, even when a "burst-suppression" pattern occurs in the EEG. The findings show the potential for applications of concepts derived from the theory of nonlinear dynamics, even if little can be assumed about the process under investigation.Comment: 7 pages, 5 figure

    A Comparison of Qualifications Based-Selection and Best Value Procurement for Construction Manager/General Contractor Highway Construction

    Get PDF
    Faster project delivery and the infusion of contractor knowledge into design are the primary drivers for choosing construction manager/general contractor (CM/GC) project delivery. This paper focuses on the use of qualifications-based (QBS) and best-value (BV) procurement approaches, how and why agencies use each, and their associated opportunities and obstacles. Data for this study were obtained from a majority of federally funded CM/GC projects completed between 2005 to 2015. The findings are that BV and QBS projects characteristics and performance have no statistically significant difference. The choice of BV or QBS coincides with the agency’s CM/GC stage of organizational development and influences of non-agency stakeholders on the CM/GC process. When agencies and the local industry are new to CM/GC, they were found to use BV as it is closer to the traditional procurement culture and it is perceived to result in a fair market project price. Alternatively, agencies and local industry partners with an established history of using CM/GC were found to choose QBS. The low level of design at the time of procurement, means that assumptions relating to risk, production rates, materials sources, etc. may be too preliminary to secure a reliable price. The use of BV procurement was found to pose a risk to innovation and increase negotiation efforts. Qualitative trends from the project data, interviews and literature point to agencies using QBS for the majority of CM/GC project and BV on CM/GC projects with lesser complexity or more highly developed designs at the time of selection

    Acute effect of meal glycemic index and glycemic load on blood glucose and insulin responses in humans

    Get PDF
    OBJECTIVE: Foods with contrasting glycemic index when incorporated into a meal, are able to differentially modify glycemia and insulinemia. However, little is known about whether this is dependent on the size of the meal. The purposes of this study were: i) to determine if the differential impact on blood glucose and insulin responses induced by contrasting GI foods is similar when provided in meals of different sizes, and; ii) to determine the relationship between the total meal glycemic load and the observed serum glucose and insulin responses. METHODS: Twelve obese women (BMI 33.7 ± 2.4 kg/m(2)) were recruited. Subjects received 4 different meals in random order. Two meals had a low glycemic index (40–43%) and two had a high-glycemic index (86–91%). Both meal types were given as two meal sizes with energy supply corresponding to 23% and 49% of predicted basal metabolic rate. Thus, meals with three different glycemic loads (95, 45–48 and 22 g) were administered. Blood samples were taken before and after each meal to determine glucose, free-fatty acids, insulin and glucagon concentrations over a 5-h period. RESULTS: An almost 2-fold higher serum glucose and insulin incremental area under the curve (AUC) over 2 h for the high- versus low-glycemic index same sized meals was observed (p < 0.05), however, for the serum glucose response in small meals this was not significant (p = 0.38). Calculated meal glycemic load was associated with 2 and 5 h serum glucose (r = 0.58, p < 0.01) and insulin (r = 0.54, p < 0.01) incremental and total AUC. In fact, when comparing the two meals with similar glycemic load but differing carbohydrate amount and type, very similar serum glucose and insulin responses were found. No differences were observed for serum free-fatty acids and glucagon profile in response to meal glycemic index. CONCLUSION: This study showed that foods of contrasting glycemic index induced a proportionally comparable difference in serum insulin response when provided in both small and large meals. The same was true for the serum glucose response but only in large meals. Glycemic load was useful in predicting the acute impact on blood glucose and insulin responses within the context of mixed meals

    Epidemiological Surveillance of Birth Defects Compatible with Thalidomide Embryopathy in Brazil

    Get PDF
    The thalidomide tragedy of the 1960s resulted in thousands of children being born with severe limb reduction defects (LRD), among other malformations. In Brazil, there are still babies born with thalidomide embryopathy (TE) because of leprosy prevalence, availability of thalidomide, and deficiencies in the control of drug dispensation. Our objective was to implement a system of proactive surveillance to identify birth defects compatible with TE. Along one year, newborns with LRD were assessed in the Brazilian hospitals participating in the Latin-American Collaborative Study of Congenital Malformations (ECLAMC). A phenotype of LRD called thalidomide embryopathy phenotype (TEP) was established for surveillance. Children with TEP born between the years 2000–2008 were monitored, and during the 2007–2008 period we clinically investigated in greater detail all cases with TEP (proactive period). The period from 1982 to 1999 was defined as the baseline period for the cumulative sum statistics. The frequency of TEP during the surveillance period, at 3.10/10,000 births (CI 95%: 2.50–3.70), was significantly higher than that observed in the baseline period (1.92/10,000 births; CI 95%: 1.60–2.20), and not uniformly distributed across different Brazilian regions. During the proactive surveillance (2007–2008), two cases of suspected TE were identified, although the two mothers had denied the use of the drug during pregnancy. Our results suggest that TEP has probably increased in recent years, which coincides with the period of greater thalidomide availability. Our proactive surveillance identified two newborns with suspected TE, proving to be a sensitive tool to detect TE. The high frequency of leprosy and the large use of thalidomide reinforce the need for a continuous monitoring of TEP across Brazil
    corecore