823 research outputs found

    Semantic Transformation of Web Services

    Get PDF
    Web services have become the predominant paradigm for the development of distributed software systems. Web services provide the means to modularize software in a way that functionality can be described, discovered and deployed in a platform independent manner over a network (e.g., intranets, extranets and the Internet). The representation of web services by current industrial practice is predominantly syntactic in nature lacking the fundamental semantic underpinnings required to fulfill the goals of the emerging Semantic Web. This paper proposes a framework aimed at (1) modeling the semantics of syntactically defined web services through a process of interpretation, (2) scop-ing the derived concepts within domain ontologies, and (3) harmonizing the semantic web services with the domain ontologies. The framework was vali-dated through its application to web services developed for a large financial system. The worked example presented in this paper is extracted from the se-mantic modeling of these financial web services

    The origin of the excess transit absorption in the HD 189733 system: planet or star?

    Get PDF
    We have detected excess absorption in the emission cores of Ca II H&K during transits of HD 189733b for the first time. Using observations of three transits, we investigate the origin of the absorption, which is also seen in Hα and the Na I D lines. Applying differential spectrophotometry methods to the Ca II H and Ca II K lines combined, using respective passband widths of Δλ = 0.4 and 0.6 Å yields excess absorption of td = 0.0074 ± 0.0044 (1.7σ; Transit 1) and 0.0214 ± 0.0022 (9.8σ; Transit 2). Similarly, we detect excess Hα absorption in a passband of width Δλ = 0.7 Å, with td = 0.0084 ± 0.0016 (5.2σ) and 0.0121 ± 0.0012 (9.9σ). For both lines, Transit 2 is thus significantly deeper. Combining all three transits for the Na I D lines yields excess absorption of td = 0.0041 ± 0.0006 (6.5σ). By considering the time series observations of each line, we find that the excess apparent absorption is best recovered in the stellar reference frame. These findings lead us to postulate that the main contribution to the excess transit absorption in the differential light curves arises because the normalizing continuum bands form in the photosphere, whereas the line cores contain a chromospheric component. We cannot rule out that part of the excess absorption signature arises from the planetary atmosphere, but we present evidence which casts doubt on recent claims to have detected wind motions in the planet's atmosphere in these data

    An Evaluation of The Effectiveness of Adaptive Histogram Equalization for Contrast Enhancement

    Get PDF
    Adaptive Histogram Equalization (AHE), a method of contrast enhancement which is sensitive to local spatial information in an image, has been proposed as a solution to the problem of the inability of ordinary display devices to depict the full dynamic intensity range in some medical images. This method is automatic, reproducible, and simultaneously displays most of the information contained in the grey-scale contrast of the image. However, it has not been known whether the use of AHE causes the loss of diagnostic information relative to the commonly-used method intensity windowing. In the current work, AHE and intensity windowing are compared using psychophysical observer studies. In studies performed at North Carolina Memorial Hospital, experienced radiologists were shown clinical CT images of the chest. Into some of the images, appropriate artificial lesion were introduced; the physicians were then shown the images processed with both AHE and intensity windowing. They were asked to assess the probability that as given image contained the artificial lesion, and their accurate was measured. The results of these experiments shown that for this particular diagnostic task, there was no significant difference in the ability of the two methods to depict luminance contrast; thus, further evaluation of AHE using controlled clinical trials is indicated

    Application of the Semi-Empirical Force-Limiting Approach for the CoNNeCT SCAN Testbed

    Get PDF
    The semi-empirical force-limiting vibration method was developed and implemented for payload testing to limit the structural impedance mismatch (high force) that occurs during shaker vibration testing. The method has since been extended for use in analytical models. The Space Communications and Navigation Testbed (SCAN Testbed), known at NASA as, the Communications, Navigation, and Networking re-Configurable Testbed (CoNNeCT), project utilized force-limiting testing and analysis following the semi-empirical approach. This paper presents the steps in performing a force-limiting analysis and then compares the results to test data recovered during the CoNNeCT force-limiting random vibration qualification test that took place at NASA Glenn Research Center (GRC) in the Structural Dynamics Laboratory (SDL) December 19, 2010 to January 7, 2011. A compilation of lessons learned and considerations for future force-limiting tests is also included

    An Optical/NIR Exploration of Forming Cluster Environments at High Redshift with VLT, Keck, and HST

    Get PDF
    The past decade has been witness to immense progress in the understanding of the early stages of cluster formation both from a theoretical and observational perspective. During this time, samples of forming clusters at higher redshift, termed "protoclusters", once comprised of heterogeneous mix of serendipitous detections or detections arising from dedicated searches around rare galaxy populations, have begun to compete with lower-redshift samples both in terms of numbers and in the homogeneity of the detection methods. Much of this progress has come from optical/near-infrared (NIR) imaging and spectroscopic campaigns designed to target large numbers of typical galaxies to exquisite depth. In this poster talk I will focus on observations from VIMOS on VLT, MOSFIRE/DEIMOS on Keck, and a 50-orbit cycle 29 HST WFC3/G141 grism campaign taken as part of the Charting Cluster Construction with VUDS and ORELSE (C3VO) survey. These observations, combined with novel mapping and search techniques, have uncovered a large number of "protostructures" at 2 < z < 5 that appear to resemble clusters and groups forming in the early universe. I will discuss the development of the methods for finding, confirming, and characterizing proto-clusters and proto-groups in our sample, as well as groups and clusters at intermediate redshifts. Several case studies of spectroscopically-confirmed massive proto-clusters with a diverse set of properties will be presented. I will finally discuss constraints on the relationship between star formation rate, stellar mass, and galaxy density at these redshift

    Visual Dependency and Dizziness after Vestibular Neuritis

    Get PDF
    Symptomatic recovery after acute vestibular neuritis (VN) is variable, with around 50% of patients reporting long term vestibular symptoms; hence, it is essential to identify factors related to poor clinical outcome. Here we investigated whether excessive reliance on visual input for spatial orientation (visual dependence) was associated with long term vestibular symptoms following acute VN. Twenty-eight patients with VN and 25 normal control subjects were included. Patients were enrolled at least 6 months after acute illness. Recovery status was not a criterion for study entry, allowing recruitment of patients with a full range of persistent symptoms. We measured visual dependence with a laptop-based Rod-and-Disk Test and severity of symptoms with the Dizziness Handicap Inventory (DHI). The third of patients showing the worst clinical outcomes (mean DHI score 36–80) had significantly greater visual dependence than normal subjects (6.35° error vs. 3.39° respectively, p = 0.03). Asymptomatic patients and those with minor residual symptoms did not differ from controls. Visual dependence was associated with high levels of persistent vestibular symptoms after acute VN. Over-reliance on visual information for spatial orientation is one characteristic of poorly recovered vestibular neuritis patients. The finding may be clinically useful given that visual dependence may be modified through rehabilitation desensitization techniques

    Neuronal ribosomes exhibit dynamic and context-dependent exchange of ribosomal proteins

    Get PDF
    Owing to their morphological complexity and dense network connections, neurons modify their proteomes locally, using mRNAs and ribosomes present in the neuropil (tissue enriched for dendrites and axons). Although ribosome biogenesis largely takes place in the nucleus and perinuclear region, neuronal ribosomal protein (RP) mRNAs have been frequently detected remotely, in dendrites and axons. Here, using imaging and ribosome profiling, we directly detected the RP mRNAs and their translation in the neuropil. Combining brief metabolic labeling with mass spectrometry, we found that a group of RPs rapidly associated with translating ribosomes in the cytoplasm and that this incorporation was independent of canonical ribosome biogenesis. Moreover, the incorporation probability of some RPs was regulated by location (neurites vs. cell bodies) and changes in the cellular environment (following oxidative stress). Our results suggest new mechanisms for the local activation, repair and/or specialization of the translational machinery within neuronal processes, potentially allowing neuronal synapses a rapid means to regulate local protein synthesis

    Functional connectome of brainstem nuclei involved in autonomic, limbic, pain and sensory processing in living humans from 7 Tesla resting state fMRI

    Get PDF
    Despite remarkable advances in mapping the functional connectivity of the cortex, the functional connectivity of subcortical regions is understudied in living humans. This is the case for brainstem nuclei that control vital processes, such as autonomic, limbic, nociceptive and sensory functions. This is because of the lack of precise brainstem nuclei localization, of adequate sensitivity and resolution in the deepest brain regions, as well as of optimized processing for the brainstem. To close the gap between the cortex and the brainstem, on 20 healthy subjects, we computed a correlation-based functional connectome of 15 brainstem nuclei involved in autonomic, limbic, nociceptive, and sensory function (superior and inferior colliculi, ventral tegmental area-parabrachial pigmented nucleus complex, microcellular tegmental nucleus-prabigeminal nucleus complex, lateral and medial parabrachial nuclei, vestibular and superior olivary complex, superior and inferior medullary reticular formation, viscerosensory motor nucleus, raphe magnus, pallidus, and obscurus, and parvicellular reticular nucleus – alpha part) with the rest of the brain. Specifically, we exploited 1.1mm isotropic resolution 7 Tesla resting-state fMRI, ad-hoc coregistration and physiological noise correction strategies, and a recently developed probabilistic template of brainstem nuclei. Further, we used 2.5mm isotropic resolution resting-state fMRI data acquired on a 3 Tesla scanner to assess the translatability of our results to conventional datasets. We report highly consistent correlation coefficients across subjects, confirming available literature on autonomic, limbic, nociceptive and sensory pathways, as well as high interconnectivity within the central autonomic network and the vestibular network. Interestingly, our results showed evidence of vestibulo-autonomic interactions in line with previous work. Comparison of 7 Tesla and 3 Tesla findings showed high translatability of results to conventional settings for brainstem-cortical connectivity and good yet weaker translatability for brainstem-brainstem connectivity. The brainstem functional connectome might bring new insight in the understanding of autonomic, limbic, nociceptive and sensory function in health and disease

    Structural Dynamic Assessment of the GN2 Piping System for NASA's New and Powerful Reverberant Acoustic Test Facility

    Get PDF
    The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) has led the design and build of the new world-class vibroacoustic test capabilities at the NASA GRC's Plum Brook Station in Sandusky, Ohio, USA from 2007 to 2011. SAIC-Benham has completed construction of a new reverberant acoustic test facility to support the future testing needs of NASA's space exploration program and commercial customers. The large Reverberant Acoustic Test Facility (RATF) is approximately 101,000 cubic feet in volume and was designed to operate at a maximum empty chamber acoustic overall sound pressure level (OASPL) of 163 dB. This combination of size and acoustic power is unprecedented amongst the world s known active reverberant acoustic test facilities. Initial checkout acoustic testing was performed on March 2011 by SAIC-Benham at test levels up to 161 dB OASPL. During testing, several branches of the gaseous nitrogen (GN2) piping system, which supply the fluid to the noise generating acoustic modulators, failed at their T-junctions connecting the 12 in. supply line to their respective 4 in. branch lines. The problem was initially detected when the oxygen sensors in the horn room indicated a lower than expected oxygen level from which was inferred GN2 leaks in the piping system. In subsequent follow up inspections, cracks were identified in the failed T-junction connections through non-destructive evaluation testing. Through structural dynamic modeling of the piping system, the root cause of the T-junction connection failures was determined. The structural dynamic assessment identified several possible corrective design improvements to the horn room piping system. The effectiveness of the chosen design repairs were subsequently evaluated in September 2011 during acoustic verification testing to 161 dB OASPL

    Negotiating the Web Science Curriculum through Shared Educational Artefacts

    No full text
    EXTENDED ABSTRACT The far-reaching impact of Web on society is widely recognised and acknowledged. The interdisciplinary study of this impact has crystallised in the field of study known as Web Science. However, defining an agreed, shared understanding of what constitutes Web Science requires complex negotiation and translations of understandings across component disciplines, national cultures and educational traditions. Some individual institutions have already established particular curricula, and discussions in the Web Science Curriculum Workshop series have marked the territory to some extent. This paper reports on a process being adopted across a consortium of partners to systematically create a shared understanding of what constitutes Web Science. It records and critiques the processes instantiated to agree a common curriculum, and presents a framework for future discussion and development. The need to study the Web in its complexity, development and impact led to the creation of Web Science. Web Science is inherently interdisciplinary. Its goal is to: a) understand the Web growth mechanisms; b) create approaches that allow new powerful and more beneficial mechanisms to occur. Teaching Web Science is a unique experience since the emerging discipline is a combination of two essential features. On one hand, the analysis of microscopic laws extrapolated to the macroscopic realm generates observed behaviour. On the other hand languages and algorithms on the Web are built in order to produce novel desired computer behaviour that should be put in context. Finding a suitable curriculum that is different from the study of language, algorithms, interaction patterns and business processes is thus an important and challenging task for the simple reason that we believe that the future of sociotechnical systems will be in their innovative power (inventing new ways to solve problems), rather than their capacity to optimize current practices. The Web Science Curriculum Development (WSCD) Project focuses European expertise in this interdisciplinary endeavour with the ultimate aim of designing a joint masters program for Web Science between the partner universities. The process of curriculum definition is being addressed using a negotiation process which mirrors the web science and engineering approach described by Berners-Lee (figure 1 below). The process starts on the engineering side (right). From the technical design point of view the consortium is creating an open repository of shared educational artefacts using EdShare [1] (based on EPrints) to collect or reference the whole range of educational resources being used in our various programmes. Socially, these resources will be annotated against a curriculum categorization [2] which in itself is subject to negotiation and change, currently via a wiki. This last process is represented by complexity and collaboration at the bottom of the diagram. The resources necessarily extend beyond artefacts used in the lecture and seminar room encompassing artefacts associated with the administrative and organisational processes which are necessary to assure the comparability of the educational resources and underwrite the quality standards of the associated awards. Figure 1: Web Science and Engineering Approach (e.g. See http://www.w3.org/2007/Talks/0314-soton-tbl/#%2811%29) From the social point of view the contributions will be discussed and peer reviewed by members of the consortium. Our intention is that by sharing the individual components of the teaching and educational process and quality assuring them by peer review we will provide concrete examples of our understanding of the discipline. However, as Berners-Lee observes, it is in the move from the micro to the macro that the magic (complexity) is involved. The challenge for our consortium, once our community repository is adequately populated, is to involve the wider community in the contribution, discussion and annotation that will lead to the evolution of a negotiated and agreed but evolving curriculum for Web Science. Others have worked on using community approaches to developing curriculum. For example, in the Computer Science community there is a repository of existing syllabi [3] that enables designers of new courses to understand how others have approached the problem, and the Information Science community is using a wiki [4] to enable the whole community to contribute to the dynamic development of the curriculum. What makes this project unique is that rather than taking a top down structured approach to curriculum definition it takes a bottom up approach, using the actual teaching materials as the basis on which to iteratively negotiate and refine the definition of the curriculum
    corecore