4,009 research outputs found
Suitability of Fast Healthcare Interoperability Resources (FHIR) for Wellness Data
Wellness data generated by patients using smart phones and portable devices can be a key part of Personal Health Record (PHR) data and offers healthcare service providers (healthcare providers) patient health information on a daily basis. Prior research has identified the potential for improved communication between healthcare provider and patient. However the practice of sharing patient generated wellness data has not been widely adopted by the healthcare sector; one of the reasons being the lack of interoperability preventing successful integration of such device generated data into the PHR and Electronic Health Record (EHR) systems. To address the interoperability issue it is important to make sure that wellness data can be supported in healthcare information exchange standards. Fast Healthcare Interoperability Resources (FHIR) is used in the current research study to identify the technical feasibility for patient generated wellness data. FHIR is expected to be the future healthcare information exchange standard in the healthcare industry. \ A conceptual data model of wellness data was developed for evaluation using FHIR standard. The conceptual data model contained blood glucose readings, blood pressure readings and Body Mass Index (BMI) data and could be extended to accept other types of wellness data. The wellness data model was packaged in an official FHIR resource called Observation. The research study proved the flexibility of adding new data elements related to wellness in Observation. It met the requirements in FHIR to include such data elements useful in self-management of chronic diseases. It also had the potential in sharing it with the healthcare provider system.
Resonant Scattering of Surface Plasmon Polaritons by Dressed Quantum Dots
The resonant scattering of surface plasmon-polariton waves by embedded
semiconductor quantum dots above the dielectric/metal interface is explored in
the strong-coupling regime. In contrast to non-resonant scattering by a
localized dielectric surface defect, a strong resonant peak in the scattering
field spectrum is predicted and accompanied by two side valleys. The peak
height depends nonlinearly on the amplitude of surface plasmon-polariton waves,
reflecting the feedback dynamics from a photon-dressed electron-hole plasma
inside the quantum dots. This unique behavior in the scattering field peak
strength is correlated with the occurrence of a resonant dip in the absorption
spectrum of surface plasmon-polariton waves due to interband photon-dressing
effect. Our result on the scattering of surface plasmon-polariton waves may be
experimentally observable and applied to spatially selective illumination and
imaging of individual molecules.Comment: 15 pages, 3 figure
Controlling quantum-dot light absorption and emission by a surface-plasmon field
The possibility for controlling the probe-field optical gain and absorption
switching and photon conversion by a surface-plasmon-polariton near field is
explored for a quantum dot above the surface of a metal. In contrast to the
linear response in the weak-coupling regime, the calculated spectra show an
induced optical gain and a triply-split spontaneous emission peak resulting
from the interference between the surface-plasmon field and the probe or
self-emitted light field in such a strongly-coupled nonlinear system. Our
result on the control of the mediated photon-photon interaction, very similar
to the `gate' control in an optical transistor, may be experimentally
observable and applied to ultra-fast intrachip/interchip optical interconnects,
improvement in the performance of fiber-optic communication networks and
developments of optical digital computers and quantum communications.Comment: 7 pages, 15 figure
New metrics for prioritized interaction test suites
Combinatorial interaction testing has been well studied in recent years, and has been widely applied in practice. It generally aims at generating an effective test suite (an interaction test suite) in order to identify faults that are caused by parameter interactions. Due to some constraints in practical applications (e.g. limited testing resources), for example in combinatorial interaction regression testing, prioritized interaction test suites (called interaction test sequences) are often employed. Consequently, many strategies have been proposed to guide the interaction test suite prioritization. It is, therefore, important to be able to evaluate the different interaction test sequences that have been created by different strategies. A well-known metric is the Average Percentage of Combinatorial Coverage (shortly APCCλ), which assesses the rate of interaction coverage of a strength λ (level of interaction among parameters) covered by a given interaction test sequence S. However, APCCλ has two drawbacks: firstly, it has two requirements (that all test cases in S be executed, and that all possible λ-wise parameter value combinations be covered by S); and secondly, it can only use a single strength λ (rather than multiple strengths) to evaluate the interaction test sequence - which means that it is not a comprehensive evaluation. To overcome the first drawback, we propose an enhanced metric Normalized APCCλ (NAPCC) to replace the APCCλ Additionally, to overcome the second drawback, we propose three new metrics: the Average Percentage of Strengths Satisfied (APSS); the Average Percentage of Weighted Multiple Interaction Coverage (APWMIC); and the Normalized APWMIC (NAPWMIC). These metrics comprehensively assess a given interaction test sequence by considering different interaction coverage at different strengths. Empirical studies show that the proposed metrics can be used to distinguish different interaction test sequences, and hence can be used to compare different test prioritization strategies
Dissociation and Degradation of Thiol-Modified DNA on Gold Nanoparticles in Aqueous and Organic Solvents
This document is the Accepted Manuscript version of a Published Work that appeared in final form in Langmuir, copyright © American Chemical Society after peer review and technical editing by publisher. To access the final edited and published work see http://dx.doi.org/10.1021/la200241dGold nanoparticles functionalized with thiol-modified DNA have been widely used in making various nanostructures, colorimetric biosensors, and drug delivery vehicles. Over the past 15 years, significant progress has been made to improve the stability of such functionalized nanoparticles. The stability of the goldâthiol bond in this system, however, has not been studied in a systematic manner. Most information on the goldâthiol bond was obtained from the study of self-assembled monolayers (SAMs). In this study, we employed two fluorophore-labeled and thiol-modified DNAs. The long-term stability of the thiolâgold bond as a function of time, salt, temperature, pH, and organic solvent has been studied. We found that the bond spontaneously dissociated under all tested conditions. The dissociation was favored at high salt, high pH, and high temperature, and little DNA degradation was observed in our system. Most organic solvents showed a moderate protection effect on the goldâthiol bond. The stability of the goldâthiol bond in the DNA system was also compared with that in SAMs. While there are many similarities, we also observed opposite trends for the salt and ethanol effect. This study suggests that the purified DNA-functionalized gold nanoparticles should be freshly prepared and used in a day or two. Long-term storage should be carried out at relatively low temperature in low salt and slightly acidic buffers.University of Waterloo ||
Natural Sciences and Engineering Research Council |
TransformCode: A Contrastive Learning Framework for Code Embedding via Subtree transformation
Large-scale language models have made great progress in the field of software
engineering in recent years. They can be used for many code-related tasks such
as code clone detection, code-to-code search, and method name prediction.
However, these large-scale language models based on each code token have
several drawbacks: They are usually large in scale, heavily dependent on
labels, and require a lot of computing power and time to fine-tune new
datasets.Furthermore, code embedding should be performed on the entire code
snippet rather than encoding each code token. The main reason for this is that
encoding each code token would cause model parameter inflation, resulting in a
lot of parameters storing information that we are not very concerned about. In
this paper, we propose a novel framework, called TransformCode, that learns
about code embeddings in a contrastive learning manner. The framework uses the
Transformer encoder as an integral part of the model. We also introduce a novel
data augmentation technique called abstract syntax tree transformation: This
technique applies syntactic and semantic transformations to the original code
snippets to generate more diverse and robust anchor samples. Our proposed
framework is both flexible and adaptable: It can be easily extended to other
downstream tasks that require code representation such as code clone detection
and classification. The framework is also very efficient and scalable: It does
not require a large model or a large amount of training data, and can support
any programming language.Finally, our framework is not limited to unsupervised
learning, but can also be applied to some supervised learning tasks by
incorporating task-specific labels or objectives. To explore the effectiveness
of our framework, we conducted extensive experiments on different software
engineering tasks using different programming languages and multiple datasets
Theory of Myelin Coiling
A new model is proposed to explain coiling of myelins composed of fluid
bilayers. This model allows the constituent bilayer cylinders of a myelin to be
non-coaxial and the bilayer lateral tension to vary from bilayer to bilayer.
The calculations show that a myelin would bend or coil to lower its free energy
when the bilayer lateral tension is sufficiently large. From a mechanical point
of view, the proposed coiling mechanism is analogous to the classical Euler
buckling of a thin elastic rod under axial compression. The analysis of a
simple two-bilayer case suggests that a bilayer lateral tension of about 1
dyne/cm can easily induce coiling of myelins of typical lipid bilayers. This
model signifies the importance of bilayer lateral tension in determining the
morphology of myelinic structures.Comment: 17 pages, 8 figures, submitted to Eur. Phys. J.
- âŠ