602 research outputs found
Stability of Diluted Adenosine Solutions in Polyolefin Infusion Bags
Background
Intravenous or intracoronary adenosine is used in the cardiac catherization lab to achieve maximal coronary blood flow and determine fractional flow reserve.
Objective
To determine the stability of adenosine 10 and 50 µg/mL in either 0.9% sodium chloride injection or 5% dextrose injection in polyolefin infusion bags stored at 2 temperatures, refrigeration (2°C-8°C) or controlled room temperature (20°C-25°C).
Methods
Adenosine 10 µg/mL and 50 µg/mL solutions were prepared in 50 mL polyolefin infusion bags containing 0.9% sodium chloride injection or 5% dextrose injection and stored at controlled room temperature or under refrigeration. Each combination of concentration, diluent, and storage was prepared in triplicate. Samples were assayed using stability-indicating, reversed-phase high-performance liquid chromatography immediately at time 0 and at 24 hours, 48 hours, 7 days, and 14 days. Stability was defined as retaining 90% to 110% of the initial adenosine concentration. The samples were also visually inspected against a light background for clarity, color, and the presence of particulate matter.
Results
After 14 days, all samples retained 99% to 101% of the initial adenosine concentration. No considerable change in pH or visual appearance was noted. The stability data indicated no significant loss of drug due to chemical degradation or physical interactions during storage.
Conclusion
Adenosine solutions of 10 and 50 µg/mL were stable for at least 14 days in 50 mL polyolefin infusion bags of 0.9% sodium chloride injection or 5% dextrose injection stored at controlled room temperature and refrigerated conditions
Analyse d’un changement de régime forestier : le cas du Nouveau-Brunswick des années 1980
In this study of the New Brunswick forestry policy system, a system that has been in place for over twenty-five years, the authors examine the government’s rationale for its development. Few studies have been devoted to investigating the factors that have contributed to the implementation of this plan. The research in this article is based on the legislative restrictions imposed on forestry in the province, relevant publications, and interviews with experts in the field. Results indicate that in the early 1980s there was a general dissatisfaction with the system in place. The perception by both the public and legislators was that an eventual shortage of lumber products would occur due to inappropriate cutting practices, wastage, and insufficient silviculture management practices adopted in order to renew the resource. This article also shows how the government shifted at least some of the burden of managing the exploitation of Crown lands to big business, in spite of its maintaining a general preoccupation with long term management.Résumé Le régime forestier du Nouveau-Brunswick a déjà plus de vingt-cinq ans. Qu’est-ce qui a incité le gouvernement à mettre en place un tel régime? Peu d’études insistent sur les raisons de ce nouveau régime. Elles sont à la fois contextuelles et structurelles : la perception d’une éventuelle pénurie, par la population autant que par les législateurs; l’application inappropriée de la coupe à blanc selon certains auteurs, jumelée au ravage de la tordeuse des bourgeons de l’épinette, aux superficies incendiées et au gaspillage de matières ligneuses en forêt; enfin l’insuffisance des pratiques sylvicoles et l’insatisfaction générale envers le régime au tournant des années 1980. Il faut enfin mettre en évidence le désengagement de l’État vis-à -vis de la gestion forestière, qui a transféré les coûts à la grande entreprise malgré une préoccupation générale pour une gestion à long terme. Ainsi, cet article s’appuie sur les résultats d’une étude de cas soutenue par une triangulation méthodologique, prenant en compte les hansards législatifs, les textes de la période en question et des entrevues avec des intervenants du domaine
Galactic Cosmic Rays from PBHs and Primordial Spectra with a Scale
We consider the observational constraints from the detection of antiprotons
in the Galaxy on the amount of Primordial Black Holes (PBH) produced from
primordial power spectra with a bumpy mass variance. Though essentially
equivalent at the present time to the constraints from the diffuse -ray
background, they allow a widely independent approach and they should improve
sensibly in the nearby future. We discuss the resulting constraints on
inflationary parameters using a Broken Scale Invariance (BSI) model as a
concrete example.Comment: 10 pages, 3 figures Version accepted in Phys. Lett. B, conclusions
unchange
Virtual Battlespace Scenario Encoding for Reuse, Phase I Report
The United States Army and United States Marine Corps employ the Virtual
Battlespace 2 (VBS2) commercial game for small unit training. Each service has
significant investment in training scenarios constructed using VBS2 tools and conforming
to the vendor's particular data formats. To move toward improved interoperability, to
gain greater fiscal flexibility in addressing the statutory intent for open competition and
affordability, and to protect the investment made in models, terrain, and other elements of
training scenarios that are separate and distinct from the virtual and gaming environments
in which the simulation executes, open standards need to be applied in place of
proprietary commercial off-the-shelf architectures. In the current ( and foreseeable)
environment of constrained budgets, it is ever more critical that the services protect and
enhance their investments in simulation systems used for training and other purposes.
Expanding capabilities for open scenario interchange will improve scenario reuse while
creating greater opportunities for simulation data interchange and open competition for
future gaming capabilities. The Extensible Markup Language (XML) is a wide-spread approach to describing
data format and content to support efficient data processing and data interchange across
systems. This report describes initial application of XML technologies to the
representation of VBS2 scenario data, demonstrating feasibility for the capture and
exchange of VBS2 scenario data. The report provides a plan of action for a follow-on
phase of the effort to expand the representation and for use with other XML-based
standards, such as the Military Scenario Definition Language (MSDL), to create
opportunities for broader interchange of scenario data across a variety of combat
simulations.Commander, Marine Corps Systems Command (DC SIAT)Marine Corps Systems Command, Modeling and Simulation OrganizationApproved for public release; distribution is unlimited
Transient Accelerated Expansion and Double Quintessence
We consider Double Quintessence models for which the Dark Energy sector
consists of two coupled scalar fields. We study in particular the possibility
to have a transient acceleration in these models. In both Double Quintessence
models studied here, it is shown that if acceleration occurs, it is necessarily
transient. We consider also the possibility to have transient acceleration in
two one-field models, the Albrecht-Skordis model and the pure exponential.
Using separate conservative constraints (marginalizing over the other
parameters) on the effective equation of state , the relative density
of the Dark Energy and the present age of the universe, we
construct scenarios with a transient acceleration that has already ended at the
present time, and even with no acceleration at all, but a less conservative
analysis using the CMB data rules out the last possibility. The scenario with a
transient acceleration ended by today, can be implemented for the range of
cosmological parameters and .Comment: Version accepted in Phys. Rev. D, 22 pages, 10 figures, 4 table
Quantum error correction benchmarks for continuous weak parity measurements
We present an experimental procedure to determine the usefulness of a
measurement scheme for quantum error correction (QEC). A QEC scheme typically
requires the ability to prepare entangled states, to carry out multi-qubit
measurements, and to perform certain recovery operations conditioned on
measurement outcomes. As a consequence, the experimental benchmark of a QEC
scheme is a tall order because it requires the conjuncture of many elementary
components. Our scheme opens the path to experimental benchmarks of individual
components of QEC. Our numerical simulations show that certain parity
measurements realized in circuit quantum electrodynamics are on the verge of
being useful for QEC
Characterization of bioaerosols from dairy barns : reconstructing the puzzle of occupational respiratory diseases by using molecular approaches
To understand the etiology of exposure-related diseases and to establish standards for reducing the risks associated with working
in contaminated environments, the exact nature of the bioaerosol components must be defined. Molecular biology tools were
used to evaluate airborne bacterial and, for the first time, archaeal content of dairy barns. Three air samplers were tested in each
of the 13 barns sampled. Up to 106 archaeal and 108 bacterial 16S rRNA genes per m3 of air were detected. Archaeal methanogens,
mainly Methanobrevibacter species, were represented. Saccharopolyspora rectivirgula, the causative agent of farmer’s lung, was
quantified to up to 107 16S rRNA genes per m3 of air. In addition, a wide variety of bacterial agents were present in our air samples
within the high airborne bioaerosol concentration range. Despite recommendations regarding hay preservation and baling
conditions, farmers still develop an S. rectivirgula-specific humoral immune response, suggesting intense and continuous exposure.
Our results demonstrate the complexity of bioaerosol components in dairy barns which could play a role in occupational
respiratory diseas
Modeling Expeditionary Advanced Base Operations in the Combined Arms Analysis Tool for the 21st Century (COMBATXXI)
The United States Marine Corps (USMC) is undergoing organizational and operational changes to adapt to new warfighting requirements in
today’s world. The USMC Force Design 2030 describes new concepts, such as Expeditionary Advanced Base Operations (EABO), with a focus on
reconnaissance/counter-reconnaissance and maritime interdiction. To examine and evaluate new concepts of operation, force structures, weapon
systems, tactics, techniques, and procedures, as well as other adaptations for such operations, the USMC requires models and simulations that
can represent the full range of variations related to these expected changes. The Combined Arms Analysis Tool for the 21st Century
(COMBATXXI) is a combat simulation jointly developed by the USMC and the US Army to support modeling and analysis. Developed over the past
20 years, COMBATXXI possesses many of the fundamental capabilities needed to study these new concepts but currently lacks realistic
representation in some key areas, such as maritime surface combatants needed for examining critical aspects of the new role of maritime
interdiction. Such representation requires platform identification, targeting, and assessment of damage that can lead to determination of their
continued ability to perform operational missions. The purpose of this study is to examine new warfighting concepts related to EABO and to
identify relevant modeling approaches using the COMBATXXI simulation. The study describes a modeling approach, initial implementation of that
approach in COMBATXXI, and preliminary evaluation of the utility of the model for supporting scenarios and studies relevant to the new USMC
concepts of operation. The study concludes with recommendations for follow-on work to further improve or employ the developed capability.Marine Corps Combat Development Command Operations Analysis Directorate Capabilities Development and Integration.Approved for public release; distribution is unlimited
Dix petits hacks Tinder : les algorithmes au service d’une économie spéculative des rencontres amoureuses et sexuelles
La mĂ©diation des rencontres amoureuses et sexuelles repose sur une longue tradition de services-conseils. Cet article analyse un corpus de guides-conseils dĂ©veloppĂ©s autour de l’application populaire Tinder. Notre analyse suggère que la pratique de blackboxing algorithmique, soit le fait de caractĂ©riser les algorithmes Ă titre de boĂ®tes noires, est centrale aux guides Ă©tudiĂ©s. Ces guides misent sur l’opacitĂ© prĂ©sumĂ©e des algorithmes de Tinder pour vanter certains produits, conseils ou services visant la facilitation des rencontres. La pratique de blackboxing algorithmique renforce principalement l’injonction de hacker Tinder. Dans ce contexte, la notion de hack renvoie Ă l’importance pour les usagers de moduler leurs comportements afin de les rendre plus algorithmiquement intelligibles, notamment en fonction de leur genre. Ainsi, nous suggĂ©rons que les algorithmes de Tinder participent Ă reproduire une Ă©conomie spĂ©culative des rencontres amoureuses et sexuelles.The mediation of love and sexuality has often relied on dating advice. This paper analyses a corpus of dating guides concerning the popular app Tinder. Our analysis suggests that the practice of algorithmic blackboxing, that is, characterizing algorithms as black boxes, is central to these guides that rely on the alleged opacity of Tinder’s algorithms to praise certain products, advice, or services that aim to facilitate matchmaking. The practice of algorithmic blackboxing primarily encourages users to hack Tinder. In this context, the notion of hack stresses the importance for users to modulate their behaviours in ways that make them more algorithmically recognisable, especially in relation to their gender. Thus, we suggest that Tinder’s algorithms participate in reproducing a speculative dating economy.La mediaciĂłn en las citas y encuentros sexuales a travĂ©s de servicios de consultorĂa goza de una larga tradiciĂłn. Este artĂculo analiza un corpus de guĂas-consejos desarrollados en el entorno de la popular aplicaciĂłn Tinder. El análisis sugiere que la práctica del blackboxing algorĂtmico, es decir, el hecho de caracterizar a los algoritmos como cajas negras, es fundamental en las guĂas estudiadas. Estas guĂas utilizan la supuesta opacidad de los algoritmos de Tinder para promocionar ciertos productos, consejos o servicios destinados a facilitar las citas. La práctica del blackboxing algorĂtmico refuerza la posibilidad de hackear Tinder. En este contexto, la nociĂłn de hack hace que los usuarios modulen sus comportamientos, de forma que puedan hacerlos más inteligibles algorĂtmicamente; y esto de acuerdo con su gĂ©nero. Finalmente, se concluye que los algoritmos de Tinder tienden a reproducir una economĂa especulativa de las citas amorosas y los encuentros sexuales
- …