1,849 research outputs found
Shock Temperature of Stainless Steel and a High Pressure - High Temperature Constraint on Thermal Diffusivity of Al_2O_3
Time dependent shock temperatures were measured for stainless steel (SS) films in contact with transparent anvils. The anvil/window material was the same as the driver material so that there would be symmetric heat flow from the sample. Inferred Hugoniot temperatures, T_h , of 5800–7500 K at 232–321 GPa are consistent with previous measurements in SS. Temperatures at the film‐anvil interface (T_i ), which are more directly measured than T_h , indicate that T_i did not decrease measurably during the approximately 250 ns that the shock wave was in Al_2O_3 or LiF anvils. Thus an upper bound is obtained for the thermal diffusivity of Al_2O_3 at the metal/anvil interface at 230 GPa and 6000K of κ≤0.00096 cm_2/s. This is a factor of 17 lower than previously calculated values, resulting in a decrease of the inferred T_h by 730 k. The observed shock temperatures are combined with temperatures calculated from measured Hugoniots and are used to calculate thermal conductivities of Al_2O_3. Also we note that since there was no measurable intensity decrease during the time when the shock wave propagated through the window, we infer from this that Al_2O_3 remained transparent while in the shocked state. Thus sapphire is a good window material to at least 250 GPa for shock temperature measurements for metals
Recommended from our members
Automated grading system for evaluation of ocular redness associated with dry eye
Background: We have observed that dry eye redness is characterized by a prominence of fine horizontal conjunctival vessels in the exposed ocular surface of the interpalpebral fissure, and have incorporated this feature into the grading of redness in clinical studies of dry eye. Aim To develop an automated method of grading dry eye-associated ocular redness in order to expand on the clinical grading system currently used. Methods: Ninety nine images from 26 dry eye subjects were evaluated by five graders using a 0–4 (in 0.5 increments) dry eye redness (Ora Calibra™ Dry Eye Redness Scale [OCDER]) scale. For the automated method, the Opencv computer vision library was used to develop software for calculating redness and horizontal conjunctival vessels (noted as “horizontality”). From original photograph, the region of interest (ROI) was selected manually using the open source ImageJ software. Total average redness intensity (Com-Red) was calculated as a single channel 8-bit image as R – 0.83G – 0.17B, where R, G and B were the respective intensities of the red, green and blue channels. The location of vessels was detected by normalizing the blue channel and selecting pixels with an intensity of less than 97% of the mean. The horizontal component (Com-Hor) was calculated by the first order Sobel derivative in the vertical direction and the score was calculated as the average blue channel image intensity of this vertical derivative. Pearson correlation coefficients, accuracy and concordance correlation coefficients (CCC) were calculated after regression and standardized regression of the dataset. Results: The agreement (both Pearson’s and CCC) among investigators using the OCDER scale was 0.67, while the agreement of investigator to computer was 0.76. A multiple regression using both redness and horizontality improved the agreement CCC from 0.66 and 0.69 to 0.76, demonstrating the contribution of vessel geometry to the overall grade. Computer analysis of a given image has 100% repeatability and zero variability from session to session. Conclusion: This objective means of grading ocular redness in a unified fashion has potential significance as a new clinical endpoint. In comparisons between computer and investigator, computer grading proved to be more reliable than another investigator using the OCDER scale. The best fitting model based on the present sample, and usable for future studies, was C4=−12.24+2.12C2HOR+0.88C2RED:C4 is the predicted investigator grade, and C2HOR and C2RED are logarithmic transformations of the computer calculated parameters COM-Hor and COM-Red. Considering the superior repeatability, computer automated grading might be preferable to investigator grading in multicentered dry eye studies in which the subtle differences in redness incurred by treatment have been historically difficult to define
Recycling bins, garbage cans or think tanks? Three myths regarding policy analysis institutes
The phrase 'think tank' has become ubiquitous – overworked and underspecified – in the political lexicon. It is entrenched in scholarly discussions of public policy as well as in the 'policy wonk' of journalists, lobbyists and spin-doctors. This does not mean that there is an agreed definition of think tank or consensual understanding of their roles and functions. Nevertheless, the majority of organizations with this label undertake policy research of some kind. The idea of think tanks as a research communication 'bridge' presupposes that there are discernible boundaries between (social) science and policy. This paper will investigate some of these boundaries. The frontiers are not only organizational and legal; they also exist in how the 'public interest' is conceived by these bodies and their financiers. Moreover, the social interactions and exchanges involved in 'bridging', themselves muddy the conception of 'boundary', allowing for analysis to go beyond the dualism imposed in seeing science on one side of the bridge, and the state on the other, to address the complex relations between experts and public policy
Identifying house price diffusion patterns among Australian state capital cities
Prior research supports the proposition that house price diffusion shows a ripple effect along the spatial dimension. That is, house price changes in one region would reflect in subsequent house price changes in other regions, showing certain linkages among regions. Using the vector autoregression model and the impulse response function, this study investigates house price diffusion among Australia\u27s state capital cities, examining the response of one market to the innovation of other markets and determining the lagged terms for the maximum absolute value of the other markets\u27 responses. The results show that the most important subnational markets in Australia do not point to Sydney, rather towards Canberra and Hobart, while the Darwin market plays a role of buffer. The safest markets are Sydney and Melbourne. This study helps to predict house price movement trends in eight capital cities.<br /
Depoliticisation, Resilience and the Herceptin Post-code Lottery Crisis: Holding Back the Tide
This article:
Covers new empirical terrain in the study of depoliticisation, with an in-depth case study of health technology regulation;
Analyses depoliticisation from a novel analytical perspective, examining how depoliticised institutions are resilient to external pressure for politicisation;
Posits a distinctive framework for analysing resilience, drawing on cognate literatures on policy networks and agencification;
Raises interesting and distinctive questions about the nature of depoliticisation in advanced liberal democracies, arguing it is more contested than commonly acknowledged.
Depoliticisation as a concept offers distinctive insights into how governments attempt to relieve political pressures in liberal democracies. Analysis has examined the effects of depoliticisation tactics on the public, but not how those tactics are sustained during moments of political tension. Drawing on policy networks and agencification literatures, this article examines how these tactics are resilient against pressure for politicisation. Using an in-depth case study of the controversial appraisal of cancer drug Herceptin in 2005/6 by the National Institute for Health and Clinical Excellence (NICE), the article examines how ‘resilient’ NICE was to external politicisation. It is argued that NICE was resilient because it was effectively ‘insulated’ by formal procedures and informal norms of deference to scientific expertise. This mechanism is termed ‘institutional double glazing’. The conclusion suggests developments to the conceptual and methodological framework of depoliticisation, and highlights theoretical insights into the nature of ‘anti-politics’ in contemporary democracies
Diagnostic accuracy of Doppler ultrasound technique of the penile arteries in correlation to selective arteriography
In 63% of 265 patients with erectile dysfunction a relevant arterial inflow disturbance was found by Doppler ultrasound examination. Correlation between Doppler and arteriography in 58 patients showed an accuracy of 95% in detecting penile arteries and an accuracy of 91% in discovering a pathological arterial pattern (arterial anomaly or arteriosclerotic obstruction). In 15 patients the arterial inflow was measured additionally by Doppler ultrasound technique after intracavernosal injection of vasoactive drugs (IIVD) (7.5 mg papaverine and 0.25 mg phentolamine). This technique proved to be more reliable than in the flaccid state and markedly facilitated localization and assessment of pathological changes of the cavernosal arteries
Individualization as driving force of clustering phenomena in humans
One of the most intriguing dynamics in biological systems is the emergence of
clustering, the self-organization into separated agglomerations of individuals.
Several theories have been developed to explain clustering in, for instance,
multi-cellular organisms, ant colonies, bee hives, flocks of birds, schools of
fish, and animal herds. A persistent puzzle, however, is clustering of opinions
in human populations. The puzzle is particularly pressing if opinions vary
continuously, such as the degree to which citizens are in favor of or against a
vaccination program. Existing opinion formation models suggest that
"monoculture" is unavoidable in the long run, unless subsets of the population
are perfectly separated from each other. Yet, social diversity is a robust
empirical phenomenon, although perfect separation is hardly possible in an
increasingly connected world. Considering randomness did not overcome the
theoretical shortcomings so far. Small perturbations of individual opinions
trigger social influence cascades that inevitably lead to monoculture, while
larger noise disrupts opinion clusters and results in rampant individualism
without any social structure. Our solution of the puzzle builds on recent
empirical research, combining the integrative tendencies of social influence
with the disintegrative effects of individualization. A key element of the new
computational model is an adaptive kind of noise. We conduct simulation
experiments to demonstrate that with this kind of noise, a third phase besides
individualism and monoculture becomes possible, characterized by the formation
of metastable clusters with diversity between and consensus within clusters.
When clusters are small, individualization tendencies are too weak to prohibit
a fusion of clusters. When clusters grow too large, however, individualization
increases in strength, which promotes their splitting.Comment: 12 pages, 4 figure
Crystal structure and assembly of the functional Nanoarchaeum equitans tRNA splicing endonuclease
The RNA splicing and processing endonuclease from Nanoarchaeum equitans (NEQ) belongs to the recently identified (αβ)2 family of splicing endonucleases that require two different subunits for splicing activity. N. equitans splicing endonuclease comprises the catalytic subunit (NEQ205) and the structural subunit (NEQ261). Here, we report the crystal structure of the functional NEQ enzyme at 2.1 Å containing both subunits, as well as that of the NEQ261 subunit alone at 2.2 Å. The functional enzyme resembles previously known α2 and α4 endonucleases but forms a heterotetramer: a dimer of two heterodimers of the catalytic subunit (NEQ205) and the structural subunit (NEQ261). Surprisingly, NEQ261 alone forms a homodimer, similar to the previously known homodimer of the catalytic subunit. The homodimers of isolated subunits are inhibitory to heterodimerization as illustrated by a covalently linked catalytic homodimer that had no RNA cleavage activity upon mixing with the structural subunit. Detailed structural comparison reveals a more favorable hetero- than homodimerization interface, thereby suggesting a possible regulation mechanism of enzyme assembly through available subunits. Finally, the uniquely flexible active site of the NEQ endonuclease provides a possible explanation for its broader substrate specificity
Humanized Mouse Model of Ovarian Cancer Recapitulates Patient Solid Tumor Progression, Ascites Formation, and Metastasis
Ovarian cancer is the most common cause of death from gynecological cancer. Understanding the biology of this disease, particularly how tumor-associated lymphocytes and fibroblasts contribute to the progression and metastasis of the tumor, has been impeded by the lack of a suitable tumor xenograft model. We report a simple and reproducible system in which the tumor and tumor stroma are successfully engrafted into NOD-scid IL2Rγnull (NSG) mice. This is achieved by injecting tumor cell aggregates derived from fresh ovarian tumor biopsy tissues (including tumor cells, and tumor-associated lymphocytes and fibroblasts) i.p. into NSG mice. Tumor progression in these mice closely parallels many of the events that are observed in ovarian cancer patients. Tumors establish in the omentum, ovaries, liver, spleen, uterus, and pancreas. Tumor growth is initially very slow and progressive within the peritoneal cavity with an ultimate development of tumor ascites, spontaneous metastasis to the lung, increasing serum and ascites levels of CA125, and the retention of tumor-associated human fibroblasts and lymphocytes that remain functional and responsive to cytokines for prolonged periods. With this model one will be able to determine how fibroblasts and lymphocytes within the tumor microenvironment may contribute to tumor growth and metastasis, and will make it possible to evaluate the efficacy of therapies that are designed to target these cells in the tumor stroma
Physics, Topology, Logic and Computation: A Rosetta Stone
In physics, Feynman diagrams are used to reason about quantum processes. In
the 1980s, it became clear that underlying these diagrams is a powerful analogy
between quantum physics and topology: namely, a linear operator behaves very
much like a "cobordism". Similar diagrams can be used to reason about logic,
where they represent proofs, and computation, where they represent programs.
With the rise of interest in quantum cryptography and quantum computation, it
became clear that there is extensive network of analogies between physics,
topology, logic and computation. In this expository paper, we make some of
these analogies precise using the concept of "closed symmetric monoidal
category". We assume no prior knowledge of category theory, proof theory or
computer science.Comment: 73 pages, 8 encapsulated postscript figure
- …