704 research outputs found
Refractory oxide insulated thermocouple designed and analyzed for high temperature applications
Study establishes design criteria for constructing high temperature thermocouple to measure nuclear fuel pin temperature. The study included a literature search to determine the compatibility of material useful for thermocouples, a hot zone error analysis, and a prototype design for hot junction and connector pin connections
Ocular sarcoidosis and tuberculous lymphadenopathy: coincidence or real association
Tuberculosis and sarcoidosis share similarity in histopathologic findings and clinically occur in association with each other occasionally. Tuberculosis should always be ruled out before the diagnosis of sarcoidosis. But, the diagnosis is often complicated, especially in extrapulmonary cases. Here we present a case of bilateral vitreous hemorrhage with uveitis. Ocular sarcoidosis was initially diagnosed based on the characteristic ocular findings, negative results on chest radiography, tuberculosis culture, and polymerase chain reaction of aqueous, as well as simultaneous presence of panda and lambda sign on gallium-67 scans. The ocular condition improved after pars plana vitrectomy and systemic steroid therapy. However, TB lymphadenopathy but no recurrent ocular inflammation was found 6Â years later. The patient received anti-TB treatment for 6Â months thereafter. The eyes remained silent except cataract progression and glaucoma under two medications during this period. In conclusion, TB could occur coincidently or in association with sarcoidosis, continued follow-up is important for patients with ocular sarcoidosis
How do I know what my theory predicts?
To get evidence for or against a theory relative to the null hypothesis, one needs to know what the theory predicts. The amount of evidence can then be quantified by a Bayes factor. Specifying the sizes of the effect oneâs theory predicts may not come naturally, but I show some ways of thinking about the problem, some simple heuristics that are often useful when one has little relevant prior information. These heuristics include the room-to-move heuristic (for comparing mean differences), the ratio-of-scales heuristic (for regression slopes), the ratio-of-means heuristic (for regression slopes), the basic-effect heuristic (for analysis of variance effects), and the total-effect heuristic (for mediation analysis)
Is the quantum world composed of propensitons?
In this paper I outline my propensiton version of quantum theory (PQT). PQT is a fully micro-realistic version of quantum theory that provides us with a very natural possible solution to the fundamental wave/particle problem, and is free of the severe defects of orthodox quantum theory (OQT) as a result. PQT makes sense of the quantum world. PQT recovers all the empirical success of OQT and is, furthermore, empirically testable (although not as yet tested). I argue that Einstein almost put forward this version of quantum theory in 1916/17 in his papers on spontaneous and induced radiative transitions, but retreated from doing so because he disliked the probabilistic character of the idea. Subsequently, the idea was overlooked because debates about quantum theory polarised into the Bohr/Heisenberg camp, which argued for the abandonment of realism and determinism, and the Einstein/Schrödinger camp, which argued for the retention of realism and determinism, no one, as a result, pursuing the most obvious option of retaining realism but abandoning determinism. It is this third, overlooked option that leads to PQT. PQT has implications for quantum field theory, the standard model, string theory, and cosmology. The really important point, however, is that it is experimentally testable. I indicate two experiments in principle capable of deciding between PQT and OQT
The Velvet Cage of Educational Con(pro)sumption
In the year that George Ritzer publishes the ninth edition of The McDonaldization of Society, moving his famous theory firmly Into the Digital Age, critical educator Petar JandriÄ and sociologist Sarah Hayes invited George to a dialogue on the digital transformation of McDonaldization and its relationship to consumer culture. In this article, George first traces for us the origins of his theory that has endured for four decades. A key dimension of McDonaldization is the âiron cageâ of control, via rationalization, that was once contained within physical sites of bricks and mortar. Increasingly now, we encounter a âvelvet cageâ in sites of digital consumption, at the hands of non-human technologies, that threaten human labour and autonomy. Exploited as unpaid con(pro)sumers, we labour to provide information for corporate digital billionaires, keeping McDonaldization alive, well, and even more predominant in augmented settings, including Higher Education, in the form of the McUniversity. With the rise of prosuming machines such as blockchain and bitcoin, that can both produce and consume without intervention from human prosumers, George concludes that prosumer capitalism will explode into unprecedented and unpredictable directions in the years to come
Design and Implementation of an Underwater Sound Recording Device
To monitor the underwater sound and pressure waves generated by anthropogenic activities such as underwater blasting and pile driving, an autonomous system was designed to record underwater acoustic signals. The underwater sound recording device (USR) allows for connections of two hydrophones or other dynamic pressure sensors, filters high frequency noise out of the collected signals, has a gain that can be independently set for each sensor, and allows for 2 h of data collection. Two versions of the USR were created: a submersible model deployable to a maximum depth of 300 m, and a watertight but not fully submersible model. Tests were performed on the USR in the laboratory using a data acquisition system to send single-frequency sinusoidal voltages directly to each component. These tests verified that the device operates as designed and performs as well as larger commercially available data acquisition systems, which are not suited for field use. On average, the designed gain values differed from the actual measured gain values by about 0.35 dB. A prototype of the device was used in a case study to measure blast pressures while investigating the effect of underwater rock blasting on juvenile Chinook salmon and rainbow trout. In the case study, maximum positive pressure from the blast was found to be significantly correlated with frequency of injury for individual fish. The case study also demonstrated that the device withstood operation in harsh environments, making it a valuable tool for collecting field measurements
Local Hidden Variables Underpinning of Entanglement and Teleportation
Entangled states whose Wigner functions are non-negative may be viewed as
being accounted for by local hidden variables (LHV). Recently, there were
studies of Bell's inequality violation (BIQV) for such states in conjunction
with the well known theorem of Bell that precludes BIQV for theories that have
LHV underpinning. We extend these studies to teleportation which is also based
on entanglement. We investigate if, to what extent, and under what conditions
may teleportation be accounted for via LHV theory. Our study allows us to
expose the role of various quantum requirements. These are, e.g., the
uncertainty relation among non-commuting operators, and the no-cloning theorem
which forces the complete elimination of the teleported state at its initial
port.Comment: 24 pages, 1 figure, accepted Found. Phy
Free Will in a Quantum World?
In this paper, I argue that Conway and Kochenâs Free Will Theorem (1,2) to the conclusion that quantum mechanics and relativity entail freedom for the particles, does not change the situation in favor of a libertarian position as they would like. In fact, the theorem more or less implicitly assumes that people are free, and thus it begs the question. Moreover, it does not prove neither that if people are free, so are particles, nor that the property people possess when they are said to be free is the same as the one particles possess when they are claimed to be free. I then analyze the Free State Theorem (2), which generalizes the Free Will Theorem without the assumption that people are free, and I show that it does not prove anything about free will, since the notion of freedom for particles is either inconsistent, or it does not concern our common understanding of freedom. In both cases, the Free Will Theorem and the Free State Theorem do not provide any enlightenment on the constraints physics can pose on free will
'A habitual disposition to the good': on reason, virtue and realism
Amidst the crisis of instrumental reason, a number of contemporary political philosophers including JĂŒrgen Habermas have sought to rescue the project of a reasonable humanism from the twin threats of religious fundamentalism and secular naturalism. In his recent work, Habermas defends a post-metaphysical politics that aims to protect rationality against encroachment while also accommodating religious faith within the public sphere. This paper contends that Habermasâ post-metaphysical project fails to provide a robust alternative either to the double challenge of secular naturalism and religious fundamentalism or to the ruthless instrumentalism that underpins capitalism. By contrast with Habermas and also with the ânew realismâ of contemporary political philosophers such as Raymond Geuss or Bernard Williams, realism in the tradition of Plato and Aristotle can defend reason against instrumental rationality and blind belief by integrating it with habit, feeling and even faith. Such metaphysicalâpolitical realism can help develop a politics of virtue that goes beyond communitarian thinking by emphasising plural modes of association (not merely âcommunityâ), substantive ties of sympathy and the importance of pursuing goodness and mutual flourishing
Assessing architectural evolution: A case study
This is the post-print version of the Article. The official published can be accessed from the link below - Copyright @ 2011 SpringerThis paper proposes to use a historical perspective on generic laws, principles,
and guidelines, like Lehmanâs software evolution laws and Martinâs design principles, in order to achieve a multi-faceted process and structural assessment of a systemâs architectural evolution. We present a simple structural model with associated historical metrics and
visualizations that could form part of an architectâs dashboard. We perform such an assessment for the Eclipse SDK, as a case study of a large, complex, and long-lived system for which sustained effective architectural evolution is paramount. The twofold aim of checking generic principles on a well-know system is, on the one hand,
to see whether there are certain lessons that could be learned for best practice of architectural evolution, and on the other hand to get more insights about the applicability of such principles. We find that while the Eclipse SDK does follow several of the laws and principles, there are some deviations, and we discuss areas of architectural improvement and limitations of the assessment approach
- âŠ