8,187 research outputs found
Silicon carbide, a semiconductor for space power electronics
After many years of promise as a high temperature semiconductor, silicon carbide (SiC) is finally emerging as a useful electronic material. Recent significant progress that has led to this emergence has been in the areas of crystal growth and device fabrication technology. High quality single-crystal SiC wafers, up to 25 mm in diameter, can now be produced routinely from boules grown by a high temperature (2700 K) sublimation process. Device fabrication processes, including chemical vapor deposition (CVD), in situ doping during CVD, reactive ion etching, oxidation, metallization, etc. have been used to fabricate p-n junction diodes and MOSFETs. The diode was operated to 870 K and the MOSFET to 770 K
Silicon carbide, an emerging high temperature semiconductor
In recent years, the aerospace propulsion and space power communities have expressed a growing need for electronic devices that are capable of sustained high temperature operation. Applications for high temperature electronic devices include development instrumentation within engines, engine control, and condition monitoring systems, and power conditioning and control systems for space platforms and satellites. Other earth-based applications include deep-well drilling instrumentation, nuclear reactor instrumentation and control, and automotive sensors. To meet the needs of these applications, the High Temperature Electronics Program at the Lewis Research Center is developing silicon carbide (SiC) as a high temperature semiconductor material. Research is focussed on developing the crystal growth, characterization, and device fabrication technologies necessary to produce a family of silicon carbide electronic devices and integrated sensors. The progress made in developing silicon carbide is presented, and the challenges that lie ahead are discussed
Recommended from our members
Boarding is Associated with Reduced Emergency Department Efficiency that is not Mitigated by a Provider in Triage
Introduction: Boarding of patients in the emergency department (ED) is associated with decreased ED efficiency. The provider-in-triage (PIT) model has been shown to improve ED throughput, but it is unclear how these improvements are affected by boarding. We sought to assess the effects of boarding on ED throughput and whether implementation of a PIT model mitigated those effects.Methods: We performed a multi-site retrospective review of 955 days of ED operations data at a tertiary care academic ED (AED) and a high-volume community ED (CED) before and after implementation of PIT. Key outcome variables were door to provider time (D2P), total length of stay of discharged patients (LOSD), and boarding time (admit request to ED departure [A2D]).Results: Implementation of PIT was associated with a decrease in median D2P by 22 minutes or 43% at the AED (p < 0.01), and 18 minutes (31%) at the CED (p < 0.01). LOSD also decreased by 19 minutes (5.9%) at the AED and 8 minutes (3.3%) at the CED (p<0.01). After adjusting for variations in daily census, the effect of boarding (A2D) on D2P and LOSD was unchanged, despite the implementation of PIT. At the AED, 7.7 minutes of boarding increased median D2P by one additional minute (p < 0.01), and every four minutes of boarding increased median LOSD by one minute (p < 0.01). At the CED, 7.1 minutes of boarding added one additional minute to D2P (p < 0.01), and 4.8 minutes of boarding added one minute to median LOSD (p < 0.01).Conclusion: In this retrospective, observational multicenter study, ED operational efficiency was improved with the implementation of a PIT model but worsened with boarding. The PIT model was unable to mitigate any of the effects of boarding. This suggests that PIT is associated with increased efficiency of ED intake and throughput, but boarding continues to have the same effect on ED efficiency regardless of upstream efficiency measures that may be designed to minimize its impact
Parallel resampling in the particle filter
Modern parallel computing devices, such as the graphics processing unit
(GPU), have gained significant traction in scientific and statistical
computing. They are particularly well-suited to data-parallel algorithms such
as the particle filter, or more generally Sequential Monte Carlo (SMC), which
are increasingly used in statistical inference. SMC methods carry a set of
weighted particles through repeated propagation, weighting and resampling
steps. The propagation and weighting steps are straightforward to parallelise,
as they require only independent operations on each particle. The resampling
step is more difficult, as standard schemes require a collective operation,
such as a sum, across particle weights. Focusing on this resampling step, we
analyse two alternative schemes that do not involve a collective operation
(Metropolis and rejection resamplers), and compare them to standard schemes
(multinomial, stratified and systematic resamplers). We find that, in certain
circumstances, the alternative resamplers can perform significantly faster on a
GPU, and to a lesser extent on a CPU, than the standard approaches. Moreover,
in single precision, the standard approaches are numerically biased for upwards
of hundreds of thousands of particles, while the alternatives are not. This is
particularly important given greater single- than double-precision throughput
on modern devices, and the consequent temptation to use single precision with a
greater number of particles. Finally, we provide auxiliary functions useful for
implementation, such as for the permutation of ancestry vectors to enable
in-place propagation.Comment: 21 pages, 6 figure
Poems
THE GHOSTS OF A CARTOON LIFE for Beverly Bourne, CRO-KIL
Reviving a Natural Right: The Freedom of Autonomy Amendment
Reviving a Natural Right: The Freedom of Autonomy Amendment
Michael Anthony Lawrence
Something is wrong in twenty-first century America when it comes to recognizing certain “self-evident truths” of freedom identified in its founding document nearly 23 decades ago. In particular, the United States today fails to uphold the core principle of the Declaration of Independence that “all men are created equal: that they are endowed by their Creator with certain inalienable rights: that among these are life, liberty and the pursuit of happiness.” This description was more than just an accidental well-turned phrase – as demonstrated by the historical record, it represented the very foundation of the Revolutionary political theory, and was intended to draw strict boundaries for the proper reach of government. America of the early twenty-first century is a place where oppressive state constitutional amendments discriminate against same-sex couples; where compassionate end-of-life choice is illegal in 49 states and where the one state where it is legal is being sued by the U.S. government; where tens of thousands are in prison for possessing or using marijuana; where a woman’s right to maintain control over her own reproductive decisions hangs by a thread; and where religious freedom is under relentless attack. How is it that Tocqueville’s prediction of a “wholly new species of oppression…, [where] the democratic government, acting in response to the will of the majority, … create[s] a society with a network of … [rules] that none can escape” has indeed come to pass?
This essay explores progressively the nature of the right of “freedom of autonomy,” several present-day applications and the right’s historical foundations, then asserts that nothing short of a constitutional amendment prohibiting federal and state government from abridging any person’s individual freedom of autonomy on matters of natural private concern will suffice in protecting the right as it was envisioned at the time of America’s founding and reaffirmed in the Reconstruction. Now is the time for change
- …