8,187 research outputs found

    The Trawler

    Get PDF
    Two poems by Australian poet Anthony Lawrenc

    Silicon carbide, a semiconductor for space power electronics

    Get PDF
    After many years of promise as a high temperature semiconductor, silicon carbide (SiC) is finally emerging as a useful electronic material. Recent significant progress that has led to this emergence has been in the areas of crystal growth and device fabrication technology. High quality single-crystal SiC wafers, up to 25 mm in diameter, can now be produced routinely from boules grown by a high temperature (2700 K) sublimation process. Device fabrication processes, including chemical vapor deposition (CVD), in situ doping during CVD, reactive ion etching, oxidation, metallization, etc. have been used to fabricate p-n junction diodes and MOSFETs. The diode was operated to 870 K and the MOSFET to 770 K

    Silicon carbide, an emerging high temperature semiconductor

    Get PDF
    In recent years, the aerospace propulsion and space power communities have expressed a growing need for electronic devices that are capable of sustained high temperature operation. Applications for high temperature electronic devices include development instrumentation within engines, engine control, and condition monitoring systems, and power conditioning and control systems for space platforms and satellites. Other earth-based applications include deep-well drilling instrumentation, nuclear reactor instrumentation and control, and automotive sensors. To meet the needs of these applications, the High Temperature Electronics Program at the Lewis Research Center is developing silicon carbide (SiC) as a high temperature semiconductor material. Research is focussed on developing the crystal growth, characterization, and device fabrication technologies necessary to produce a family of silicon carbide electronic devices and integrated sensors. The progress made in developing silicon carbide is presented, and the challenges that lie ahead are discussed

    Parallel resampling in the particle filter

    Full text link
    Modern parallel computing devices, such as the graphics processing unit (GPU), have gained significant traction in scientific and statistical computing. They are particularly well-suited to data-parallel algorithms such as the particle filter, or more generally Sequential Monte Carlo (SMC), which are increasingly used in statistical inference. SMC methods carry a set of weighted particles through repeated propagation, weighting and resampling steps. The propagation and weighting steps are straightforward to parallelise, as they require only independent operations on each particle. The resampling step is more difficult, as standard schemes require a collective operation, such as a sum, across particle weights. Focusing on this resampling step, we analyse two alternative schemes that do not involve a collective operation (Metropolis and rejection resamplers), and compare them to standard schemes (multinomial, stratified and systematic resamplers). We find that, in certain circumstances, the alternative resamplers can perform significantly faster on a GPU, and to a lesser extent on a CPU, than the standard approaches. Moreover, in single precision, the standard approaches are numerically biased for upwards of hundreds of thousands of particles, while the alternatives are not. This is particularly important given greater single- than double-precision throughput on modern devices, and the consequent temptation to use single precision with a greater number of particles. Finally, we provide auxiliary functions useful for implementation, such as for the permutation of ancestry vectors to enable in-place propagation.Comment: 21 pages, 6 figure

    Poems

    Get PDF
    THE GHOSTS OF A CARTOON LIFE for Beverly Bourne, CRO-KIL

    Reviving a Natural Right: The Freedom of Autonomy Amendment

    Get PDF
    Reviving a Natural Right: The Freedom of Autonomy Amendment Michael Anthony Lawrence Something is wrong in twenty-first century America when it comes to recognizing certain “self-evident truths” of freedom identified in its founding document nearly 23 decades ago. In particular, the United States today fails to uphold the core principle of the Declaration of Independence that “all men are created equal: that they are endowed by their Creator with certain inalienable rights: that among these are life, liberty and the pursuit of happiness.” This description was more than just an accidental well-turned phrase – as demonstrated by the historical record, it represented the very foundation of the Revolutionary political theory, and was intended to draw strict boundaries for the proper reach of government. America of the early twenty-first century is a place where oppressive state constitutional amendments discriminate against same-sex couples; where compassionate end-of-life choice is illegal in 49 states and where the one state where it is legal is being sued by the U.S. government; where tens of thousands are in prison for possessing or using marijuana; where a woman’s right to maintain control over her own reproductive decisions hangs by a thread; and where religious freedom is under relentless attack. How is it that Tocqueville’s prediction of a “wholly new species of oppression…, [where] the democratic government, acting in response to the will of the majority, … create[s] a society with a network of … [rules] that none can escape” has indeed come to pass? This essay explores progressively the nature of the right of “freedom of autonomy,” several present-day applications and the right’s historical foundations, then asserts that nothing short of a constitutional amendment prohibiting federal and state government from abridging any person’s individual freedom of autonomy on matters of natural private concern will suffice in protecting the right as it was envisioned at the time of America’s founding and reaffirmed in the Reconstruction. Now is the time for change
    • …
    corecore