262 research outputs found

    Estimating Visits to Denali National Park and Preserve

    Get PDF

    Archaeological Watercraft: A Review and Critical Analysis of the Practice

    Get PDF
    Vestiges of humankind’s long-term interaction with the earth’s rivers, lakes, oceans and seas lie beneath water, sand, soils and sediments in the form of archaeological waterlogged wooden ships and boats. These quintessential maritime artifacts, and the connections formed between humans and watercraft create an extensive artifact biography revealing a host of physical and metaphysical meanings. Over a span of the last nine millennium watercraft have acted as containers, vessels, a means of conveyance, a bridge, a home, a factory, a prison, a fortress or a life boat. They are emblematic of individuals and nations. Deep seated in many cultural beliefs, they are integral aspects of birth or renewal, and often a critical component for reaching the afterlife. All of these factors motivate individuals to save archaeological watercraft when discovered in the course of academic search and survey, or a civil/commercial excavation. Although attempts by professional conservators to stabilize waterlogged wooden watercraft first occurred in the early 1860s with the finds from Nydam Bog (Denmark), there was little change in methodological and philosophical approaches from then until after World War II. Following the war, a combination of new products, dissatisfaction with previously tried methods, and a shift in attitudes towards preserving representations of the past, ushered in a new era in the conservation of antiquities, and in particular watercraft. Over the last 60 years, incremental advancements have taken place concerning the conservation techniques applied to waterlogged archaeological wood and wooden structures. Investigations that focused primarily on methods that maintained the dimensional stabilization of the object are now beginning to share considerably more time with inquiries about the state of artifacts once stabilized and in storage or on display. The archaeological remains of one watercraft, La Belle discovered in the shallow waters of Matagorda Bay, Texas in 1995 provides a case study in this dissertation to address some of the issues surrounding the conservation of waterlogged ships and boats

    Single crystal growth and anisotropic magnetic properties of Li2Sr[Li1 − xFexN]2

    Get PDF
    Up to now, investigation of physical properties of ternary and higher nitridometalates was severely hampered by challenges concerning phase purity and crystal size. Employing a modified lithium flux technique, we are now able to prepare sufficiently large single crystals of the highly air and moisture sensitive nitridoferrate Li2Sr[Li1−xFexN]2\rm Li_2Sr[Li_{1-x}Fe_xN]_2 for anisotropic magnetization measurements. The magnetic properties are most remarkable: large anisotropy and coercivity fields of 7 Tesla at T=2T = 2 K indicate a significant orbital contribution to the magnetic moment of iron. Altogether, the novel growth method opens a route towards interesting phases in the comparatively recent research field of nitridometalates and should be applicable to various other materials.Comment: 10 pages, 5 figures, published open access in Inorganics, minor typos correcte

    Upconversion detector for range-resolved DIAL measurement of atmospheric CH<sub>4</sub>

    Get PDF
    We demonstrate a robust, compact, portable and efficient upconversion detector (UCD) for a differential absorption lidar (DIAL) system designed for range-resolved methane (CH4) atmospheric sensing. The UCD is built on an intracavity pump system that mixes a 1064 nm pump laser with the lidar backscatter signal at 1646 nm in a 25-mm long periodically poled lithium niobate crystal. The upconverted signal at 646 nm is detected by aphotomultiplier tube (PMT). The UCD with a noise equivalent power around 127 fW/Hz1/2 outperforms a conventional InGaAs based avalanche photodetector when both are used for DIAL measurements. Using the UCD, CH4 DIAL measurements have been performed yielding differential absorption optical depths with relative errors of less than 11% at ranges between 3 km and 9 k

    Semi-classical Monte Carlo algorithm for the simulation of X-ray grating interferometry.

    Get PDF
    Traditional simulation techniques such as wave optics methods and Monte Carlo (MC) particle transport cannot model both interference and inelastic scattering phenomena within one framework. Based on the rules of quantum mechanics to calculate probabilities, we propose a new semi-classical MC algorithm for efficient and simultaneous modeling of scattering and interference processes. The similarities to MC particle transport allow the implementation as a flexible c++ object oriented extension of EGSnrc-a well-established MC toolkit. In addition to previously proposed Huygens principle based transport through optics components, new variance reduction techniques for the transport through gratings are presented as transport options to achieve the required improvement in speed and memory costs necessary for an efficient exploration (system design-dose estimations) of the medical implementation of X-ray grating interferometry (GI), an emerging imaging technique currently subject of tremendous efforts towards clinical translation. The feasibility of simulation of interference effects is confirmed in four academic cases and an experimental table-top GI setup. Comparison with conventional MC transport show that deposited energy features of EGSnrc are conserved

    Auto-commissioning of a Monte Carlo electron beam model with application to photon MLC shaped electron fields.

    Get PDF
    OBJECTIVE Presently electron beam treatments are delivered using dedicated applicators. An alternative is the usage of the already installed photon multileaf collimator (pMLC) enabling efficient electron treatments. Currently, the commissioning of beam models is a manual and time-consuming process. In this work an auto-commissioning procedure for the Monte Carlo (MC) beam model part representing the beam above the pMLC is developed for TrueBeam systems with electron energies from 6 to 22 MeV. APPROACH The analytical part of the electron beam model includes a main source representing the primary beam and a jaw source representing the head scatter contribution each consisting of an electron and a photon component, while MC radiation transport is performed for the pMLC. The auto-commissioning of this analytical part relies on information pre-determined from MC simulations, in-air dose profiles and absolute dose measurements in water for different field sizes and source to surface distances (SSDs). For validation calculated and measured dose distributions in water were compared for different field sizes, SSDs and beam energies for eight TrueBeam systems. Furthermore, a sternum case in an anthropomorphic phantom was considered and calculated and measured dose distributions were compared at different SSDs. MAIN RESULTS Instead of the manual commissioning taking up to several days of calculation time and several hours of user time, the auto-commissioning is carried out in a few minutes. Measured and calculated dose distributions agree generally within 3% of maximum dose or 2 mm. The gamma passing rates for the sternum case ranged from 96% to 99% (3% (global)/2 mm criteria, 10% threshold). SIGNIFICANCE The auto-commissioning procedure was successfully implemented and applied to eight TrueBeam systems. The newly developed user-friendly auto-commissioning procedure allows an efficient commissioning of an MC electron beam model and eases the usage of advanced electron radiotherapy utilizing the pMLC for beam shaping

    Efficiency enhancements of a Monte Carlo beamlet based treatment planning process: implementation and parameter study.

    Get PDF
    OBJECTIVE The computational effort to perform beamlet calculation, plan optimization and final dose calculation of a treatment planning process (TPP) generating intensity modulated treatment plans is enormous, especially if Monte Carlo (MC) simulations are used for dose calculation. The goal of this work is to improve the computational efficiency of a fully MC based TPP for static and dynamic photon, electron and mixed photon-electron treatment techniques by implementing multiple methods and studying the influence of their parameters. APPROACH A framework is implemented calculating MC beamlets efficiently in parallel on each available CPU core. The user can specify the desired statistical uncertainty of the beamlets, a fractional sparse dose threshold to save beamlets in a sparse format and minimal distances to the PTV surface from which 2x2x2=8 (medium) or even 4x4x4=64 (large) voxels are merged. The compromise between final plan quality and computational efficiency of beamlet calculation and optimization is studied for several parameter values to find a reasonable trade-off. For this purpose, four clinical and one academic case are considered with different treatment techniques. MAIN RESULTS Setting the statistical uncertainty to 5% (photon beamlets) and 15% (electron beamlets), the fractional sparse dose threshold relative to the maximal beamlet dose to 0.1% and minimal distances for medium and large voxels to the PTV to 1 cm and 2 cm, respectively, does not lead to substantial degradation in final plan quality. Only OAR sparing is slightly degraded. Furthermore, computation times are reduced by about 58% (photon beamlets), 88% (electron beamlets) and 96% (optimization) compared to using 2.5% (photon beamlets) and 5% (electron beamlets) statistical uncertainty and no sparse format nor voxel merging. SIGNIFICANCE Several methods are implemented improving computational efficiency of beamlet calculation and plan optimization of a fully MC based TPP without substantial degradation in final plan quality

    Impact of random outliers in auto-segmented targets on radiotherapy treatment plans for glioblastoma.

    Get PDF
    AIMS To save time and have more consistent contours, fully automatic segmentation of targets and organs at risk (OAR) is a valuable asset in radiotherapy. Though current deep learning (DL) based models are on par with manual contouring, they are not perfect and typical errors, as false positives, occur frequently and unpredictably. While it is possible to solve this for OARs, it is far from straightforward for target structures. In order to tackle this problem, in this study, we analyzed the occurrence and the possible dose effects of automated delineation outliers. METHODS First, a set of controlled experiments on synthetically generated outliers on the CT of a glioblastoma (GBM) patient was performed. We analyzed the dosimetric impact on outliers with different location, shape, absolute size and relative size to the main target, resulting in 61 simulated scenarios. Second, multiple segmentation models where trained on a U-Net network based on 80 training sets consisting of GBM cases with annotated gross tumor volume (GTV) and edema structures. On 20 test cases, 5 different trained models and a majority voting method were used to predict the GTV and edema. The amount of outliers on the predictions were determined, as well as their size and distance from the actual target. RESULTS We found that plans containing outliers result in an increased dose to healthy brain tissue. The extent of the dose effect is dependent on the relative size, location and the distance to the main targets and involved OARs. Generally, the larger the absolute outlier volume and the distance to the target the higher the potential dose effect. For 120 predicted GTV and edema structures, we found 1887 outliers. After construction of the planning treatment volume (PTV), 137 outliers remained with a mean distance to the target of 38.5 ± 5.0 mm and a mean size of 1010.8 ± 95.6 mm3. We also found that majority voting of DL results is capable to reduce outliers. CONCLUSIONS This study shows that there is a severe risk of false positive outliers in current DL predictions of target structures. Additionally, these errors will have an evident detrimental impact on the dose and therefore could affect treatment outcome

    Robust optimization and assessment of dynamic trajectory and mixed-beam arc radiotherapy: apreliminary study.

    Get PDF
    Dynamic trajectory radiotherapy (DTRT) and dynamic mixed-beam arc therapy (DYMBARC) exploit non-coplanarity and, for DYMBARC, simultaneously optimized photon and electron beams. Margin concepts to account for set-up uncertainties during delivery are ill-defined for electron fields. We develop robust optimization for DTRT&DYMBARC and compare dosimetric plan quality and robustness for both techniques and both optimization strategies for four cases.&#xD;Approach. Cases for different treatment sites and clinical target volume (CTV) to planning target volume (PTV) margins, m, were investigated. Dynamic gantry-table-collimator photon paths were optimized to minimize PTV/organ-at-risk (OAR) overlap in beam's-eye-view and minimize potential photon multileaf collimator (MLC) travel. For DYMBARC plans, non-isocentric partial electron arcs or static fields with shortened source-to-surface distance (80 cm) were added. Direct aperture optimization (DAO) was used to simultaneously optimize MLC-based intensity modulation for both photon and electron beams yielding deliverable PTV-based DTRT&DYMBARC plans. Robust-optimized plans used the same paths/arcs/fields. DAO with stochastic programming was used for set-up uncertainties with equal weights in all translational directions and magnitude δ such that m= 0.7δ. Robust analysis considered random errors in all directions with or without an additional systematic error in the worst 3D direction for the adjacent OARs. &#xD;Main results. Electron contribution was 7%-41% of target dose depending on the case and optimization strategy for DYMBARC. All techniques achieved similar CTV coverage in the nominal (no error) scenario. OAR sparing was overall better in the DYMBARC plans than in the DTRT plans and DYMBARC plans were generally more robust to the considered uncertainties. OAR sparing was better in the PTV-based than in robust-optimized plans for OARs abutting or overlapping with the target volume, but more affected by uncertainties. &#xD;Significance. Better plan robustness can be achieved with robust optimization than with margins. Combining electron arcs/fields with non-coplanar photon trajectories further improves robustness and OAR sparing

    Technical note: A collision prediction tool using Blender.

    Get PDF
    Non-coplanar radiotherapy treatment techniques on C-arm linear accelerators have the potential to reduce dose to organs-at-risk in comparison with coplanar treatment techniques. Accurately predicting possible collisions between gantry, table and patient during treatment planning is needed to ensure patient safety. We offer a freely available collision prediction tool using Blender, a free and open-source 3D computer graphics software toolset. A geometric model of a C-arm linear accelerator including a library of patient models is created inside Blender. Based on the model, collision predictions can be used both to calculate collision-free zones and to check treatment plans for collisions. The tool is validated for two setups, once with and once without a full body phantom with the same table position. For this, each gantry-table angle combination with a 2° resolution is manually checked for collision interlocks at a TrueBeam system and compared to simulated collision predictions. For the collision check of a treatment plan, the tool outputs the minimal distance between the gantry, table and patient model and a video of the movement of the gantry and table, which is demonstrated for one use case. A graphical user interface allows user-friendly input of the table and patient specification for the collision prediction tool. The validation resulted in a true positive rate of 100%, which is the rate between the number of correctly predicted collision gantry-table combinations and the number of all measured collision gantry-table combinations, and a true negative rate of 89%, which is the ratio between the number of correctly predicted collision-free combinations and the number of all measured collision-free combinations. A collision prediction tool is successfully created and able to produce maps of collision-free zones and to test treatment plans for collisions including visualisation of the gantry and table movement
    • …
    corecore