126 research outputs found

    Solvent and temperature effects on fluorescence emission of europium beta - diketonates

    Get PDF
    Solvent and temperature effects on fluorescent emission of europium-diketonate

    Liquid laser cavities

    Get PDF
    Liquid laser cavities have plenum chambers at the ends of the capillary cell which are terminated in transparent optical flats. By use of these cavities, several new europium chelates and a terbium chelate can provide laser action in solution at room temperature

    Laser action from a terbium beta-ketoenolate at room temperature

    Get PDF
    Laser activity is achieved in a solution of terbium tris at room temperature in a liquid solvent of acetonitrile or p-dioxane. After precipitation, the microcrystals of hydrated tris chelate are filtered, washed in distilled water, and dried. They show no signs of deterioration after storage

    An advanced 10.6-micro laser communication experiment

    Get PDF
    Carbon dioxide laser capability of high data rate intersatellite communicatio

    QuakeFlow: A Scalable Machine-learning-based Earthquake Monitoring Workflow with Cloud Computing

    Full text link
    Earthquake monitoring workflows are designed to detect earthquake signals and to determine source characteristics from continuous waveform data. Recent developments in deep learning seismology have been used to improve tasks within earthquake monitoring workflows that allow the fast and accurate detection of up to orders of magnitude more small events than are present in conventional catalogs. To facilitate the application of machine-learning algorithms to large-volume seismic records, we developed a cloud-based earthquake monitoring workflow, QuakeFlow, that applies multiple processing steps to generate earthquake catalogs from raw seismic data. QuakeFlow uses a deep learning model, PhaseNet, for picking P/S phases and a machine learning model, GaMMA, for phase association with approximate earthquake location and magnitude. Each component in QuakeFlow is containerized, allowing straightforward updates to the pipeline with new deep learning/machine learning models, as well as the ability to add new components, such as earthquake relocation algorithms. We built QuakeFlow in Kubernetes to make it auto-scale for large datasets and to make it easy to deploy on cloud platforms, which enables large-scale parallel processing. We used QuakeFlow to process three years of continuous archived data from Puerto Rico, and found more than a factor of ten more events that occurred on much the same structures as previously known seismicity. We applied Quakeflow to monitoring frequent earthquakes in Hawaii and found over an order of magnitude more events than are in the standard catalog, including many events that illuminate the deep structure of the magmatic system. We also added Kafka and Spark streaming to deliver real-time earthquake monitoring results. QuakeFlow is an effective and efficient approach both for improving realtime earthquake monitoring and for mining archived seismic data sets
    corecore