359 research outputs found

    Using high resolution optical imagery to detect earthquake-induced liquefaction: the 2011 Christchurch earthquake

    Get PDF
    Using automated supervised methods with satellite and aerial imageries for liquefaction mapping is a promising step in providing detailed and region-scale maps of liquefaction extent immediately after an earthquake. The accuracy of these methods depends on the quantity and quality of training samples and the number of available spectral bands. Digitizing a large number of high-quality training samples from an event may not be feasible in the desired timeframe for rapid response as the training pixels for each class should be typical and accurately represent the spectral diversity of that specific class. To perform automated classification for liquefaction detection, we need to understand how to build the optimal and accurate training dataset. Using multispectral optical imagery from the 22 February, 2011 Christchurch earthquake, we investigate the effects of quantity of high-quality training pixel samples as well as the number of spectral bands on the performance of a pixel-based parametric supervised maximum likelihood classifier for liquefaction detection. We find that the liquefaction surface effects are bimodal in terms of spectral signature and therefore, should be classified as either wet liquefaction or dry liquefaction. This is due to the difference in water content between these two modes. Using 5-fold cross-validation method, we evaluate performance of the classifier on datasets with different pixel sizes of 50, 100, 500, 2000, and 4000. Also, the effect of adding spectral information was investigated by adding once only the near infrared (NIR) band to the visible red, green, and blue (RGB) bands and the other time using all available 8 spectral bands of the World-View 2 satellite imagery. We find that the classifier has high accuracies (75%–95%) when using the 2000 pixels-size dataset that includes the RGB+NIR spectral bands and therefore, increasing to 4000 pixels-size dataset and/or eight spectral bands may not be worth the required time and cost. We also investigate accuracies of the classifier when using aerial imagery with same number of training pixels and either RGB or RGB+NIR bands and find that the classifier accuracies are higher when using satellite imagery with same number of training pixels and spectral information. The classifier identifies dry liquefaction with higher user accuracy than wet liquefaction across all evaluated scenarios. To improve classification performance for wet liquefaction detection, we also investigate adding geospatial information of building footprints to improve classification performance. We find that using a building footprint mask to remove them from the classification process, increases wet liquefaction user accuracy by roughly 10%.Published versio

    Preparing Community-Oriented Teachers: Reflections from a Multicultural Service-Learning Standpoint

    Get PDF
    The Banneker History Project (BHP) reconstructed the history of a local, segregated school. The Benjamin Banneker School served African American youth from 1913 to 1951. Oral histories from surviving alumni as well as primary documents from the times were sought. This article focuses on ways that one group of participants, 24 preservice teachers of color. experienced and interpreted the BHP. Data are reported in response to three questions: (a) Whose community does service learning serve? (b) What meanings do preservice teachers make of culturally responsive teaching? and (c) Does a community orientation count in teacher education? The author reflects on and draws insights from these data. She considers the implications of this effort for community-orientated teacher education

    Man and His World

    Get PDF

    The Banneker History Project: Historic Investigation of a Once Segregated School

    Get PDF
    The Banneker History Project was a service learning project in which students investigated the history of the Benjamin Banneker School, a segregated school that operated from 1915-1951 in a Midwestern college community. This article discusses the research these students conducted and the perceptions they adopted as a result of their work

    Model Validation of Recent Ground Motion Prediction Relations for Shallow Crustal Earthquakes in Active Tectonic Regions

    Get PDF
    Recent earthquake ground motion prediction relations, such as those developed from the Next Generation Attenuation of Ground Motions (NGA) project in 2008, have established a new baseline for the estimation of ground motion parameters such as peak ground acceleration (PGA), peak ground velocity (PGV), and spectral acceleration (Sa). When these models were published, very little was written about model validation or prediction accuracy. We perform statistical goodness-of-fit analyses to quantitatively compare the predictive abilities of these recent models. The prediction accuracy of the models is compared using several testing subsets of the master database used to develop the NGA models. In addition, we perform a blind comparison of the new models with previous simpler models, using ground motion records from the two most recent earthquakes of magnitude 6.0 or greater to strike mainland California: (1) the 2004 M 6.0 Parkfield earthquake, and (2) the 2003 M 6.5 San Simeon earthquake. By comparing the predictor variables and performance of different models, we discuss the sources of uncertainty in the estimates of ground motion parameters and offer recommendations for model development. This paper presents a model validation framework for assessing the prediction accuracy of ground motion prediction relations and aiding in their future development

    A Practical Approach for Implementing the Probability of Liquefaction in Performance Based Design

    Get PDF
    Empirical Liquefaction Models (ELMs) are the usual approach for predicting the occurrence of soil liquefaction. These ELMs are typically based on in situ index tests, such as the Standard Penetration Test (SPT) and Cone Penetration Test (CPT), and are broadly classified as deterministic and probabilistic models. The deterministic model provides a “yes/no” response to the question of whether or not a site will liquefy. However, Performance-Based Earthquake Engineering (PBEE) requires an estimate of the probability of liquefaction (PL) which is a quantitative and continuous measure of the severity of liquefaction. Probabilistic models are better suited for PBEE but are still not consistently used in routine engineering applications. This is primarily due to the limited guidance regarding which model to use, and the difficulty in interpreting the resulting probabilities. The practical implementation of a probabilistic model requires a threshold of liquefaction (THL). The researchers who have used probabilistic methods have either come up with subjective THL or have used the established deterministic curves to develop the THL. In this study, we compare the predictive performance of the various deterministic and probabilistic ELMs within a quantitative validation framework. We incorporate estimated costs associated with risk as well as with risk mitigation to interpret PL using precision and recall and to, compute the optimal THL using Precision- Recall (P-R) cost curve. We also provide the P-R cost curves for the popular probabilistic model developed using Bayesian updating for SPT and CPT data by Cetin et al. (2004) and Moss et al. (2006) respectively. These curves should be immediately useful to a geotechnical engineer who needs to choose the optimal THL that incorporates the costs associated with the risk of liquefaction and the costs associated with mitigation

    The Objective-Subjective Dichotomy and its Use in Describing Probability

    Get PDF
    This article deals with the nature of the objective-subjective dichotomy, first from a general historical point of view, and then with regard to the use of these terms over time to describe theories of probability. The different (metaphysical and epistemological) meanings of “objective” and “subjective” are analyzed, and then used to show that all probability theories can be divided into three broad classes

    Cogitator : a parallel, fuzzy, database-driven expert system

    Get PDF
    The quest to build anthropomorphic machines has led researchers to focus on knowledge and the manipulation thereof. Recently, the expert system was proposed as a solution, working well in small, well understood domains. However these initial attempts highlighted the tedious process associated with building systems to display intelligence, the most notable being the Knowledge Acquisition Bottleneck. Attempts to circumvent this problem have led researchers to propose the use of machine learning databases as a source of knowledge. Attempts to utilise databases as sources of knowledge has led to the development Database-Driven Expert Systems. Furthermore, it has been ascertained that a requisite for intelligent systems is powerful computation. In response to these problems and proposals, a new type of database-driven expert system, Cogitator is proposed. It is shown to circumvent the Knowledge Acquisition Bottleneck and posess many other advantages over both traditional expert systems and connectionist systems, whilst having non-serious disadvantages.KMBT_22

    Site Response at Treasure and Yerba Buena Islands, California

    Get PDF

    Otimization of Volatile Fatty Acid Formation by Fermentation of Primary Sludge

    Get PDF
    The Orange County (North Carolina) Water and Sewer Authority (OWASA) operates the Mason Farm Wastewater Treatment Plant to achieve the removal of biochemical oxygen demand (BOD), nitrification of ammonium-nitrogen, and biological phosphorus removal (BPR). Future discharge permits may require total nitrogen removal, which would be accomplished by incorporating denitrification into the biological treatment process. One means of providing the carbon sources necessary for denitrification would be to provide volatile fatty acids (VFAs) from the fermentation of primary sludge. Although OWASA already ferments primary sludge at the Mason Farm Plant, the purpose of this study was to evaluate the conditions under which VFA production could be maximized during fermentation. Fermentation involves the conversion of organic compounds to volatile fatty acids (VFAs) by microorganisms under anaerobic conditions. When primary sludge is fermented, the principal compounds available for fermentation are solids such as cellulose and other polysaccharides, proteins, and lipids or fats. Before these solids can be fermented, however, they are first broken down to low-molecular-weight products that are then converted to VFAs by fermentative bacteria. Anaerobic bioreactors generally contain complex communities of microorganisms. If the residence time of the organisms in such reactors is long enough, microorganisms that convert acetic acid and other compounds to methane (methanogens) will normally be present. Thus, the presence of methanogens can serve as a sink for the VFAs that are produced by the fermentative bacteria. Optimizing the production of VFAs in a fermenter therefore involves an optimum balance of conditions that maximize hydrolysis and fermentation reactions with those that minimize methanogenesis. The most common means of accomplishing this in the field is to operate sludge fermenters at residence times that are too short to permit significant growth of the slow-growing methanogens. In addition to residence time, other factors such as pH might influence the rate and extent of fermentation. For example, the optimum pH for growth of methanogens is near neutral, so that the acidic conditions often associated with fermentation could help minimize methanogenesis. Methanogens are also known to be very sensitive to oxygen, so that intermittent exposure to air might also reduce methanogenesis in fermentation processes. Our review of the literature indicates that the effects of pH and oxygen on the fermentation of primary sludge are not well understood. In this study the effects of pH and oxygen addition on VFA formation from primary sludge were evaluated in batch anaerobic incubations. In addition, the effect of hydraulic residence time in a continuous-flow reactor was also evaluated. The combined results suggested that pH is the most important variable influencing VFA formation. In general, pH near or below 5.0 led to maximum VFA formation (between 0.4 and 0.5g VFAs as chemical oxygen demand per gram volatile solids fed).Master of Science in Environmental Engineerin
    • …
    corecore