1,787 research outputs found

    UNDERSTANDING AND DEVELOPING LITERACY IN THE ESL/EFL CLASSROOM

    Get PDF
    Abstract: Literacy in the Australian Curriculum is defined in terms of the important role language plays in the construction of meaning in diverse social and cultural contexts and their related situations (ACARA 2013, Halliday & Hasan,1985; Vygotsky,1976). To this end, in this paper we will briefly review some relevant pathways to literacy, namely, the teaching/and learning cycle that informs the literacy pedagogy, and the language learning theories that underpin the accepted literacy approach choices. In addition, this paper will address an important link to literacy, namely, the intercultural understanding general capability. This will be done via a brief coverage of the work of Golding (2005) in relation to his suggestions regarding the development of ‘a thinking classroom’. This will be achieved in an arena that also presents the views of other researchers (Pohl, 2000; Pope & Denicolo, 2001; Arthur, 2005; Lipson, 2006) who also believe in the importance of student and teacher wellbeing in the classroom. It is hoped that the discussions that are covered in this paper will provide a useful review of the relevant literature and related findings. Keywords:literacy, intercultural understanding, text types, genre theor

    Seismic performance of buried electrical cables: evidence-based repair rates and fragility functions

    Get PDF
    The fragility of buried electrical cables is often neglected in earthquakes but significant damage to cables was observed during the 2010–2011 Canterbury earthquake sequence in New Zealand. This study estimates Poisson repair rates, similar to those in existence for pipelines, using damage data retrieved from part of the electric power distribution network in the city of Christchurch. The functions have been developed separately for four seismic hazard zones: no liquefaction, all liquefaction effects, liquefaction-induced settlement only, and liquefaction-induced lateral spread. In each zone six different intensity measures (IMs) are tested, including peak ground velocity as a measure of ground shaking and five metrics of permanent ground deformation: vertical differential, horizontal, maximum, vector mean and geometric mean. The analysis confirms that the vulnerability of buried cables is influenced more by liquefaction than by ground shaking, and that lateral spread causes more damage than settlement alone. In areas where lateral spreading is observed, the geometric mean permanent ground deformation is identified as the best performing IM across all zones when considering both variance explained and uncertainty. In areas where only settlement is observed, there is only a moderate correlation between repair rate and vertical differential permanent ground deformation but the estimated model error is relatively small and so the model may be acceptable. In general, repair rates in the zone where no liquefaction occurred are very low and it is possible that repairs present in this area result from misclassification of hazard observations, either in the raw data or due to the approximations of the geospatial analysis. Along with hazard intensity, insulation material is identified as a critical factor influencing cable fragility, with paper-insulated lead covered armoured cables experiencing considerably higher repair rates than cross-linked polyethylene cables. The analysis shows no trend between cable age and repair rates and the differences in repair rates between conducting materials is shown not to be significant. In addition to repair rate functions, an example of a fragility curve suite for cables is presented, which may be more useful for analysis of network connectivity where cable functionality is of more interest than the number of repairs. These functions are one of the first to be produced for the prediction of damage to buried cables

    Evaluating desktop methods for assessing liquefaction-induced damage to infrastructure for the insurance sector

    Get PDF
    The current method used by insurance catastrophe models to account for liquefaction simply applies a factor to shaking-induced losses based on liquefaction susceptibility. There is a need for more sophisticated methods but they must be compatible with the data and resource constraints that insurers have to work with. This study compares five models: liquefaction potential index (LPI) calculated from shear-wave velocity; two implementations of the HAZUS software methodology; and two models based on USGS remote sensing data. Data from the September 2010 and February 2011 Canterbury (New Zealand) earthquakes is used to compare observed liquefaction occurrences to predictions from these models using binary classification performance measures. The analysis shows that the best performing model is LPI although the correlation with observations is only moderate and statistical techniques for binary classification models indicate that the model is biased towards positive predictions of liquefaction occurrence

    Evaluating Simplified Methods for Liquefaction Assessment for Loss Estimation

    Get PDF
    Currently, some catastrophe models used by the insurance industry account for liquefaction by applying a simple factor to shaking-induced losses. The factor is based only on local liquefaction susceptibility and this highlights the need for a more sophisticated approach to incorporating the effects of liquefaction in loss models. This study compares 11 unique models, each based on one of three principal simplified liquefaction assessment methods: liquefaction potential index (LPI) calculated from shear-wave velocity, the HAZUS software method and a method created specifically to make use of USGS remote sensing data. Data from the September 2010 Darfield and February 2011 Christchurch earthquakes in New Zealand are used to compare observed liquefaction occurrences to forecasts from these models using binary classification performance measures. The analysis shows that the best-performing model is the LPI calculated using known shear-wave velocity profiles, which correctly forecasts 78 % of sites where liquefaction occurred and 80 % of sites where liquefaction did not occur, when the threshold is set at 7. However, these data may not always be available to insurers. The next best model is also based on LPI but uses shear-wave velocity profiles simulated from the combination of USGS VS30 data and empirical functions that relate VS30 to average shear-wave velocities at shallower depths. This model correctly forecasts 58 % of sites where liquefaction occurred and 84 % of sites where liquefaction did not occur, when the threshold is set at 4. These scores increase to 78 and 86 %, respectively, when forecasts are based on liquefaction probabilities that are empirically related to the same values of LPI. This model is potentially more useful for insurance since the input data are publicly available. HAZUS models, which are commonly used in studies where no local model is available, perform poorly and incorrectly forecast 87 % of sites where liquefaction occurred, even at optimal thresholds. This paper also considers two models (HAZUS and EPOLLS) for estimation of the scale of liquefaction in terms of permanent ground deformation but finds that both models perform poorly, with correlations between observations and forecasts lower than 0.4 in all cases. Therefore these models potentially provide negligible additional value to loss estimation analysis outside of the regions for which they have been developed

    BEA: An efficient Bayesian emulation-based approach for probabilistic seismic response

    Get PDF
    This paper presents an advanced Bayesian emulation-based approach (hereafter BEA) that allows a reduced number of analyses to be carried out to compute the probabilistic seismic response and fragility of buildings. The BEA, which is a version of kriging, uses a mean function as a first approximation of the expected Engineering Demand Parameter given Intensity Measure (EDP|IM) and then models the approximation errors as a Gaussian Process (GP). A main advantage of the BEA is its flexibility, as it does not impose a fixed mathematical form on the EDP|IM relationship (unlike other approaches such as the standard cloud method). In addition, BEA makes fewer assumptions than standard methods, and provides improved characterization of uncertainty. This paper first presents the BEA approach and then assesses its computational efficiency as compared to the standard cloud method. This is done through the creation of EDP|IM relationships and fragility functions using the outputs of nonlinear dynamic and nonlinear static analyses for two case-study buildings representing Pre- and Special-Code seismic vulnerability classes. The nonlinear dynamic and static analysis methods represent different levels of accuracy i.e., are of high and low fidelity, respectively. The BEA and standard cloud methods are compared in their ability to recreate three “pseudo-realities”, each represented by an artificially generated EDP|IM relationship derived from a large set of analysis runs. Several input configurations are tested, including, reduced sets of training inputs (analysis runs), training inputs of high and low fidelity, two sampling processes for these inputs (i.e., random and stratified sampling) and two different IM representations. The results demonstrate that BEA yields both an improved accuracy in terms of mean estimates, as well as smaller uncertainty bounds compared to the cloud method. The improved performance of the BEA is maintained for all “pseudo-realities” tested regardless of whether it is trained with high or low fidelity analysis data, with the improvement particularly pronounced in cases when the advanced IM INp is used. It is demonstrated that good accuracy can be achieved with BEA even with reduced samples, yielding a saving in 25% in number of analyses required to generate the EDP|IM relationship. Finally, the use of BEA drastically improves both the accuracy and efficiency of the resultant seismic fragility functions

    Tuberculosis/HIV/AIDS coinfection in Porto Alegre, RS/Brazil - invisibility and silencing of the most affected groups

    Get PDF
    OBJECTIVE: To analyze how belonging to certain social groups contributes to constituting the vulnerabilities associated with illnesses due to tuberculosis/HIV/AIDS coinfection. METHODOLOGYThis is a qualitative study carried out in the city of Porto Alegre, state of Rio Grande do Sul, in regions of high social vulnerability. Twenty coinfected people were interviewed in specialized health services between August and December 2016. The analysis was based on the frameworks The Sound of Silence and Vulnerability and Human Rights. RESULTS: Socioeconomic conditions were decisive for the constitution of the vulnerability conditions. Processes of people invisibilization, and the silencing of their voices, in a scenario marked by economic, racial and gender inequalities, contributed for their health needs not to be understood and effectively taken into account in the services actions. FINAL CONSIDERATIONS: The more effective strategies are to legitimize voices and to understand the needs of those affected by coinfection, the greater the chances that programmatic responses to the problem will be successful

    Interdependence and dynamics of essential services in an extensive risk context: a case study in Montserrat, West Indies

    Get PDF
    The essential services that support urban living are complex and interdependent, and their disruption in disasters directly affects society. Yet there are few empirical studies to inform our understanding of the vulnerabilities and resilience of complex infrastructure systems in disasters. This research takes a systems thinking approach to explore the dynamic behaviour of a network of essential services, in the presence and absence of volcanic ashfall hazards in Montserrat, West Indies. Adopting a case study methodology and qualitative methods to gather empirical data, we centre the study on the healthcare system and its interconnected network of essential services. We identify different types of relationship between sectors and develop a new interdependence classification system for analysis. Relationships are further categorised by hazard conditions, for use in extensive risk contexts. During heightened volcanic activity, relationships between systems transform in both number and type: connections increase across the network by 41%, and adapt to increase cooperation and information sharing. Interconnections add capacities to the network, increasing the resilience of prioritised sectors. This in-depth and context-specific approach provides a new methodology for studying the dynamics of infrastructure interdependence in an extensive risk context, and can be adapted for use in other hazard contexts

    Effect of defect size on P-S-N curves in Very-High-Cycle Fatigue

    Get PDF
    It is well-known that internal defects play a key role in the Very-High-Cycle Fatigue (VHCF) response of metallic materials. VHCF failures generally nucleate from internal defects, whose size strongly affects the material strength and life. Therefore, S-N curves in the VHCF regime are defect size dependent and the scatter of fatigue data is significantly influenced by the statistical distribution of the defect size within the material. The present paper proposes an innovative approach for the statistical modeling of Probabilistic-S-N (P-S-N) curves in the VHCF regime. The proposed model considers conditional P-S-N curves that depend on a specific value of the initial defect size. From the statistical distribution of the initial defect size, marginal P-S-N curves are estimated and the effect of the risk-volume on the VHCF response is also modeled. Finally, the paper reports a numerical example that quantitatively illustrates the concepts of conditional and marginal P-S-N curves and that shows the effect of the risk-volume on the VHCF response
    corecore