7,304 research outputs found

    Examining the cognitive costs of counterfactual language comprehension: Evidence from ERPs

    Get PDF
    Recent empirical research suggests that understanding a counterfactual event (e.g. ‘If Josie had revised, she would have passed her exams’) activates mental representations of both the factual and counterfactual versions of events. However, it remains unclear when readers switch between these models during comprehension, and whether representing multiple ‘worlds’ is cognitively effortful. This paper reports two ERP studies where participants read contexts that set up a factual or counterfactual scenario, followed by a second sentence describing a consequence of this event. Critically, this sentence included a noun that was either consistent or inconsistent with the preceding context, and either included a modal verb to indicate reference to the counterfactual-world or not (thus referring to the factual-world). Experiment 2 used adapted versions of the materials used in Experiment 1 to examine the degree to which representing multiple versions of a counterfactual situation makes heavy demands on cognitive resources by measuring individuals’ working memory capacity. Results showed that when reference to the counterfactual-world was maintained by the ongoing discourse, readers correctly interpreted events according to the counterfactual-world (i.e. showed larger N400 for inconsistent than consistent words). In contrast, when cues referred back to the factual-world, readers showed no difference between consistent and inconsistent critical words, suggesting that they simultaneously compared information against both possible worlds. These results support previous dual-representation accounts for counterfactuals, and provide new evidence that linguistic cues can guide the reader in selecting which world model to evaluate incoming information against. Crucially, we reveal evidence that maintaining and updating a hypothetical model over time relies upon the availability of cognitive resources

    Size-resolved aerosol fluxes above a temperate broadleaf forest

    Full text link
    Aerosol fluxes were measured by eddy-correlation for 8 weeks of the summer and fall of 2011 above a temperate broadleaf forest in central Ontario, Canada. These size-resolved measurements apply to particles with optical diameters between 50 and 500 nm and are the first ones reported above a temperate deciduous forest. The particle spectrometer was located on top of the flux tower in order to reduce signal dampening in the tube and thus maximize measurement efficiency. The 8-week data set extends into autumn, capturing leaf senescence and loss, offering a rare opportunity to investigate the influence of leaf area index on particle transfer. A distinct pattern of emission and deposition that depends on the particle size is highlighted: while the smallest particles (dp  100 nm) are preferentially deposited (62% of the time). For the size bins with detection efficiency above 50% (68–292 nm), the median transfer velocity for each bin varies between +1.34 and −2.69 mm s−1 and is equal to −0.21 mm s−1 for the total particle count. The occurrence of the upward fluxes shows a marked diurnal pattern. Possible explanations for these upward fluxes are proposed. The measurements, and their comparison with an existing model, highlight some of the key drivers of the particle transfer onto a broadleaf forest: particle size, friction velocity, leaf area index and atmospheric stability.We are grateful to the Haliburton forest staff and owner for their support, as well as Ting Zheng and Jing Ming Chen (Dept of Geography, Univ. of Toronto) for sharing the TRAC instrument LAI data. The UHSAS and SMPS instruments were contributed by the Canadian Aerosol Research Network, funded by the Canada Foundation for Innovation. (Canada Foundation for Innovation)First author draf

    Seismic Risk Analysis of Revenue Losses, Gross Regional Product and transportation systems.

    Get PDF
    Natural threats like earthquakes, hurricanes or tsunamis have shown seri- ous impacts on communities. In the past, major earthquakes in the United States like Loma Prieta 1989, Northridge 1994, or recent events in Italy like L’Aquila 2009 or Emilia 2012 earthquake emphasized the importance of pre- paredness and awareness to reduce social impacts. Earthquakes impacted businesses and dramatically reduced the gross regional product. Seismic Hazard is traditionally assessed using Probabilistic Seismic Hazard Anal- ysis (PSHA). PSHA well represents the hazard at a specific location, but it’s unsatisfactory for spatially distributed systems. Scenario earthquakes overcome the problem representing the actual distribution of shaking over a spatially distributed system. The performance of distributed productive systems during the recovery process needs to be explored. Scenario earthquakes have been used to assess the risk in bridge networks and the social losses in terms of gross regional product reduction. The proposed method for scenario earthquakes has been applied to a real case study: Treviso, a city in the North East of Italy. The proposed method for scenario earthquakes requires three models: one representation of the sources (Italian Seismogenic Zonation 9), one attenuation relationship (Sa- betta and Pugliese 1996) and a model of the occurrence rate of magnitudes (Gutenberg Richter). A methodology has been proposed to reduce thou- sands of scenarios to a subset consistent with the hazard at each location. Earthquake scenarios, along with Mote Carlo method, have been used to simulate business damage. The response of business facilities to earthquake has been obtained from fragility curves for precast industrial building. Fur- thermore, from business damage the reduction of productivity has been simulated using economic data from the National statistical service and a proposed piecewise “loss of functionality model”. To simulate the economic process in the time domain, an innovative businesses recovery function has been proposed. The proposed method has been applied to generate scenarios earthquakes at the location of bridges and business areas. The proposed selection method- ology has been applied to reduce 8000 scenarios to a subset of 60. Subse- quently, these scenario earthquakes have been used to calculate three system performance parameters: the risk in transportation networks, the risk in terms of business damage and the losses of gross regional product. A novel model for business recovery process has been tested. The proposed model has been used to represent the business recovery process and simulate the effects of government aids allocated for reconstruction. The proposed method has efficiently modeled the seismic hazard using scenario earthquakes. The scenario earthquakes presented have been used to assess possible consequences of earthquakes in seismic prone zones and to increase the preparedness. Scenario earthquakes have been used to sim- ulate the effects to economy of the impacted area; a significant Gross Regional Product reduction has been shown, up to 77% with an earthquake with 0.0003 probability of occurrence. The results showed that limited funds available after the disaster can be distributed in a more efficient way

    The diversity and distribution of multihost viruses in bumblebees

    Get PDF
    The bumblebees (genus Bombus) are an ecologically and economically important group in decline. Their decline is driven by many factors, but parasites are believed to play a role. This thesis examines the factors that influence the diversity and distribution of multihost viruses in bumblebees using molecular and modelling techniques. In Chapter 2, I performed viral discovery to isolate new multihost viruses in bumblebees. I investigated factors that explain prevalence differences between different host species using co-phylogenetic models. I found that related hosts are infected with similar viral assemblages, related viruses infect similar host assemblages and related hosts are on average infected with related viruses. Chapter 3 investigated the ecology of four of the novel viruses in greater detail. I applied a multivariate probit regression to investigate the abiotic factors that may drive infection. I found that precipitation may have a positive or negative effect depending on the virus. Also, we observe a strong non-random association between two of the viruses. The novel viruses have considerably more diversity than the previously known viruses. Chapter 4 investigated the effect of pesticides on viral and non-viral infection. I exposed Bombus terrestris colonies to field realistic doses of the neoticotinoid pesticide clothianidin in the laboratory, to the mimic pulsed exposure of crop blooms. I found some evidence for a positive effect of uncertain size on the infection rate of pesticide exposed colonies relative to non-pesticide exposed colonies, a potentially important result. Chapter 5 explored the evolution of avirulent multihost digital organisms across fluctuating fitness landscapes within a discrete sequence space. Consistent with theory, I found that evolution across a fluctuating discrete landscape leads to a faster rate of adaptation, greater diversity and greater specialism or generalism, depending on the correlation between the landscapes. A large range of factors are found to be important in the distribution of infection and diversity of viruses, and we find evidence for abiotic, biotic and anthropogenic factors all playing a role.BBSR

    A holistic evaluation concept for long-term structural health monitoring

    Get PDF
    [no abstract

    Економічна статистика

    Get PDF
    The tutorial contains lecture notes of the general theory of statistics, including clustering statistics, absolute, relative and average values, organization and grouping of data, sampling, correlation and regression analysis, estimation, time series, indexes and their use in economics. Typical examples of solving the material are presented in the text. Practical problems are given at the end of each chapter for the better understanding of theoretical notes. Designed for students of bachelor level in 073 "Management" and 072 "Finance and Credit".Навчальний посібник містить основи загальної теорії статистики, включаючи групування статистичних даних, абсолютні, відносні і середні величини, статистичні розподіли, вибіркове спостереження, кореляційно-регресійний аналіз, оцінювання, ряди динаміки, індекси та їх використання в економіко-статистичних дослідженнях. Представлено типові приклади з розв’язаннями за матеріалом, що вивчається. Кожен розділ містить практичні завдання для закріплення теоретичного матеріалу. Призначено для студентів спеціальностей 073 "Менеджмент» та 072 "Фінанси, банківська справа і страхування"

    Modeling Censored Data Using Mixture Regression Models with an Application to Cattle Production Yields

    Get PDF
    This research develops a mixture regression model that is shown to have advantages over the classical Tobit model in model fit and predictive tests when data are generated from a two step process. Additionally, the model is shown to allow for flexibility in distributional assumptions while nesting the classic Tobit model. A simulated data set is utilized to assess the potential loss in efficiency from model misspecification, assuming the Tobit and a zero-inflated log-normal distribution, which is derived from the generalized mixture model. Results from simulations key on the finding that the proposed zero-inflated log-normal model clearly outperforms the Tobit model when data are generated from a two step process. When data are generated from a Tobit model, forecasts are more accurate when utilizing the Tobit model. However, the Tobit model will be shown to be a special case of the generalized mixture model. The empirical model is then applied to evaluating mortality rates in commercial cattle feedlots, both independently and as part of a system including other performance and health factors. This particular application is hypothesized to be more appropriate for the proposed model due to the high degree of censoring and skewed nature of mortality rates. The zero-inflated log-normal model clearly models and predicts with more accuracy that the tobit model.censoring, livestock production, tobit, zero-inflated, bayesian, Livestock Production/Industries,

    Effect of antiscalants during eutectic freeze crystallization of a reverse osmosis retentate

    Get PDF
    Includes bibliography.Eutectic Freeze Crystallization (EFC) is a separation technique which involves simultaneous crystallization of water and solute under eutectic conditions. It can be applied to treatment of various industrial aqueous streams containing dissolved organic and inorganic contaminants, such as reverse osmosis (RO) retentate brine streams. Since antiscalants are dosed in RO feed streams, these become concentrated in the retentate brine stream and could have an undesirable effect on crystallization kinetics of both ice and salt in EFC. In this study, the impact of a phosphonate antiscalant on the kinetic processes of nucleation and growth in EFC was investigated. Firstly, the effect of an antiscalant on the thermodynamic phase equilibria of a binary Na2SO4 aqueous solution was experimentally determined. The effect of the antiscalant on the nucleation and growth rates of both ice and salt in a continuous EFC process was then established for concentrations of 200, 350 and 500 mg/L of antiscalant. Product quality parameters such as the Crystal Size Distribution (CSD), morphology and purity of crystals were also measured since they are directly affected by the kinetic rate processes investigated

    Coastal plants for biofuel production and coastal preservation

    Get PDF
    ABSTRACT Sustainable and renewable biofuels as well as coastal preservation are important to the State of Louisiana which is losing its coastline at the rate of up to 100 square kilometers per year. This has important implications for other coastal areas worldwide. By managing water hyacinth in canals and lakes in coastal Louisiana the biomass of this fast growing aquatic plant can reduce coastal erosion by absorbing wave energy, and remediate waste water through bioabsorption of contaminants, while also providing a source of biofuel. This research has shown that coastal vegetation can play a part in lessening the impact of storms by reducing wave energy up to14%. Floating booms can hold water hyacinth in place along coastal canals so that it can be contained for growth and harvesting while providing this protection. Under average growing conditions in Louisiana, water hyacinth produced 2.4 to 2.6 metric tons of hydrated biomass per hectare per day. In addition this research found that this plant has a fermentable glucose and xylose content in excess of 48% by dry weight which is suitable for bioethanol production. Its rapid growth rate combined with its fermentable sugar concentration makes water hyacinth a viable candidate for use as a source of biofuel and for coastal preservation. Engineered barges fitted with loading mechanisms and harvesting systems were designed to contain and harvest water hyacinth in Louisiana’s coastal canals and to produce biofuel from harvested water hyacinth. Harvesting and growth site accessibility and design for transportation and proximity to coastal ethanol production facilities was integral to the design. Carbon neutral fuels are an important consideration related to environmental sustainability concerns. As the State of Louisiana is losing coastal wetlands the combination of erosion control with biofuel production will be a great benefit to the state and other coastal areas of the world
    corecore