488 research outputs found

    Language games

    Get PDF
    2013 Spring.Includes bibliographical references.The complex nature of language has interested me as long as I can remember: how we experience it and how it affects our lives in both personal and public ways. This fascination was the spark for a thesis body of work that considers Ludwig Wittgenstein's "language game" in the context of contemporary discourse. In his publication Philosophical Investigations, he first coins the term, noting that it is "meant to bring into prominence the fact that the speaking of language is part of an activity, or of a form of life." This idea that we activate language as we speak it, is the cornerstone of my personal exploration of the written and spoken word as a medium and the foundation of this thesis body

    The Continuing Validity of the Electoral College: A Quantitative Confirmation

    Get PDF
    In recent years, efforts to undermine or discard the Electoral College have gained substantial momentum, leading to a need for objective answers about how the system affects presidential elections. Using accessible quantitative techniques, this article answers three essential questions about the purposes and effects of the Electoral College using a unique approach that measures the electoral system’s success and potential in terms that correspond to its raison d\u27être, parameterizing the problem in terms of satisfaction and population instead of voters. This article dispenses with arcane, voter-based statistical models. It recognizes the Electoral College as a discrete mathematical system and applies more appropriate descriptive and predictive techniques to election data. The result is that the system’s effect on elections is quantified, related to historical data, and reliably forecast for the foreseeable future. This is the type of substantive analysis long needed to confirm or disprove the system’s merits. Part I first examines records of the Constitutional Convention to determine the Framers’ purpose in choosing the algorithm they did. Concluding that their purpose was to provide a president who would be representative of people across the country, the article proceeds to examine whether the system has achieved their goal. Part II describes the discrepancies there have been between the popular and electoral vote in order to fairly characterize the basis for controversy. It then proposes and applies a framework for assessing whether the Electoral College results in an effective expression of the will and interests of the People that is consistent with legitimate governance. Part III concludes with a mathematical analysis that proves that there are specific, calculable limitations on the size and distribution of a prevailing minority and illustrates that there is a continuing likelihood that winning candidates will be selected by states comprising a majority of the population

    Characerization, Depolymerization And Fractionation Of Alkali Lignin

    Get PDF
    Lignin may serve as a potential source of renewable chemicals and as a possible wealth of materials for replacement of petroleum-based fuel and petrochemicals. Lignin is a plant component that constitutes the second most common natural polymer on earth, behind only cellulose, and is the most common natural polymer with an aromatic network. Technical lignins (isolated from chemical processing of raw lignin) are produced as waste in the papermaking and biorefinery industries; an estimation of U.S. waste lignin is about 24 million tons yearly, more than the estimated 10.5 million tons of plastics discarded annually. The exact structures of natural lignin and technical lignins are still not known, thus research continues on characterization of the many forms of technical lignins, which can differ substantially. In this work, we have developed a gel permeation chromatography (GPC) method by HPLC with a variable wavelength UV-Vis detector; this was applied to raw lignin and technical lignins in order to establish a feasible method of determining molecular weights for a polymer which is insoluble in a pure aqueous or a pure organic solvent. Characterization of lignin was continued with a modified Folin-Ciocalteu method for quantification of phenolic hydroxyl groups in lignin model compounds and technical lignins. Additionally, analysis of four factors of the experiment were statistically evaluated using a 24 full factorial (ANOVA) design of experiment, giving information on main influences and interactions of the method. Fractionation of lignins was carried out by preparative size exclusion chromatography. Further analysis of molecular weight distribution in the individual fractions was performed by electrospray ionization high resolution time-of-flight mass spectrometry (ESI HR TOF-MS), thermal carbon analysis (TCA) and thermal desorption-pyrolysis-gas chromatography-mass spectrometry (TD-Py-GC-MS). Additional information about phenolic and aliphatic hydroxyl groups was supplied through phosphitylized standards and lignin samples evaluated via 31PNMR analysis. Oxidative depolymerization of alkali lignin was accomplished through addition of hydrogen peroxide to a water matrix at various percentages (v/v), also with variation of added methanol as a co-solvent. Lignin samples with initial pH values of 3, 7 and 11 were evaluated for wt% of solubilized (depolymerized) material under two sets of filtration, and analyzed for pH change as well. Depolymerization was also done through subcritical water (SW) treatment of alkali lignin. TCA and TD-Py-GC-MS analyses of 300 °C SW samples were performed as described above, while the mass range for MS analysis was 10 – 550 m/z. This range had a lower limit which allowed monitoring of noncondensable gases (H2O, N2, O2, CO2). In addition, a novel method of mass balance was implemented through normalization of TCA and TD-Py-GC-MS data. SW treated samples were compared to untreated lignin profiles to determine the predominant species yielded at each temperature fraction. The process of condensation with concomitant gas formation through the temperature fractions was monitored through elemental analysis as C/H and C/O ratios. A summary of results finds that GPC method development allowed a determination of THF:water ratios which in turn led to complete solubilization in extraction solvents. FC method development resulted in quantitative phenolic OH count per nmol carbon in whole technical lignins and solubilized alkali lignin samples. Fractionation methodology was found to effectively limit MW ranges within individual fractions, although not to the extent expected. Both hi-MW and low-MW compounds outside expected ranges were found in every fraction. Oxidation of lignin by hydrogen peroxide did show depolymerization of samples, but this may have been due primarily to thermal effects. Peroxide reactions resulted in excessive ring-opening which in turn allowed a large amount of condensation and an actual increase in MW and a loss of solubilized material due to filtration of condensed material. Additionally, the lignin in basic and acidic solutions showed a very noticeable buffering effect. Subcritical water treatment of lignin samples resulted in a good mass balance for depolymerized materials in the liquid fraction; the extent of degradation was found to be more extensive than thought when looking at the GPC profiles

    Will Teachers Implement Instruction Aligned to the Common Core State Standards?: Utilizing a Predictive Model

    Get PDF
    The purpose of this study was to investigate the use of the Theory of Planned Behavior (TPB) as a predictive model for secondary (i.e., grades six through twelve) teachers’ intent to implement instruction aligned to the Common Core State Standards. Two differing TPB models were investigated utilizing a regression analysis. The first model included TPB elements including attitude, subjective norms, and perceived behavioral control (i.e., self-efficacy) while the second model included two additional measures including perceived knowledge and accurate knowledge. Because a measure for secondary teachers’ sense of efficacy for literacy instruction did not exist to measure the construct in the TPB theoretical model, a scale was created and an initial validation study was conducted on the scale. Overall, subjective norms were a significant predictor of secondary teachers’ intent to implement literacy instruction across both TPB models. Sense of efficacy was a significant contributor in the original model, yet it did not demonstrate significance in the second model when knowledge was entered. Perceived knowledge was a significant predictor in the second model

    Evaluation of an Elementary School Wellness Concept in Rural East Tennessee.

    Get PDF
    Parents of elementary school children in the small, rural area of Unicoi County, TN were surveyed to determine their attitudes toward health, nutrition, and Unicoi County Schools\u27 Wellness Policy. Elementary school classrooms were randomly chosen to receive surveys for the children\u27s parents to return by mail. Data were compiled and analyzed using SPSS software. Over 99% of parents stated that nutrition education in schools was important, and 96% stated schools played an important role in their child\u27s health. The assessment provided meaningful data and laid groundwork for future nutrition education programs. The research showed rural, lower-income parents are supportive of positive nutritional changes in schools

    How Immigration Policy Affects Migratory Flows and Immigrant Experiences: A Comparative Analysis of Policy Impacts on Northern Triangle and Venezuelan Immigrants in the United States

    Get PDF
    In the past two decades, the US has experienced a large influx of immigrants from Venezuela and the Northern Triangle countries of Guatemala, Honduras and El Salvador. Due to these unprecedented increases, there has been numerous notable shifts in Immigration control policy between the presidencies of Barack Obama and Donald Trump. Generally, policies under Obama were favorable and reflected pro-immigrant rhetoric. Meanwhile, Trump took a drastic turn toward restrictionist, unfavorable policies. This study aims to examine the impact of immigration policy on migratory flows and the immigrant experience in the US. Using both quantitative and qualitative methods, I examine data that demonstrates the migratory inflows and outflows of these groups as well as oral history interviews and newspaper discourse to determine the effects of immigration policy on migratory flows and the socioeconomic integration of these immigrant groups in the US. The results indicate that favorable policies generally coincided with increases in migratory flows while unfavorable policies coincided with decreases. With regard to the immigrant experience, favorable policies positively impacted the socioeconomic integration process and unfavorable policies negatively impacted the process. This study may provide valuable insight for decision makers of immigration control policy and allow for a deeper understanding of the effects of immigration policies

    Low temperature survival of 'Redhaven' peach floral buds on selected rootstocks

    Get PDF
    A Thesis Presented to The Faculty of the Graduate School at the University of Missouri In Partial Fulfillment of the Requirements for the Degree Master of Science.Thesis supervisor: Dr. Michele Warmund.The relative cold tolerance of 'Redhaven' peach floral buds grafted onto various rootstocks was evaluated at selected dates from November 2011 to March 2013. Budwood was collected from trees in coordinated rootstock trials at New Franklin, MO and Clemson, SC for artificial freezing tests in late fall, mid-winter, and early spring. Samples were cooled 3 �C/h, thawed, and 'Redhaven' floral bud T50 values for each rootstock were calculated from the number of dead buds per test temperature. Although winter temperatures were unseasonably warm during this study, 'Redhaven' floral buds varied in cold tolerance among the rootstocks grown in Missouri in February 2012 and March 2013. In February 2012, 'Redhaven' floral buds on trees with KV010-127 and HBOK 32 rootstocks were the most cold tolerant, but in March 2013, those on Guardian rootstock were the hardiest. For South Carolina, 'Redhaven' floral buds on trees with Lovell and Viking rootstocks were the most hardy in January 2012, which was the only sampling date in which T50 values differed among rootstocks. When data were pooled from both locations, mean 'Redhaven' floral bud T50 values were always lower in Missouri than in South Carolina at similar collection periods. Also, buds from trees on Lovell, Guardian, Bright's Hybrid #5, and HBOK 32 rootstocks were hardier than those on Controller 5 and Mirobac rootstocks.Includes bibliographical references (pages 36-42)

    Recherche, intégrité et éthique à aborder distinctement mais conjointement /

    Get PDF
    Titre de l'écran-titre (visionné le 23 avril 2009).Bibliogr.Webographi

    Analysis of Artifacts Inherent to Real-Time Radar Target Emulation

    Get PDF
    Executing high-fidelity tests of radar hardware requires real-time fixed-latency target emulation. Because fundamental radar measurements occur in the time domain, real-time fixed-latency target emulation is essential to producing an accurate representation of a radar environment. Radar test equipment is further constrained by the application specific minimum delay to a target of interest, a parameter that limits the maximum latency through the target emulator algorithm. These time constraints on radar target emulation result in imperfect DSP algorithms that generate spectral artifacts. Knowledge of the behavior and predictability of these spectral artifacts is the key to identifying whether a particular suite of hardware is sufficient to execute tests for a particular radar design. This work presents an analysis of the design considerations required for development of a digital radar target emulator. Further considerations include how the spectral artifacts inherent to the algorithms change with respect to the radar environment and an analysis of how effectively various DSP algorithms can be used to produce an accurate representation of simple target scenarios. This work presents a model representative of natural target motion, a model that is representative of the side effects of digital target emulation, and finally a true HDL simulation of a target
    • …
    corecore