584 research outputs found

    Philosophy of Technology Assumptions in Educational Technology Leadership

    Get PDF
    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved technology directors and instructional technology specialists from school districts, and data collection involved interviews and a written questionnaire. Three broad philosophy of technology views were widely held by participants, including an instrumental view of technology, technological optimism, and a technological determinist perspective that sees technological change as inevitable. Technology leaders were guided by two main approaches to technology decision making in cognitive dissonance with each other, represented by the categories Educational goals and curriculum should drive technology, and Keep up with Technology (or be left behind). The researcher concluded that as leaders deal with their perceived experience of the inevitability of technological change, and their concern for preparing students for a technological future, the core category Keep up with technology (or be left behind) is given the greater weight in technology decision making. A risk is that this can on occasion mean a quickness to adopt technology for the sake of technology, without aligning the technology implementation with educational goals

    Examining Philosophy of Technology Using Grounded Theory Methods

    Get PDF
    A qualitative study was conducted to examine the philosophy of technology of K-12 technology leaders, and explore the influence of their thinking on technology decision making. The research design aligned with CORBIN and STRAUSS grounded theory methods, and I proceeded from a research paradigm of critical realism. The subjects were school technology directors and instructional technology specialists, and data collection consisted of interviews and a written questionnaire. Data analysis involved the use of grounded theory methods including memo writing, open and axial coding, constant comparison, the use of purposive and theoretical sampling, and theoretical saturation of categories. Three broad philosophy of technology views were widely held by participants: an instrumental view of technology, technological optimism, and a technological determinist perspective that saw technological change as inevitable. Technology leaders were guided by two main approaches to technology decision making, represented by the categories Educational goals and curriculum should drive technology, and Keep up with Technology (or be left behind). The core category and central phenomenon that emerged was that technology leaders approached technology leadership by placing greater emphasis on keeping up with technology, being influenced by an ideological orientation to technological change, and being concerned about preparing students for a technological future

    Questioning Technological Determinism through Empirical Research

    Get PDF
    Using qualitative methods, the author sought to better understand how philosophical assumptions about technology affect the thinking, and influence the decision making, of educational technology leaders in their professional practice. One of the research questions focused on examining whether assumptions of technological determinism were present in thinking and influenced the decisions that leaders make. The core category that emerged from data analysis, Keep up with technology, was interpreted to be a manifestation of the technological imperative, an assumption associated with the philosophical perspective of technological determinism. The article presents a literature review and critique of philosophical issues surrounding technological determinism. Data analysis led to the conclusion that technology leaders working in K-12 education place weighted priority on the technological imperative, and there is philosophical tension between Keep up with technology, and a concurrently held perspective based on the logic of the instrumental view of technology. The findings suggest that different accounts of technological determinism, including Bimber’s three accounts of normative, nomological, and unintended consequences, are significant in the thinking of participants. School technology leaders placed priority on embracing technological change, sometimes adopting technology for its own sake

    The fragment effect: an innovative new approach to apatite (U-Th)/He thermochronology

    Get PDF
    The uniquely low temperature sensitivity of the apatite (U-Th)/He system makes it an invaluable tool for studying shallow crustal processes which are not accessible through other techniques. Major advancements in both the theoretical and practical aspects of the technique have taken place over the past decade or so, however the routine application of the process is often held back by the perceived problem of single grain age ‘over dispersion’, particularly when applied to old, slowly cooled geological settings. There persists a misconception that age dispersion is indicative of a problem with the apatite (U-Th)/He system. A significant component of single grain age dispersion is inherent to the natural system, and therefore beneficial to reconstructing robust thermal histories. Variations in crystal grain size, accumulated amounts of radiation damage and changes to the helium concentration gradient within a grain due to fragmentation all contribute positively to age dispersion. Other, imposed factors such as crystal zoning and 4He implantation (which are undesirable) can also contribute to dispersion, however in the vast majority of cases their effects are negligible and only contribute noise to the inherent natural dispersion signal. The Ballachulish Igneous complex (BIC) in western Scotland has been used as a case study to demonstrate the range of age dispersion which should be expected when analysing large numbers of single grain aliquots per sample. Where 20+ grains are analysed, total dispersion will often be well in excess of 100% for old, slowly cooled samples, indeed dispersion in excess of 200% is possible. Such dispersion will often be as a consequence of outlying or apparently anomalous ages, however such ages should not be discounted unless there is sound analytical justification to do so. Apparent anomalous ages will often be ‘swallowed up’ by the data if more, or even different sized/shaped grains are analysed. Due to the competing effects of the three main causes of inherent natural dispersion, it should not be expected that large, well dispersed data sets will show any significant correlation between single grain age and either grain size or eU concentration. However a lack of correlation does not indicate poor quality data. Brown, Beucher and co-workers (Brown et al., 2013; Beucher et al., 2013) proposed a new modelling approach to account for the common occurrence of broken crystals in apatite separates, demonstrating that the additional inherent natural age dispersion arising from analysing fragments can be exploited when reconstructing thermal histories. A new inversion technique – HelFRAG was developed, based on a finite length cylinder diffusion model. The model is computationally demanding, therefore sampling based inversion methods requiring many forward model simulations become less practical. Consequently, an approximation of the finite cylinder diffusion model has been incorporated into the modelling software QTQt (Gallagher, 2012). Here, the approximation – QFrag has been demonstrated capable of returning comparable results to the full HelFRAG inversion technique when given the same synthetic data set, enabling the more routine application of the fragment model. Both QFrag and HelFRAG modelling techniques have been used to model the new BIC AHe dataset. The purpose is twofold: to demonstrate the importance of the fragment model with a real dataset, and to provide a new thermochronological interpretation for the BIC. When using this dataset, modelling samples individually shows only subtle differences (if any) between modelling broken grains correctly as fragments, verses modelling them incorrectly as whole grains. A far greater difference in the model output is seen when only modelling 3-6 grains compared to 20+, irrespective of whether fragments are treated correctly or not. When multiple samples are modelled together in a vertical profile, the fragment effect becomes much more important. A very different thermal history interpretation arises when any broken grains are modelled incorrectly as whole grains compared to when modelled as fragments. The new thermal history interpretation for the BIC involves a four stage cooling history from the time of intrusion (c. 424Ma). Very rapid cooling and uplift occurred immediately after intrusion over the first c. 20Myrs of the history (Phase 1). This brought the complex from c. 10km depths to within 2-3km of the surface. There followed much slower continued uplift between c. 404Ma and c. 300Ma, resulting in up to 1km of denudation (Phase 2). Over the next c. 150Myrs only a small volume of uplift occurred, however the geothermal gradient increased towards the end of this time period, suggesting crustal thinning (Phase 3). A final, rapid period of cooling and uplift occurred at c. 140Ma, bringing the top of the profile very near to the surface (Phase 4). No significant denudation has occurred since the end of this rapid uplift phase (10’s to 100’s of meters at most). The first two phases of cooling are interpreted as the final stages of the Caledonian orogeny, with erosion driven isostatic uplift causing continued denudation after the cessation of collisional tectonics. The end of phase three and the subsequent rapid uplift (Phase 4) are interpreted as the beginnings of crustal thinning and continental rifting which ultimately led to the opening of the North Atlantic Ocean

    Moles

    Get PDF
    The South Carolina Department of Natural Resources published guides to many threatened animals living in the state. This guide gives information about moles, including description, status, habitat, conservation challenges & recommendations, and measures of success

    Antenna design for microwave hepatic ablation using an axisymmetric electromagnetic model

    Get PDF
    BACKGROUND: An axisymmetric finite element method (FEM) model was employed to demonstrate important techniques used in the design of antennas for hepatic microwave ablation (MWA). To effectively treat deep-seated hepatic tumors, these antennas should produce a highly localized specific absorption rate (SAR) pattern and be efficient radiators at approved generator frequencies. METHODS AND RESULTS: As an example, a double slot choked antenna for hepatic MWA was designed and implemented using FEMLABâ„¢ 3.0. DISCUSSION: This paper emphasizes the importance of factors that can affect simulation accuracy, which include boundary conditions, the dielectric properties of liver tissue, and mesh resolution

    Harold M. Frost T J Musculoskel Neuron Interact 2001; 2(2):117-119 William F. Neuman Awardee 2001

    Get PDF
    Tribute to Harold M. Frost, honorary president of ISMNI, who received the William F. Neuman Award from the American Society of Bone and Mineral Research October 2001

    The minimal preprocessing pipelines for the Human Connectome Project

    Get PDF
    The Human Connectome Project (HCP) faces the challenging task of bringing multiple magnetic resonance imaging (MRI) modalities together in a common automated preprocessing framework across a large cohort of subjects. The MRI data acquired by the HCP differ in many ways from data acquired on conventional 3 Tesla scanners and often require newly developed preprocessing methods. We describe the minimal preprocessing pipelines for structural, functional, and diffusion MRI that were developed by the HCP to accomplish many low level tasks, including spatial artifact/distortion removal, surface generation, cross-modal registration, and alignment to standard space. These pipelines are specially designed to capitalize on the high quality data offered by the HCP. The final standard space makes use of a recently introduced CIFTI file format and the associated grayordinate spatial coordinate system. This allows for combined cortical surface and subcortical volume analyses while reducing the storage and processing requirements for high spatial and temporal resolution data. Here, we provide the minimum image acquisition requirements for the HCP minimal preprocessing pipelines and additional advice for investigators interested in replicating the HCP's acquisition protocols or using these pipelines. Finally, we discuss some potential future improvements to the pipelines
    • …
    corecore