641 research outputs found

    Argumentation in school science : Breaking the tradition of authoritative exposition through a pedagogy that promotes discussion and reasoning

    Get PDF
    The value of argumentation in science education has become internationally recognised and has been the subject of many research studies in recent years. Successful introduction of argumentation activities in learning contexts involves extending teaching goals beyond the understanding of facts and concepts, to include an emphasis on cognitive and metacognitive processes, epistemic criteria and reasoning. The authors focus on the difficulties inherent in shifting a tradition of teaching from one dominated by authoritative exposition to one that is more dialogic, involving small-group discussion based on tasks that stimulate argumentation. The paper builds on previous research on enhancing the quality of argument in school science, to focus on how argumentation activities have been designed, with appropriate strategies, resources and modelling, for pedagogical purposes. The paper analyses design frameworks, their contexts and lesson plans, to evaluate their potential for enhancing reasoning through foregrounding the processes of argumentation. Examples of classroom dialogue where teachers adopt the frameworks/plans are analysed to show how argumentation processes are scaffolded. The analysis shows that several layers of interpretation are needed and these layers need to be aligned for successful implementation. The analysis serves to highlight the potential and limitations of the design frameworks

    Overfeeding, Autonomic Regulation and Metabolic Consequences

    Get PDF
    The autonomic nervous system plays an important role in the regulation of body processes in health and disease. Overfeeding and obesity (a disproportional increase of the fat mass of the body) are often accompanied by alterations in both sympathetic and parasympathetic autonomic functions. The overfeeding-induced changes in autonomic outflow occur with typical symptoms such as adiposity and hyperinsulinemia. There might be a causal relationship between autonomic disturbances and the consequences of overfeeding and obesity. Therefore studies were designed to investigate autonomic functioning in experimentally and genetically hyperphagic rats. Special emphasis was given to the processes that are involved in the regulation of peripheral energy substrate homeostasis. The data revealed that overfeeding is accompanied by increased parasympathetic outflow. Typical indices of vagal activity (such as the cephalic insulin release during food ingestion) were increased in all our rat models for hyperphagia. Overfeeding was also accompanied by increased sympathetic tone, reflected by enhanced baseline plasma norepinephrine (NE) levels in both VMH-lesioned animals and rats rendered obese by hyperalimentation. Plasma levels of NE during exercise were, however, reduced in these two groups of animals. This diminished increase in the exercise-induced NE outflow could be normalized by prior food deprivation. It was concluded from these experiments that overfeeding is associated with increased parasympathetic and sympathetic tone. In models for hyperphagia that display a continuously elevated nutrient intake such as the VMH-lesioned and the overfed rat, this increased sympathetic tone was accompanied by a diminished NE response to exercise. This attenuated outflow of NE was directly related to the size of the fat reserves, indicating that the feedback mechanism from the periphery to the central nervous system is altered in the overfed state.

    The Hubble Constant

    Get PDF
    I review the current state of determinations of the Hubble constant, which gives the length scale of the Universe by relating the expansion velocity of objects to their distance. There are two broad categories of measurements. The first uses individual astrophysical objects which have some property that allows their intrinsic luminosity or size to be determined, or allows the determination of their distance by geometric means. The second category comprises the use of all-sky cosmic microwave background, or correlations between large samples of galaxies, to determine information about the geometry of the Universe and hence the Hubble constant, typically in a combination with other cosmological parameters. Many, but not all, object-based measurements give H0H_0 values of around 72-74km/s/Mpc , with typical errors of 2-3km/s/Mpc. This is in mild discrepancy with CMB-based measurements, in particular those from the Planck satellite, which give values of 67-68km/s/Mpc and typical errors of 1-2km/s/Mpc. The size of the remaining systematics indicate that accuracy rather than precision is the remaining problem in a good determination of the Hubble constant. Whether a discrepancy exists, and whether new physics is needed to resolve it, depends on details of the systematics of the object-based methods, and also on the assumptions about other cosmological parameters and which datasets are combined in the case of the all-sky methods.Comment: Extensively revised and updated since the 2007 version: accepted by Living Reviews in Relativity as a major (2014) update of LRR 10, 4, 200

    La mutaciĂłn de la biblioteca en los inicios del siglo XXI

    Get PDF
    El papel que las bibliotecas desempeñan en la actualidad ha sido cuestionado en numerosas ocasiones, a causa de la revolución tecnológica a la que se está asistiendo, la cual pondría en amenaza el papel básico que hasta ahora tenían éstas. Pero la realidad es que se observa una tendencia en la que cada vez se construyen más bibliotecas y más complejas. Por lo que se torna harto interesante descubrir hacia dónde camina la arquitectura en este sentido. A este respecto, se advierte un cambio de paradigma respecto al papel que juega la biblioteca en la actualidad, provocando a su vez una respuesta en cuanto a la arquitectura bibliotecaria. Los cambios sociales que se han producido de la mano del desarrollo de la tecnología, se han visto implicados de forma directa e indirecta en este cambio de paradigma, presentando los desafíos a los que se enfrenta la arquitectura actual. Debido a la incertidumbre que se presenta de cara a este nuevo periodo, en base a los constantes cambios que se manifiestan debido a la rápida evolución que sufren las nuevas tecnologías, y de cara a las demandas de esta nueva sociedad, se torna fundamental la aplicación del principio de la flexibilidad en los nuevos espacios bibliotecarios, pero se rehúye de la clásica solución espacial donde el espacio se vuelve uniforme, sin apenas expresión plástica ni variedad. Por otro lado, se presentan los temidos efectos que la globalización tiene sobre el planeta, a cualquier escala, y que parecen llevar todo hacia una tendencia universalista. Por lo que se hace interesante el estudio de la arquitectura bibliotecaria desde diferentes ópticas a nivel cultural, tomando como referencia la cultura occidental, debido a su importante papel de cara al origen de este proceso, así como la cultura oriental, definida prácticamente como la antítesis de ésta. Sobre la base de lo planteado, la presente investigación, a modo de primera toma de contacto, ha pretendido abrir un campo de estudio a través del cual se puedan identificar nuevas formas de aplicar la flexibilidad en los espacios bibliotecarios, de cara a que éstos no se vuelvan caducos en cuanto al diseño que presentan, además de poner el causa los procesos de globalización que pretenden reducir la realidad actual, en base a procesos de homogeneización; por lo que se busca identificar procesos de hibridación en la arquitectura actual, a través de los cuales poder identificar si se producen diferencias entre las distintas culturas, valorando en qué medida se incluyen las cuestiones de carácter local en las obras que representan cada una de estas culturas. A través de la metodología empleada, basada en los estudios de caso individual y en el análisis comparativo de éstos, se han podido evidenciar líneas que muestran esa evolución de la arquitectura actual, en la que se torna evidente la importancia que tiene la flexibilidad en el nuevo escenario, así como la posibilidad de aplicarla de diversa manera, además de reconocerse los mencionados procesos de hibridación, permitiendo establecer diferencias entre las dos culturas, desmontando así la ambiciosa visión de carácter unilateral que se tenía sobre la globalización.Abstract: The role played by libraries nowadays has been frequently questioned in view of the technological revolution currently underway, which would seem to threaten the basic function libraries had up until now. However, what we are actually seeing is a trend in which a growing number of increasingly complex libraries are being built. Therefore, it is of interest to take a look at where architecture in this sector is heading. The truth is, a change can be observed in what is considered to be the model role of libraries today, which in turn leads to a different response in terms of their architecture. The social changes that have taken place as a result of developments in technology have played both a direct and indirect role in such a change of model and represent the challenges facing current architecture. Given the uncertainty that exists these days due to the constant changes occurring as a result of the rapid evolution of new technologies and of society’s new demands, it has become essential to build a flexibility component into all new library design concepts, while shying away from the classic solution of uniform spaces with little plastic variety or expression. On the other hand, the effects of globalisation present at all scales of life on this planet inevitably arise and tend to drive everything towards a common universality. So it is interesting to study the architecture of libraries from different cultural perspectives, firstly in Western culture, given the important role it has played in the origin of this process, and subsequently from the viewpoint of Oriental culture, practically defined as the antithesis of the former. On the basis of the above premises, this paper seeks to stand as the initial contact in a line of research that identifies new ways of applying flexibility into the architecture of libraries so that they do not become obsolete in terms of their design, as well as identifying the globalisation processes that seek to reduce current reality through homogenisation procedures. Therefore, it aims to identify hybridisation processes in current architecture that serve to ascertain whether distinctions occur between different cultures and to assess the extent to which local issues find a place in iconic buildings representing each of those cultures. By means of the methodology used, based on individual case studies and comparative analysis of each, certain traits have been revealed that show an evolution in contemporary architecture, in which greater importance is given in this new scenario to flexibility and the ability to apply it in different ways, as well as acknowledging the aforementioned hybridization processes, enabling differences between the two cultures to be identified and thus any narrow-minded view of globalization as a unilateral phenomenon to be dismantled

    MICE: The muon ionization cooling experiment. Step I: First measurement of emittance with particle physics detectors

    Get PDF
    Copyright @ 2011 APSThe Muon Ionization Cooling Experiment (MICE) is a strategic R&D project intended to demonstrate the only practical solution to providing high brilliance beams necessary for a neutrino factory or muon collider. MICE is under development at the Rutherford Appleton Laboratory (RAL) in the United Kingdom. It comprises a dedicated beamline to generate a range of input muon emittances and momenta, with time-of-flight and Cherenkov detectors to ensure a pure muon beam. The emittance of the incoming beam will be measured in the upstream magnetic spectrometer with a scintillating fiber tracker. A cooling cell will then follow, alternating energy loss in Liquid Hydrogen (LH2) absorbers to RF cavity acceleration. A second spectrometer, identical to the first, and a second muon identification system will measure the outgoing emittance. In the 2010 run at RAL the muon beamline and most detectors were fully commissioned and a first measurement of the emittance of the muon beam with particle physics (time-of-flight) detectors was performed. The analysis of these data was recently completed and is discussed in this paper. Future steps for MICE, where beam emittance and emittance reduction (cooling) are to be measured with greater accuracy, are also presented.This work was supported by NSF grant PHY-0842798

    Characteristic Evolution and Matching

    Get PDF
    I review the development of numerical evolution codes for general relativity based upon the characteristic initial value problem. Progress in characteristic evolution is traced from the early stage of 1D feasibility studies to 2D axisymmetric codes that accurately simulate the oscillations and gravitational collapse of relativistic stars and to current 3D codes that provide pieces of a binary black hole spacetime. Cauchy codes have now been successful at simulating all aspects of the binary black hole problem inside an artificially constructed outer boundary. A prime application of characteristic evolution is to extend such simulations to null infinity where the waveform from the binary inspiral and merger can be unambiguously computed. This has now been accomplished by Cauchy-characteristic extraction, where data for the characteristic evolution is supplied by Cauchy data on an extraction worldtube inside the artificial outer boundary. The ultimate application of characteristic evolution is to eliminate the role of this outer boundary by constructing a global solution via Cauchy-characteristic matching. Progress in this direction is discussed.Comment: New version to appear in Living Reviews 2012. arXiv admin note: updated version of arXiv:gr-qc/050809

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    X-ray emission from the Sombrero galaxy: discrete sources

    Get PDF
    We present a study of discrete X-ray sources in and around the bulge-dominated, massive Sa galaxy, Sombrero (M104), based on new and archival Chandra observations with a total exposure of ~200 ks. With a detection limit of L_X = 1E37 erg/s and a field of view covering a galactocentric radius of ~30 kpc (11.5 arcminute), 383 sources are detected. Cross-correlation with Spitler et al.'s catalogue of Sombrero globular clusters (GCs) identified from HST/ACS observations reveals 41 X-rays sources in GCs, presumably low-mass X-ray binaries (LMXBs). We quantify the differential luminosity functions (LFs) for both the detected GC and field LMXBs, whose power-low indices (~1.1 for the GC-LF and ~1.6 for field-LF) are consistent with previous studies for elliptical galaxies. With precise sky positions of the GCs without a detected X-ray source, we further quantify, through a fluctuation analysis, the GC LF at fainter luminosities down to 1E35 erg/s. The derived index rules out a faint-end slope flatter than 1.1 at a 2 sigma significance, contrary to recent findings in several elliptical galaxies and the bulge of M31. On the other hand, the 2-6 keV unresolved emission places a tight constraint on the field LF, implying a flattened index of ~1.0 below 1E37 erg/s. We also detect 101 sources in the halo of Sombrero. The presence of these sources cannot be interpreted as galactic LMXBs whose spatial distribution empirically follows the starlight. Their number is also higher than the expected number of cosmic AGNs (52+/-11 [1 sigma]) whose surface density is constrained by deep X-ray surveys. We suggest that either the cosmic X-ray background is unusually high in the direction of Sombrero, or a distinct population of X-ray sources is present in the halo of Sombrero.Comment: 11 figures, 5 tables, ApJ in pres

    Azimuthal anisotropy of charged particles at high transverse momenta in PbPb collisions at sqrt(s[NN]) = 2.76 TeV

    Get PDF
    The azimuthal anisotropy of charged particles in PbPb collisions at nucleon-nucleon center-of-mass energy of 2.76 TeV is measured with the CMS detector at the LHC over an extended transverse momentum (pt) range up to approximately 60 GeV. The data cover both the low-pt region associated with hydrodynamic flow phenomena and the high-pt region where the anisotropies may reflect the path-length dependence of parton energy loss in the created medium. The anisotropy parameter (v2) of the particles is extracted by correlating charged tracks with respect to the event-plane reconstructed by using the energy deposited in forward-angle calorimeters. For the six bins of collision centrality studied, spanning the range of 0-60% most-central events, the observed v2 values are found to first increase with pt, reaching a maximum around pt = 3 GeV, and then to gradually decrease to almost zero, with the decline persisting up to at least pt = 40 GeV over the full centrality range measured.Comment: Replaced with published version. Added journal reference and DO
    • …
    corecore