59 research outputs found

    Perforación profunda en el lago de Chalco: Reporte técnico

    Get PDF
    En este artículo se presenta un resumen de las actividades realizadas para la recuperación de la totalidad de la secuencia lacustre del lago de Chalco. Mediante estudios geofísicos se determinó la distribución y espesor de los sedimentos lacustres con base en lo cual se seleccionó el sitio de perforación. Con datos de los espectros H/V de sísmica pasiva se hizo un mapa de isofrecuencias que definieron una región con sedimentos lacustres y material volcánico granulado de hasta 300 m de espesor. El uso de métodos electromagnéticos mostró cambios en la resistividad eléctrica relacionados con variaciones en la composición de la columna sedimentaria; entre 100 – 120 m de profundidad hay un primer aumento en la resistividad asociado al incremento de materiales volcaniclásticos, y entre 330 – 400 m de profundidad un segundo aumento asociado a la presencia de coladas de basalto. Fueron perforados tres pozos con recuperación continua, llegando a profundidades de 420 m en el pozo A, 310 m en el B y 520 en el C. Durante el trabajo de perforación se tomaron muestras para el análisis geomicrobiológicos y de metagenómica. Durante el proceso de perforación se recuperó un total de 1152 m de sedimentos con una profundidad máxima de 520 m. El porcentaje de recuperación de la columna sedimentaria varió entre 88 a 92 % en los tres sondeos. Los resultados del análisis de susceptibilidad magnética en las tres secuencias indica que los primeros 260 m son sedimentos lacustres, entre 260 y 300 m los sedimentos son más gruesos y debajo de los 300 m son predominantemente volcaniclásticos. El análisis de la secuencia sedimentaria del lago de Chalco de los últimos ~300000 años, permitirá documentar y ampliar el conocimiento acerca de la variabilidad climática de la zona, la historia paleoambiental, la historia del cierre de la cuenca, el desarrollo del sistema lacustre y la recurrencia de la actividad volcánica en la cuenca. Además, el estudio de las propiedades físicas de esta secuencia sedimentaria es importante para la modelación de la propagación de ondas sísmicas y de la estructura de la cuenca, así como para mejorar la capacidad de modelación del proceso de subsidencia del terreno que experimenta esta región. This paper presents a short description of the coring operations undertaken to recover the full lacustrine sedimentary sequence from Chalco. Geophysical techniques were used to determine the distribution and thickness of the sediments in order to select the drilling site. Resonance frequencies determined from H/V spectral ratios were used to determine an area where lake sediments reached 300 m thickness. Electromagnetic survey showed two changes in electric resistivity which were related to changes in sediment composition, the first from 100 to 120 m, related to an increase in volcanoclastic sediments and the second from 330 to 400 m related to the presence of a basaltic flows. Three wells were drilled with continuous recovery, reaching depths of 420 m in well A, 310 in B and 520 in C. Samples for geomicrobiological and metagenomics studies were collected during drilling operations. A total of 1152 m of core sediments were recovered reaching a maximum depth of 520 m. Recovery percentages were between 88 and 92 % in the three wells. Magnetic susceptibility analyses in the three sequences show that the first 260 m are mostly lake sediments, between 260 and 300 m sediments are coarser and below 300 m they are mostly volcaniclastic. Analysis of the sedimentary sequence of Lake Chalco that covers the last ~300000 years will allow documenting and extending the knowledge of climate variability in area, the paleoenvironmental history, basin closure history, lacustrian system development and volcanic activity recurrence. Studies of the physical properties of this sequence will be important for seismic propagation and basin structure modeling, and also will improve modeling of the subsidence process that this region experiences

    The parent?infant dyad and the construction of the subjective self

    Get PDF
    Developmental psychology and psychopathology has in the past been more concerned with the quality of self-representation than with the development of the subjective agency which underpins our experience of feeling, thought and action, a key function of mentalisation. This review begins by contrasting a Cartesian view of pre-wired introspective subjectivity with a constructionist model based on the assumption of an innate contingency detector which orients the infant towards aspects of the social world that react congruently and in a specifically cued informative manner that expresses and facilitates the assimilation of cultural knowledge. Research on the neural mechanisms associated with mentalisation and social influences on its development are reviewed. It is suggested that the infant focuses on the attachment figure as a source of reliable information about the world. The construction of the sense of a subjective self is then an aspect of acquiring knowledge about the world through the caregiver's pedagogical communicative displays which in this context focuses on the child's thoughts and feelings. We argue that a number of possible mechanisms, including complementary activation of attachment and mentalisation, the disruptive effect of maltreatment on parent-child communication, the biobehavioural overlap of cues for learning and cues for attachment, may have a role in ensuring that the quality of relationship with the caregiver influences the development of the child's experience of thoughts and feelings

    Hepcidin and Hfe in iron overload in beta-thalassemia

    No full text
    Hepcidin (HAMP) negatively regulates iron absorption, degrading the iron exporter ferroportin at the level of enterocytes and macrophages. We showed that mice with beta-thalassemia intermedia (th3/+) have increased anemia and iron overload. However, their hepcidin expression is relatively low compared to their iron burden. We also showed that the iron metabolism gene Hfe is down-regulated in concert with hepcidin in th3/+ mice. These observations suggest that low hepcidin levels are responsible for abnormal iron absorption in thalassemic mice and that down-regulation of Hfe might be involved in the pathway that controls hepcidin synthesis in beta-thalassemia. Therefore, these studies suggest that increasing hepcidin and/or Hfe expression could be a strategy to reduces iron overload in these animals. The goal of this paper is to review recent findings that correlate hepcidin, Hfe, and iron metabolism in beta-thalassemia and to discuss potential novel therapeutic approaches based on these recent discoveries

    Sample return missions to minor bodies

    No full text
    Mark Burchell and the organizers of an RAS Specialist Discussion Meeting in May 2012 argue, on the basis of the Stardust and Hayabusa missions, that collecting samples from asteroids and comets offers a potentially rich scientific return – and one that can be exploited for many years to come

    A lattice-preserving multigrid method for solving the inhomogeneous poisson equations used in image analysis

    No full text
    Abstract. The inhomogeneous Poisson (Laplace) equation with internal Dirichlet boundary conditions has recently appeared in several applications ranging from image segmentation [1, 2, 3] to image colorization [4], digital photo matting [5, 6] and image filtering [7, 8]. In addition, the problem we address may also be considered as the generalized eigenvector problem associated with Normalized Cuts [9], the linearized anisotropic diffusion problem [10, 11, 8] solved with a backward Euler method, visual surface reconstruction with discontinuities [12, 13] or optical flow [14]. Although these approaches have demonstrated quality results, the computational burden of finding a solution requires an efficient solver. Design of an efficient multigrid solver is difficult for these problems due to unpredictable inhomogeneity in the equation coefficients and internal Dirichlet boundary conditions with unpredictable location and value. Previous approaches to multigrid solvers have typically employed either a data-driven operator (with fast convergence) or the maintenance of a lattice structure at coarse levels (with low memory overhead). In addition to memory efficiency, a lattice structure at coarse levels is also essential to taking advantage of the power of a GPU implementation [15,16,5,3]. In this work, we present a multigrid method that maintains the low memory overhead (and GPU suitability) associated with a regular lattice while benefiting from the fast convergence of a data-driven coarse operator.
    • …
    corecore