325 research outputs found

    The new urban paradigm

    Get PDF
    This paper argues in favor of a new urban model that harnesses the power that cities have to curb global warming. Such a model tackles fundamental management challenges in the energy, building and transport sectors to promote the growth of diverse and compact cities. Such a model is essential for meeting complex challenges in cities, such as promoting a cohesive social life and a competitive economic base while simultaneously preserving agricultural and natural systems crucial to soil, energy, and material resources. With most of the population living in urban areas, the G20 should recognize the key role that cities play in addressing global challenges such as climate change. Improved measures taken by cities should be an indispensable solution. The G20 Development Working Group, Climate Sustainability Working Group, and Energy Transitions Working Group should incorporate an urban approach to discussions related to climate change.Fil: Lanfranchi, Gabriel. Centro de Implementación de Políticas Públicas para la Equidad y el Crecimiento; ArgentinaFil: Herrero, Ana Carolina. Centro de Implementación de Políticas Públicas para la Equidad y el Crecimiento; ArgentinaFil: Rueda Palenzuela, Salvador. Agencia Ecología Urbana Barcelona; EspañaFil: Camilloni, Ines Angela. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Centro de Investigaciones del Mar y la Atmósfera. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Centro de Investigaciones del Mar y la Atmósfera; ArgentinaFil: Bauer, Steffen. German Development Institute; Alemani

    Social Determinants, Urban Planning, and Covid-19 Response: Evidence From Quito, Ecuador

    Get PDF
    Covid-19 has put all urban planning systems around the world to the test. Cities’ design and how these are managed are being observed, analyzed, and even questioned from the perspective of the pandemic. Density and poverty have been two fundamental aspects to manage in the pandemic scenario in cities of the Global South, which face this challenge along with other pre-pandemic planning problems. In the city of Quito, Ecuador, the response to the pandemic has been coordinated through regulations issued by the emergency operations center at the national level, and the information (number of cases) has been recorded per parish. The objective of this research is to determine if there is a relationship between Covid-19, poverty, and population density at the parish level for the canton of Quito. The results have shown that there is no correlation. What they did show is that due both to the difficulties of responding to the pandemic and the city’s planning structure, another type of characterization, or characterizations, of the territory (for example, by scenarios or by situations) is needed, which can respond to the needs of the most vulnerable groups. Another observable result was that the gap between urban planning and management instruments and the complexity of territorial needs contributes to the polarization of local government approaches, which compromises urban planning with minimum continuity and coherence

    Análisis y propuesta de intervención en las Aulas hospitalarias de Aragón.

    Get PDF
    La finalidad de este trabajo ha sido realizar una propuesta de intervención, en un contexto hospitalario, enfocado a niños de 8 a 11 años. Se promueve la total inclusión y la atención a la diversidad de niños que, por circunstancias relacionadas con su salud, se desenvuelven en un contexto externo al centro escolar con las dificultades que su situación conlleva. Esta propuesta de intervención es dinámica, está integrada en la realidad del alumno y busca promover, sobre todo, su autonomía y creatividad. Se trata de fomentar el desarrollo integral de los niños hospitalizados, dando continuidad al proceso educativo, mediante la impartición de materias y contenidos con un aspecto lúdico y recreativo en un intento por mejorar su calidad de vida y bienestar emocional y así evitar el fracaso escolar dentro de lo posible.<br /

    Accounting for rupture directivity in ShakeMap

    Get PDF
    The rapid and accurate information about the ground shaking following an earthquake is necessary for emergency response planning. A prompt strategy is contouring the real data recorded at the stations. However only few regions, i.e. Japan and Taiwan, have an instrumental coverage as good as needed to produce shaking maps relying almost entirely on real data. ShakeMap has been conceived in order to “fill” the data gap and producing stable contouring using the ground motion predictive equations (GMPEs) and site effect. Thus for regions where the data coverage is sparse, the interpolation plays a crucial role and the choice of the GMPE can affect strongly the goodness of the ground shaking estimation. However the GMPEs derive from an empirical regression describing the averaged behavior of the ground shaking and tend to mask, when present, specific trends due to multidimensional effects like the asymmetry of the rupture process (directivity effect). Thus, ShakeMaps for large events may not reproduce faithfully the ground motion in the near source if determined without the introduction of rupture related parameters. One way to improve the ShakeMap prediction is to modify the ground motion modeling in order to better explain the ground motion variability. To this purpose, the empirical model can be refined with information about the rupture process (Spagnuolo PhD2010), in this case using the directivity term defined by Spudich and Chiou (Earthquake Spectra 2008). The aim of this work is to quantify the effectiveness of refined GMPEs in improving the performance of ShakeMap. We quantify the agreement of this new GMPE with the real recorded data, and make inference about the reliability of this new ShakeMap. The test is focused on the study of the ShakeMap degradation when the number of the observations is reduced, and on the quantification of the improvements due to the directivity term. In order to conduct properly the test, we investigate two well- recorded events from Japan: the 2008 Iwate-Miyagi (M7) and the 2000 Tottori (M6.6) events. This work is part of the DPC-INGV S3 project (2007-09), as described in the companion abstract Ameri et al. (ESC2010)

    Protein Thermodynamic Destabilization in the Assessment of Pathogenicity of a Variant of Uncertain Significance in Cardiac Myosin Binding Protein C.

    Get PDF
    In the era of next generation sequencing (NGS), genetic testing for inherited disorders identifies an ever-increasing number of variants whose pathogenicity remains unclear. These variants of uncertain significance (VUS) limit the reach of genetic testing in clinical practice. The VUS for hypertrophic cardiomyopathy (HCM), the most common familial heart disease, constitute over 60% of entries for missense variants shown in ClinVar database. We have studied a novel VUS (c.1809T>G-p.I603M) in the most frequently mutated gene in HCM, MYBPC3, which codes for cardiac myosin-binding protein C (cMyBPC). Our determinations of pathogenicity integrate bioinformatics evaluation and functional studies of RNA splicing and protein thermodynamic stability. In silico prediction and mRNA analysis indicated no alteration of RNA splicing induced by the variant. At the protein level, the p.I603M mutation maps to the C4 domain of cMyBPC. Although the mutation does not perturb much the overall structure of the C4 domain, the stability of C4 I603M is severely compromised as detected by circular dichroism and differential scanning calorimetry experiments. Taking into account the highly destabilizing effect of the mutation in the structure of C4, we propose reclassification of variant p.I603M as likely pathogenic. Looking into the future, the workflow described here can be used to refine the assignment of pathogenicity of variants of uncertain significance in MYBPC3.J.A.C. was funded by the Ministerio de Ciencia, Innovación y Universidades (MCNU) through grants BIO2017-83640-P (AEI/FEDER, UE) and RYC-2014-16604, the European Research Area Network on Cardiovascular Diseases (ERA-CVD/ISCIII, MINOTAUR, AC16/00045), the Comunidad de Madrid (P2018/NMT-4443) and the CNIC-Severo Ochoa intramural grant program (03-2016 IGP). The CNIC is supported by the Instituto de Salud Carlos III (ISCIII), MCNU and the Pro CNIC Foundation, and is a Severo Ochoa Center of Excellence (SEV-2015-0505). G.F. was funded by the Ministero dell’Istruzione, dell’Università e della Ricerca-Rome PS35-126/IND.S

    Testing the improvement of ShakeMaps using f inite-f ault models and synthetic seismograms

    Get PDF
    ShakeMap package uses empirical ground motion prediction equations (GM PEs) to estimate the ground motion where recorded data are not available. Recorded and estimated values are then interpolated in order to produce a shaking map associated to the considered event. Anyway GMPEs account only for average characteristics of source and wave propagation processes. Within the framework of the DPC-INGV S3 project (2007-09), we evaluate whether the inclusion of directivity effects in GMPEs (companion paper Spagnuolo et al., 2010) or the use of synthetic seismograms from finite-fault rupture models may improve the ShakeMap evaluation. An advantage of using simulated motions from kinematic rupture models is that source effects, as rupture directivity, are directly included in the synthetics. This is particularly interesting in Italy where the regional GMPEs, based on a few number of near-source records for moderate-to-large earthquakes, are not reliable for estimating ground motion in the vicinity of the source. In this work we investigated how and if the synthetic seismograms generated with finite-fault models can be used in place of (or in addition to) GMPEs within the ShakeMap methodology. We assumed a description of the rupture model with gradually increasing details, from a simple point source to a kinematic rupture history obtained from inversion of strong-motion data. According to the available information synthetic seismograms are calculated with methods that account for the different degree of approximation in source properties. We chose the M w 6.9 2008 Iwate-M iyagi (Japan) earthquake as a case study. This earthquake has been recorded by a very large number of stations and the corresponding ShakeMap relies almost totally on the recorded ground motions. Starting from this ideal case, we removed a number of stations in order to evaluate the deviations from the reference map and the sensitivity of the map to the number of stations used. The removed data are then substituted with synthetic values calculated assuming different source approximations, and the resulting maps are compared to the original ones (containing observed data only). The use of synthetic seismograms computed for finite-fault rupture models produces, in general, an improvement of the calculated ShakeMaps, especially when synthetics are used to integrate real data. When real data are not available and ShakeMap is estimated using GMPEs only, the improvement adding simulated values depends on the considered strong-motion parameters

    Guest-responsive polaritons in a porous framework: chromophoric sponges in optical QED cavities

    Get PDF
    Introducing porous material into optical cavities is a critical step toward the utilization of quantum-electrodynamical (QED) effects for advanced technologies, e.g. in the context of sensing. We demonstrate that crystalline, porous metal–organic frameworks (MOFs) are well suited for the fabrication of optical cavities. In going beyond functionalities offered by other materials, they allow for the reversible loading and release of guest species into and out of optical resonators. For an all-metal mirror-based Fabry–Perot cavity we yield strong coupling (∼21% Rabi splitting). This value is remarkably large, considering that the high porosity of the framework reduces the density of optically active moieties relative to the corresponding bulk structure by ∼60%. Such a strong response of a porous chromophoric scaffold could only be realized by employing silicon-phthalocyanine (SiPc) dyes designed to undergo strong J-aggregation when assembled into a MOF. Integration of the SiPc MOF as active component into the optical microcavity was realized by employing a layer-by-layer method. The new functionality opens up the possibility to reversibly and continuously tune QED devices and to use them as optical sensors

    Candidate LBV stars in galaxy NGC 7793 found via HST photometry + MUSE spectroscopy

    Get PDF
    Only about 19 Galactic and 25 extragalactic bonafide luminous blue variables (LBVs) are known to date. This incomplete census prevents our understanding of this crucial phase of massive star evolution which leads to the formation of heavy binary black holes via the classical channel. With large samples of LBVs one could better determine the duration and maximum stellar luminosity which characterize this phase. We search for candidate LBVs (cLBVs) in a new galaxy, NGC 7793. For this purpose, we combine high spatial resolution images from two Hubble Space Telescope (HST) programs with optical spectroscopy from the Multi Unit Spectroscopic Explorer (MUSE). By combining PSF-fitting photometry measured on F547M, F657N, and F814W images, with restrictions on point-like appearance (at HST resolution) and H α luminosity, we find 100 potential cLBVs, 36 of which fall in the MUSE fields. Five of the latter 36 sources are promising cLBVs which have MV ≤ −7 and a combination of: H α with a P-Cygni profile; no [O I]λ6300 emission; weak or no [O III]λ5007 emission; large [N II]/H α relative to H II regions; and [S II]λ6716/[S II]λ6731∼1⁠. It is not clear if these five cLBVs are isolated from O-type stars, which would favour the binary formation scenario of LBVs. Our study, which approximately covers one fourth of the optical disc of NGC 7793, demonstrates how by combining the above HST surveys with multi-object spectroscopy from 8-m class telescopes, one can efficiently find large samples of cLBVs in nearby galaxies

    An Open System for Social Computation

    Get PDF
    Part of the power of social computation comes from using the collective intelligence of humans to tame the aggregate uncertainty of (otherwise) low veracity data obtained from human and automated sources. We have witnessed a surge in development of social computing systems but, ironically, there have been few attempts to generalise across this activity so that creation of the underlying mechanisms themselves can be made more social. We describe a method for achieving this by standardising patterns of social computation via lightweight formal specifications (we call these social artifacts) that can be connected to existing internet architectures via a single model of computation. Upon this framework we build a mechanism for extracting provenance meta-data across social computations
    corecore