3,369 research outputs found

    Skipping the co-expression problem: the new 2A "CHYSEL" technology

    Get PDF
    The rapid progress in the field of genomics is increasing our knowledge of multi-gene diseases. However, any realistic hope of gene therapy treatment for those diseases needs first to address the problem of co-ordinately co-expressing several transgenes. Currently, the use of internal ribosomal entry sites (IRESs) is the strategy chosen by many researchers to ensure co-expression. The large sizes of the IRESs (~0.5 kb), and the difficulties of ensuring a well-balanced co-expression, have prompted several researchers to imitate a co-expression strategy used by many viruses: to express several proteins as a polyprotein. A small peptide of 18 amino acids (2A) from the foot-and-mouth disease virus (FMDV) is being used to avoid the need of proteinases to process the polyprotein. FMDV 2A is introduced as a linker between two proteins to allow autonomous intra-ribosomal self-processing of polyproteins. Recent reports have shown that this sequence is compatible with different sub-cellular targeting signals and can be used to co-express up to four proteins from a single retroviral vector. This short peptide provides a tool to allow the co-expression of multiple proteins from a single vector, a useful technology for those working with heteromultimeric proteins, biochemical pathways or combined/synergistic phenomena

    Sequential epidemic spread between agglomerates of self-propelled agents in one dimension

    Full text link
    Motile organisms can form stable agglomerates such as cities or colonies. In the outbreak of a highly contagious disease, the control of large-scale epidemic spread depends on factors like the number and size of agglomerates, travel rate between them, and disease recovery rate. While the emergence of agglomerates permits early interventions, it also explains longer real epidemics. In this work, we study the spread of susceptible-infected-recovered epidemics in one-dimensional spatially-structured systems. By working in one dimension, we mimic microorganisms in narrow channels and establish a necessary foundation for future investigation in higher dimensions. We employ a model of self-propelled particles which spontaneously form multiple clusters. As the rate of stochastic reorientation decreases, clusters become larger and less numerous. Besides examining the time evolution averaged over many epidemics, we show how the final number of ever-infected individuals depends non-trivially on single-individual parameters. In particular, the number of ever-infected individuals first increases with the reorientation rate since particles escape sooner from clusters and spread the disease. For higher reorientation rate, travel between clusters becomes too diffusive and the clusters too small, decreasing the number of ever-infected individuals.Comment: 11 pages, 6 figure

    Artificial iris performance for smart contact lens vision correction applications

    Get PDF
    This paper presents the simulated performance assessment of an artificial iris embedded on a scleral contact lens using real data from an aniridia patient. The artificial iris is based on guest-host liquid crystal cells (GH-LCD) in order to actively modify the transmittance of the lens and effective pupil size. Experimental validation of the GH-LCD spectrum and iris contrast (determined to be 1:2.1) enabled the development of optical models that include the effect of a small pupil on image quality and visual quality on an optical system with aniridia characteristics. Visual simulations at different light conditions (high/low photopic and mesopic) demonstrated the theoretical capacity of the customized artificial iris smart contact lens to expand the depth-of-focus and decrease the optical aberrations (in particular, the spherical aberration). The visual modelling suggests a maximum depth-of-focus value for a 2-mm pupil diameter for both eyes as follows: 3D (1,000 cd/m(2)), 2D (10 cd/m(2)) and 0.75D (1 cd/m(2)). This work demonstrates the beneficial optical effects of an active artificial iris, based on visual simulations in response to different light levels, and enables further experimental investigation on patients to validate the dynamic light attenuation and visual performance of smart contact lenses with GH-LCD

    Damage-driven strain localisation in networks of fibres: A computational homogenisation approach

    Get PDF
    In many applications, such as textiles, fibreglass, paper and several kinds of biological fibrous tissues, the main load-bearing constituents at the micro-scale are arranged as a fibre network. In these materials, rupture is usually driven by micro-mechanical failure mechanisms, and strain localisation due to progressive damage evolution in the fibres is the main cause of macro-scale instability. We propose a strain-driven computational homogenisation formulationbased on Representative Volume Element (RVE), within a framework in which micro-scale fibre damage can lead to macro-scale localisation phenomena. The mechanical stiffness considered here for the fibrous structure system is due to: i) an intra-fibre mechanism in which each fibre is axially stretched, and as a result, it can suffer damage; ii) an inter-fibre mechanism in which the stiffness results from the variation of the relative angle between pairs of fibres. The homogenised tangent tensor, which comes from the contribution of these two mechanisms, is required to detect the so-called bifurcation point at the macro-scale, through the spectral analysis of the acoustic tensor. This analysis can precisely determine the instant at which the macro-scale problem becomes ill-posed. At such a point, the spectral analysis provides information about the macro-scale failure pattern (unit normal and crack-opening vectors). Special attention is devoted to present the theoretical fundamentals rigorously in the light of variational formulations for multi-scale models. Also, the impact of a recent derived more general boundary condition for fibre networks is assessed in the context of materials undergoing softening. Numerical examples showing the suitability of the present methodology are also shown and discussed

    Territorio, crimen, comunidad. Heterogeneidad del homicidio en Medellín

    Get PDF
    Este libro se enmarca en el programa de investigación alrededor de la economía política de la periferia, que desde hace una década viene adelantando el CAP de la Universidad EAFIT. El resultado de este esfuerzo ha sido la colección de libros a la que pertenece este texto y que da cuenta de la pertinencia de estudios académicos en torno a las realidades periféricas de la ciudad y la región. Así pues, son seis volúmenes los que hacen parte de la colección: Economía Criminal en Antioquia: Narcotráfico, 2011 Informalidad e ilegalidad en la explotación del ORO y la MADERA en Antioquia, 2012 Economía criminal y poder político, 2013 Oro como fortuna. Instituciones, capital social y gobernanza de la minería aurífera colombiana, 2014 Nuevas modalidades de captación de rentas ilegales en Medellín, 2014 Territorio, crimen, comunidad. Heterogeneidad del homicidio en Medellín, 2015 Estos títulos aportan herramientas académicas que permiten abordar crítica y constructivamente los fenómenos que perfilan la realidad de la ciudad y el país. Además de describir y analizar los hechos, estamos convencidos de que la academia tiene la responsabilidad de señalar alternativas a los procesos de toma de decisiones.Una mirada panorámica al lugar y a los actores -- El contexto de los polígonos del homicidio en Medellín -- Priorización de medidas para la aplicación del plan de Garantías de No Repetición en Medellín -- Una aproximación cuantitativa al homicidio en Medellín -- Aprendizajes y ejercicios de la violencia homicida -- Más allá de las normas de papel y de sangre: Análisis de la incidencia de las reglas formales e informales en la variación del homicidio en los polígonos de Medellín -- Las comunidades conjugan los verbos contener y resistir -- El que no oye consejos, no llega a viejo. Recomendaciones de política pública -- Método para la definición de polígonos de concentración de homicidios en Medellí

    Modeling of urban enviromental problems using geographic information systems and multivariate methods.

    Get PDF
    En el presente documento se hace una descripción general de una propuesta metodológica para la modelización de problemas ambientales urbanos, utilizando una combinación de Sistemas de Información Geográfica y técnicas numéricas para generar escenarios probables de impacto. Se presenta como ejemplo una aplicación específica a la creación de mapas de escenarios de susceptibilidad a deslizamientos de tierra en una ciudad intermedia colombiana. Específicamente, se hace una descripción breve de la necesidad de buscar métodos transversales de análisis para las realidades complejas que constituyen los entornos urbanos, uno de ellos (entre muchos existentes) constituido por el fenómeno de deslizamientos de tierra en ciudades andinas, y a partir de ello, se propone una metodología basada en el análisis multifactorial de causas, que aprovecha experiencias pasadas para proponer escenarios futuros posibles, aprovechando las ventajas ofrecidas por los Sistemas de Información Geográfica para manejar información, y las Redes Neuronales Artificiales para clasificar datos.In the present work a methodological proposal is presented for modeling urban environmental problems. It combines Geographic Information Systems and numerical techniques in order to generate probable impact scenarios. Additionally, a specific application is presented as an example oriented to creating susceptibility maps of landslides risk scenarios in an intermediate Colombian city. More specifically, the necessity of finding transversal methods to analyze the complex urban realities is presented. Among these realities, we pick up the possibility of landslides in Andes cities and a whole methodology based on a multifactorial cause analysis is introduced. This methodology takes into account past experiences to propose possible futures scenarios, relying on both Geographic Information Systems to store data and on Artificial Neural Networks to classify them

    Design of a Multipurpose Photonic Chip Architecture for THz Dual-Comb Spectrometers

    Get PDF
    This article belongs to the Special Issue Terahertz Sensing and Imaging Technologies.In this work, we present a multipurpose photonic integrated circuit capable of generating multiheterodyne complex Dual-Combs (DC) THz signals. Our work focuses on translating the functionality of an electro-optic tunable DC system into a photonic chip employing standard building blocks to ensure the scalability and cost efficiency of the integrated device. The architecture we analyze for integration is based on three stages: a seed comb, a mode selection stage and a DC stage. This final DC stage includes a frequency shifter, a key element to improve the final detection of the THz signals and obtain real-time operation. This investigation covers three key aspects: (1) a solution for comb line selection on GHz spaced combs using OIL or OPLL on photonic chips is studied and evaluated, (2) a simple and versatile scheme to produce a frequency shift using the double sideband suppressed carrier modulation technique and an asymmetric Mach Zehnder Interferometer to filter one of the sidebands is proposed, and (3) a multipurpose architecture that can offer a versatile effective device, moving from application-specific PICs to general-purpose PICs. Using the building blocks (BBs) available from an InP-based foundry, we obtained simulations that offer a high-quality Dual-Comb frequency shifted signal with a side mode suppression ratio around 21 dB, and 41 dB after photodetection with an intermediate frequency of 1 MHz. We tested our system to generate a Dual-Comb with 10 kHz of frequency spacing and an OOK modulation with 5 Gbps which can be down-converted to the THz range by a square law detector. It is also important to note that the presented architecture is multipurpose and can also be applied to THz communications. This design is a step to enable a commercial THz photonic chip for multiple applications such as THz spectroscopy, THz multispectral imaging and THz telecommunications and offers the possibility of being fabricated in a multi-project wafer.This research was supported by Instituto Tecnológico Metropolitano, Universidad Carlos III de Madrid, the EU H2020 Celta project under Grant Agreement 675683, by the Spanish Ministry of Economy and Competitiveness under Project TEC2017-86271-R and by the ATTRACT project funded by the EC under Grant Agreement 777222

    Georges Lemaître: la armonía entre ciencia y fe

    Get PDF
    Este artículo describe las principales etapas de la vida y la obra de Georges Lemaître, sacerdote católico y físico matemático del siglo XX. Sus dos contribuciones más importantes a la cosmología científica fueron la explicación del corrimiento hacia el rojo de las galaxias como consecuencia de la expansión del universo y la propuesta de un origen explosivo en un momento concreto del pasado de la historia del cosmos. El profesor Mariano Artigas fue uno de los primeros en dar a conocer en España las ideas de Lemaître sobre las relaciones entre ciencia y fe, que conjugaba una necesaria separación de ambas disciplinas sin renunciar a discernir interacciones sutiles y profundas.This article describes the main stages in the life and work of Georges Lemaître, a 20th century catholic priest and mathematical physicist. His two main contributions to scientific cosmology were his explanation of the galactic redshift as a result of the expansion of the universe, and his proposal of an explosive origin at a particular time in the past history of the cosmos. Professor Mariano Artigas was one of the pioneers to introduces in Spain the views of Lemaître on the relations between science and faith, who combined a necessary separation of both disciplines without refusing to recognize subtle and profound interactions
    corecore