180 research outputs found

    Relationship between vestibular hair cell loss and deficits in two anti-gravity reflexes in the rat.

    Get PDF
    The tail-lift reflex and the air-righting reflex in rats are anti-gravity reflexes that depend on vestibular function. To begin identifying their cellular basis, this study examined the relationship between reflex loss and the graded lesions caused in the vestibular sensory epithelia by varying doses of an ototoxic compound. After ototoxic exposure, we recorded these reflexes using high speed video. The movies were used to obtain objective measures of the reflexes: the minimum angle formed by the nose, the back of the neck and the base of the tail during the tail-lift maneuver and the time to right in the air-righting test. The vestibular sensory epithelia were then collected from the rats and used to estimate the loss of type I (HCI), type II (HCII) and all hair cells (HC) in both central and peripheral parts of the crista, utricle, and saccule. As expected, tail-lift angles decreased, and air-righting times increased, while the numbers of HCs remaining in the epithelia decreased in a dose-dependent manner. The results demonstrated greater sensitivity of HCI compared to HCII to the IDPN ototoxicity, as well as a relative resiliency of the saccule compared to the crista and utricle. Comparing the functional measures with the cell counts, we observed that loss of the tail-lift reflex associates better with HCI than with HCII loss. In contrast, most HCI in the crista and utricle were lost before air-righting times increased. These data suggest that these reflexes depend on the function of non-identical populations of vestibular HCs

    Computational Characterization of Activities and Learners in a Learning System

    Get PDF
    For a technology-based learning system to be able to personalize its learning process, it must characterize the learners. This can be achieved by storing information about them in a feature vector. The aim of this research is to propose such a system. In our proposal, the students are characterized based on their activity in the system, so learning activities also need to be characterized. The vectors are data structures formed by numerical or categorical variables such as learning style, cognitive level, knowledge type or the history of the learner’s actions in the system. The learner’s feature vector is updated considering the results and the time of the activities performed by the learner. A use case is also presented to illustrate how variables can be used to achieve different effects on the learning of individuals through the use of instructional strategies. The most valuable contribution of this proposal is the fact that students are characterized based on their activity in the system, instead of on self-reporting. Another important contribution is the practical nature of the vectors that will allow them to be computed by an artificial intelligence algorithm

    Face-to-Face vs On-line: An analysis of Profile, Learning, Performance and Satisfaction among Post Graduate Students

    Get PDF
    The aim of this study is to explore the differences between face-to-face and on-line students in a post graduate education program. The variables considered are Post Graduate Student's profile, competences and learning outcomes, academic performance and satisfaction. The sample was composed by 47 students (64% face-to-face). Analysis of variance (ANOVA) and student's t utilizing SPPS Statistics 22.0 were performed. Results showed differences in all variables: (i) Regarding student profile, face-to-face students were younger and from a broader range of nationalities; (ii) Both students' profiles showed positive and significant differences between their pre-post competences, learning outcomes and self-evaluation scores in several of the programs courses. Moreover, there were significant differences when considering specific courses and profiles; (iii) Face-to-face students obtained better grades in 4 out of 7 courses of the post graduate program; (iv) Finally, face-to-face students reported higher satisfaction and a more positive perception of the teaching methodologies utilized than on-line students. Theoretical and practical implications are discussed to improve specific teaching methodologies for on-line students

    Trabajando las emociones en el aula desde la inteligencia emocional

    Get PDF
    En el presente trabajo encontramos dos partes principales: la primera trata de una fundamentación teórica, en la que se incluye un breve recorrido histórico sobre el concepto de inteligencia emocional, la teoría de las inteligencias múltiples, los beneficios y competencias de la educación emocional y la implantación de ella en las aulas. La segunda parte del trabajo contempla una propuesta de intervención, la cual incluye una serie de sesiones para poder trabajar los distintos aspectos de la inteligencia emocional en el aula con alumnos de eduación primaria

    Direct Evidence of Internalization of Tau by Microglia in Vitro and in Vivo

    Get PDF
    The microtubule-associated protein (MAP) tau plays a critical role in the pathogenesis of tauopathies. Excess tau can be released into the extracellular medium in a physiological or pathological manner to be internalized by surrounding neurons' a process that contributes to the spread of this protein throughout the brain. Such spreading may correlate with the progression of the abovementioned diseases. In addition to neurons, tau can be internalized into other cells. Here we demonstrate that microglia take up tau in vitro and in vivo. In this regard, microglia from primary cultures internalized soluble (human recombinant tau42) and insoluble (homogenates derived from human AD brain) tau in vitro. Furthermore, using stereotaxic injection of tau in mice in vivo, we show that murine microglia internalize human tau. In addition, we demonstrate, for the first time, that microglia colocalize with various forms of tau in postmortem brain tissue of patients with Alzheimer's disease and non-demented control subjects. Our data reveal a potential role of microglia in the internalization of tau that might be relevant for the design of strategies to enhance the clearance of extracellular tau in neurodegenerative diseases characterized by the accumulation of this protein.Spanish Ministry of Health, the Comunidad de Madrid, the Centro de Investigación Biomédica en Red sobre Enfermedades Neurodegenerativas (CIBERNED, ISCIII), and the Alzheimer’s Association.Peer Reviewe

    Virtualizing super-computation on-board UAS

    Get PDF
    Unmanned aerial systems (UAS, also known as UAV, RPAS or drones) have a great potential to support a wide variety of aerial remote sensing applications. Most UAS work by acquiring data using on-board sensors for later post-processing. Some require the data gathered to be downlinked to the ground in real-time. However, depending on the volume of data and the cost of the communications, this later option is not sustainable in the long term. This paper develops the concept of virtualizing super-computation on-board UAS, as a method to ease the operation by facilitating the downlink of high-level information products instead of raw data. Exploiting recent developments in miniaturized multi-core devices is the way to speed-up on-board computation. This hardware shall satisfy size, power and weight constraints. Several technologies are appearing with promising results for high performance computing on unmanned platforms, such as the 36 cores of the TILE-Gx36 by Tilera (now EZchip) or the 64 cores of the Epiphany-IV by Adapteva. The strategy for virtualizing super-computation on-board includes the benchmarking for hardware selection, the software architecture and the communications aware design. A parallelization strategy is given for the 36-core TILE-Gx36 for a UAS in a fire mission or in similar target-detection applications. The results are obtained for payload image processing algorithms and determine in real-time the data snapshot to gather and transfer to ground according to the needs of the mission, the processing time, and consumed watts.Unmanned aerial systems (UAS, also known as UAV, RPAS or drones) have a great potential to support a wide variety of aerial remote sensing applications. Most UAS work by acquiring data using on-board sensors for later post-processing. Some require the data gathered to be downlinked to the ground in real-time. However, depending on the volume of data and the cost of the communications, this later option is not sustainable in the long term. This paper develops the concept of virtualizing super-computation on-board UAS, as a method to ease the operation by facilitating the downlink of high-level information products instead of raw data. Exploiting recent developments in miniaturized multi-core devices is the way to speed-up on-board computation. This hardware shall satisfy size, power and weight constraints. Several technologies are appearing with promising results for high performance computing on unmanned platforms, such as the 36 cores of the TILE-Gx36 by Tilera (now EZchip) or the 64 cores of the Epiphany-IV by Adapteva. The strategy for virtualizing super-computation on-board includes the benchmarking for hardware selection, the software architecture and the communications aware design. A parallelization strategy is given for the 36-core TILE-Gx36 for a UAS in a fire mission or in similar target-detection applications. The results are obtained for payload image processing algorithms and determine in real-time the data snapshot to gather and transfer to ground according to the needs of the mission, the processing time, and consumed watts.Postprint (published version

    Experimental investigation of turbulence level in enhanced heat exchangers with active insert devices

    Get PDF
    This work presents a visualization study carried-out on a dynamic insert device. The flow pattern is obtained by employing the Particle Image Velocimetry (PIV) technique. The insert device is moved alternatively along a tube and consists of several circular elements with six circumferentially distributed holes on them, which are mounted on a shaft with a pitch of 5D. The whole is moved alternatively along the axial direction by a hydraulic cylinder. The increase of the turbulence level of the Flow will be analyzed and related to the heat transfer augmentation. By the use of Particle Image Velocimetry technique and water as test fluid, the 2-Dimensional pattern of the turbulent flow is obtained on the two symmetry planes of the device: hole center and between holes. The results permit to establish the flow pattern along the devices. In static conditions of the scraper, experiments are carried out at three different Reynolds ranging from 4000 to 6323. In dynamic conditions, the Reynolds number has been kept constant at 7400, while varying the velocity of the scrapper in relation of 0.5, 1 and 2 with the bulk velocity of the flow. Co current and counterflow directions of the scraping device have been analyzed, and results show the insert device movement on flow behavior

    A literature review on thermal comfort performance of parametric façades

    Get PDF
    Thermal performance is a major part of the building envelope and is getting more attention globally. Nowadays, parametric design methods are used in building envelope design, such as facade design, for optimization of building envelopes, which could affect thermal performance and energy consumption. Moreover, new technologies applied to building design have not only changed the appearance of cities but also increased occupant comfort. This paper illustrates a systematic review that explains some tools and techniques that have been used in recent years to improve thermal comfort by applying parametric design panels to a second skin façade for residents. It attempted to collect and synthesize the most relevant evidence and methodologies. In this paper, 30 articles have been analyzed. They are classified by methodologies, years, and climate zones. Results suggest that simulation is the most accurate in comparison with other methodologies.Peer ReviewedPostprint (published version

    Modelo de aprendizaje personalizado y adaptativo

    Get PDF
    Desde hace años venimos contemplando cómo nuestra sociedad ha cambiado de la mano de la evolución de las Tecnologías de la Información (TI). Nos encontramos en un entorno que cambia constantemente, en el que la información se renueva continuamente, lo que nos lleva a un aprendizaje dinámico, continuo, y cuyas barreras espacio temporales están desapareciendo, hacia un aprendizaje global. Con esto, la educación está inmersa en un proceso de cambio, de una transformación que permita hacer frente a estas nuevas características y necesidades que presenta la sociedad en este nuevo entorno, una verdadera transformación digital. Se trata de una forma diferente de aprendizaje en la que, además, los espacios educativos se están deslocalizando. Y en este proceso de transformación, el potencial y el rápido crecimiento de las TI pueden tener un papel crucial y conformar la base para una verdadera evolución del aprendizaje para esta sociedad digital. Sin embargo, la situación actual es que estas expectativas no se han cumplido, el uso de las TI en educación no está logrando el efecto que se esperaba, no están contribuyendo a una verdadera transformación del proceso de aprendizaje. Y es que, entre otras razones, el uso que se está haciendo de las TI es de meras herramientas complementarias a la docencia habitual, cuando deberíamos emplearlas para poder profundizar en el proceso de aprendizaje. Para conseguir un cambio significativo debemos ir más allá de un uso instrumental. Por esta razón, proponemos un modelo de aprendizaje adaptativo y personalizado, que sirva de base para crear un sistema de aprendizaje que permita cubrir las necesidades detectadas en la sociedad digital sin descuidar los objetivos intencionales educativos. Un modelo que hemos denominado CALM, acrónimo de Customized Adaptive Learning Model. Se trata de un modelo que se adapta el flujo de aprendizaje a las características y al estado de cada aprendiz y que busca acrecentar su motivación, ofreciéndole autonomía en su propio proceso de aprendizaje, en un ciclo continuo de mejora. Todo ello diseñado y supervisado en todo momento por el docente, cuyo papel consideramos crucial en este proceso. En CALM el curso está dividido en competencias (los conocimientos, las habilidades y las aptitudes que los aprendices irán adquiriendo) dispuestas en forma de grafo dirigido o, como lo llamamos, mapa de competencias. Estas competencias serán desarrolladas a través de la realización de actividades, y será el propio sistema, a través de lo que llamamos el motor de selección, el que asigne a cada aprendiz en cada momento la actividad que considere más apropiada. Por su parte, el docente será el que diseñe toda la propuesta, creando las competencias y configurando el mapa, y añadiendo las actividades. Después, podrá en todo momento supervisar el proceso de todos los aprendices, analizando su progreso y estado, tanto colectivo como individual, y gestionarlo a través de un factor clave que introducimos en el modelo: las estrategias instruccionales. A través de ellas, el docente podrá guiar al modelo (al motor de selección) en la selección de actividades, de modo que, a pesar de que este analiza de forma dinámica y utilizando técnicas de inteligencia artificial las características de cada aprendiz para asignarles una actividad, la estrategia docente marcará la decisión final a tomar, según los criterios que el docente considere apropiados, tanto a nivel individual del aprendiz como a nivel de grupo. Con CALM, hemos propuesto una base para construir un sistema de aprendizaje inteligente con el que cubrir las necesidades educativas que presenta nuestra sociedad actual, a través de un aprendizaje adaptativo y personalizado, teniendo siempre presentes los objetivos docentes

    Adaptive learning based on competences and activities

    Get PDF
    Frente al aprendizaje tradicional de talla única, proponemos un modelo de aprendizaje adaptativo basado en las tecnologías de la información, abierto, colaborativo, flexible y escalable. El modelo propuesto tiene como elementos centrales los conceptos de competencia y de actividad de aprendizaje y se estructura en tres elementos principales: el cuadro de mando docente (para el diseño del curso en base a competencias y actividades), el espacio de trabajo del estudiante (en que se realizan las actividades formativas y se mantiene el estado de competencias y actividades) y el motor de selección (encargado de la selección de actividades en función del progreso del estudiante en su aprendizaje). El modelo presentado permite la personalización del contenido, adaptado al nivel de conocimientos de cada usuario y a su progreso, y a través de itinerarios de aprendizaje diferentes elegidos por el propio usuario. Incorpora los conceptos de refresco y de refuerzo y la posibilidad de elegir para dotar a los estudiantes de autonomía.Faced with traditional one-size-fits-all learning, we propose an open, collaborative, flexible and scalable adaptive learning model based on information technologies. The central elements in the model are the concepts of competence and learning activity and it is structured in three main elements: the teaching board (for a course design based on competencies and activities), the student work space (in which the training activities are carried out and the state of competences and activities are maintained) and the selection engine (responsible for the selection of activities according to the student's learning progress). The presented model allows the customization of the content, adapted to the level of knowledge and the progress of each user, and through different learning itineraries chosen by the user. It incorporates the concepts of refreshment, reinforcement and freedom of choice, so that the students are provided with autonomy
    corecore