3,868 research outputs found

    Cognición y representación interna de entornos dinámicos en el cerebro de los mamíferos

    Get PDF
    Tesis inédita de la Universidad Complutense de Madrid, Facultad de Ciencias Biológicas, leída el 07/05/2021El tiempo es una de las dimensiones fundamentales de la realidad. Paradójicamente, los fenómenos temporales del mundo natural contienen ingentes cantidades de información redundante, y a pesar de ello, codificar internamente el tiempo en el cerebro es imprescindible para anticiparse a peligros en ambientes dinámicos. No obstante, dedicar grandes cantidades de recursos cognitivos a procesar las características espacio-temporales de entornos complejos debería ser incompatible con la supervivencia, que requiere respuestas rápidas. Aun así, los animales son capaces de tomar decisiones en intervalos de tiempo muy estrechos. ¿Cómo consigue hacer esto el cerebro? Como respuesta al balance entre complejidad y velocidad, la hipótesis de la compactación del tiempo propone que el cerebro no codifica el tiempo explícitamente, sino que lo integra en el espacio. En teoría, la compactación del tiempo simplifica las representaciones internas del entorno, reduciendo significativamente la carga de trabajo dedicada a la planificación y la toma de decisiones. La compactación del tiempo proporciona un marco operativo que pretende explicar cómo las situaciones dinámicas, percibidas o producidas, se representan cognitivamente en forma de predicciones espaciales o representaciones internas compactas (CIR), que pueden almacenarse en la memoria y recuperarse más adelante para generar respuestas. Aunque la compactación del tiempo ya ha sido implementada en robots, hasta ahora no se había comprobado su existencia como mecanismo biológico y cognitivo en el cerebro...Time is one of the most prominent dimensions that organize reality. Paradoxically, there are loads of redundant information contained within the temporal features of the natural world, and yet internal coding of time in the brain seems to be crucial for anticipating time-changing, dynamic hazards. Allocating such significant brain resources to process spatiotemporal aspects of complex environments should apparently be incompatible with survival, which requires fast and accurate responses. Nonetheless, animals make decisions under pressure and in narrow time windows. How does the brain achieve this? An effort to resolve the complexity-velocity trade-off led to a hypothesis called time compaction, which states the brain does not encode time explicitly but embeds it into space. Theoretically, time compaction can significantly simplify internal representations of the environment and hence ease the brain workload devoted to planning and decision-making. Time compaction also provides an operational framework that aims to explain how perceived and produced dynamic situations are cognitively represented, in the form of spatial predictions or compact internal representations (CIRs) that can be stored in memory and be used later on to guide behaviour and generate action. Although successfully implemented in robots, time compaction still lacked assessment of its biological soundness as an actual cognitive mechanism in the brain...Fac. de Ciencias BiológicasTRUEunpu

    Test exploration and validation using transaction level models

    Get PDF
    The complexity of the test infrastructure and test strategies in systems-on-chip approaches the complexity of the functional design space. This paper presents test design space exploration and validation of test strategies and schedules using transaction level models (TLMs). Since many aspects of testing involve the transfer of a significant amount of test stimuli and responses, the communication-centric view of TLMs suits this purpose exceptionally wel

    Functional tissue engineering of human heart valve leaflets

    Get PDF

    A Test Vector Minimization Algorithm Based On Delta Debugging For Post-Silicon Validation Of Pcie Rootport

    Get PDF
    In silicon hardware design, such as designing PCIe devices, design verification is an essential part of the design process, whereby the devices are subjected to a series of tests that verify the functionality. However, manual debugging is still widely used in post-silicon validation and is a major bottleneck in the validation process. The reason is a large number of tests vectors have to be analyzed, and this slows process down. To solve the problem, a test vector minimizer algorithm is proposed to eliminate redundant test vectors that do not contribute to reproduction of a test failure, hence, improving the debug throughput. The proposed methodology is inspired by the Delta Debugging algorithm which is has been used in automated software debugging but not in post-silicon hardware debugging. The minimizer operates on the principle of binary partitioning of the test vectors, and iteratively testing each subset (or complement of set) on a post-silicon System-Under-Test (SUT), to identify and eliminate redundant test vectors. Test results using test vector sets containing deliberately introduced erroneous test vectors show that the minimizer is able to isolate the erroneous test vectors. In test cases containing up to 10,000 test vectors, the minimizer requires about 16ns per test vector in the test case when only one erroneous test vector is present. In a test case with 1000 vectors including erroneous vectors, the same minimizer requires about 140μs per erroneous test vector that is injected. Thus, the minimizer’s CPU consumption is significantly smaller than the typical amount of time of a test running on SUT. The factors that significantly impact the performance of the algorithm are number of erroneous test vectors and distribution (spacing) of the erroneous vectors. The effect of total number of test vectors and position of the erroneous vectors are relatively minor compared to the other two. The minimization algorithm therefore was most effective for cases where there are only a few erroneous test vectors, with large number of test vectors in the set

    New Techniques to Reduce the Execution Time of Functional Test Programs

    Get PDF
    The compaction of test programs for processor-based systems is of utmost practical importance: Software-Based Self-Test (SBST) is nowadays increasingly adopted, especially for in-field test of safety-critical applications, and both the size and the execution time of the test are critical parameters. However, while compacting the size of binary test sequences has been thoroughly studied over the years, the reduction of the execution time of test programs is still a rather unexplored area of research. This paper describes a family of algorithms able to automatically enhance an existing test program, reducing the time required to run it and, as a side effect, its size. The proposed solutions are based on instruction removal and restoration, which is shown to be computationally more efficient than instruction removal alone. Experimental results demonstrate the compaction capabilities, and allow analyzing computational costs and effectiveness of the different algorithms

    March Test Generation Revealed

    Get PDF
    Memory testing commonly faces two issues: the characterization of detailed and realistic fault models and the definition of time-efficient test algorithms. Among the different types of algorithms proposed for testing static random access memories, march tests have proven to be faster, simpler, and regularly structured. The majority of the published march tests have been manually generated. Unfortunately, the continuous evolution of the memory technology introduces new classes of faults such as dynamic and linked faults and makes the task of handwriting test algorithms harder and not always leading to optimal results. Although some researchers published handmade march tests able to deal with new fault models, the problem of a comprehensive methodology to automatically generate march tests addressing both classic and new fault models is still an open issue. This paper proposes a new polynomial algorithm to automatically generate march tests. The formal model adopted to represent memory faults allows the definition of a general methodology to deal with static, dynamic, and linked faults. Experimental results show that the new automatically generated march tests reduce the test complexity and, therefore, the test time, compared to the well-known state of the art in memory testin

    Morphometric Analysis to Characterize the Differentiation of Mesenchymal Stem Cells into Smooth Muscle Cells in Response to Biochemical and Mechanical Stimulation

    Full text link
    The morphology and biochemical phenotype of cells are closely linked. This relationship is important in progenitor cell bioengineering, which generates functional, tissue-specific cells from uncommitted precursors. Advances in biofabrication have demonstrated that cell shape can regulate cell behavior and alter phenotype-specific functions. Establishing accessible and rigorous techniques for quantifying cell shape will therefore facilitate assessment of cellular responses to environmental stimuli, and will enable more comprehensive understanding of developmental, pathological, and regenerative processes. For progenitor cells being induced into specific lineages, this ability becomes a pertinent means for validating their degree of differentiation and may lead to novel strategies for controlling cell phenotype. In our approach, we used the differentiation of adult human mesenchymal stem cells (MSCs) into smooth muscle cells (SMCs) as a model system to investigate the relationship between cell shape and phenotype. These cell types are responsive to mechanical and biochemical stimuli and the shape of SMCs is a recognized marker of differentiated state, providing a system in which morphological and biochemical phenotype are both understood and inducible. By applying exogenous stimuli, we changed cell shape and examined the corresponding cellular phenotype. In the first Aim, we applied stretch to MSCs on 2D collagen sheets to promote differentiation. Using mathematical shape factors, we quantified the morphological changes in response to defined stretch parameters. In the second Aim, we investigated the use of input energy as a means of controlling cell shape and corresponding differentiation. We examined how combinations of stretch parameters that produce equal energy input impacted morphology, and postulated that cell shape is a function of energy input. In the third Aim, we translated our method of quantifying shape factors into 3D culture, and validated the method by investigating the differentiation of MSCs into SMCs by mechanical and growth factor stimulation. We used the shape factors to quantify morphological differences and compared these changes to biochemical markers. Our results demonstrate that mechanical stretch influences multiple aspects of MSC phenotype, including cell morphology. Shape factors described these changes objectively and quantitatively, and enabled the identification of relationships between SMC shape and differentiated state. Similar morphological responses could be induced using different combinations of stretch parameters that resulted in equal energy input. Cell shape followed a linear relationship with energy input despite the variance introduced by using MSCs from different patients. Only one SMC gene marker directly exhibited this relationship; however, partial least squares regression analysis revealed that other genes were also associated with shape factors. Translation of the shape quantification method into 3D systems revealed that while the additional dimensionality hindered comparison of morphology between 2D and 3D samples, these shape factors were still applicable within 3D systems. Differences in cell morphology caused by growth factors and mechanical stretch in 3D constructs were elucidated by shape analysis, and these phenotypic changes were corroborated through biochemical assays. Taken together, these results validate the use of cell shape as means of characterizing phenotype and the process of progenitor cell differentiation. The automated method we developed generates a robust set of morphological parameters that provide a way to characterize the differentiation of MSCs into SMCs. This work has implications in our understanding of the relationship between cell morphology and phenotype, and may lead to new ways to control and improve differentiation efficiency in a variety of cell and tissue systems.PHDBiomedical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/145833/1/brandanw_1.pd

    Diagnostic techniques in deflagration and detonation studies

    Get PDF
    Advances in experimental, high-speed techniques can be used to explore the processes occurring within energetic materials. This review describes techniques used to study a wide range of processes: hot-spot formation, ignition thresholds, deflagration, sensitivity and finally the detonation process. As this is a wide field the focus will be on small-scale experiments and quantitative studies. It is important that such studies are linked to predictive models, which inform the experimental design process. The stimuli range includes, thermal ignition, drop-weight, Hopkinson Bar and Plate Impact studies. Studies made with inert simulants are also included as these are important in differentiating between reactive response and purely mechanical behaviour
    corecore