24,164 research outputs found

    Functional Generative Design: An Evolutionary Approach to 3D-Printing

    Full text link
    Consumer-grade printers are widely available, but their ability to print complex objects is limited. Therefore, new designs need to be discovered that serve the same function, but are printable. A representative such problem is to produce a working, reliable mechanical spring. The proposed methodology for discovering solutions to this problem consists of three components: First, an effective search space is learned through a variational autoencoder (VAE); second, a surrogate model for functional designs is built; and third, a genetic algorithm is used to simultaneously update the hyperparameters of the surrogate and to optimize the designs using the updated surrogate. Using a car-launcher mechanism as a test domain, spring designs were 3D-printed and evaluated to update the surrogate model. Two experiments were then performed: First, the initial set of designs for the surrogate-based optimizer was selected randomly from the training set that was used for training the VAE model, which resulted in an exploitative search behavior. On the other hand, in the second experiment, the initial set was composed of more uniformly selected designs from the same training set and a more explorative search behavior was observed. Both of the experiments showed that the methodology generates interesting, successful, and reliable spring geometries robust to the noise inherent in the 3D printing process. The methodology can be generalized to other functional design problems, thus making consumer-grade 3D printing more versatile.Comment: 8 pages, 12 figures, GECCO'1

    Bio-inspired 0.35μm CMOS Time-to-Digital Converter with 29.3ps LSB

    Get PDF
    Time-to-digital converter (TDC) integrated circuit is introduced in this paper. It is based on chain of delay elements composing a regular scalable structure. The scheme is analogous to the sound direction sensitivity nerve system found in barn owl. The circuit occupies small silicon area, and its direct mapping from time to position-code makes conversion rates up to 500Msps possible. Specialty of the circuit is the structural and functional symmetry. Therefore the role of start and stop signals are interchangeable. In other words negative delay is acceptable: the circuit has no dead time problems. These are benefits of the biology model of the auditory scene representation in the bird's brain. The prototype chip is implemented in 0.35μm CMOS having less than 30ps single-shot resolution in the measurements.Hungarian National Research Foundation TS4085

    Transformer Networks for Trajectory Forecasting

    Full text link
    Most recent successes on forecasting the people motion are based on LSTM models and all most recent progress has been achieved by modelling the social interaction among people and the people interaction with the scene. We question the use of the LSTM models and propose the novel use of Transformer Networks for trajectory forecasting. This is a fundamental switch from the sequential step-by-step processing of LSTMs to the only-attention-based memory mechanisms of Transformers. In particular, we consider both the original Transformer Network (TF) and the larger Bidirectional Transformer (BERT), state-of-the-art on all natural language processing tasks. Our proposed Transformers predict the trajectories of the individual people in the scene. These are "simple" model because each person is modelled separately without any complex human-human nor scene interaction terms. In particular, the TF model without bells and whistles yields the best score on the largest and most challenging trajectory forecasting benchmark of TrajNet. Additionally, its extension which predicts multiple plausible future trajectories performs on par with more engineered techniques on the 5 datasets of ETH + UCY. Finally, we show that Transformers may deal with missing observations, as it may be the case with real sensor data. Code is available at https://github.com/FGiuliari/Trajectory-Transformer.Comment: 18 pages, 3 figure

    MonetDB/XQuery: a fast XQuery processor powered by a relational engine

    Get PDF
    Relational XQuery systems try to re-use mature relational data management infrastructures to create fast and scalable XML database technology. This paper describes the main features, key contributions, and lessons learned while implementing such a system. Its architecture consists of (i) a range-based encoding of XML documents into relational tables, (ii) a compilation technique that translates XQuery into a basic relational algebra, (iii) a restricted (order) property-aware peephole relational query optimization strategy, and (iv) a mapping from XML update statements into relational updates. Thus, this system implements all essential XML database functionalities (rather than a single feature) such that we can learn from the full consequences of our architectural decisions. While implementing this system, we had to extend the state-of-the-art with a number of new technical contributions, such as loop-lifted staircase join and efficient relational query evaluation strategies for XQuery theta-joins with existential semantics. These contributions as well as the architectural lessons learned are also deemed valuable for other relational back-end engines. The performance and scalability of the resulting system is evaluated on the XMark benchmark up to data sizes of 11GB. The performance section also provides an extensive benchmark comparison of all major XMark results published previously, which confirm that the goal of purely relational XQuery processing, namely speed and scalability, was met

    Reducing CSF partial volume effects to enhance diffusion tensor imaging metrics of brain microstructure

    Get PDF
    Technological advances over recent decades now allow for in vivo observation of human brain tissue through the use of neuroimaging methods. While this field originated with techniques capable of capturing macrostructural details of brain anatomy, modern methods such as diffusion tensor imaging (DTI) that are now regularly implemented in research protocols have the ability to characterize brain microstructure. DTI has been used to reveal subtle micro-anatomical abnormalities in the prodromal phase ofº various diseases and also to delineate “normal” age-related changes in brain tissue across the lifespan. Nevertheless, imaging artifact in DTI remains a significant limitation for identifying true neural signatures of disease and brain-behavior relationships. Cerebrospinal fluid (CSF) contamination of brain voxels is a main source of error on DTI scans that causes partial volume effects and reduces the accuracy of tissue characterization. Several methods have been proposed to correct for CSF artifact though many of these methods introduce new limitations that may preclude certain applications. The purpose of this review is to discuss the complexity of signal acquisition as it relates to CSF artifact on DTI scans and review methods of CSF suppression in DTI. We will then discuss a technique that has been recently shown to effectively suppress the CSF signal in DTI data, resulting in fewer errors and improved measurement of brain tissue. This approach and related techniques have the potential to significantly improve our understanding of “normal” brain aging and neuropsychiatric and neurodegenerative diseases. Considerations for next-level applications are discussed

    Compressing DNA sequence databases with coil

    Get PDF
    Background: Publicly available DNA sequence databases such as GenBank are large, and are growing at an exponential rate. The sheer volume of data being dealt with presents serious storage and data communications problems. Currently, sequence data is usually kept in large "flat files," which are then compressed using standard Lempel-Ziv (gzip) compression – an approach which rarely achieves good compression ratios. While much research has been done on compressing individual DNA sequences, surprisingly little has focused on the compression of entire databases of such sequences. In this study we introduce the sequence database compression software coil. Results: We have designed and implemented a portable software package, coil, for compressing and decompressing DNA sequence databases based on the idea of edit-tree coding. coil is geared towards achieving high compression ratios at the expense of execution time and memory usage during compression – the compression time represents a "one-off investment" whose cost is quickly amortised if the resulting compressed file is transmitted many times. Decompression requires little memory and is extremely fast. We demonstrate a 5% improvement in compression ratio over state-of-the-art general-purpose compression tools for a large GenBank database file containing Expressed Sequence Tag (EST) data. Finally, coil can efficiently encode incremental additions to a sequence database. Conclusion: coil presents a compelling alternative to conventional compression of flat files for the storage and distribution of DNA sequence databases having a narrow distribution of sequence lengths, such as EST data. Increasing compression levels for databases having a wide distribution of sequence lengths is a direction for future work
    corecore