39 research outputs found

    DEAR project: Lunar dust surface interactions, risk and removal investigations

    Get PDF
    The DEAR project (Dusty Environment Application Research) investigates the interaction between lunar regolith and surfaces and components relevant for lunar exploration. Based on the TUBS regolith simulant which is representative in chemistry, size and shape properties to Moon soils to study the regolith transport, adhesion and strategies for cleaning. The regolith simulant will be applied to thermal, structural, optical sensor, sealing and other astronautic systems, providing input for requirements, justification and verification. The key applications are split in human space flight regolith investigations, wrinkled surface with random movement and hardware surfaces, flat material defined movement. The paper provides an overview of the DEAR project including a discussion of the first results, in particular vibration, shock and micro-vibration on regolith bearing surfaces. The investigation shall enable better understand the regolith layers interaction and the release mechanism, as well as potential cross contamination and cleaning strategies. The research is complemented by simulation of the regolith motion as parameter surface plasma interactions. The project is funded and supported by the European Space Agency (ESA). DEAR specifically addresses the development and testing of lunar dust removal strategies on optics, mechanisms and human space flight hardware (e.g., space suits). As the Moons regolith is known to be highly abrasive, electrically chargeable, and potentially chemically reactive, lunar dust might reduce the performance of hardware, such as cameras, thermal control surfaces and solar cells. The dust can cause malfunction on seals for on/off mechanisms or space suits. Of particular interest are risk assessment, avoidance, and cleaning techniques such as the use of electric fields to remove lunar dust from surfaces. Representative dust (e.g., regolith analogues of interesting landing sites) will be used in a dedicated test setup to evaluate risks and effects of lunar dust. We describe designs and methods developed by the DEAR consortium to deal with the regolith-related issues, in particular an electrode design to deflect regolith particles, cleaning of astronautical systems with CO2, design of a robotic arm for the testing within the DEAR chamber, regolith removal via shock, and regolith interaction with cleanroom textile

    An interactive editor for curve-skeletons: SkeletonLab

    Get PDF
    Curve-skeletons are powerful shape descriptors able to provide higher level information on topology, structure and semantics of a given digital object. Their range of application is wide and encompasses computer animation, shape matching, modelling and remeshing. While a universally accepted definition of curve-skeleton is still lacking, there are currently many algorithms for the curve-skeleton computation (or skeletonization) as well as different techniques for building a mesh around a given curve-skeleton (inverse skeletonization). Despite their widespread use, automatically extracted skeletons usually need to be processed in order to be used in further stages of any pipeline, due to different requirements. We present here an advanced tool, named SkeletonLab, that provides simple interactive techniques to rapidly and automatically edit and repair curve skeletons generated using different techniques proposed in the literature, as well as handcrafting them. The aim of the tool is to allow trained practitioners to manipulate the curve-skeletons obtained with skeletonization algorithms in order to fit their specific pipelines or to explore the requirements of newly developed techniques

    The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances

    Get PDF
    In the last five years there have been a large number of new time series classification algorithms proposed in the literature. These algorithms have been evaluated on subsets of the 47 data sets in the University of California, Riverside time series classification archive. The archive has recently been expanded to 85 data sets, over half of which have been donated by researchers at the University of East Anglia. Aspects of previous evaluations have made comparisons between algorithms difficult. For example, several different programming languages have been used, experiments involved a single train/test split and some used normalised data whilst others did not. The relaunch of the archive provides a timely opportunity to thoroughly evaluate algorithms on a larger number of datasets. We have implemented 18 recently proposed algorithms in a common Java framework and compared them against two standard benchmark classifiers (and each other) by performing 100 resampling experiments on each of the 85 datasets. We use these results to test several hypotheses relating to whether the algorithms are significantly more accurate than the benchmarks and each other. Our results indicate that only 9 of these algorithms are significantly more accurate than both benchmarks and that one classifier, the Collective of Transformation Ensembles, is significantly more accurate than all of the others. All of our experiments and results are reproducible: we release all of our code, results and experimental details and we hope these experiments form the basis for more rigorous testing of new algorithms in the future

    Part-based segmentation by skeleton cut space analysis

    No full text
    We present a new method for part-based segmentation of voxel shapes that uses medial surfaces to define a segmenting cut at each medial voxel. The cut has several desirable properties–smoothness, tightness, and orientation with respect to the shape’s local symmetry axis, making it a good segmentation tool. We next analyze the space of all cuts created for a given shape and detect cuts which are good segment borders. Our method is robust to noise, pose invariant, independent on the shape geometry and genus, and is simple to implement. We demonstrate our method on a wide selection of 3D shapes

    Enhanced DTI tracking with adaptive tensor interpolation

    No full text
    A novel tensor interpolation method is introduced that allows Diffusion Tensor Imaging (DTI) streamlining to overcome low-anisotropy regions and permits branching of trajectories using information gathered from the neighbourhood of low-anisotropy voxels met during the tracking. The interpolation method is performed in Log-Euclidean space and collects directional information in a spherical neighbourhood of the voxel in order to reconstruct a tensor with a higher linear diffusion coefficient than the original. The weight of the contribution of a certain neighbouring voxel is proportional to its linear diffusion coefficient and inversely proportional to a power of the spatial Euclidean distance between the two voxels. This inverse power law provides our method with robustness against noise. In order to resolve multiple fiber orientations, we divide the neighbourhood of a lowanisotropy voxel in sectors, and compute an interpolated tensor in each sector. The tracking then continues along the main eigenvector of the reconstructed tensors. We test our method on artificial, phantom and brain data, and compare it with (a) standard streamline tracking, (b) the Tensorlines method, (c) streamline tracking after an interpolationmethod based on bilateral filtering, and (d) streamline tracking using moving least square regularisation. It is shown that the new method compares favourably with these methods in artificial datasets. The proposed approach gives the possibility to explore a DTI dataset to locate singularities as well as to enhance deterministic tractography techniques. In this way it allows to immediately obtain results more similar to those provided by more powerful but computationally much more demanding methods that are intrinsically able to solve crossing fibers, such as probabilistic tracking or high angular resolution diffusion imaging

    79.4: Light-rolls: high throughput manufacture for LED lighting and displays

    No full text
    A high-throughput, roll-to-roll technology based on three key manufacturing platforms - 3-D MEMS, fluidic chip assembly and ink jet printing, is presented. The target applications for the technology are Displays and Lightings. In particular, flexible, pixilated LED displays can be produced, based on the integration of LED chips onto flexible polymer substrates. The preliminary evaluation of the manufacturing platforms and initial technology validator demonstrators will be presented
    corecore