719,864 research outputs found

    Exact Cover with light

    Full text link
    We suggest a new optical solution for solving the YES/NO version of the Exact Cover problem by using the massive parallelism of light. The idea is to build an optical device which can generate all possible solutions of the problem and then to pick the correct one. In our case the device has a graph-like representation and the light is traversing it by following the routes given by the connections between nodes. The nodes are connected by arcs in a special way which lets us to generate all possible covers (exact or not) of the given set. For selecting the correct solution we assign to each item, from the set to be covered, a special integer number. These numbers will actually represent delays induced to light when it passes through arcs. The solution is represented as a subray arriving at a certain moment in the destination node. This will tell us if an exact cover does exist or not.Comment: 20 pages, 4 figures, New Generation Computing, accepted, 200

    Stellar intensity interferometry: Optimizing air Cherenkov telescope array layouts

    Full text link
    Kilometric-scale optical imagers seem feasible to realize by intensity interferometry, using telescopes primarily erected for measuring Cherenkov light induced by gamma rays. Planned arrays envision 50--100 telescopes, distributed over some 1--4 km2^2. Although array layouts and telescope sizes will primarily be chosen for gamma-ray observations, also their interferometric performance may be optimized. Observations of stellar objects were numerically simulated for different array geometries, yielding signal-to-noise ratios for different Fourier components of the source images in the interferometric (u,v)(u,v)-plane. Simulations were made for layouts actually proposed for future Cherenkov telescope arrays, and for subsets with only a fraction of the telescopes. All large arrays provide dense sampling of the (u,v)(u,v)-plane due to the sheer number of telescopes, irrespective of their geographic orientation or stellar coordinates. However, for improved coverage of the (u,v)(u,v)-plane and a wider variety of baselines (enabling better image reconstruction), an exact east-west grid should be avoided for the numerous smaller telescopes, and repetitive geometric patterns avoided for the few large ones. Sparse arrays become severely limited by a lack of short baselines, and to cover astrophysically relevant dimensions between 0.1--3 milliarcseconds in visible wavelengths, baselines between pairs of telescopes should cover the whole interval 30--2000 m.Comment: 12 pages, 10 figures; presented at the SPIE conference "Optical and Infrared Interferometry II", San Diego, CA, USA (June 2010

    Approximate Cover of Strings

    Get PDF
    Regularities in strings arise in various areas of science, including coding and automata theory, formal language theory, combinatorics, molecular biology and many others. A common notion to describe regularity in a string T is a cover, which is a string C for which every letter of T lies within some occurrence of C. The alignment of the cover repetitions in the given text is called a tiling. In many applications finding exact repetitions is not sufficient, due to the presence of errors. In this paper, we use a new approach for handling errors in coverable phenomena and define the approximate cover problem (ACP), in which we are given a text that is a sequence of some cover repetitions with possible mismatch errors, and we seek a string that covers the text with the minimum number of errors. We first show that the ACP is NP-hard, by studying the cover-size relaxation of the ACP, in which the requested size of the approximate cover is also given with the input string. We show this relaxation is already NP-hard. We also study another two relaxations of the ACP, which we call the partial-tiling relaxation of the ACP and the full-tiling relaxation of the ACP, in which a tiling of the requested cover is also given with the input string. A given full tiling retains all the occurrences of the cover before the errors, while in a partial tiling there can be additional occurrences of the cover that are not marked by the tiling. We show that the partial-tiling relaxation has a polynomial time complexity and give experimental evidence that the full-tiling also has polynomial time complexity. The study of these relaxations, besides shedding another light on the complexity of the ACP, also involves a deep understanding of the properties of covers, yielding some key lemmas and observations that may be helpful for a future study of regularities in the presence of errors

    Reflections on Interdisciplinary Collaboration between Sociology and the Exact sciences

    Get PDF
    If the original ambition of sociology to constitute itself into an encyclopaedia of the social sciences has largely failed (because of the obligation to restrict its scope through disciplinary specialization), the discipline has been more successful as a key actor in interdisciplinary and trans-disciplinary encounters that cover a wide range of domains that link the exact and social sciences or the nature-culture divide that is at the foundation of modern epistemology. Besides providing the much needed human and social dimension in performance in the applied sciences relating to man (medicine, engineering, agriculture) or the intervention dimension that the pure sciences can usher in, sociology has a long history in interrogating the social (organizational, relational) and cultural (symbolic) context of the production of scientific knowledge. A discipline with a vocation towards fulfilling the aspirations of the pure and applied dimensions of science, sociology hopes to both gain from the advances of the other sciences exact sciences while contributing reciprocally to their development. This presentation hopes to throw light on these preoccupations by exploring the bases (philosophical, social) of this imperative as well as the problems faced in or the obstacles that still hinder the emergence of interdisciplinary/multidisciplinary/trans-disciplinary practice.Key words: sociology, exact sciences, collaboration, intersections, reciprocity, reflexivit

    Greenhouse application of light-drone imaging technology for assessing weeds severity occurring on baby-leaf red lettuce beds approaching fresh-cutting

    Get PDF
    Aim of study: For baby-leaf lettuces greenhouse cultivations the absence of weeds is a mandatory quality requirement. One of the most promising and innovative technologies in weed research, is the use of Unmanned Aerial Vehicles (or drones) equipped with acquisition systems. The aim of this study was to provide an estimation of the exact weed amount on baby-sized red lettuce beds using a light drone equipped with an RGB microcamera.Area of study: Trials were performed at specialized organic farm site in Eboli (Salerno, Italy), under polyethylene multi-tunnel greenhouse.Material and methods: The RGB images acquired were processed with specific algorithms distinguishing weeds from crop yields, estimating the weeds covered surface and the severity of weed contamination in terms of biomass. A regression between the percentage of the surface covered by weed (with respect to the image total surface) and the weight of weed (with respect to the total harvested biomass) was calculated.Main results: The regression between the total cover values of the 25 calibration images and the total weight measured report a significant linear correlation. Digital monitoring was able to capture with accuracy the highly variable weed coverage that, among the different grids positioned under real cultivation conditions, was in the range 0-16.4% of the total cultivated one.Research highlights: In a precision weed management context, with the aim of improving management and decreasing the use of pesticides, this study provided an estimation of the exact weed amount on baby-sized red lettuce beds using a light drone

    CORRELATIVE LIGHT MICROSCOPY AND X-RAY MICROTOMOGRAPHY OF GROUND SECTIONS OF MINERALISED TISSUES

    Get PDF
    Starting from scratch, if one wanted to correlate light microscopical (LM) and X-ray microtomographic (XMT, micro-CT) findings from the mineralized tissues - bone and calcified cartilage in the skeleton and dentine, enamel, and cementum in teeth - one could simply examine the same, resin embedded sample with at least one flat surface by confocal scanning reflection and/or fluorescence light microscopy and XMT. However, we are frequently presented with ready-made 'ground' sections mounted in Canada balsam or DPX on 25mm wide ~1mm thick glass slides with 0.17mm cover slips. Many such preparations are historical or are valuable by representing archival material from rare diseases or endangered species: all are inconvenient in form for XMT. We endeavored to economize on X-ray beam time by scanning a 25mm thick stack of slides, separating the relevant data from each sample and making exact matches with transmitted ordinary and polarized light microscopy images. Samples were selected to represent a wide range of sizes and skeletal and dental tissue types, including human femoral bone, human permanent teeth, dog carnassial tooth, narwhal mandible, black rhinoceros molars, sperm whale cementum and dentine, African elephant ivory, and prairie marmot molars. XMT was conducted using the QMUL Mucat-2 system [1], nominal voxel size 20μm, 90kV, 24 hours. Analysis used in-house analysis software TomView, ImageJ [2] and Drishti [3] software. In each case we were able to match XMT and light microscopy. We can now report mineralization densities for all the calcified tissues in the context of classical light microscopy imagery
    corecore