5,245 research outputs found

    Classification of Big Point Cloud Data Using Cloud Computing

    Get PDF
    Point cloud data plays an significant role in various geospatial applications as it conveys plentiful information which can be used for different types of analysis. Semantic analysis, which is an important one of them, aims to label points as different categories. In machine learning, the problem is called classification. In addition, processing point data is becoming more and more challenging due to the growing data volume. In this paper, we address point data classification in a big data context. The popular cluster computing framework Apache Spark is used through the experiments and the promising results suggests a great potential of Apache Spark for large-scale point data processing

    A new framework for interactive segmentation of point clouds

    Get PDF
    Point cloud segmentation is a fundamental problem in point processing. Segmenting a point cloud fully automatically is very challenging due to the property of point cloud as well as different requirements of distinct users. In this paper, an interactive segmentation method for point clouds is proposed. Only two strokes need to be drawn intuitively to indicate the target object and the background respectively. The draw strokes are sparse and don't necessarily cover the whole object. Given the strokes, a weighted graph is built and the segmentation is formulated as a minimization problem. The problem is solved efficiently by using the Max Flow Min Cut algorithm. In the experiments, the mobile mapping data of a city area is utilized. The resulting segmentations demonstrate the efficiency of the method that can be potentially applied for general point clouds

    NOSQL For Storage and Retrieval of Large LiDAR Data Collections

    Get PDF
    Developments in LiDAR technology over the past decades have made LiDAR to become a mature and widely accepted source of geospatial information. This in turn has led to an enormous growth in data volume. The central idea for a file-centric storage of LiDAR point clouds is the observation that large collections of LiDAR data are typically delivered as large collections of files, rather than single files of terabyte size. This split of the dataset, commonly referred to as tiling, was usually done to accommodate a specific processing pipeline. It makes therefore sense to preserve this split. A document oriented NoSQL database can easily emulate this data partitioning, by representing each tile (file) in a separate document. The document stores the metadata of the tile. The actual files are stored in a distributed file system emulated by the NoSQL database. We demonstrate the use of MongoDB a highly scalable document oriented NoSQL database for storing large LiDAR files. MongoDB like any NoSQL database allows for queries on the attributes of the document. As a specialty MongoDB also allows spatial queries. Hence we can perform spatial queries on the bounding boxes of the LiDAR tiles. Inserting and retrieving files on a cloud-based database is compared to native file system and cloud storage transfer speed

    Sideloading - Ingestion Of large point clouds into the apache spark big data engine

    Get PDF
    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation

    The ultraviolet spectrum of HH 24A and its relation to optical spectra

    Get PDF
    The spectrum of the brightest part (HH 24A) of the complex Herbig-Haro object HH 24 in the short wavelength UV range was studied. The object is of special interest since it is known that in the optical range the continuum is due to dust scattered light originating in a young stellar object while the shock excited emission lines are formed in HH 24A itself. The spectrum shows only a continuum or a quasi-continuum and is not comparable to that of the typical high excitation object like HH1 or HH2 nor to that of a low excitation object like HH3 or HH47

    Historical roots of Agile methods: where did “Agile thinking” come from?

    No full text
    The appearance of Agile methods has been the most noticeable change to software process thinking in the last fifteen years [16], but in fact many of the “Agile ideas” have been around since 70’s or even before. Many studies and reviews have been conducted about Agile methods which ascribe their emergence as a reaction against traditional methods. In this paper, we argue that although Agile methods are new as a whole, they have strong roots in the history of software engineering. In addition to the iterative and incremental approaches that have been in use since 1957 [21], people who criticised the traditional methods suggested alternative approaches which were actually Agile ideas such as the response to change, customer involvement, and working software over documentation. The authors of this paper believe that education about the history of Agile thinking will help to develop better understanding as well as promoting the use of Agile methods. We therefore present and discuss the reasons behind the development and introduction of Agile methods, as a reaction to traditional methods, as a result of people's experience, and in particular focusing on reusing ideas from histor

    Determination of Strong-Interaction Widths and Shifts of Pionic X-Rays with a Crystal Spectrometer

    Get PDF
    Pionic 3d-2p atomic transitions in F, Na, and Mg have been studied using a bent crystal spectrometer. The pionic atoms were formed in the production target placed in the external proton beam of the Space Radiation Effects Laboratory synchrocyclotron. The observed energies and widths of the transitions are E=41679(3) eV and Γ=21(8) eV, E=62434(18) eV and Γ=22(80) eV, E=74389(9) eV and Γ=67(35) eV, in F, Na, and Mg, respectively. The results are compared with calculations based on a pion-nucleus optical potential

    Nuclear deformation and neutrinoless double-β\beta decay of 94,96^{94,96}Zr, 98,100^{98,100}Mo, 104^{104}Ru, 110^{110}Pd, 128,130^{128,130}Te and 150^{150}Nd nuclei in mass mechanism

    Full text link
    The (ββ)0ν(\beta ^{-}\beta ^{-})_{0\nu} decay of 94,96^{94,96}Zr, 98,100^{98,100}Mo, 104^{104}Ru, 110^{110}Pd, 128,130^{128,130}Te and 150^{150}Nd isotopes for the 0+0+0^{+}\to 0^{+} transition is studied in the Projected Hartree-Fock-Bogoliubov framework. In our earlier work, the reliability of HFB intrinsic wave functions participating in the ββ\beta ^{-}\beta ^{-} decay of the above mentioned nuclei has been established by obtaining an overall agreement between the theoretically calculated spectroscopic properties, namely yrast spectra, reduced B(E2B(E2:0+2+)0^{+}\to 2^{+}) transition probabilities, quadrupole moments Q(2+)Q(2^{+}), gyromagnetic factors g(2+)g(2^{+}) as well as half-lives T1/22νT_{1/2}^{2\nu} for the 0+0+0^{+}\to 0^{+} transition and the available experimental data. In the present work, we study the (ββ)0ν(\beta ^{-}\beta ^{-})_{0\nu} decay for the 0+0+0^{+}\to 0^{+} transition in the mass mechanism and extract limits on effective mass of light as well as heavy neutrinos from the observed half-lives T1/20ν(0+0+)T_{1/2}^{0\nu}(0^{+}\to 0^{+}) using nuclear transition matrix elements calculated with the same set of wave functions. Further, the effect of deformation on the nuclear transition matrix elements required to study the (ββ)0ν(\beta ^{-}\beta ^{-})_{0\nu} decay in the mass mechanism is investigated. It is noticed that the deformation effect on nuclear transition matrix elements is of approximately same magnitude in (ββ)2ν(\beta ^{-}\beta ^{-})_{2\nu} and (ββ)0ν(\beta ^{-}\beta ^{-})_{0\nu} decay.Comment: 15 pages, 1 figur

    A high-reflectivity high-Q micromechanical Bragg-mirror

    Get PDF
    We report on the fabrication and characterization of a micromechanical oscillator consisting only of a free-standing dielectric Bragg mirror with high optical reflectivity and high mechanical quality. The fabrication technique is a hybrid approach involving laser ablation and dry etching. The mirror has a reflectivity of 99.6%, a mass of 400ng, and a mechanical quality factor Q of approximately 10^4. Using this micromirror in a Fabry Perot cavity, a finesse of 500 has been achieved. This is an important step towards designing tunable high-Q high-finesse cavities on chip.Comment: 3 pages, 2 figure
    corecore