44 research outputs found

    Early postnatal development of the lumbar vertebrae in male Wistar rats: double staining and digital radiological studies

    Get PDF
    The aim of the study was to evaluate the physiological developmental changes of male rats’ lumbar vertebrae during the first 22 days after birth. Morphology and mineralisation of lumbar vertebrae were evaluated using double-staining and digital radiography system, which allowed vertebral width and optical density to be determined. Pup weight, crown-rump length, body mass index and vertebral width increased during postnatal period and significantly correlated with their age. Bone mineralisation, as measured by optical density, did not show any significant differences. The complete fusion of the primary ossification centres had a cranio- -caudal direction and started on day 19 after parturition but was incomplete by day 22. It could be concluded that, unlike significant age-related increase of vertebral size, mineralisation was only slightly elevated during evaluated postnatal period. The method described is supplementary to alizarin red S staining as it provides both qualitative and quantitative data on mineralisation in a similar manner to micro computed tomography but does not allow 3 dimensional and microarchitecture examination

    DATA OPTIMIZATION FOR 3D MODELING AND ANALYSIS OF A FORTRESS ARCHITECTURE

    Get PDF
    Thanks to the recent worldwide spread of drones and to the development of structure from motion photogrammetric software, UAV photogrammetry is becoming a convenient and reliable way for the 3D documentation of built heritage. Hence, nowadays, UAV photogrammetric surveying is a common and quite standard tool for producing 3D models of relatively large areas. However, when such areas are large, then a significant part of the generated point cloud is often of minor interest. Given the necessity of efficiently dealing with storing, processing and analyzing the produced point cloud, some optimization step should be considered in order to reduce the amount of redundancy, in particular in the parts of the model that are of minor interest. Despite this can be done by means of a manual selection of such parts, an automatic selection is clearly much more viable way to speed up the final model generation. Motivated by the recent development of many semantic classification techniques, the aim of this work is investigating the use of point cloud optimization based on semantic recognition of different components in the photogrammetric 3D model. The Girifalco Fortress (Cortona, Italy) is used as case study for such investigation. The rationale of the proposed methodology is clearly that of preserving high point density in the model in the areas that describe the fortress, whereas point cloud density is dramatically reduced in vegetated and soil areas. Thanks to the implemented automatic procedure, in the considered case study, the size of the point cloud has been reduced by a factor five, approximately. It is worth to notice that such result has been obtained preserving the original point density on the fortress surfaces, hence ensuring the same capabilities of geometric analysis of the original photogrammetric model

    Pragmatic markers in Hungarian: Some introductory remarks

    Full text link

    Large data sets and their study using Douglas-Peucker method

    No full text
    Technological progress in measurement automation in geodesy creates new interesting research problems. One of them is an existance of excessive observational data sets. That means the new technologies produce large data sets. such data stes may cause problems for users, since they create heavy processing load. Their uploading to suitable software and subsequent processing is very time consuming. Thus, there is a need to resample and reorganize the data sets before processing, in order to considerable reduce their amount without loosing essential information. In the article, Douglas-Peuckesr method was used to reduce the size of the data sets. The proposed algorithm was applied and tested on a fragment of the data set containing results of Świnoujście-Szczecin Channel bottom mesurements. Based on ststistical analysis presented in the paper, a set of parameters can also serve as an initial information for another study

    The Impact of Optimizing the Number of Points of ALS Data Set on the Accuracy of the Generated DTM

    No full text
    Airborne laser scanning technology delivers the result of the survey in the form of a point cloud. In order to construct a digital terrain model, it is necessary to perform filtration, which consists in separating data reflecting the relief features from the data reflecting situational details. In view of the very large amount of data in the survey data set, as well as the time consumption and difficulty in automatic filtration of the point cloud, it is possible to apply an optimization algorithm reducing the size of the point cloud while deriving a digital terrain model. This study presents the stages of compiling an airborne laser scanning point cloud using filtration and optimization. The filtration was carried out using the adaptive TIN model and the method of robust moving surfaces, while optimization was carried out with the application of an already existing algorithm to reduce the size of the survey data set. The effect of reducing the size of the data set on the accuracy of the generated DTM was tested and empirical and numerical tests have been performed

    Point cloud unification with optimization algorithm

    No full text
    Terrestrial laser scanning is a technology that enables to obtain three-dimensional data - an accurate representation of reality. During scanning not only desired objects are measured, but also a lot of additional elements. Therefore, unnecessary data is being removed, what has an impact on efficiency of point cloud processing. It can happen while single point clouds are displayed - user decides what he wants to deleted and does it manually, or by using tools provided in dedicated for point cloud processing softwares. In Leica Geosystems Cyclone - software used here in tests, user can apply tools e.g. for merging or unification of point clouds. Both of them change the separate points clouds into one points cloud, however unification can be executed with reduction - low, medium, high, highest or no reduction at all. It should be noted, that the modeled objects may have complex structure and unification with selected type of reduction can have a very big impact on the result of modeling. In such situation it is desirable to apply different types of reduction. In this article authors propose to apply an optimization algorithm on unified point clouds. Unification conducted by means of Cyclone Leica Geosystems (v.7.3.3) enables to merge point clouds and reduced the number of points. The point elimination is determined mainly by spacing between points. It may leads to loose of important points - representing some essential elements of scanned objects or area. Applying optimization algorithm, especially for complex objects, may help to reduce the number of points without losing the information necessary for proper modeling

    Proposed Technology of lidar data processing to build DTM

    No full text
    Light Detection and Ranging (LiDAR) is a sensing technology whieh has application in building the Digital Terrain Model (DTM). A point cloud generated from laser scanning makes up the so-called large dataset, which is difficult and sometimes even impossible to use directly. Import of LiDAR point cloud into appropriate software and its processing is time-consuming and demands high computing power. Therefore, it is advisable to optimize the volume of observation results which make up the point cloud. The following paper presents operation of a modified algorithm for optimization of points' number in a large dataset [BÅ‚aszczak W., 2006]. The optimization involves reduction and uses existing cartographic generalization methods. The optimized dataset was filtered, and during the process the points representing the terrain were separated from data representing non-ground elements. Filtration was carried out with the application of a proposed new method including trend line in search belts, and the laser power used to register points. The optimized and filtered data set was then used to build a DTM. The results obtained encourage further detailed study of theoretical and empirical character

    Analysis of lignite deposit parameters for the purpose of a planned power plant

    No full text
    Elektrownie produkujące energię opierającą się na węgielu brunatnym są ściśle związane z miejscem pozyskania surowca, czyli ze złożem.W przeciwieństwie do węgla kamiennego, węgiel brunatny w stanie surowym, ze względu na swoje właściwości, nie nadaje się do transportu na większe odległości. Budowane przy złożach elektrownie bazujące na tym surowcu muszą być dostosowane do właściwości dostępnej w pobliżu kopaliny. W artykule przedstawiono analizę jednego ze złóż węgla brunatnego pod kątem parametrów technologicznych projektowanej elektrowni. Do analiz wytypowano trzy parametry węgla: popielność Ad [%], wartość opałową Qir [kJ/kg] oraz zawartość siarki całkowitej w węglu Std [%]. Na podstawie informacji z dokumentacji geologicznej oraz projektu zagospodarowania złoża określono statystykę parametrów jakościowych węgla w złożu w funkcji postępu projektowanej eksploatacji. Oprócz wartości średnich przedstawiono wielkości możliwych błędów oszacowań wynikających ze zmienności złoża ale również z niedoskonałej informacji o złożu.Power plants producing energy from lignite are tied to the locations of lignite extraction – to lignite deposits. Unlike hard coal, raw lignite, because of its properties and water content, cannot be transported over long distances. Therefore, power plants built in the neighborhood of a deposit have to be adjusted to the properties of the lignite in the deposit. This article analyzes the parameters of a lignite deposit for the purpose of considering a new power plant. Three lignite parameters were selected – ash content Ad [%], calorific value Qir [kJ/kg], and sulfur content Std [%]. Based on the information from geological documentation and the lignite deposit development project, the analysis calculates relevant statistics of these lignite deposit parameters, taking into account the progress of the planned exploitation. The mean values as well as the standard deviation of the mean values are presented
    corecore