340 research outputs found

    Deglaciation constraints in the Parâng Mountains, Southern Romania, using surface exposure dating

    Get PDF
    Cosmogenic nuclide surface exposure ages have been widely used to constrain glacial chronologies in the European regions. This paper brings new evidence that the Romanian Carpathians sheltered mountain glaciers in their upper valleys and cirques until the end of the last glaciation. Twenty-four 10Be surface exposure ages were obtained from boulders on moraine crests in the central area of the Parâng Mountains, Southern Carpathians. Exposure ages were used to constrain the timing of the deglaciation events during the Late Glacial. The lowest boulders yielded an age of 13.0 ± 1.1 (1766 m) and final deglaciation occurred at 10.2 ± 0.9 ka (2055 m). Timing of the Late Glacial events and complete deglaciation reported in this study are consistent with, and confirm, previously reported ages of deglaciation within the Carpathian and surrounding European region

    Subjective Annotation for a Frame Interpolation Benchmark using Artefact Amplification

    Get PDF
    Current benchmarks for optical flow algorithms evaluate the estimation either directly by comparing the predicted flow fields with the ground truth or indirectly by using the predicted flow fields for frame interpolation and then comparing the interpolated frames with the actual frames. In the latter case, objective quality measures such as the mean squared error are typically employed. However, it is well known that for image quality assessment, the actual quality experienced by the user cannot be fully deduced from such simple measures. Hence, we conducted a subjective quality assessment crowdscouring study for the interpolated frames provided by one of the optical flow benchmarks, the Middlebury benchmark. We collected forced-choice paired comparisons between interpolated images and corresponding ground truth. To increase the sensitivity of observers when judging minute difference in paired comparisons we introduced a new method to the field of full-reference quality assessment, called artefact amplification. From the crowdsourcing data, we reconstructed absolute quality scale values according to Thurstone's model. As a result, we obtained a re-ranking of the 155 participating algorithms w.r.t. the visual quality of the interpolated frames. This re-ranking not only shows the necessity of visual quality assessment as another evaluation metric for optical flow and frame interpolation benchmarks, the results also provide the ground truth for designing novel image quality assessment (IQA) methods dedicated to perceptual quality of interpolated images. As a first step, we proposed such a new full-reference method, called WAE-IQA. By weighing the local differences between an interpolated image and its ground truth WAE-IQA performed slightly better than the currently best FR-IQA approach from the literature.Comment: arXiv admin note: text overlap with arXiv:1901.0536

    SUR-Net: Predicting the Satisfied User Ratio Curve for Image Compression with Deep Learning

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.The Satisfied User Ratio (SUR) curve for a lossy image compression scheme, e.g., JPEG, characterizes the probability distribution of the Just Noticeable Difference (JND) level, the smallest distortion level that can be perceived by a subject. We propose the first deep learning approach to predict such SUR curves. Instead of the direct approach of regressing the SUR curve itself for a given reference image, our model is trained on pairs of images, original and compressed. Relying on a Siamese Convolutional Neural Network (CNN), feature pooling, a fully connected regression-head, and transfer learning, we achieved a good prediction performance. Experiments on the MCL-JCI dataset showed a mean Bhattacharyya distance between the predicted and the original JND distributions of only 0.072

    Regime Switch and Effect on Per Capita Food Security Issues in South Africa

    Get PDF
    This paper examines whether the food security situation in South Africa is sensitive to the past and present governance systems. The study was aimed at reviewing the performance of key indicators: per capita land utilization, price index and consumption of a major staple food commodity (maize) in the pre- and post-apartheid periods. It also aimed at validating the application of population growth and food advocacy theories on South African food security. Time series analysis involving variables such as per capital land cultivation, consumption/tons and price/tons of maize within the period of 1970 to 2010 was conducted. Threshold autoregressive model (TAR) approach was used to capture per capita food security status of South Africans and to monitor trends under apartheid and post-apartheid eras. We found that there is a declining trend in per capita land cultivation and mixed results of per capita consumption of maize. The study revealed that population growth in South Africa has not been harnessed and there is possibility of worsening food security in the country. The long-run effect between the variables was established. The study recommends per capita targeting policy strategies for the improvement of staple food production and dietary balancing to ensure sustainable food security

    DeepFL-IQA: Weak Supervision for Deep IQA Feature Learning

    Get PDF
    Multi-level deep-features have been driving state-of-the-art methods for aesthetics and image quality assessment (IQA). However, most IQA benchmarks are comprised of artificially distorted images, for which features derived from ImageNet under-perform. We propose a new IQA dataset and a weakly supervised feature learning approach to train features more suitable for IQA of artificially distorted images. The dataset, KADIS-700k, is far more extensive than similar works, consisting of 140,000 pristine images, 25 distortions types, totaling 700k distorted versions. Our weakly supervised feature learning is designed as a multi-task learning type training, using eleven existing full-reference IQA metrics as proxies for differential mean opinion scores. We also introduce a benchmark database, KADID-10k, of artificially degraded images, each subjectively annotated by 30 crowd workers. We make use of our derived image feature vectors for (no-reference) image quality assessment by training and testing a shallow regression network on this database and five other benchmark IQA databases. Our method, termed DeepFL-IQA, performs better than other feature-based no-reference IQA methods and also better than all tested full-reference IQA methods on KADID-10k. For the other five benchmark IQA databases, DeepFL-IQA matches the performance of the best existing end-to-end deep learning-based methods on average.Comment: dataset url: http://database.mmsp-kn.d

    Improving Link Reliability through Network Coding in Cooperative Cellular Networks

    Get PDF
    The paper proposes a XOR-based network coded cooperation protocol for the uplink transmission of relay assisted cellular networks and an algorithm for selection and assignment of the relay nodes. The performances of the cooperation protocol are expressed in terms of network decoder outage probability and Block Error Rate of the cooperating users. These performance indicators are analyzed theoretically and by computer simulations. The relay nodes assignment is based on the optimization, according to several criteria, of the graph that describes the cooperation cluster formed after an initial selection of the relay nodes. The graph optimization is performed using Genetic Algorithms adapted to the topology of the cooperation cluster and the optimization criteria considered

    Quantitative biological studies at cellular and sub-cellular level

    Get PDF
    The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file.Title from title screen of research.pdf file (viewed on March 23, 2009)Vita.Includes bibliographical references.Thesis (Ph.D.) University of Missouri-Columbia 2007.Dissertations, Academic -- University of Missouri--Columbia -- Biological sciences.We developed a translational magnetic tweezers for quantitative measurements at cellular and sub-cellular level. In the cellular studies, multiple membrane tethers were extracted simultaneously under constant force transduced to eukaryotic cells through magnetic beads attached to their membranes. The tethers were characterized in terms of viscoelastic parameters. The contribution of the actin cytoskeleton in the process of tether formation was investigated. The membrane tether system was used to test the applicability of the Crooks fluctuation theorem (a recent finding in non-equilibrium thermodynamics) at the mesoscopic level. In the sub-cellular studies, the cytoplasmic viscoelastic coefficients of mouse oocytes were determined using magnetic beads trapped into their cytoskeletal mesh. We found that cryopreservation altered all the viscoelastic parameters. We demonstrated that the reversible disassembly of the actin cytoskeleton with latrunculin A before cryopreservation increased the number of survivors and preserved their viscoelastic parameters. This finding promoted latrunculin as a candidate cryoprotective agent
    corecore