15,808 research outputs found

    Emerging technologies for the non-invasive characterization of physical-mechanical properties of tablets

    Get PDF
    The density, porosity, breaking force, viscoelastic properties, and the presence or absence of any structural defects or irregularities are important physical-mechanical quality attributes of popular solid dosage forms like tablets. The irregularities associated with these attributes may influence the drug product functionality. Thus, an accurate and efficient characterization of these properties is critical for successful development and manufacturing of a robust tablets. These properties are mainly analyzed and monitored with traditional pharmacopeial and non-pharmacopeial methods. Such methods are associated with several challenges such as lack of spatial resolution, efficiency, or sample-sparing attributes. Recent advances in technology, design, instrumentation, and software have led to the emergence of newer techniques for non-invasive characterization of physical-mechanical properties of tablets. These techniques include near infrared spectroscopy, Raman spectroscopy, X-ray microtomography, nuclear magnetic resonance (NMR) imaging, terahertz pulsed imaging, laser-induced breakdown spectroscopy, and various acoustic- and thermal-based techniques. Such state-of-the-art techniques are currently applied at various stages of development and manufacturing of tablets at industrial scale. Each technique has specific advantages or challenges with respect to operational efficiency and cost, compared to traditional analytical methods. Currently, most of these techniques are used as secondary analytical tools to support the traditional methods in characterizing or monitoring tablet quality attributes. Therefore, further development in the instrumentation and software, and studies on the applications are necessary for their adoption in routine analysis and monitoring of tablet physical-mechanical properties

    SAR data compression: Application, requirements, and designs

    Get PDF
    The feasibility of reducing data volume and data rate is evaluated for the Earth Observing System (EOS) Synthetic Aperture Radar (SAR). All elements of data stream from the sensor downlink data stream to electronic delivery of browse data products are explored. The factors influencing design of a data compression system are analyzed, including the signal data characteristics, the image quality requirements, and the throughput requirements. The conclusion is that little or no reduction can be achieved in the raw signal data using traditional data compression techniques (e.g., vector quantization, adaptive discrete cosine transform) due to the induced phase errors in the output image. However, after image formation, a number of techniques are effective for data compression

    Rapid deconvolution of low-resolution time-of-flight data using Bayesian inference

    Get PDF
    The deconvolution of low-resolution time-of-flight data has numerous advantages, including the ability to extract additional information from the experimental data. We augment the well-known Lucy-Richardson deconvolution algorithm using various Bayesian prior distributions and show that a prior of second-differences of the signal outperforms the standard Lucy-Richardson algorithm, accelerating the rate of convergence by more than a factor of four, while preserving the peak amplitude ratios of a similar fraction of the total peaks. A novel stopping criterion and boosting mechanism are implemented to ensure that these methods converge to a similar final entropy and local minima are avoided. Improvement by a factor of two in mass resolution allows more accurate quantification of the spectra. The general method is demonstrated in this paper through the deconvolution of fragmentation peaks of the 2,5-dihydroxybenzoic acid matrix and the benzyltriphenylphosphonium thermometer ion, following femtosecond ultraviolet laser desorption

    Dynamic image data compression in spatial and temporal domains : theory and algorithm

    Get PDF
    Author name used in this publication: Dagan FengVersion of RecordPublishe

    The Data Big Bang and the Expanding Digital Universe: High-Dimensional, Complex and Massive Data Sets in an Inflationary Epoch

    Get PDF
    Recent and forthcoming advances in instrumentation, and giant new surveys, are creating astronomical data sets that are not amenable to the methods of analysis familiar to astronomers. Traditional methods are often inadequate not merely because of the size in bytes of the data sets, but also because of the complexity of modern data sets. Mathematical limitations of familiar algorithms and techniques in dealing with such data sets create a critical need for new paradigms for the representation, analysis and scientific visualization (as opposed to illustrative visualization) of heterogeneous, multiresolution data across application domains. Some of the problems presented by the new data sets have been addressed by other disciplines such as applied mathematics, statistics and machine learning and have been utilized by other sciences such as space-based geosciences. Unfortunately, valuable results pertaining to these problems are mostly to be found only in publications outside of astronomy. Here we offer brief overviews of a number of concepts, techniques and developments, some "old" and some new. These are generally unknown to most of the astronomical community, but are vital to the analysis and visualization of complex datasets and images. In order for astronomers to take advantage of the richness and complexity of the new era of data, and to be able to identify, adopt, and apply new solutions, the astronomical community needs a certain degree of awareness and understanding of the new concepts. One of the goals of this paper is to help bridge the gap between applied mathematics, artificial intelligence and computer science on the one side and astronomy on the other.Comment: 24 pages, 8 Figures, 1 Table. Accepted for publication: "Advances in Astronomy, special issue "Robotic Astronomy

    On the validation of solid mechanics models using optical measurements and data decomposition

    Get PDF
    Engineering simulation has a significant role in the process of design and analysis of most engineered products at all scales and is used to provide elegant, light-weight, optimized designs. A major step in achieving high confidence in computational models with good predictive capabilities is model validation. It is normal practice to validate simulation models by comparing their numerical results to experimental data. However, current validation practices tend to focus on identifying hot-spots in the data and checking that the experimental and modeling results have a satisfactory agreement in these critical zones. Often the comparison is restricted to a single or a few points where the maximum stress/strain is predicted by the model. The objective of the present paper is to demonstrate a step-bystep approach for performing model validation by combining full-field optical measurement methodologies with computational simulation techniques. Two important issues of the validation procedure are discussed, i.e. effective techniques to perform data compression using the principles of orthogonal decomposition, as well as methodologies to quantify the quality of simulations and make decisions about model validity. An I-beam with open holes under three-point bending loading is selected as an exemplar of the methodology. Orthogonal decomposition by Zernike shape descriptors is performed to compress large amounts of numerical and experimental data in selected regions of interest (ROI) by reducing its dimensionality while preserving information; and different comparison techniques including traditional error norms, a linear comparison methodology and a concordance coefficient correlation are used in order to make decisions about the validity of the simulation

    Environmental monitoring: landslide assessment and risk management (Test site: Vernazza, Cinque Terre Natural Park)

    Get PDF
    Natural disasters, whether of meteorological origin such as cyclones, floods, tornadoes and droughts or having geological nature such as earthquakes, volcanoes and landslide, are well known for their devastating impacts on human life, economy and environment. Over recent decades, the people and the societies are becoming more vulnerable; although the frequency of natural events may be constant, human activities contribute to their increased intensity. Indeed, every year millions of people are affected by natural disasters globally and, only in the last decade, more than 80% of all disaster-related deaths were caused by natural hazards. The PhD work is part of the activities for the support and development of methodologies useful to improve the management of environmental emergencies. In particular, it focused on the analysis of environmental monitoring and disaster risk management, a systematic approach to identify, to assess and to reduce the potential risks produced by a disaster. This method (Disaster Risk Management) aims to reduce socio-economic vulnerabilities and deals with natural and man-made events. In the PhD thesis, in particular, the slope movements have been evaluated. Slope failures are generally not so costly as earthquakes or major floods, but they are more widespread, and over the years may cause more property loss than any other geological hazard. In many developing regions slope failures constitute a continuing and serious impact on the social and economic structure. Specifically, the Italian territory has always been subject to instability phenomena, because of the geological and morphological characteristic and because of "extreme" weather events that are repeated more frequently than in the past, in relation to climate change. Currently these disasters lead to the largest number of victims and damages to settlements, infrastructure and historical and cultural environmental, after the earthquakes. The urban development, especially in recent decades, resulted in an increase of the assets at risk and unstable areas, often due to constant human intervention badly designed that led to instability also places previously considered "safe". Prevention is therefore essential to minimize the damages caused by landslides The objectives of the conducted research were to investigate the different techniques and to check their potentiality, in order to evaluate the most appropriate instrument for landslide hazard assessment in terms of better compromise between time to perform the analysis and expected results. The attempt is to evaluate which are the best methodologies to use according to the scenario, taking into consideration both reachable accuracies and time constraints. Careful considerations will be performed on strengths, weaknesses and limitations inherent to each methodology. The characteristics associated with geographic, or geospatial, information technologies facilitate the integration of scientific, social and economic data, opening up interesting possibilities for monitoring, assessment and change detection activities, thus enabling better informed interventions in human and natural systems. This is an important factor for the success of emergency operations and for developing valuable natural disaster preparedness, mitigation and prevention systems. The test site was the municipality of Vernazza, which in October 2011 was subject to a extreme rainfall which led to the occurrence of a series of landslides along the Vernazzola stream, which have emphasized the flood event that affected the water cours

    A multi-objective performance optimisation framework for video coding

    Get PDF
    Digital video technologies have become an essential part of the way visual information is created, consumed and communicated. However, due to the unprecedented growth of digital video technologies, competition for bandwidth resources has become fierce. This has highlighted a critical need for optimising the performance of video encoders. However, there is a dual optimisation problem, wherein, the objective is to reduce the buffer and memory requirements while maintaining the quality of the encoded video. Additionally, through the analysis of existing video compression techniques, it was found that the operation of video encoders requires the optimisation of numerous decision parameters to achieve the best trade-offs between factors that affect visual quality; given the resource limitations arising from operational constraints such as memory and complexity. The research in this thesis has focused on optimising the performance of the H.264/AVC video encoder, a process that involved finding solutions for multiple conflicting objectives. As part of this research, an automated tool for optimising video compression to achieve an optimal trade-off between bit rate and visual quality, given maximum allowed memory and computational complexity constraints, within a diverse range of scene environments, has been developed. Moreover, the evaluation of this optimisation framework has highlighted the effectiveness of the developed solution
    corecore