996 research outputs found
X-Ray Diffraction
A Parkland College A with Honors project, this paper reviews X-ray diffraction, identifying the process, its developers, and its uses
Effects of Retarder on Cement
Applying chemical analysis gained from taking a chemistry class, the author reflects on his summer employment at a chemical plant where he tested the effect of tartaric acid had on one of the company\u27s products, Redline Speed Crete
Stratigraphy of the Garden City formation in northwestern Utah, and its trilobite faunas
This is a report on the stratigraphy and the trilobite faunas of the Garden City formation as exposed in portions of northeastern Utah and southeastern Idaho….https://elischolar.library.yale.edu/peabody_museum_natural_history_bulletin/1005/thumbnail.jp
Clustering for Classification
Advances in technology have provided industry with an array of devices for collecting data. The frequency and scale of data collection means that
there are now many large datasets being generated. To find patterns in these datasets it would be useful to be able to apply modern methods of
classification such as support vector machines. Unfortunately these methods are computationally expensive, quadratic in the number of data points in fact, so cannot be applied directly.
This thesis proposes a framework whereby a variety of clustering methods can be used to summarise datasets, that is, reduce them to a smaller but
still representative dataset so that these advanced methods can be applied. It compares the results of using this framework against using random selection on a large number of classification and regression problems. Results show that the clustered datasets are on average fifty percent smaller than the original datasets without loss of classification accuracy which is significantly better than random selection. They also show that there is no free lunch, for each dataset it is important to choose a clustering method carefully
A hybrid error control and artifact detection mechanism for robust decoding of H.264/AVC video sequences
This letter presents a hybrid error control and artifact detection (HECAD) mechanism which can be used to enhance the error resilient capabilities of the standard H.264/advanced video coding (AVC) codec. The proposed solution first exploits the residual source redundancy to recover the most likelihood H.264/AVC bitstream. If error recovery is unsuccessful, the residual corrupted slices are then passed through a pixel-level artifact detection mechanism to detect the visually impaired macroblocks to be concealed. The proposed HECAD algorithm achieves overall peak signal-to-noise ratio gains between 0.4 dB and 4.5 dB relative to the standard with no additional bandwidth requirement. The cost of this solution translates in a marginal increase in the complexity of the decoder. In addition, this method can be applied in conjunction with other error resilient strategies and scales well with different encoding configurations.peer-reviewe
Accurate modelling of Ka-band videoconferencing systems based on the quality of experience
This work formed part of the project TWISTER, which was financially supported under the European Union 6th Framework Programme (FP6). The authors are solely responsible for the contents of the paper, which does not represent the opinion of the European Commission.Ka-band satellite multimedia communication networks play important roles because of their capability to provide the required bandwidth in remote places of the globe. However, because of design complexity, in practice they suffer from poor design and performance degradation because of being practically forced to guarantee acceptable end-user satisfaction in conditions of extremely low bit error rates, which is emphasised with the vulnerability of compressed video content to transmission errors, often impossible to be applied during the service development phase. A novel discrete event simulation model is presented, which provides performance estimation for such systems based on subjective measurement and a better quality of experience. The authors show that the proposed model reduces implementation cost and is flexible to be used for different network topologies around the globe.peer-reviewe
A support vector machine approach for detection and localization of transmission errors within standard H.263++ decoders
Wireless multimedia services are increasingly becoming popular boosting the need for better quality-of-experience (QoE) with minimal costs. The standard codecs employed by these systems remove spatio-temporal redundancies to minimize the bandwidth required. However, this increases the exposure of the system to transmission errors, thus presenting a significant degradation in perceptual quality of the reconstructed video sequences. A number of mechanisms were investigated in the past to make these codecs more robust against transmission errors. Nevertheless, these techniques achieved little success, forcing the transmission to be held at lower bit-error rates (BERs) to guarantee acceptable quality. This paper presents a novel solution to this problem based on the error detection capabilities of the transport protocols to identify potentially corrupted group-of-blocks (GOBs). The algorithm uses a support vector machine (SVM) at its core to localize visually impaired macroblocks (MBs) that require concealment within these GOBs. Hence, this method drastically reduces the region to be concealed compared to state-of-the-art error resilient strategies which assume a packet loss scenario. Testing on a standard H.263++ codec confirms that a significant gain in quality is achieved with error detection rates of 97.8% and peak signal-to-noise ratio (PSNR) gains of up to 5.33 dB. Moreover, most of the undetected errors provide minimal visual artifacts and are thus of little influence to the perceived quality of the reconstructed sequences.peer-reviewe
Robust decoder-based error control strategy for recovery of H.264/AVC video content
Real-time wireless conversational and broadcasting multimedia applications offer particular transmission challenges as reliable content delivery cannot be guaranteed. The undelivered and erroneous content causes significant degradation in quality of experience. The H.264/AVC standard includes several error resilient tools to mitigate this effect on video quality. However, the methods implemented by the standard are based on a packet-loss scenario, where corrupted slices are dropped and the lost information concealed. Partially damaged slices still contain valuable information that can be used to enhance the quality of the recovered video. This study presents a novel error recovery solution that relies on a joint source-channel decoder to recover only feasible slices. A major advantage of this decoder-based strategy is that it grants additional robustness while keeping the same transmission data rate. Simulation results show that the proposed approach manages to completely recover 30.79% of the corrupted slices. This provides frame-by-frame peak signal-to-noise ratio (PSNR) gains of up to 18.1%dB, a result which, to the knowledge of the authors, is superior to all other joint source-channel decoding methods found in literature. Furthermore, this error resilient strategy can be combined with other error resilient tools adopted by the standard to enhance their performance.peer-reviewe
A robust error detection mechanism for H.264/AVC coded video sequences based on support vector machines
Current trends in wireless communications provide fast and location-independent access to multimedia services. Due to its high compression efficiency, H.264/AVC is expected to become the dominant underlying technology in the delivery of future wireless video applications. The error resilient mechanisms adopted by this standard alleviate the problem of spatio-temporal propagation of visual artifacts caused by transmission errors by dropping and concealing all macroblocks (MBs) contained within corrupted segments, including uncorrupted MBs. Concealing these uncorrupted MBs generally causes a reduction in quality of the reconstructed video sequence.peer-reviewe
Service Oriented Grid Computing Model as a means of Cost Sharing in the Institutions of Higher Learning in Kenya
The use of distributed systems by enterprises and academic institutions has increased exponentially in recent years, enabled by factors such as ready access to the Internet and the World-Wide Web, the maturity and ubiquity of the HTTP protocol, and the improvement in secure communication technology. In the early days, distributed applications communicated using proprietary protocols, and system administrators used adhoc (improvised) methods to manage systems that might be across town, on another continent, or anywhere in between. Numerous standards have been developed over the years to ease the costs of deployment and maintenance, with varying degrees of success. Today, the key technologies in distributed systems are service-oriented architecture (SOA), Web services, and grid computing, all of which are seeing significant investment in standardization and increasingly rapid adoption by organizations of all types and sizes. Academic organizations in Kenya have seen increase in the number of students admitted as well reduction in central government funding to these institutions to purchase more computer systems and procure management information systems .In this paper we offer a highlevel description of each of the technologies, and how they can be used to develop a cost effective co-funded dynamic system that can be used by the institutions
- …