4 research outputs found

    High-frequency Noise Removal of Vintage Songs in Audio Cassettes using Signal Processing Techniques

    Get PDF
    From 1960 to the early 2000s, the audio cassette was one of the three most used storage devices for prerecorded music, alongside gramophone records (LPs) and compact discs (CDs). A compact cassette consists of two miniature spools, between which a magnetic tape is passed and wound. The tape head in tape recorders and cassette players is used to store and play audio tracks in these magnetic tapes by converting the electric signal to a magnetic fluctuation and vice versa. The sound quality of an audio track recorded on a compact cassette is decent. However, as these cassettes are stored in the Earth’s magnetic field for longer periods, the effect from it influences magnetic particles to change their orientation and distort the recorded audio signal. As a result, high-frequency noise is generated in the audio signal, resulting in a great disturbance when listening to songs and other audio recordings on vintage cassettes. 168Therefore, the removal of such noises while minimizing the impact on the original audio signal will make sure the listener gets to enjoy the original quality of the vintage audio recordings. Fourier transform of a given distorted audio file (in “wav” format) was obtained to identify the high-frequency noise using Wolfram Mathematica 12.3. Then a variable “m” was introduced to the program to identify magnitudes of high-frequency components and remove their effect on the original audio recording. A trial-and-error approach was taken to identify the adequate “m" value for each channel of a given audio signal. Vintage songs were used to evaluate the accuracy of the developed method in this study. It was identified that the suitable “m” value changes for each channel in each song. Therefore, it is impossible to state a constant value for the variable “m”. However, the accuracy of the output also depends on the hearing frequency of the person, and the audio equipment being used. As an application, this technique can be used to conserve vintage songs that were originally stored in compact cassettes and spools, because it allows the user to control the number of high-frequency components removed from the distorted audio signal. Keywords: Signal Processing Techniques, Fourier Transforms, Mathematica, Noise Removing,Audio Cassette

    Noise subspaces subtraction in SVD based on the difference of variance values

    Get PDF
    As a matrix decomposition method, Singular Value Decomposition (SVD) is introduced to signal processing such as denoising. Firstly, a polluted signal is constructed in Hankel matrix form, and then through SVD the Hankel matrix is decomposed to two unitary matrices and a diagonal matrix in which a series of singular values are arranged in a descending order. These singular values are considered to be located in a series of subspaces including signal subspaces and noise subspaces. The singular values in these subspaces are different because the signal magnitudes dominate noise magnitudes. Therefore, if these two kinds of subspaces are well separated, an ideal denoised signal could be achieved by reconstruction. This paper improves the traditional SVD denoising which merely does well in processing periodic signals’ subspaces separation. The improved SVD denoising method based on variance value extends SVD denoising to aperiodic signal denoising. The denoising results by improved SVD denoising, traditional SVD denoising, wavelet thresholding and EEMD denoising are compared and the improved SVD denoising method received an excellent numerical experimental effects

    Методи та моделі підвищення завадостійкості інформаційно-телекомунікаційних систем медичного призначення

    Get PDF
    In the thesis, the analysis of modern medical information-telecommunication system (ITS) structures, communication channel types, health information interchange techniques, biomedical signal types, optimal digital filtration methods have been carried out, and necessity of development of new methods of optimal non-stationary noisy signal (including the negative SNR) filtration without having a priori information about signal and noise characteristics has been justified. The method and corresponding structural-analytical model of optimal signal filtration in stationary noise environments based on the minimization of the difference between filtered and approximated signals by the criterion of minimum mean absolute error have been developed. To improve an information signal recovery accuracy in the case of non-stationary noise, the method and corresponding structural-analytical model of the adaptive sampling by the criterion of the minimization the absolute difference of two consecutive values of signal-to-noise ratio estimated for noisy signal optimal filtration and approximation results (estimated correspondingly for two consecutive sampling rates) have been developed. The structural analytical model of advanced data communication in medical information-telecommunication systems has been developed. Software, which includes a module for generating information and noise components of noisy signal, a noisy signal processing module, a filtration and approximation result evaluation module, has been developed. A series of experiments on evaluating results of the stationary (the sum of four harmonics) and non-stationary (electrocardiogram, electroencephalogram, modulated binary sequence) noisy digital signal optimal filtration in the presence of high additive fluctuation (Gauss distribution) and impulse (Bernoulli distribution) noise levels has been carried out. The developed methods of optimal signal sampling and filtration allow us to restore information signal without having a reference signal and noise characteristics (blind signal processing, BSP), to reduce power consumption in the autonomous biomedical equipment, to reduce the bit error ratio (BER), to reduce the medical system hardware (biomedical sensors, transmitters/receivers, storages, etc.) quality demands

    Models for Efficient Automated Site Data Acquisition

    Get PDF
    Accurate and timely data acquisition for tracking and progress reporting is essential for efficient management and successful project delivery. Considerable research work has been conducted to develop methods utilizing automated site data acquisition for tracking and progress reporting. However, these developments are challenged by: the dynamic and noisy nature of construction jobsites; the indoor localization accuracy; and the data processing and extraction of actionable information. Limited research work attempted to study and develop customized design of wireless sensor networks to meet the above challenges and overcome limitations of utilizing off-the-shelf technologies. The objective of this research is to study, design, configure and develop fully customized automated site data acquisition models, with a special focus on near real-time automated tracking and control of construction operations embracing cutting edge innovations in wireless and remote sensing technologies. In this context, wireless and remote sensing technologies are integrated in two customized prototypes to monitor and collect data from construction jobsites. This data is then processed and mined to generate meaningful and actionable information. The developed prototypes are expected to have wider scope of applications in construction management, such as improving construction safety, monitoring the condition of civil infrastructure and reducing energy consumption in buildings. Two families of prototypes were developed in this research; Sensor Aided GPS (SA-GPS) prototype, which is designed and developed for tracking outdoor construction operations such as earthmoving; and Self-Calibrated Wireless Sensor Network (SC-WSN), which is designed for indoor localization and tracking of construction resources (labor, materials and equipment). These prototypes along with their hardware and software are encapsulated in a computational framework. The framework houses a set of algorithms coded in C# to enable efficient data processing and fusion that support tracking and progress reporting. Both the hardware prototypes and software algorithms were progressively tested, evaluated and re-designed using Rapid Prototyping approach. The validation process of the developed prototypes encompasses three steps; (1) simulation to validate the prototypes’ design virtually using MATLAB, (2) laboratory experiments to evaluate prototypes’ functionality in real time, and (3) testing on scaled case studies after fine-tuning the prototype design based on the results obtained from the first two steps. The SA-GPS prototype consists of a microcontroller equipped with GPS module as well as a number of sensors such as accelerometer, barometric pressure sensor, Bluetooth proximity and strain gauges. The results of testing the developed SA-GPS prototype on scaled construction jobsite indicated that it was capable of estimating project progress within 3% mean absolute percentage error and 1% standard deviation on 16 trials, in comparison to the standalone GPS which had approximately 12% mean absolute percentage error and 2% standard deviation. The SC-WSN prototype incorporates two main features. The first is the use of the Kalman filtering and smoothing for the RSSI signal to provide more stable and predictable signal for estimating the distance between a reader and a tag. The second is the use of a developed dynamic path-loss model which continually optimizes its parameters to cope with the dynamically changing construction environment using Particle Swarm Optimization (PSO) algorithm. The laboratory testing indicated the improvement in location estimation, where the produced location estimates using SC_WSN had an average error of 0.66m in comparison to 1.67m using the raw RSSI signal. Also the results indicated 60% accuracy improvement in estimating locations using the developed dynamic model. The developed prototypes are not only expected to reduce the risk of project cost and duration overruns by timely and early detection of deviations from project plan, but also enables project managers to observe and oversee their project’s status in near real-time. It is expected that the accuracy of the developed hardware, can be achieved on large-scale real construction projects. This is attributed to the fact that the developed prototype does not require any scalable improvements on its hardware technology, nor does it require any additional computational changes to its developed algorithms and software
    corecore