3,740 research outputs found

    Digital watermarking and novel security devices

    Get PDF
    EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    A study and some experimental work of digital image and video watermarking

    Get PDF
    The rapid growth of digitized media and the emergence of digital networks have created a pressing need for copyright protection and anonymous communications schemes. Digital watermarking (or data hiding in a more general term) is a kind of steganography technique by adding information into a digital data stream. Several most important watermarking schemes applied to multilevel and binary still images and digital videos were studied. They include schemes based on DCT (Discrete Cosine Transform), DWT (Discrete Wavelet Transform), and fractal transforms. The question whether these invisible watermarking techniques can resolve the issue of rightful ownership of intellectual properties was discussed. The watermarking schemes were further studied from malicious attack point of view, which is considered an effective way to advance the watermarking techniques. In particular, the StirMark robustness tests based on geometrical distortion were carried out. A binary watermarking scheme applied in the DCT domain is presented in this research project. The effect of the binarization procedure necessarily encountered in dealing with binary document images is found so strong that most of conventional embedding schemes fail in dealing with watermarking of binary document images. Some particular measures have to be taken. The initial simulation results indicate that the proposed technique is promising though further efforts need to be made

    Significance of log-periodic precursors to financial crashes

    Full text link
    We clarify the status of log-periodicity associated with speculative bubbles preceding financial crashes. In particular, we address Feigenbaum's [2001] criticism and show how it can be rebuked. Feigenbaum's main result is as follows: ``the hypothesis that the log-periodic component is present in the data cannot be rejected at the 95% confidence level when using all the data prior to the 1987 crash; however, it can be rejected by removing the last year of data.'' (e.g., by removing 15% of the data closest to the critical point). We stress that it is naive to analyze a critical point phenomenon, i.e., a power law divergence, reliably by removing the most important part of the data closest to the critical point. We also present the history of log-periodicity in the present context explaining its essential features and why it may be important. We offer an extension of the rational expectation bubble model for general and arbitrary risk-aversion within the general stochastic discount factor theory. We suggest guidelines for using log-periodicity and explain how to develop and interpret statistical tests of log-periodicity. We discuss the issue of prediction based on our results and the evidence of outliers in the distribution of drawdowns. New statistical tests demonstrate that the 1% to 10% quantile of the largest events of the population of drawdowns of the Nasdaq composite index and of the Dow Jones Industrial Average index belong to a distribution significantly different from the rest of the population. This suggests that very large drawdowns result from an amplification mechanism that may make them more predictable than smaller market moves.Comment: Latex document of 38 pages including 16 eps figures and 3 tables, in press in Quantitative Financ

    Data hiding in images based on fractal modulation and diversity combining

    Get PDF
    The current work provides a new data-embedding infrastructure based on fractal modulation. The embedding problem is tackled from a communications point of view. The data to be embedded becomes the signal to be transmitted through a watermark channel. The channel could be the image itself or some manipulation of the image. The image self noise and noise due to attacks are the two sources of noise in this paradigm. At the receiver, the image self noise has to be suppressed, while noise due to the attacks may sometimes be predicted and inverted. The concepts of fractal modulation and deterministic self-similar signals are extended to 2-dimensional images. These novel techniques are used to build a deterministic bi-homogenous watermark signal that embodies the binary data to be embedded. The binary data to be embedded, is repeated and scaled with different amplitudes at each level and is used as the wavelet decomposition pyramid. The binary data is appended with special marking data, which is used during demodulation, to identify and correct unreliable or distorted blocks of wavelet coefficients. This specially constructed pyramid is inverted using the inverse discrete wavelet transform to obtain the self-similar watermark signal. In the data embedding stage, the well-established linear additive technique is used to add the watermark signal to the cover image, to generate the watermarked (stego) image. Data extraction from a potential stego image is done using diversity combining. Neither the original image nor the original binary sequence (or watermark signal) is required during the extraction. A prediction of the original image is obtained using a cross-shaped window and is used to suppress the image self noise in the potential stego image. The resulting signal is then decomposed using the discrete wavelet transform. The number of levels and the wavelet used are the same as those used in the watermark signal generation stage. A thresholding process similar to wavelet de-noising is used to identify whether a particular coefficient is reliable or not. A decision is made as to whether a block is reliable or not based on the marking data present in each block and sometimes corrections are applied to the blocks. Finally the selected blocks are combined based on the diversity combining strategy to extract the embedded binary data

    RESEARCH ON THE FRACTAL STATISTICAL CHARACTERISTICS AS POSSIBLE PROGNOSTIC PARAMETERS FOR EARTHQUAKES, GENERATED IN THE SEISMIC ZONE OF VRANCEA, IN THE PERIOD BETWEEN 01.08.2016 AND 30.12.2016

    Get PDF
    Using fractal analysis is an excellent alternative method for decode the seismic noise structure. Fractal analysis of microseismic noise could also be an appropriate method to detect earthquake indicators. The scientific goal is to detect standard signals, based on different earthquakes’ focal mechanisms, separating the "individual" behavior of the elements of the monitoring systems.The method for describing low-frequency microseismic noise from the network of seismic stations in a seismically active region of the Vrancea used. Seismic records of twenty-three broadband stations were analyzed, situated at distances of 20 to 500 km from the Vrancea earthquakes whit magnitudes Mw=5.7 and Mw=5.6 on September 23 and December 27, 2016, respectively. The daily assessment values of three multifractal parameters (characteristics of the multifractal singularity spectra of the waveform) from each station used for the description.The present paper is a continuation of previous work [Oynakov et al., 2019], where the effects of synchronization in the low-frequency microseismic field were found before the Vrancea earthquake with magnitude Mp=5.6 on October 28, 2019.The study shows that the noise coherence measure increased for stations, closer to the epicenter. However, the question of the source of this coherence remains open.Using fractal analysis is an excellent alternative method for decode the seismic noise structure. Fractal analysis of microseismic noise could also be an appropriate method to detect earthquake indicators. The scientific goal is to detect standard signals, based on different earthquakes’ focal mechanisms, separating the "individual" behavior of the elements of the monitoring systems.The method for describing low-frequency microseismic noise from the network of seismic stations in a seismically active region of the Vrancea used. Seismic records of twenty-three broadband stations were analyzed, situated at distances of 20 to 500 km from the Vrancea earthquakes whit magnitudes Mw=5.7 and Mw=5.6 on September 23 and December 27, 2016, respectively. The daily assessment values of three multifractal parameters (characteristics of the multifractal singularity spectra of the waveform) from each station used for the description.The present paper is a continuation of previous work [Oynakov et al., 2019], where the effects of synchronization in the low-frequency microseismic field were found before the Vrancea earthquake with magnitude Mp=5.6 on October 28, 2019.The study shows that the noise coherence measure increased for stations, closer to the epicenter. However, the question of the source of this coherence remains open

    Process of Fingerprint Authentication using Cancelable Biohashed Template

    Get PDF
    Template protection using cancelable biometrics prevents data loss and hacking stored templates, by providing considerable privacy and security. Hashing and salting techniques are used to build resilient systems. Salted password method is employed to protect passwords against different types of attacks namely brute-force attack, dictionary attack, rainbow table attacks. Salting claims that random data can be added to input of hash function to ensure unique output. Hashing salts are speed bumps in an attacker’s road to breach user’s data. Research proposes a contemporary two factor authenticator called Biohashing. Biohashing procedure is implemented by recapitulated inner product over a pseudo random number generator key, as well as fingerprint features that are a network of minutiae. Cancelable template authentication used in fingerprint-based sales counter accelerates payment process. Fingerhash is code produced after applying biohashing on fingerprint. Fingerhash is a binary string procured by choosing individual bit of sign depending on a preset threshold. Experiment is carried using benchmark FVC 2002 DB1 dataset. Authentication accuracy is found to be nearly 97\%. Results compared with state-of art approaches finds promising

    On Characterization and Optimization of Engineering Surfaces

    Get PDF
    Swedish manufacturing industry in collaboration with academia is exploring innovative ways to manufacture eco-efficient and resource efficient products. Consequently, improving manufacturing efficiency and quality has become the priority for the manufacturing sector to remain competitive in a sustainable way. To achieve this, control and optimization of manufacturing process and product’s performance are necessary. This has led to increase in demand for functional surfaces, which are engineering surfaces tailored to different applications. With new advancements in manufacturing and surface metrology, investigations are steadily progressing towards re-defining quality and meeting dynamic customer demands. In this thesis, surfaces produced by different manufacturing systems are investigated, and methods are proposed to improve specification and optimization.The definition and interpretation of surface roughness vary across the manufacturing industry and academia. It is well known that surface characterization helps to understand the manufacturing process and its influence on surface functional properties such as wear, friction, adhesivity, wettability, fluid retention and aesthetic properties such as gloss. Manufactured surfaces consist of features that are relevant and features that are not of interest. To be able to produce the intended function, it is important to identify and quantify the features of relevance. Use of surface texture parameters helps in quantifying these surface features with respect to type, region, spacing and distribution. Currently, surface parameters Ra or Sa that represent average roughness are widely used in the industry, but they may not provide adequate information on the surface. In this thesis, a general methodology, based on the standard surface parameters and statistical approach, is proposed to improve the specification for surface roughness and identify the combination of significant surface texture parameters that best describe the surface and extract valuable surface information.Surface topography generated by additive, subtractive and formative processes is investigated with the developed research approach. The roughness profile parameters and areal surface parameters defined in ISO, along with power spectral density and scale sensitive fractal analysis, are used for surface characterization and analysis. In this thesis, the application of regression statistics to identify the set of significant surface parameters that improve the specification for surface roughness is shown. These surface parameters are used to discriminate between the surfaces produced by multiple process variables at multiple levels. By analyzing the influence of process variables on the surface topography, the research methodology helps to understand the underlying physical phenomenon and enhance the domain-specific knowledge with respect to surface topography. Subsequently, it helps to interpret processing conditions for process and surface function optimization.The research methods employed in this study are valid and applicable for different manufacturing processes. This thesis can support the guidelines for manufacturing industry focusing on process and functional optimization through surface analysis. With increase in use of machine learning and artificial intelligence in automation, methodologies such as the one proposed in this thesis are vital in exploring and extracting new possibilities in functional surfaces

    ANALYSIS OF THE SIDE EFFECTS OF CELL PHONE ON THE BRAIN USING ELECTROENCEPHALOGRAM

    Get PDF
    Cell phone emits electromagnetic radiation which could be harmful to human brain after an extensive exposure. The number of cell phone users has drastically increased all over the world. In 2012, the number of cell phone subscribers went up to 6 billion and the majority of them are children and young adults. The main concern is the health risk involved with the cell phone usage. In this paper, the effects of cell phone on human brain are investigated. 16-channels EEG was used to record the brain signal from 24 healthy participants under six different conditions (before experiment, during experiment: right ear with distance, right ear without distance, left ear with distance, left ear without distance and after experiment). After data were preprocessed, the signal irregularity for all the six conditions was computed. Three parameters were used to measure the signal complexity for time series data: i) Composite Permutation Entropy Index (CPEI), ii) Fractal Dimension (FD) and iii) Hjorth complexity. Based on these three parameters, the results showed that the complexity level is high during experiment when the cell phone is at Right ear without distance compared to Left ear without distance for all the brain regions (frontal, central, parietal, temporal and occipital). However, the complexity level decreases with distance when the cell phone is either on right or left ear. The increase in the complexity level explains the presence of high frequency components due to the cell phone radiation. Participants are more exposed to the radiation during the call when the cell phone is without distance. For the frequency domain, absolute power and coherence were used for the analysis. It can be observed from the result that delta, theta and alpha brain waves are lower at both frontal and temporal lobes during cell phone usage. But beta wave is higher during and after call. This is due to the effect of radiation from the cell phone

    Biometrics Authentication of Fingerprint with Using Fingerprint Reader and Microcontroller Arduino

    Get PDF
    The idea of security is as old as humanity itself. Between oldest methods of security were included simple mechanical locks whose authentication element was the key. At first, a universal–simple type, later unique for each lock. A long time had mechanical locks been the sole option for protection against unauthorized access. The boom of biometrics has come in the 20th century, and especially in recent years, biometrics is much expanded in the various areas of our life. Opposite of traditional security methods such as passwords, access cards, and hardware keys, it offers many benefits. The main benefits are the uniqueness and the impossibility of their loss. The main benefits are the uniqueness and the impossibility of their loss. Therefore we focussed in this paper on the the design of low cost biometric fingerprint system and subsequent implementation of this system in praxtise. Our main goal was to create a system that is capable of recognizing fingerprints from a user and then processing them. The main part of this system is the microcontroller Arduino Yun with an external interface to the scan of the fingerprint with a name Adafruit R305 (special reader). This microcontroller communicates with the external database, which ensures the exchange of data between Arduino Yun and user application. This application was created for (currently) most widespread mobile operating system-Android
    • …
    corecore