3 research outputs found

    Video denoising using fuzzy-connectedness principles

    Get PDF
    Fuzzy-connectedness principles are effective for image segmentation. In this paper, such a principle is applied to video denoising. Assume a video signal suffers from both additive white Gaussian noise and impulsive noise. The corrupted signal is filtered by a fuzzy system, which fuzzily connects a Wiener filter and a median filter together. The simulation results show that the fuzzy-connectedness approach produces desirable outputs

    Contributions to the segmentation of dermoscopic images

    Get PDF
    Tese de mestrado. Mestrado em Engenharia Biomédica. Faculdade de Engenharia. Universidade do Porto. 201

    Noise Reduction in the Gamma-Ray Log by Means of Nonlinear Filtering

    Get PDF
    In January 1983, I returned to OSU to begin formal work on the degree of Doctor of Philosophy. My personal goal was to take a number of courses which I knew from experience would be useful in the burgeoning information age. Luckily for me, since without it I might never have collected and produced enough material to write this dissertation, I became involved with the Oklahoma State University Consortium for Enhancement of Well Log Data via Signal Processing. (We call it the Well Log Project for brevity.) This wonderful group of companies gave us much needed contact with researchers from companies such as Amoco Production Company; Arco Oil and Gas Company; Cities Service Oil and Gas Corporation; Conoco; Dresser-Atlas Company; Exxon Production Research Company; Gearhart Industries, Inc.; Halliburton; International Business Machines; Mobil Research and Development Corporation; Phillips Petroleum Corporation; Seismograph Service Corporation; Sohio Petroleum Company; Texaco Corporation; and the Oklahoma State University Center for Energy Research. My graduate research assistantship was funded by this consortium, and the support is gratefully acknowledged. Ironically, some experts, along the very helpful and extremely detailed discussions, also advised that everything possible had been done for the gamma-ray log, and that I should pursue a more fruitful avenue of research. Instead of the desired effect, this made me more determined to do something which, I hope, is useful in the field. After all, the history of technology is filled with ironies like Einstein working at a patent office during an era when serious suggestions were being made that patent offices be closed since everything possible had been invented. One thing for which I feel indebted to these researchers is that in spite of any doubts as to the fruitfulness of this field of my endeavor, they did their utmost to assist me in every way possible. After reading a large amount of literature on gamma-ray logging, I knew that the issue of what to do about the Poisson noise inherent in radioactive decay is an important problem to investigate. The problem is that this noise is small in comparison with the uncertainties involved in physical logging, so by late spring 1983 I began using the synthetic logs described here and measuring the results in Monte Carlo simulations. This was the turning point because it provided me with a relatively objective figure of merit of a filter. And, although I touch on the subject of what input parameters should be provided to the synthetic log generator, I have never changed them from that first spring day when the program ran. This I have purposely not experimented with for fear of coming up with an optimized log instead of an improved filter. The question of how the different synthetic log parameters affect filtering may provide another interesting topic of research if couched in slightly different tenns. The advent of the synthetic log brought with it some startling conclusions: Ordinary median filters increased the noise; recursive median filters improved the noise level more than did the optimal time-invariant linear filter; but my own best filter thus far did little in comparison. That filter has long since been confined to mass storage, but it did introduce me to a useful methodology in inventing filters. One indication of the pathological features of this filter appears to be that the histograms results of the Monte Carlo simulation are multimodal. By keeping a record of the input seeds to the random number subsequently be regenerated and examined manually for the features that cause unusually beneficial or pathological behavior. This may seem like it would produce an unwieldy quantity of output, but for any filter that improved the results, less than 10% of the histogram's data points fell outside the Gaussian-like center hump, and often the examination of only a few of these would be sufficient to conjecture what might be improved. Once the histogram began to appear Gaussian, the filters often began to produce reasonable results on actual log data, also. Perhaps the process could be repeated with the tails of the histograms, but this is left as a potentially interesting problem for future work.Electrical Engineerin
    corecore