193 research outputs found
Comparison of the effect of cycloplegic versus NSAID eye drops on pain after photorefractive keratectomy
Purpose: To compare the effect of Homatropine and Diclofenac eye drops for reducing pain after photorefractive keratectomy (PRK). Methods: This randomized, double-masked, interventional study included 32 patients (64 eyes) who underwent bilateral PRK. After operation, patients received Homatropine eye drops in one eye and Diclofenac eye drops in the fellow eye for 48 h. The level of pain was evaluated using visual analogue scale (VAS), verbal rating scale (VRS), and pain rating index (PRI) at 0.5, 24, and 48 h after operation. Results: The level of pain was statistically similar between the two eyes half an hour after operation; however, Diclofenac eyes had significantly less pain 24 h after operation (1.7 ± 1.4 vs 5.8 ± 2.1, P < 0.001 for VAS, 0.6 ± 0.6 vs 2.4 ± 1.1, P < 0.001 for VRS, and 3.4 ± 3.4 vs 12.0 ± 6.9, P < 0.001 for PRI, respectively). Also, 48 h after surgery, the pain scores were less in the Diclofenac eyes (1.6 ± 1.8 vs 3.4 ± 2.8, P < 0.001 for VAS, 0.6 ± 0.6 vs 1.2 ± 0.9, P < 0.001 for VRS, and 3.3 ± 3.7 vs 6.5 ± 6.2, P < 0.001 for PRI). No case with delayed epithelial healing in both groups was observed. Conclusion: The effect of Homatropine seems to be lower compared to Diclofenac for reducing pain after photorefractive keratectomy. © 2015 Iranian Society of Ophthalmology
Exponential Pattern Retrieval Capacity with Non-Binary Associative Memory
We consider the problem of neural association for a network of non-binary neurons. Here, the task is to recall a previously memorized pattern from its noisy version using a network of neurons whose states assume values from a finite number of non-negative integer levels. Prior works in this area consider storing a finite number of purely random patterns, and have shown that the pattern retrieval capacities (maximum number of patterns that can be memorized) scale only linearly with the number of neurons in the network
Molecular associative memory: An associative memory framework with exponential storage capacity for DNA computing
Associative memory problem: Find the closest stored vector (in Hamming distance) to a given query vector. There are different ways to implement an associative memory, including the neural networks and DNA strands. Using neural networks, connection weights are adjusted in order to perform association. Recall procedure is iterative and relies on simple neural operations. In this case, the design criteria is maximizing the number of stored patterns C while having some noise tolerance. The molecular implementation is based on synthesizing C DNA strands as stored vectors. Recall procedure is usually done in one shot via chemical reactions and relies on highly parallelism of DNA computing. Here, the design criteria: finding proper DNA sequences to minimize probability of error during the recall phase. Current molecular associative memories are either low in storage capacity, if implemented using molecular realizations of neural networks, or very complex to implement, if all the stored sequences have to be synthesized. We introduce an associative memory framework with exponential storage capacity based on transcriptional networks of DNA switches. The advantages of the proposed approach over current methods are: 1. Exponential storage capacities with current neural network-based approaches can not be achieved. 2. For other methods, although having exponential storage capacities is possible, it is very complex as it requires synthesizing an extraordinarily large number of DNA strands
Nonbinary Associative Memory With Exponential Pattern Retrieval Capacity and Iterative Learning
We consider the problem of neural association for a network of nonbinary neurons. Here, the task is to first memorize a set of patterns using a network of neurons whose states assume values from a finite number of integer levels. Later, the same network should be able to recall the previously memorized patterns from their noisy versions. Prior work in this area consider storing a finite number of purely random patterns, and have shown that the pattern retrieval capacities (maximum number of patterns that can be memorized) scale only linearly with the number of neurons in the network. In our formulation of the problem, we concentrate on exploiting redundancy and internal structure of the patterns to improve the pattern retrieval capacity. Our first result shows that if the given patterns have a suitable linear-algebraic structure, i.e., comprise a subspace of the set of all possible patterns, then the pattern retrieval capacity is exponential in terms of the number of neurons. The second result extends the previous finding to cases where the patterns have weak minor components, i.e., the smallest eigenvalues of the correlation matrix tend toward zero. We will use these minor components (or the basis vectors of the pattern null space) to increase both the pattern retrieval capacity and error correction capabilities. An iterative algorithm is proposed for the learning phase, and two simple algorithms are presented for the recall phase. Using analytical methods and simulations, we show that the proposed methods can tolerate a fair amount of errors in the input while being able to memorize an exponentially large number of patterns
Neural Pre-coding Increases the Pattern Retrieval Capacity of Hopfield and Bidirectional Associative Memories
We consider the problem of neural association, which deals with the retrieval of a previously memorized pattern from its noisy version. The performance of various neural networks developed for this task may be judged in terms of their pattern retrieval capacities (the number of patterns that can be stored), and their error-correction (noise tolerance) capabilities. While significant progress has been made, most prior works in this area show poor performance with regard to pattern retrieval capacity and/or error correction. In this paper, we propose two new methods to significantly increase the pattern retrieval capacity of the Hopfield and Bidirectional Associative Memories (BAM). The main idea is to store patterns drawn from a family of low correlation sequences, similar to those used in Code Division Multiple Access (CDMA) communications, instead of storing purely random patterns as in prior works. These low correlation patterns can be obtained from random sequences by pre-coding the original sequences via simple operations that both real and artificial neurons are capable of accomplishing
Statistical Mechanics Analysis of LDPC Coding in MIMO Gaussian Channels
Using analytical methods of statistical mechanics, we analyse the typical
behaviour of a multiple-input multiple-output (MIMO) Gaussian channel with
binary inputs under LDPC network coding and joint decoding. The saddle point
equations for the replica symmetric solution are found in particular
realizations of this channel, including a small and large number of
transmitters and receivers. In particular, we examine the cases of a single
transmitter, a single receiver and the symmetric and asymmetric interference
channels. Both dynamical and thermodynamical transitions from the ferromagnetic
solution of perfect decoding to a non-ferromagnetic solution are identified for
the cases considered, marking the practical and theoretical limits of the
system under the current coding scheme. Numerical results are provided, showing
the typical level of improvement/deterioration achieved with respect to the
single transmitter/receiver result, for the various cases.Comment: 25 pages, 7 figure
Peer-to-peer live video streaming with rateless codes for massively multiplayer online games
International audienc
- …