374,156 research outputs found
Quantum cryptography: key distribution and beyond
Uniquely among the sciences, quantum cryptography has driven both
foundational research as well as practical real-life applications. We review
the progress of quantum cryptography in the last decade, covering quantum key
distribution and other applications.Comment: It's a review on quantum cryptography and it is not restricted to QK
Beyond Stemming and Lemmatization: Ultra-stemming to Improve Automatic Text Summarization
In Automatic Text Summarization, preprocessing is an important phase to
reduce the space of textual representation. Classically, stemming and
lemmatization have been widely used for normalizing words. However, even using
normalization on large texts, the curse of dimensionality can disturb the
performance of summarizers. This paper describes a new method for normalization
of words to further reduce the space of representation. We propose to reduce
each word to its initial letters, as a form of Ultra-stemming. The results show
that Ultra-stemming not only preserve the content of summaries produced by this
representation, but often the performances of the systems can be dramatically
improved. Summaries on trilingual corpora were evaluated automatically with
Fresa. Results confirm an increase in the performance, regardless of summarizer
system used.Comment: 22 pages, 12 figures, 9 table
Does Phenomenal Consciousness Overflow Attention? An Argument from Feature-Integration
In the past two decades a number of arguments have been given in favor of the possibility of phenomenal consciousness without attentional access, otherwise known as phenomenal overflow. This paper will show that the empirical data commonly cited in support of this thesis is, at best, ambiguous between two equally plausible interpretations, one of which does not posit phenomenology beyond attention. Next, after citing evidence for the feature-integration theory of attention, this paper will give an account of the relationship between consciousness and attention that accounts for both the empirical data and our phenomenological intuitions without positing phenomenal consciousness beyond attention. Having undercut the motivations for accepting phenomenal overflow along with having given reasons to think that phenomenal overflow does not occur, I end with the tentative conclusion that attention is a necessary condition for phenomenal consciousness
Eccentricity dependent auditory enhancement of visual stimulus detection but not discrimination
Sensory perception is enhanced by the complementary information provided by our different sensory modalities and even apparently task irrelevant stimuli in one modality can facilitate performance in another. While perception in general comprises both, the detection of sensory objects as well as their discrimination and recognition, most studies on audio-visual interactions have focused on either of these aspects. However, previous evidence, neuroanatomical projections between early sensory cortices and computational mechanisms suggest that sounds might differentially affect visual detection and discrimination and differentially at central and peripheral retinal locations. We performed an experiment to directly test this by probing the enhancement of visual detection and discrimination by auxiliary sounds at different visual eccentricities and within the same subjects. Specifically, we quantified the enhancement provided by sounds that reduce the overall uncertainty about the visual stimulus beyond basic multisensory co-stimulation. This revealed a general trend for stronger enhancement at peripheral locations in both tasks, but a statistically significant effect only for detection and only at peripheral locations. Overall this suggests that there are topographic differences in the auditory facilitation of basic visual processes and that these may differentially affect basic aspects of visual recognition
Controlling phonons and photons at the wavelength-scale: silicon photonics meets silicon phononics
Radio-frequency communication systems have long used bulk- and
surface-acoustic-wave devices supporting ultrasonic mechanical waves to
manipulate and sense signals. These devices have greatly improved our ability
to process microwaves by interfacing them to orders-of-magnitude slower and
lower loss mechanical fields. In parallel, long-distance communications have
been dominated by low-loss infrared optical photons. As electrical signal
processing and transmission approaches physical limits imposed by energy
dissipation, optical links are now being actively considered for mobile and
cloud technologies. Thus there is a strong driver for wavelength-scale
mechanical wave or "phononic" circuitry fabricated by scalable semiconductor
processes. With the advent of these circuits, new micro- and nanostructures
that combine electrical, optical and mechanical elements have emerged. In these
devices, such as optomechanical waveguides and resonators, optical photons and
gigahertz phonons are ideally matched to one another as both have wavelengths
on the order of micrometers. The development of phononic circuits has thus
emerged as a vibrant field of research pursued for optical signal processing
and sensing applications as well as emerging quantum technologies. In this
review, we discuss the key physics and figures of merit underpinning this
field. We also summarize the state of the art in nanoscale electro- and
optomechanical systems with a focus on scalable platforms such as silicon.
Finally, we give perspectives on what these new systems may bring and what
challenges they face in the coming years. In particular, we believe hybrid
electro- and optomechanical devices incorporating highly coherent and compact
mechanical elements on a chip have significant untapped potential for
electro-optic modulation, quantum microwave-to-optical photon conversion,
sensing and microwave signal processing.Comment: 26 pages, 5 figure
Stochastic accumulation of feature information in perception and memory
It is now well established that the time course of perceptual processing influences the first second or so of performance in a wide variety of cognitive tasks. Over the last20 years, there has been a shift from modeling the speed at which a display is processed, to modeling the speed at which different features of the display are perceived and formalizing how this perceptual information is used in decision making. The first of these models(Lamberts, 1995) was implemented to fit the time course of performance in a speeded perceptual categorization task and assumed a simple stochastic accumulation of feature information. Subsequently, similar approaches have been used to model performance in a range of cognitive tasks including identification, absolute identification, perceptual matching, recognition, visual search, and word processing, again assuming a simple stochastic accumulation of feature information from both the stimulus and representations held in memory. These models are typically fit to data from signal-to-respond experiments whereby the effects of stimulus exposure duration on performance are examined, but response times (RTs) and RT distributions have also been modeled. In this article, we review this approach and explore the insights it has provided about the interplay between perceptual processing, memory retrieval, and decision making in a variety of tasks. In so doing, we highlight how such approaches can continue to usefully contribute to our understanding of cognition
How does working memory enable number-induced spatial biases?
Number-space associations are a robust observation, but their underlying mechanisms remain debated. Two major accounts have been identified. First, spatial codes may constitute an intrinsic part of number representations stored in the brain – a perspective most commonly referred to as the Mental Number Line account. Second, spatial codes may be generated at the level of working memory when number (or other) representations are coordinated in function of a specific task. The aim of the current paper is twofold. First, whereas a pure Mental Number Line account cannot capture the complexity of observations reported in the literature, we here explore if and how a pure working memory account can suffice. Second, we make explicit (more than in our earlier work) the potential building blocks of such a working memory account, thereby providing clear and concrete foci for empirical efforts to test the feasibility of the account
Generation and manipulation of nonclassical light using photonic crystals
Photonic crystal cavities can localize light into nanoscale volumes with high
quality factors. This permits a strong interaction between light and matter,
which is important for the construction of classical light sources with
improved properties (e.g., low threshold lasers) and of nonclassical light
sources (such as single and entangled photon sources) that are crucial pieces
of hardware of quantum information processing systems. This article will review
some of our recent experimental and theoretical results on the interaction
between single quantum dots and photonic crystal cavity fields, and on the
integration of multiple photonic crystal devices into functional circuits for
quantum information processing.Comment: 6 pages, 6 figures; replaced with revised versio
Interchange Fee – Competitiveness of Payment Instruments
Purpose of the article: This study describes the markets of payment instruments. It focuses mainly on the
systems of credit and debit cards. Authors identify key moments in theory used in this dilemma, called Tourist
Test. The market of credit cards and direct debit system is 4 side business, where the Interchange fee plays
very important role.
Methodology/methods: In this paper was applied secondary research. The secondary research was based on
analysis of papers and literature published about Interchange fee including European Commission together
with polemic about payment instruments market competitiveness.
Scientific aim: The aim of this article is to identify the rules, conditions on the market of payment instruments
- the system of credit cards and direct debit system. The authors try to recognize the problems on the demand
and supply side. The authors focus on defying the cost-benefit approach, contributed with Tourist test. The
Interchange Fee plays the key role.
Findings: The approach of the scientific literature pays attention to costs and benefits, its equilibrium, also
talking about social utility and social welfare the author is missing the whole impact on end consumer welfare
and satisfaction. The dilemma is even more complicated due to the fact that the end consumer does not know
he or she is not maximizing his or her utility, apart from the merchant, who is under pressure of margin squeeze.
Conclusions: It is needed to start to measure the effectiveness and influence which the existence of Interchange
Fee brings. Of course, confront the effectiveness and influence with benefits the Interchange Fee has
- …