490,724 research outputs found
Recommended from our members
A study of the human ability to detect road surface type based on steering wheel vibration feedback
A study was performed to investigate the human ability to detect road surface type based on the
associated steering wheel vibration feedback. Tangential direction acceleration time histories
measured during road testing of a single mid-sized European automobile were used as the basis
for the study. Scaled and frequency filtered copies of two base stimuli were presented to test
subjects in a laboratory setting during two experiments which each involved 25 participants. Theory
of signal detection (TSD) was adopted as the analytical framework and the results were
summarised by means of the detectability index d’ and as receiver operating curve (ROC) points.
The results of the experiment to investigate the effect of scaling suggested monotonic relationships
between stimulus level and detection for both road surfaces. Detection of the tarmac surface
improved with reductions in acceleration level while the opposite was true of the cobblestone
surface. The ROC points for both surfaces were characterised by gradual increases in detection as
a function of acceleration level, obtaining hit rates of nearly 100% at optimum. The results of the
experiment to investigate the effect of frequency bandwidth suggested a monotonically increasing
relationship between detectability and the bandwi\dth of the vibration stimuli. Detection of both road
surfaces improved with increases in bandwidth. Average hit rates exceeded 80% for stimuli
covering the frequency range from 0 to 80 Hz. Human detection of road surface type appears to
depend on the long term memory model, or cognitive interpretation mechanism, associated with
each surface. The complexity of the measured response suggests the need to categorise and
classify incoming data before an optimal choice of feedback stimuli can be made in automotive
steering systems
Augmenting basic colour terms in english
In an unconstrained colour naming experiment conducted over the web, 330 participants named 600 colour samples in English. The 30 most frequent monolexemic colour terms were analyzed with regards to frequency, consensus among genders, response times, consistency of use, denotative volume in the Munsell and OSA colour spaces and inter-experimental agreement. Each of these measures served for ranking colour term salience; rankings were then combined to give a composite index of basicness. The results support the extension of English inventory from the 11 basic colour terms of Berlin and Kay to 13 terms by the addition of lilac and turquoise
Does the Blazar Gamma-Ray Spectrum Harden with Increasing Flux? Analysis of 9 Years of EGRET Data
The Energetic Gamma-Ray Experiment Telescope (EGRET) on the Compton Gamma-Ray
Observatory (CGRO) discovered gamma-ray emission from more than 67 blazars
during its 9 yr lifetime. We conducted an exhaustive search of the EGRET
archives and selected all the blazars that were observed multiple times and
were bright enough to enable a spectral analysis using standard power-law
models. The sample consists of 18 flat-spectrum radio quasars(FSRQs), 6
low-frequency peaked BL Lac objects (LBLs) and 2 high-frequency peaked BL Lac
objects (HBLs). We do not detect any clear pattern in the variation of spectral
index with flux. Some of the blazars do not show any statistical evidence for
spectral variability. The spectrum hardens with increasing flux in a few cases.
There is also evidence for a flux-hardness anticorrelation at low fluxes in
five blazars. The well-observed blazars (3C 279, 3C 273, PKS 0528+134, PKS
1622-297 PKS 0208-512) do not show any overall trend in the long-term spectral
dependence on flux, but the sample shows a mixture of hard and soft states. We
observed a previously unreported spectral hysteresis at weekly timescales in
all three FSRQs for which data from flares lasting for ~(3-4) weeks were
available. All three sources show a counterclockwise rotation, despite the
widely different flux profiles. We analyze the observed spectral behavior in
the context of various inverse Compton mechanisms believed to be responsible
for emission in the EGRET energy range. Our analysis uses the EGRET skymaps
that were regenerated to include the changes in performance during the mission
Results from a study of scintillation behavior at 12, 20, and 30 GHz using the results from the Virginia Tech Olympus receivers
Tropospheric scintillations are rapid fluctuations of signal caused by multiple scattering from the small scale turbulent refractive index inhomogeneities in the troposphere. They can strongly impair satellite communications links operating at frequency above 10 GHz. The VA Tech OLYMPUS propagation experiment which includes 12, 20, and 30 GHz beacon receivers at an elevation angle of 14 degrees provides us with valuable multifrequency scintillation data. A long term analysis of tropospheric scintillation results from the VA Tech OLYMPUS experiment is presented. It includes statistics of both the scintillation intensity and the attenuation relative to clear air as well as seasonal, diurnal and meteorological trends. A comparison with the Consultative Committee for International Radio (CCIR) predictive model for scintillation fading is presented
Probabilistic Models For The Simulation Of Bibliographic Retrieval Systems
A general model of a bibliographic retrieval sytem is presented which has five main elements: the documents, the queries, the thesaurus of indexing terms, the search algorithms and the physical storage locations. This is adapted to produce a probabilistic model which is suitable for simulation purposes, concentrating on the assignment of index terms to documents. This is accomplished by using the distribution of terms over documents and over queries, the distribution of exhaustivity over documents and over queries, the distribution of co-occurrences (occurrences of pairs of terms), the distribution of relevant and non-relevant documents over the number of terms matching the query. Several theoretical distributions were tested against four databases to find the best fitting distributions using the chi-square criterion. The distribution of terms over documents was split into two parts. The low frequency terms were analyzed using the number of terms which occurred x times, called the frequency-size approach. The high frequency terms were ranked by the number of occurrences in documents and analyzed using the rank versus the frequency of the term, called the frequency-rank approach. It was found that a generalized Zipf distribution fit the frequency-size portion and a generalized Bradford or log-rank distribution was best for the frequency-rank part.;These distributions were incorporated into a simulation program using a probabilistic model of term occurrences and co-occurrences. Simulation of the four databases was carried out using both the independence assumption of the occurrence of terms and the dependence assumption. In most cases the dependence model gave an improvement over the independent model but did not reproduce fully the original distribution of co-occurrences.;A small experiment with the clustering of terms to incorporate term dependence was also carried out. A method of incorporating the clustered terms into a simulation model needs to be found.;More work needs to be done in incorporating dependence of index terms, especially of order higher than two, into a model of bibliographic retrieval systems. Goodness-of-fit tests and parameter estimation methods need to be devised for the type of long tailed distributions encountered
Fast Stress Detection via ECG
Nowadays stress has become a regular part of life. Stress is difficult to measure because there has been no definition of stress that everyone accepts. Furthermore, if we do not get a handle on our stress and it becomes long term, it can seriously interfere with our health. Therefore, finding the method for stress detection could be beneficial for taking control of stress. Electrocardiogram (ECG) is the measurement of the electrical activity of the heart and represents an established standard in determining the health condition of the heart. The PQRST1[55] complex of ECG conveys information about each cardiac-cycle, where the R-peak is placed in the middle of the PQRST complex and represents the maximum value of the PQRST. Since the PQRST depicts the entire cardio-cycle, the R–peak determines half of the cardio-cycle. The distance between two adjacent R-peaks is defined as a heart rate (HR). The variation of the HR in the specific time frame, defined as heart rate variability (HRV), can reflect the state of the autonomic nervous system (ANS). The ANS has two main divisions, the sympathetic nervous system (SNS) and the parasympathetic nervous system (PNS). The SNS occurs in response to stress while the PNS results from the function of internal organs. The activity of ANS can cause an acceleration (SNS) or deceleration (PNS) of the HR. The SNS activity is associated with the low-frequency range while, the PNS activity is associated with the high frequency component of the HRV. Therefore, the power ratio of the low and high-frequency components of the spectrum of HRV can potentially show whether the subject is exposed to stress or not [48] [50]. In this research, we introduced three new indices, with one of them proposed as a proxy to provide equivalent results in the detection of stress or no-stress states while avoiding complex measurement devices as well as complex calculations. The goal was to find a more time efficient method for fast stress detection which could potentially be used in the applications that run on devices such as a wearable smartwatch in tandem with a smartphone or tablet. The experiment was established to measure the literature proposed index for stress measurement [48][50] as well as our introduced indices. In the experiment, we induced stress to the participants by using mental arithmetic as a stressor [51][53]. Theexperiment contained two kinds of trials. In the first one, the participant was exposed to different amounts of cognitive load induced by doing mental-arithmetic while, in the second one, the participant was placed in a relaxed environment. Each participant in the experiment gave feedback in which period of the experiment he/she felt stress. During the entire experiment, we recorded theparticipant‘s ECG. The ECG was used to calculate HRV which consequently was used for the calculation of the values of the index as proposed from the literature for calculating the level of the stress. The same data was used for the calculation of our introduced indices. The values of our proposed index was compared with the index and the participant‘s feedback. Finally, the data analyses showed that our proposed index is suitable to determine whether a participant is exposed to stress
Self-induced charge currents in electromagnetic materials, photon effective rest mass and some related topics
The contribution of self-induced charge currents of metamaterial media to
photon effective rest mass is discussed in detail in the present paper. We
concern ourselves with two kinds of photon effective rest mass, i.e., the
frequency-dependent and frequency-independent effective rest mass. Based on
these two definitions, we calculate the photon effective rest mass in the
left-handed medium and the 2TDLM media, the latter of which is described by the
so-called two time derivative Lorentz material (2TDLM) model. Additionally, we
concentrate primarily on the torque, which is caused by the interaction between
self-induced charge currents in dilute plasma (e.g., the secondary cosmic rays)
and interstellar magnetic fields (ambient cosmic magnetic vector potentials),
acting on the torsion balance of the rotating torsion balance experiment.Comment: 11 pages, Late
Layout-based substitution tree indexing and retrieval for mathematical expressions
We introduce a new system for layout-based indexing and retrieval of mathematical expressions using substitution trees. Substitution trees can efficiently store and find hierarchically-structured data based on similarity. Previously Kolhase and Sucan applied substitution trees to indexing mathematical expressions in operator tree representation (Content MathML) and query-by-expression retrieval. In this investigation, we use substitution trees to index mathematical expressions in symbol layout tree representation (LaTeX) to group expressions based on the similarity of their symbols, symbol layout, sub-expressions and size. We describe our novel substitution tree indexing and retrieval algorithms and our many significant contributions to the behavior of these algorithms, including: allowing substitution trees to index and retrieve layout-based mathematical expressions instead of predicates; introducing a bias in the insertion function that helps group expressions in the index based on similarity in baseline size; modifying the search function to find expressions that are not identical yet still structurally similar to a search query; and ranking search results based on their similarity in symbols and symbol layout to the search query. We provide an experiment testing our system against the term frequency-inverse document frequency (TF-IDF) keyword-based system of Zanibbi and Yuan and demonstrate that: in many cases, the two systems are comparable; our system excelled at finding expressions identical to the search query and expressions containing relevant sub-expressions; and our system experiences some limitations due to the insertion bias and the presence of LaTeX formatting in expressions. Future work includes: designing a different insertion bias that improves the quality of search results; modifying the behavior of the search and ranking functions; and extending the scope of the system so that it can index websites or non-LaTeX expressions (such as MathML or images). Overall, we present a promising first attempt at layout-based substitution tree indexing and retrieval for mathematical expressions
Limits on the detectability of the CMB B-mode polarization imposed by foregrounds
We investigate which practical constraints are imposed by foregrounds to the
detection of the B-mode polarization generated by gravitational waves in the
case of experiments of the type currently being planned. Because the B-mode
signal is probably dominated by foregrounds at all frequencies, the detection
of the cosmological component depends drastically on our ability for removing
foregrounds. We provide an analytical expression to estimate the level of the
residual polarization for Galactic foregrounds, according to the method
employed for their subtraction. We interpret this result in terms of the lower
limit of the tensor-to-scalar ratio r that allows to disentangle the
cosmological B-mode polarization from the foregrounds contribution. Polarized
emission from extragalactic radio sources and gravitational lensing is also
taken into account. As a first approach, we consider the ideal limit of an
instrumental noise--free experiment: for a full--sky coverage and a degree
resolution, we obtain a limit of r~10^(-4). This value can be improved by
high--resolution experiments and, in principle, no clear fundamental limit on
the detectability of gravitational waves polarization is found. Our analysis is
also applied to planned or hypothetical future polarization experiments, taking
into account expected noise levels.Comment: 15 pages, 9 figures, version accepted for publication in MNRA
Testing Lorentz invariance by use of vacuum and matter filled cavity resonators
We consider tests of Lorentz invariance for the photon and fermion sector
that use vacuum and matter-filled cavities. Assumptions on the wave-function of
the electrons in crystals are eliminated from the underlying theory and
accurate sensitivity coefficients (including some exceptionally large ones) are
calculated for various materials. We derive the Lorentz-violating shift in the
index of refraction n, which leads to additional sensitivity for matter-filled
cavities ; and to birefringence in initially isotropic media. Using published
experimental data, we obtain improved bounds on Lorentz violation for photons
and electrons at levels of 10^-15 and below. We discuss implications for future
experiments and propose a new Michelson-Morley type experiment based on
birefringence in matter.Comment: 15 pages, 8 table
- …