116,682 research outputs found

    Energy-temperature uncertainty relation in quantum thermodynamics

    Get PDF
    This is the author accepted manuscript. The final version is available from Springer Nature via the DOI in this record.Much like Heisenberg’s uncertainty principle in quantum mechanics, there exists a thermodynamic uncertainty relation in classical statistical mechanics that limits the simultaneous estimation of energy and temperature for a system in equilibrium. However, for nanoscale systems deviations from standard thermodynamics arise due to non-negligible interactions with the environment. Here we include interactions and, using quantum estimation theory, derive a generalised thermodynamic uncertainty relation valid for classical and quantum systems at all coupling strengths. We show that the non-commutativity between the system’s state and its effective energy operator gives rise to additional quantum fluctuations that increase the uncertainty in temperature and modify the heat capacity. Surprisingly, these quantum fluctuations are described by the average Wigner-Yanase-Dyson skew information, a quantity intimately connected to measures of coherence. For temperature estimation we demonstrate that the optimal signal-to-noise ratio is constrained not only by the heat capacity, but an additional dissipative term arising from the non-negligible interactions. Practically this will inform the design of optimal nanoscale thermometers. On the fundamental side the results shed light on the interplay between classical and non-classical fluctuations in quantum thermodynamics.HM is supported by EPSRC through a Doctoral Training Grant. J.A. acknowledges support from EPSRC, grant EP/M009165/1, and the Royal Society. This research was supported by the COST network MP1209 “Thermodynamics in the quantum regime”

    Model selection in High-Dimensions: A Quadratic-risk based approach

    Full text link
    In this article we propose a general class of risk measures which can be used for data based evaluation of parametric models. The loss function is defined as generalized quadratic distance between the true density and the proposed model. These distances are characterized by a simple quadratic form structure that is adaptable through the choice of a nonnegative definite kernel and a bandwidth parameter. Using asymptotic results for the quadratic distances we build a quick-to-compute approximation for the risk function. Its derivation is analogous to the Akaike Information Criterion (AIC), but unlike AIC, the quadratic risk is a global comparison tool. The method does not require resampling, a great advantage when point estimators are expensive to compute. The method is illustrated using the problem of selecting the number of components in a mixture model, where it is shown that, by using an appropriate kernel, the method is computationally straightforward in arbitrarily high data dimensions. In this same context it is shown that the method has some clear advantages over AIC and BIC.Comment: Updated with reviewer suggestion

    Do we measure what we get?

    Get PDF
    Performance measures shall enhance the performance of companies by directing the attention of decision makers towards the achievement of organizational goals. Therefore, goal congruence is regarded in literature as a major factor in the quality of such measures. As reality is affected by many variables, in practice one has tried to achieve a high degree of goal congruence by incorporating an increasing number of these variables into performance measures. However, a goal congruent measure does not lead automatically to superior decisions, because decision makers’ restricted cognitive abilities can counteract the intended effects. This paper addresses the interplay between goal congruence and complexity of performance measures considering cognitively-restricted decision makers. Two types of decision quality are derived which allow a differentiated view on the influence of this interplay on decision quality and learning. The simulation experiments based on this differentiation provide results which allow a critical reflection on costs and benefits of goal congruence and the assumptions regarding the goal congruence of incentive systems

    Forecasting People Trajectories and Head Poses by Jointly Reasoning on Tracklets and Vislets

    Full text link
    In this work, we explore the correlation between people trajectories and their head orientations. We argue that people trajectory and head pose forecasting can be modelled as a joint problem. Recent approaches on trajectory forecasting leverage short-term trajectories (aka tracklets) of pedestrians to predict their future paths. In addition, sociological cues, such as expected destination or pedestrian interaction, are often combined with tracklets. In this paper, we propose MiXing-LSTM (MX-LSTM) to capture the interplay between positions and head orientations (vislets) thanks to a joint unconstrained optimization of full covariance matrices during the LSTM backpropagation. We additionally exploit the head orientations as a proxy for the visual attention, when modeling social interactions. MX-LSTM predicts future pedestrians location and head pose, increasing the standard capabilities of the current approaches on long-term trajectory forecasting. Compared to the state-of-the-art, our approach shows better performances on an extensive set of public benchmarks. MX-LSTM is particularly effective when people move slowly, i.e. the most challenging scenario for all other models. The proposed approach also allows for accurate predictions on a longer time horizon.Comment: Accepted at IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 2019. arXiv admin note: text overlap with arXiv:1805.0065

    The Visual Social Distancing Problem

    Get PDF
    One of the main and most effective measures to contain the recent viral outbreak is the maintenance of the so-called Social Distancing (SD). To comply with this constraint, workplaces, public institutions, transports and schools will likely adopt restrictions over the minimum inter-personal distance between people. Given this actual scenario, it is crucial to massively measure the compliance to such physical constraint in our life, in order to figure out the reasons of the possible breaks of such distance limitations, and understand if this implies a possible threat given the scene context. All of this, complying with privacy policies and making the measurement acceptable. To this end, we introduce the Visual Social Distancing (VSD) problem, defined as the automatic estimation of the inter-personal distance from an image, and the characterization of the related people aggregations. VSD is pivotal for a non-invasive analysis to whether people comply with the SD restriction, and to provide statistics about the level of safety of specific areas whenever this constraint is violated. We then discuss how VSD relates with previous literature in Social Signal Processing and indicate which existing Computer Vision methods can be used to manage such problem. We conclude with future challenges related to the effectiveness of VSD systems, ethical implications and future application scenarios.Comment: 9 pages, 5 figures. All the authors equally contributed to this manuscript and they are listed by alphabetical order. Under submissio

    Illusory correlation, group size and memory

    Get PDF
    Two studies were conducted to test the predictions of a multi-component model of distinctiveness-based illusory correlation (IC) regarding the use of episodic and evaluative information in the production of the phenomenon. Extending on the standard paradigm, participants were presented with 4 groups decreasing in size, but all exhibiting the same ratio of positive to negative behaviours. Study 1 (N = 75) specifically tested the role of group size and distinctiveness, by including a zero-frequency cell in the design. Consistent with predictions drawn from the proposed model, with decreasing group size, the magnitude of the IC effect showed a linear in- crease in judgments thought to be based on evaluative information. In Study 2 (N = 43), a number of changes were introduced to a group assignment task (double presentation, inclusion of decoys) that allowed a more rig- orous test of the predicted item-specific memory effects. In addition, a new multilevel, mixed logistic regression approach to signal-detection type analysis was used, providing a more flexible and reliable analysis than previ- ously. Again, with decreasing group size, IC effects showed the predicted monotonic increase on the measures (group assignment frequencies, likability ratings) thought to be dependent on evaluative information. At the same time, measures thought to be based on episodic information (free recall and group assignment accuracy) partly revealed the predicted enhanced episodic memory for smaller groups and negative items, while also supporting a distinctiveness-based approach. Additional analysis revealed that the pattern of results for judg- ments though to be based on evaluative information was independent of interpersonal variation in behavioral memory, as predicted by the multi-component model, and in contrast to predictions of the competing models. The results are discussed in terms of the implications of the findings for the proposed mechanisms of illusory correlation

    The interplay of strategic and internal green marketing orientation on competitive advantage

    Get PDF
    This paper seeks to clarify and refine the relationship between strategic and internal green marketing and firm competitiveness. Despite the significance of corporate environmental strategy to firms adopting a triple-bottom line performance evaluation, there is insufficient focus on strategic green marketing and its impact on a firm’s competitiveness. This study fills the gap by providing a comprehensive view of strategic green marketing and its impact on competitive advantage. Findings also reveal the moderating role of internal green marketing actions towards the development of a sustained competitive advantage. Specifically, the findings build on contemporary green marketing literature suggesting that a significant interplay between strategy and people exists which enhances the creation of competitive advantage. This in turn increases financial performance. Finally, this research uses an updated approach to build on current literature concerning the drivers and outcomes of strategic green marketing. This provides managers with nuanced insights about environmentally-driven competitive advantage

    Fiscal-monetary-financial stability interactions in a data-rich environment

    Get PDF
    In this paper, we shed some light on the mutual interplay of economic policy and the financial stability objective. We contribute to the intense discussion regarding the influence of fiscal and monetary policy measures on the real economy and the financial sector. We apply a factor-augmented vector autoregression model to Czech macroeconomic data and model the policy interactions in a data-rich environment. Our findings can be summarized in three main points: First, loose economic policies (especially monetary policy) may translate into a more stable financial sector, albeit only in the short term. In the medium term, an expansion-focused mix of monetary and fiscal policy may contribute to systemic risk accumulation, by substantially increasing credit dynamics and house prices. Second, we find that fiscal and monetary policy impact the financial sector in differential magnitudes and time horizons. And third, we confirm that systemic risk materialization might cause significant output losses and deterioration of public finances, trigger deflationary pressures, and increase the debt service ratio. Overall, our findings provide some empirical support for countercyclical fiscal and monetary policies.Web of Science18322419

    Perception of categories: from coding efficiency to reaction times

    Full text link
    Reaction-times in perceptual tasks are the subject of many experimental and theoretical studies. With the neural decision making process as main focus, most of these works concern discrete (typically binary) choice tasks, implying the identification of the stimulus as an exemplar of a category. Here we address issues specific to the perception of categories (e.g. vowels, familiar faces, ...), making a clear distinction between identifying a category (an element of a discrete set) and estimating a continuous parameter (such as a direction). We exhibit a link between optimal Bayesian decoding and coding efficiency, the latter being measured by the mutual information between the discrete category set and the neural activity. We characterize the properties of the best estimator of the likelihood of the category, when this estimator takes its inputs from a large population of stimulus-specific coding cells. Adopting the diffusion-to-bound approach to model the decisional process, this allows to relate analytically the bias and variance of the diffusion process underlying decision making to macroscopic quantities that are behaviorally measurable. A major consequence is the existence of a quantitative link between reaction times and discrimination accuracy. The resulting analytical expression of mean reaction times during an identification task accounts for empirical facts, both qualitatively (e.g. more time is needed to identify a category from a stimulus at the boundary compared to a stimulus lying within a category), and quantitatively (working on published experimental data on phoneme identification tasks)
    • 

    corecore