1,096 research outputs found
Semantic Support for Log Analysis of Safety-Critical Embedded Systems
Testing is a relevant activity for the development life-cycle of Safety
Critical Embedded systems. In particular, much effort is spent for analysis and
classification of test logs from SCADA subsystems, especially when failures
occur. The human expertise is needful to understand the reasons of failures,
for tracing back the errors, as well as to understand which requirements are
affected by errors and which ones will be affected by eventual changes in the
system design. Semantic techniques and full text search are used to support
human experts for the analysis and classification of test logs, in order to
speedup and improve the diagnosis phase. Moreover, retrieval of tests and
requirements, which can be related to the current failure, is supported in
order to allow the discovery of available alternatives and solutions for a
better and faster investigation of the problem.Comment: EDCC-2014, BIG4CIP-2014, Embedded systems, testing, semantic
discovery, ontology, big dat
Information Overload in Monopsony Markets
I consider a situation in which heterogenous senders (applicants) compete in order to be selected by one receiver (employer). Productivity is private information to the senders, and the receiver processes imperfect signals (applications) to screen among applicants. The information-processing technology is imperfect: the accuracy of each signal in predicting the unknown productivity decreases with the total number of signals processed. I show that, for a sufficiently large market, information overload occurs as there exist equilibria in which too many people apply and the receiver neglects some applications. For any information-processing technology level, information overload equilibria emerge when the cost of sending applications is low relatively to the existing technology level. The magnitude of information overload is bounded and it is larger if the receiver cannot neglect applications. As a result, an overloaded market in which the receiver has to process all applications is less efficient than an overloaded market where neglecting excessive information is an option
Information Overload in Multi-Stage Selection Procedures
The paper studies information processing imperfections in a fully rational decision-making network. It is shown that imperfect information transmission and imperfect information acquisition in a multi-stage selection game yield information overload. The paper analyses the mechanisms responsible for a seeming bounded rational behavior of the network and shows their similarities and distinctions. Two special cases of filtering selection procedures are investigated, where the overload takes its most limiting forms. The model developed in the paper can be applied both to organizations and to individuals. It can serve as a rational foundation for bounded rationality
Evaluation Problem versus Selection Problem in Organizational Structures
We consider a hierarchical organization with two fully rational agents. The goal of the organization is that of selecting the best alternative out of several available, and agents are heterogenous in the accuracy with which they screen the alternatives. We show that, if internal communications between agents is not possible, the ordering of agents affects the performance of the organization. More specifically, we find that the expected payoff of the organization improves when the more accurate agent screens first. Finally, we note that such optimal ordering makes the hierarchy formally identical to one in which the internal communication flow is perfect
Medición de títulos de deuda a ser mantenidos hasta su vencimiento y no afectados por coberturas : el caso de títulos que devengan intereses sobre saldos ajustados por CER
La Resolución Técnica (RT) N" 17 de la FACPCE prevé la utilización del Método del Costo Amortizado (o Método de la Tasa Efe
ctiva) para la medición periódica primaria de algunos componentes patrimoniales. Entre los que se incluyen las inversiones en títulos de deuda a ser mantenidos hasta su vencimiento y no afectados por coberturas. Básicamente, la medición de activos a través del Método del Costo Amortizado implica sumar a1 importe del activo contabilizado inicialmente -neto de cobranzas- los resultados financieros devengados, calculados "exponencialmente" sobre la base de dicho importe utilizando la tasa interna de retorno determinada al momento de la medición inicial. No obstante, en la aplicación concreta de este método se presentan algunas cuestiones de dudosa resolución.
Particularmente, en relación a aquellos títulos de deuda emitidos con tasa de interés variable a determinar para cada servicio de intereses, o bien -como ocurre con gran parte de los títulos que pueden encontrarse en el mercado- emitidos con tasa de interés constante pero que devengan intereses sobre saldos que deben ajustarse también para cada servicio de intereses. En estos casos, al tener que determinar la tasa interna de retorno al momento de la medición inicial, surge la necesidad de "estimar" los flujos de fondos a cobrar, respecto de lo cual ni la normativa vigente -RT No 17- ni la doctrina contable han establecido pautas a seguir. En este marco, en el presente trabajo se analiza el Método del Costo Amortiza como criterio de medición de ciertos títulos de deuda, poniendo especial énfasis en los aspectos que hacen a la concreta y efectiva operativización del mismo en el caso particular de los títulos que devengan intereses sobre saldos ajustados por CER.Fil: Bersia, Paola.
Universidad Nacional de Río CuartoFil: Ficco, Cecilia.
Universidad Nacional de Río CuartoFil: Ricci, Sergio.
Universidad Nacional de Río Cuart
Live Migration in Emerging Cloud Paradigms
The elastic provisioning of resources and the capability to adapt to changing resource demand and environmental conditions on-the-fly are, probably, key success factors of cloud computing. Live migration of virtual resources is of pivotal importance in achieving such key properties. However, the ability to effectively and efficiently determine which resource to be migrated and where, by satisfying proper objectives and constraints, remains a research challenge. The existing literature is generally based on metaheuristics running a central resolver. Such an approach is not suitable because it only considers the quality-of-service aspect during the decision-making performance while ignoring the regulatory challenges. This column highlights the regulatory challenges associated with the cross-border dataflow implication of migration and stresses the need to adopt alternative decision approaches.postprin
A Theory of Procedurally Rational Choice: Optimization without Evaluation
This paper analyses the behavior of an individual who wants to maximize his utility function, but he is not able to evaluate it. There are many ways to choose a single alternative from a given set. We show that a unique utility maximizing procedure exists. Choices induced by this optimal procedure are always transitive but generally violate the Weak Axiom. In other words, utility maximizing individuals who are unable to evaluate their objective functions fail to exhibit rational revealed preferences
Hybrid Simulation and Test of Vessel Traffic Systems on the Cloud
This paper presents a cloud-based hybrid simulation platform to test large-scale distributed System-of-Systems (SoS) for the management and control of maritime traffic, the so-called Vessel Traffic Systems (VTS). A VTS consists of multiple, heterogeneous, distributed and interoperating systems, including radar, automatic identification systems, direction finders, electro-optical sensors, gateways to external VTSs, information systems; identifying, representing and analyzing interactions is a challenge to the evaluation of the real risks for safety and security of the marine environment. The need for reproducing in fabric the system behaviors that could occur in situ demands for the ability of integrating emulated and simulated environments to cope with the different testability requirements of involved systems and to keep testing cost sustainable. The platform exploits hybrid simulation and virtualization technologies, and it is deployable on a private cloud, reducing the cost of setting up realistic and effective testing scenarios
A systematic investigation of mental representations of faces: face-space organization and its relation to individual face processing skills
The main aim of the present thesis is examining the contribution of face-space properties to individual differences in face identity processing. I consider the neural and behavioural differences between typical and distinctive faces as proxies to study face-space properties. In the first two experiments, I combine fMRI and ALE meta-analysis to investigate the neural effects of face-space location on BOLD signal. I also investigate the same effect at the behavioural level in my fourth experiment, by quantifying reaction times and learning rates in response to faces of different typicality levels. The data presented in this thesis suggest that there are, indeed, small but systematic neural and behavioural differences between typical and distinctive faces, which reflect different location and density in face-spaces. In the third and fourth experiments, I more directly investigate the relationship between the properties of individual face-spaces and face identity processing skills. In the third experiment, I assess the difference between participants with high and low face memory performance in a well-established neural marker of typicality processing, the P200 ERP component, measured through EEG. In the fourth experiment, I use cutting-edge statistical techniques to measure the relationship between individuals’ face processing skills and their performance during a face detection task, from which I derive indicators of faces-space expansion and adaptability. Additionally, this thesis aims to fill the highlighted theoretical gap between models of face representation and predictive processing theories. Therefore, the first study also investigates the relationship between the effects of face typicality and predictability on BOLD signal. Overall, this thesis represents a systematic investigation about how known faces are mentally represented and the extent to which this relates to individual face processing performance profiles
Comparison of academic success between students who live in single parent households and students who live in two-parent households
The purpose of thus study was to compare the impact household composition has on the academic achievement of elementary school students. The sample contained 83 third and fourth grade students from a small Southern New Jersey school district, 22 from single parent households, and 61 from two parent households. The grade point averages for reading, language and mathematics were recorded for two consecutive marking periods and the measuring device used was a paired sample t-test. The results of the study show a high correlation between household composition and academic achievement. The grade point averages of the students in two parent households were significantly higher than the grade point averages of the students in single parent households. The conclusions of the study show that household composition does have an impact on student academic achievement
- …
