1,129,159 research outputs found
Quantum state targeting
We introduce a new primitive for quantum communication that we term "state
targeting" wherein the goal is to pass a test for a target state even though
the system upon which the test is performed is submitted prior to learning the
target state's identity. Success in state targeting can be described as having
some control over the outcome of the test. We show that increasing one's
control above a minimum amount implies an unavoidable increase in the
probability of failing the test. This is analogous to the unavoidable
disturbance to a quantum state that results from gaining information about its
identity, and can be shown to be a purely quantum effect. We provide some
applications of the results to the security analysis of cryptographic tasks
implemented between remote antagonistic parties. Although we focus on weak coin
flipping, the results are significant for other two-party protocols, such as
strong coin flipping, partially binding and concealing bit commitment, and bit
escrow. Furthermore, the results have significance not only for the traditional
notion of security in cryptography, that of restricting a cheater's ability to
bias the outcome of the protocol, but also on a novel notion of security that
arises only in the quantum context, that of cheat-sensitivity. Finally, our
analysis of state targeting leads to some interesting secondary results, for
instance, a generalization of Uhlmann's theorem and an operational
interpretation of the fidelity between two mixed states
MacBeth as a MCDA Tool to Benchmark the Iberian Airports
This work relates to airports benchmarking which is a very important issue for stakeholders. Airports benchmarking depends on airport performance indicators which are also important issues for business and operational management, regulatory bodies, airlines and passengers. There are several sets of indicators to evaluate airports performance and also there are several techniques to benchmark airports. This work uses MacBeth - a MCDA (Multi-Criteria Decision Analysis) tool, to evaluate the attractiveness of the most important Iberian Airports. This approach is a new one and the preliminary results are very promising when compared with some traditional studies of airports benchmarking. Key words: Airports Benchmarking, MCDA/MacBeth, Iberian Airports
Automatic disruption classification in JET with the ITER-like wall
The new full-metal ITER-like wall at JET was found to have a deep impact on the physics of
disruptions at JET. In order to develop disruption classification, the 10D operational space of
JET with the new ITER-like wall has been explored using the generative topographic mapping
method. The 2D map has been exploited to develop an automatic disruption classification of
several disruption classes manually identified. In particular, all the non-intentional disruptions
have been considered, that occurred in JET from 2011 to 2013 with the new wall. A statistical
analysis of the plasma parameters describing the operational spaces of JET with carbon wall
and JET ITER-like wall has been performed and some physical considerations have been
made on the difference between these two operational spaces and the disruption classes which
can be identified. The performance of the JET- ITER-like wall classifier is tested in realtime
in conjunction with a disruption predictor presently operating at JET with good results.
Moreover, to validate and analyse the results, another reference classifier has been developed,
based on the k-nearest neighbour technique. Finally, in order to verify the reliability of the
performed classification, a conformal predictor based on non-conformity measures has been
developed
DATA MINING AND THE PROCESS OF TAKING DECISIONS IN EBUSINESS
Data mining software allows users to analyze large databases to solve business decision problems. Data mining is, in some ways, an extension of statistics, with a few artificial intelligence and machine learning twists thrown in. Like statistics, data mining is not a business solution, it is just a technology. For example, consider a catalog retailer who needs to decide who should receive information about a new product. The information operated on by the data mining process is contained in a historical database of previous interactions with customers and the features associated with the customers, such as age, zip code, their responses. The data mining software would use this historical information to build a model of customer behavior that could be used to predict which customers would be likely to respond to the new product. By using this information a marketing manager can select only the customers who are most likely to respond. The operational business software can then feed the results of the decision to the appropriate touch point systems (call centers, direct mail, web servers, email systems, etc.) so that the right customers receive the right offers.data mining, business decisions, data analysis, cluster analysis, decision strategy
Applied business analytics approach to IT projects – Methodological framework
The design and implementation of a big data project differs from a typical business intelligence project that might be presented concurrently within the same organization. A big data initiative typically triggers a large scale IT project that is expected to deliver the desired outcomes. The industry has identified two major methodologies for running a data centric project, in particular SEMMA (Sample, Explore, Modify, Model and Assess) and CRISP-DM (Cross Industry Standard Process for Data Mining). More general, the professional organizations PMI (Project Management Institute) and IIBA (International Institute of Business Analysis) have defined their methods for project management and business analysis based on the best current industry practices. However, big data projects place new challenges that are not considered by the existing methodologies. The building of end-to-end big data analytical solution for optimization of the supply chain, pricing and promotion, product launch, shop potential and customer value is facing both business and technical challenges. The most common business challenges are unclear and/or poorly defined business cases; irrelevant data; poor data quality; overlooked data granularity; improper contextualization of data; unprepared or bad prepared data; non-meaningful results; lack of skill set. Some of the technical challenges are related to lag of resources and technology limitations; availability of data sources; storage difficulties; security issues; performance problems; little flexibility; and ineffective DevOps. This paper discusses an applied business analytics approach to IT projects and addresses the above-described aspects. The authors present their work on research and development of new methodological framework and analytical instruments applicable in both business endeavors, and educational initiatives, targeting big data. The proposed framework is based on proprietary methodology and advanced analytics tools. It is focused on the development and the implementation of practical solutions for project managers, business analysts, IT practitioners and Business/Data Analytics students. Under discussion are also the necessary skills and knowledge for the successful big data business analyst, and some of the main organizational and operational aspects of the big data projects, including the continuous model deployment
Performance and Congestion Analysis of the Portuguese Hospital Services
The health care services have been characterised by a growing demand by the citizens leading to the need of more and more resources. Population aging, new pathologies and drugs as well as new treatments are some of the major factors for this. However, in hospitals, for example, consumption of a large number of inputs frequently has not corresponded to the production of the same or more proportion of outputs. Sometimes, the outputs even decline with the increase of inputs due to the influence of the congestion effect on efficiency. The heavy burden of the health sector on the state budget brings about the interest of research over its efficiency. This paper aims to assess the performance of the Portuguese hospitals and particularly the contribution of the congestion effect. We use the non-parametric technique of data envelopment analysis (DEA) for this purpose and a double-bootstrap procedure to take into account the influence of operational environment on efficiency. Afterwards, by comparing three different approaches we determine the importance of congestion in efficiency measurement and discuss its computation methodologically. The results suggest significant levels of inefficiency in 68 major Portuguese hospitals for the year 2005 and more than half of them were found to be congested.Hospitals; congestion; efficiency; DEA; Portugal
ANALYZING OFFICIAL AND OPERATIONAL CURRICULUM OF SOCIAL STUDIES TEACHER EDUCATION
The aim of this research is to analyze the Social Studies Teacher Education Program (SSTEP) and determine the problems encountered in practice according to the views of teacher educators and prospective teachers at Ankara University, Faculty of Educational Sciences. Two study groups of participants were included in this study. Ten prospective social studies teachers in their last year of matriculation constituted the first study group. The second group consisted of teacher educators lecturing in SSTEP. All participants voluntarily expressed their views on the official and operational curriculum. A focus group interview with prospective teachers and semi-structured interviews with teacher educators provided the data. Data collected from both study groups were analyzed using descriptive analysis technique. Results showed that some changes were necessary concerning the sequence of some courses per semester and the theory-practice balance of the courses in the program. Furthermore, some courses should be removed from the program and some new courses should be added. As for teacher educators’ and prospective teachers’ views on operational curriculum, it is emphasized that there are some problems related with physical conditions, teacher educators, learning-teaching processes, evaluation of teacher educators, student profiles, teaching practices, and a lack of some units. Results are thought to be significant in terms of contributing to the development of teacher education programs. KEYWORDS: social studies teaching, teacher education program, prospective teacher.DOI: http://dx.doi.org/10.15181/atee.v1i0.66
Bank ownership and performance in the Middle East and North Africa region
Although both domestic and foreign private banks have gained ground in MENA in recent years, state banks continue to play an important role in many countries. Using a MENA bank-level panel dataset for the period 2001-08, the paper contributes to the empirical literature by documenting recent ownership trends and assessing the role of ownership and bank performance in MENA while accounting for key bank characteristics such as size and balance sheet composition. The paper analyzes headline performance indicators as well as their key drivers and finds that state banks exhibit significantly weaker performance, despite their larger size. This result is mainly driven by a larger holding of government securities, higher costs due to larger staffing numbers, and larger loan loss provisions reflecting weaker asset quality. The results reflect both operational inefficiencies and policy mandates. The paper also provides a detailed performance analysis of foreign and listed banks. Foreign banks are fairly new in MENA, yet perform on par with domestic banks despite their smaller size and higher investment costs. Listed banks exhibit superior performance driven by higher interest margins even in the face of higher costs associated with listing. Taken together, the results do not reject the development role for state banks, but do show that their intervention comes at a cost. As such, there is scope to reduce the share of state banks in some countries and to clarify the mandates, improve the governance, and strengthen the operational efficiency of most state banks in MENA.Banks&Banking Reform,Access to Finance,Debt Markets,Corporate Law,Bankruptcy and Resolution of Financial Distress
Which causal structures might support a quantum-classical gap?
A causal scenario is a graph that describes the cause and effect
relationships between all relevant variables in an experiment. A scenario is
deemed `not interesting' if there is no device-independent way to distinguish
the predictions of classical physics from any generalised probabilistic theory
(including quantum mechanics). Conversely, an interesting scenario is one in
which there exists a gap between the predictions of different operational
probabilistic theories, as occurs for example in Bell-type experiments. Henson,
Lal and Pusey (HLP) recently proposed a sufficient condition for a causal
scenario to not be interesting. In this paper we supplement their analysis with
some new techniques and results. We first show that existing graphical
techniques due to Evans can be used to confirm by inspection that many graphs
are interesting without having to explicitly search for inequality violations.
For three exceptional cases -- the graphs numbered 15,16,20 in HLP -- we show
that there exist non-Shannon type entropic inequalities that imply these graphs
are interesting. In doing so, we find that existing methods of entropic
inequalities can be greatly enhanced by conditioning on the specific values of
certain variables.Comment: 13 pages, 9 figures, 1 bicycle. Added an appendix showing that
e-separation is strictly more general than the skeleton method. Added journal
referenc
- …