3,647 research outputs found
Modelling Share Price Behaviour Across Time
The Efficient Markets Hypothesis (EMH) is currently the dominant paradigm in Finance. This paper reviews the theoretical development of the hypothesis and the empirical testing which has occurred to determine its validity. Furthermore, empirical anomalies found by researchers in the Weak Form of the EMH are discussed and their theoretical interpretation critiqued. This paper also provides an overview of the Hamilton (1989) model and its extensions, one of the many econometric models developed in order to model the non-linearity in time-series such as stock prices.
From ACT-ONE to Miranda, a Translation Experiment
It is now almost universally acknowledged that the data language ACT-ONE associated with the formal description technique LOTOS is inappropriate for the purpose of OSI formal description. In response to this the LOTOS restandardisation activity plans to replace ACT-ONE with a functional language. Thus, compatibility between ACT-ONE and the replacement data language becomes an issue. In response to this, we present an experimental investigation of backward compatibility between ACT-ONE and the new LOTOS data language. Specifically, we investigate translating ACT-ONE data types into the functional language Miranda. Miranda has been chosen as it is a widely used functional programming language and it is close in form to the anticipated new data language. This work serves as a ``verification of concept'' for translating ACT-ONE to the E-LOTOS data language. It identifies the bounds on embedding ACT-ONE in a functional data language. In particular, it indicates what can be translated and what cannot be translated. In addition, the paper reveals pertinent issues which can inform the E-LOTOS work. For example, which constructs are needed in E-LOTOS in order to support the class of data type specifications typically made in the LOTOS setting? We conclude with a number of specific recommendations for the E-LOTOS data language
Macroscale and Nanoscale Photoelectrochemical Behavior of p-Type Si(111) Covered by a Single Layer of Graphene or Hexagonal Boron Nitride
Two-dimensional (2D) materials may enable a general approach to the introduction of a dipole at a semiconductor surface as well as control over other properties of the double layer at a semiconductor/liquid interface. Vastly different properties can be found in the 2D materials currently studied due in part to the range of the distribution of density-of-states. In this work, the open-circuit voltage (V_(oc)) of p-SiâH, p-Si/Gr (graphene), and p-Si/h-BN (hexagonal boron nitride) in contact with a series of one-electron outer-sphere redox couples was investigated by macroscale measurements as well as by scanning electrochemical cell microscopy (SECCM). The band gaps of Gr and h-BN (0â5.97 eV) encompass the wide range of band gaps for 2D materials, so these interfaces (p-Si/Gr and p-Si/h-BN) serve as useful references to understand the behavior of 2D materials more generally. The value of V_(oc) shifted with respect to the effective potential of the contacting solution, with slopes (ÎV_(oc)/ÎE_(Eff)) of â0.27 and â0.38 for p-Si/Gr and p-Si/h-BN, respectively, indicating that band bending at the p-Si/h-BN and p-Si/Gr interfaces responds at least partially to changes in the electrochemical potential of the contacting liquid electrolyte. Additionally, SECCM is shown to be an effective method to interrogate the nanoscale photoelectrochemical behavior of an interface, showing little spatial variance over scales exceeding the grain size of the CVD-grown 2D materials in this work. The measurements demonstrated that the polycrystalline nature of the 2D materials had little effect on the results and confirmed that the macroscale measurements reflected the junction behavior at the nanoscale
Recommended from our members
Termination Criteria for Computerized Classification Testing
Computerized classification testing (CCT) is an approach to designing tests with intelligent algorithms, similar to adaptive testing, but specifically designed for the purpose of classifying examinees into categories such as pass and fail . Like adaptive testing for point estimation of ability, the key component is the termination criterion, namely the algorithm that decides whether to classify the examinee and end the test or to continue and administer another item. This paper applies a newly suggested termination criterion, the generalized likelihood ratio (GLR), to CCT. It also explores the role of the indifference region in the specification of likelihood-ratio based termination criteria, comparing the GLR to the sequential probability ratio test. Results from simulation studies suggest that the GLR is always at least as efficient as existing methods. Accessed 7,385 times on https://pareonline.net from February 24, 2011 to December 31, 2019. For downloads from January 1, 2020 forward, please click on the PlumX Metrics link to the right
Recommended from our members
A Practitionerâs Guide for Variable-length Computerized Classification Testing
Variable-length computerized classification tests, CCTs, (Lin & Spray, 2000; Thompson, 2006) are a powerful and efficient approach to testing for the purpose of classifying examinees into groups. CCTs are designed by the specification of at least five technical components: psychometric model, calibrated item bank, starting point, item selection algorithm, and termination criterion. Several options exist for each of these CCT components, creating a myriad of possible designs. Confusion among designs is exacerbated by the lack of a standardized nomenclature. This article outlines the components of a CCT, common options for each component, and the interaction of options for different components, so that practitioners may more efficiently design CCTs. It also offers a suggestion of nomenclature. Accessed 17,852 times on https://pareonline.net from January 30, 2007 to December 31, 2019. For downloads from January 1, 2020 forward, please click on the PlumX Metrics link to the right
- âŠ