14 research outputs found

    Firm size and the pre- holiday effect in New Zealand

    Get PDF
    Using a sample spanning four decades, we document that the pre-holiday effect, one of the most common of the calendar effect anomalies, still exists in the New Zealand market. Contrary to international evidence, the effect appears to have increased over time. Moreover, we find that this effect is inversely related to firm size with the entire effect limited only to small firms, with no pre-holiday price patterns being observed for medium to large firms. The existence of this pre-holiday effect seems to be mainly driven by factors relevant to New Zealand. A search for possible reasons for the persistence of the effect points primarily towards the illiquidity of smaller stocks and the reluctance of small investors to buy prior to major market closures

    GDP, share prices and share returns: Australian and New Zealand evidence

    Get PDF
    With the aim of predicting share market returns, many empirical studies have delved into how financial and macroeconomic variables can be used to forecast return variability. The aim of this paper is to examine whether the ratio of aggregate share price to GDP can capture the variation of future returns on the aggregate share market within Australia and New Zealand. Using quarterly and semi-annual data for the period 1991-2003 for New Zealand and 1982-2006 for Australia, this study finds that the ratio of share price to GDP indeed captures a significant amount of the variation of returns on the New Zealand share market as well as the Australian share market; however results for Australian data do vary, depending on the sample period. Results in this paper generally provide support for the theory behind previous papers, specifically that of Rangvid (2006)

    All-sky search for long-duration gravitational wave transients with initial LIGO

    Get PDF
    We present the results of a search for long-duration gravitational wave transients in two sets of data collected by the LIGO Hanford and LIGO Livingston detectors between November 5, 2005 and September 30, 2007, and July 7, 2009 and October 20, 2010, with a total observational time of 283.0 days and 132.9 days, respectively. The search targets gravitational wave transients of duration 10-500 s in a frequency band of 40-1000 Hz, with minimal assumptions about the signal waveform, polarization, source direction, or time of occurrence. All candidate triggers were consistent with the expected background; as a result we set 90% confidence upper limits on the rate of long-duration gravitational wave transients for different types of gravitational wave signals. For signals from black hole accretion disk instabilities, we set upper limits on the source rate density between 3.4×10-5 and 9.4×10-4 Mpc-3 yr-1 at 90% confidence. These are the first results from an all-sky search for unmodeled long-duration transient gravitational waves. © 2016 American Physical Society

    All-sky search for long-duration gravitational wave transients with initial LIGO

    Get PDF
    We present the results of a search for long-duration gravitational wave transients in two sets of data collected by the LIGO Hanford and LIGO Livingston detectors between November 5, 2005 and September 30, 2007, and July 7, 2009 and October 20, 2010, with a total observational time of 283.0 days and 132.9 days, respectively. The search targets gravitational wave transients of duration 10-500 s in a frequency band of 40-1000 Hz, with minimal assumptions about the signal waveform, polarization, source direction, or time of occurrence. All candidate triggers were consistent with the expected background; as a result we set 90% confidence upper limits on the rate of long-duration gravitational wave transients for different types of gravitational wave signals. For signals from black hole accretion disk instabilities, we set upper limits on the source rate density between 3.4×10-5 and 9.4×10-4 Mpc-3 yr-1 at 90% confidence. These are the first results from an all-sky search for unmodeled long-duration transient gravitational waves. © 2016 American Physical Society

    Bizygomatic breadth determination in damaged skulls

    No full text
    Metric and discriminant function analyses of the skull have been used successfully to determine ancestry and sex from human skeletal remains in both forensic and archaeological contexts. However, skulls are frequently discovered in damaged condition. One structure that is commonly fragmented, even when the rest of the skull is preserved, is the zygomatic arch. The bizygomatic width is an important measurement in craniometry and in forensic facial reconstruction for determining facial width; therefore we propose a simple linear regression model to predict the bizygomatic width of skulls with damaged zygomatic arches. Thirty-one adult skulls originating from the Indian sub-continent were used to measure the bizygomatic width. Then, on the same skulls, a straight steel wire was placed at the superior surface of the temporal and zygomatic origins of the zygomatic arch to simulate the zygomatic arch reconstruction on damaged skulls. These wire measurements were used to fit a simple linear regression model between the bizygomatic widths and the wire measurements, and the estimated regression model; Bizygomatic Width (bone) = 0.61 + 1.02(wire measurement), has a very high R2 value of 0.91. Hence, this model could effectively be used to predict bizygomatic widths based on wire measurements, In addition, the bizygomatic widths and wire measurements were collected from 14 New Zealand European skulls to test the ability of the regression model to determine bizygomatic widths from different ethnic groups. This model accurately predicted the bizygomatic widths in New Zealands of European origin skulls and therefore suggests that this regression model could be used for other ethnic groups. The importance of the bizygomatic width for craniometric analysis makes this regression model particularly useful for analysing archaeological samples. Furthermore, this regression line can be used in the field of forensic facial reconstruction to reconstruct damaged zygomatic arches prior to facial reconstructions

    DEA as a tool for predicting corporate failure and success: A case of bankruptcy assessment

    No full text
    Using an additive super-efficiency data envelopment analysis (DEA) model, this paper develops a new assessment index based on two frontiers for predicting corporate failure and success. The proposed approach is applied to a random sample of 1001 firms, which is composed of 50 large US bankrupt firms randomly selected from Altman's bankruptcy database and 901 healthy matching firms. This sample represents the largest firms that went bankrupt over the period 1991-2004 and represents a full spectrum of industries. Our findings demonstrate that the DEA model is relatively weak in predicting corporate failures compared to healthy firm predictions, and the assessment index improves this weakness by giving the decision maker various options to achieve different precision levels of bankrupt, non-bankrupt, and total predictions.Data envelopment analysis (DEA) Bankruptcy Corporate failure Corporate success Bankruptcy

    DEA as a tool for bankruptcy assessment: A comparative study with logistic regression technique

    No full text
    This paper proposes data envelopment analysis (DEA) as a quick-and-easy tool for assessing corporate bankruptcy. DEA is a non-parametric method that measures weight estimates (not parameter estimates) of a classification function for separating default and non-default firms. Using a recent sample of large corporate failures in the United States, we examine the capability of DEA in assessing corporate bankruptcy by comparing it with logistic regression (LR). We find that DEA outperforms LR in evaluating bankruptcy out-of-sample. This feature of DEA is appealing and has practical relevance for investors. Another advantage of DEA over LR is that it does not have assumptions associated with statistical and econometric methods. Furthermore, DEA does not need a large sample size for bankruptcy evaluation, usually required by such statistical and econometric approaches. The need for such a large sample size is a significant disadvantage to practitioners when investment decisions are made using small samples. DEA can bypass such a difficulty related to a sample size. Thus, DEA is a practically appealing method for bankruptcy assessment.Bankruptcy Data envelopment analysis Logit regression
    corecore