268 research outputs found

    Linear Codes from Some 2-Designs

    Full text link
    A classical method of constructing a linear code over \gf(q) with a tt-design is to use the incidence matrix of the tt-design as a generator matrix over \gf(q) of the code. This approach has been extensively investigated in the literature. In this paper, a different method of constructing linear codes using specific classes of 22-designs is studied, and linear codes with a few weights are obtained from almost difference sets, difference sets, and a type of 22-designs associated to semibent functions. Two families of the codes obtained in this paper are optimal. The linear codes presented in this paper have applications in secret sharing and authentication schemes, in addition to their applications in consumer electronics, communication and data storage systems. A coding-theory approach to the characterisation of highly nonlinear Boolean functions is presented

    Part I:

    Get PDF

    Statistical mechanics and dynamics of solvable models with long-range interactions

    Get PDF
    The two-body potential of systems with long-range interactions decays at large distances as V(r)1/rαV(r)\sim 1/r^\alpha, with αd\alpha\leq d, where dd is the space dimension. Examples are: gravitational systems, two-dimensional hydrodynamics, two-dimensional elasticity, charged and dipolar systems. Although such systems can be made extensive, they are intrinsically non additive. Moreover, the space of accessible macroscopic thermodynamic parameters might be non convex. The violation of these two basic properties is at the origin of ensemble inequivalence, which implies that specific heat can be negative in the microcanonical ensemble and temperature jumps can appear at microcanonical first order phase transitions. The lack of convexity implies that ergodicity may be generically broken. We present here a comprehensive review of the recent advances on the statistical mechanics and out-of-equilibrium dynamics of systems with long-range interactions. The core of the review consists in the detailed presentation of the concept of ensemble inequivalence, as exemplified by the exact solution, in the microcanonical and canonical ensembles, of mean-field type models. Relaxation towards thermodynamic equilibrium can be extremely slow and quasi-stationary states may be present. The understanding of such unusual relaxation process is obtained by the introduction of an appropriate kinetic theory based on the Vlasov equation.Comment: 118 pages, review paper, added references, slight change of conten

    Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain

    Get PDF
    The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio

    Word Knowledge and Word Usage

    Get PDF
    Word storage and processing define a multi-factorial domain of scientific inquiry whose thorough investigation goes well beyond the boundaries of traditional disciplinary taxonomies, to require synergic integration of a wide range of methods, techniques and empirical and experimental findings. The present book intends to approach a few central issues concerning the organization, structure and functioning of the Mental Lexicon, by asking domain experts to look at common, central topics from complementary standpoints, and discuss the advantages of developing converging perspectives. The book will explore the connections between computational and algorithmic models of the mental lexicon, word frequency distributions and information theoretical measures of word families, statistical correlations across psycho-linguistic and cognitive evidence, principles of machine learning and integrative brain models of word storage and processing. Main goal of the book will be to map out the landscape of future research in this area, to foster the development of interdisciplinary curricula and help single-domain specialists understand and address issues and questions as they are raised in other disciplines

    Improved micro-contact resistance model that considers material deformation, electron transport and thin film characteristics

    No full text
    This paper reports on an improved analytic model forpredicting micro-contact resistance needed for designing microelectro-mechanical systems (MEMS) switches. The originalmodel had two primary considerations: 1) contact materialdeformation (i.e. elastic, plastic, or elastic-plastic) and 2) effectivecontact area radius. The model also assumed that individual aspotswere close together and that their interactions weredependent on each other which led to using the single effective aspotcontact area model. This single effective area model wasused to determine specific electron transport regions (i.e. ballistic,quasi-ballistic, or diffusive) by comparing the effective radius andthe mean free path of an electron. Using this model required thatmicro-switch contact materials be deposited, during devicefabrication, with processes ensuring low surface roughness values(i.e. sputtered films). Sputtered thin film electric contacts,however, do not behave like bulk materials and the effects of thinfilm contacts and spreading resistance must be considered. Theimproved micro-contact resistance model accounts for the twoprimary considerations above, as well as, using thin film,sputtered, electric contact

    Estimating Dependences and Risk between Gold Prices and S&P500: New Evidences from ARCH,GARCH, Copula and ES-VaR models

    Get PDF
    This thesis examines the correlations and linkages between the stock and commodity in order to quantify the risk present for investors in financial market (stock and commodity) using the Value at Risk measure. The risk assessed in this thesis is losses on investments in stock (S&P500) and commodity (gold prices). The structure of this thesis is based on three empirical chapters. We emphasise the focus by acknowledging the risk factor which is the non-stop fluctuation in the prices of commodity and stock prices. The thesis starts by measuring volatility, then dependence which is the correlation and lastly measure the expected shortfalls and Value at risk (VaR). The research focuses on mitigating the risk using VaR measures and assessing the use of the volatility measures such as ARCH and GARCH and basic VaR calculations, we also measured the correlation using the Copula method. Since, the measures of volatility methods have limitations that they can measure single security at a time, the second empirical chapter measures the interdependence of stock and commodity (S&P500 and Gold Price Index) by investigating the risk transmission involved in investing in any of them and whether the ups and downs in the prices of one effect the prices of the other using the Time Varying copula method. Lastly, the third empirical chapter which is the last chapter, investigates the expected shortfalls and Value at Risk (VaR) between the S&P500 and Gold prices Index using the ES-VaR method proposed by Patton, Ziegel and Chen (2018). Volatility is considered to be the most popular and traditional measure of risk. For which we have used ARCH and GARCH model in our first empirical chapter. However, the problem with volatility is that it does not take into account the direction of an investments’ movement: volatility of stocks is that they suddenly jump higher and investors are not distressed with gains. When we talk about investors for them the risk is about the odds of losing money, after my research and findings VaR is based on the common-sense fact. Hence, investors care about the odds of big losses, VaR answers the question, what is my worst-case scenario? Or simply how much I could lose in a really bad month? The results of the thesis demonstrated that measuring volatility (ARCH GARCH) alone was not sufficient in measuring the risk involved in an investment therefore methodologies such as correlation and VAR demonstrates better results. In terms of measuring the interdependence, the Time Varying Copula is used since the dynamic structure of the de- pendence between the data can be modelled by allowing either the copula function or the dependence parameter to be time varying. Lastly, hybrid model further demonstrates the average return on a risky asset for which Expected Shortfall (ES) along with some quantile dependence and VaR (Value at risk) is utilised. Basel III Accord which is applied in coming years till 2019 focuses more on ES unlike VaR, hence there is little existing work on modelling ES. The thesis focused on the results from the model of Patton, Ziegel and Chen (2018) which is based on the statistical decision theory. Patton, Ziegel and Chen (2018), overcame the problem of elicitability for ES by using ES and VaR jointly and propose the new dynamic model of risk measure. This research adds to the contribution of knowledge that measuring risk by using volatility is not enough for measuring risk, interdependence helps in measuring the dependency of one variable over the other and estimations and inference methods proposed by Patton, Ziegel and Chen (2018) using simulations proposed in ES-VaR model further concludes that ARCH and GARCH or other rolling window models are not enough for determining the risk forecasts. The results suggest, in first empirical chapter we see volatility between Gold prices and S&P500. The second empirical chapter results suggest conditional dependence of the two indexes is strongly time varying. The correlation between the stock is high before 2008. The results further displayed slight stronger bivariate upper tail, which signifies that the conditional dependence of the indexes is influence by positive shocks. The last empirical chapter findings proposed that measuring forecasts using ES-Var model proposed by Patton, Ziegel and Chen (2018) does outer perform forecasts based on univariate GARCH model. Investors want to 10 protect themselves from high losses and ES-VaR model discussed in last chapter would certainly help them to manage their funds properly
    corecore