1,496 research outputs found

    Probability-Based Memory Access Controller (PMAC) for Energy Reduction in High Performance Processors

    Get PDF
    The increasing transistor density due to Moore's law scaling continues to drive the improvement in processor core performance with each process generation. The additional transistors are used to widen the pipeline, increase the size of the out-of-order instruction scheduling window, register files, queues and other pipeline data structures to extract high levels of instruction level parallelism and improve upon single- threaded performance. Such dynamically scheduled superscalar processor cores speculatively fetch and execute several instructions far ahead in a program, along the program path predicted by its branch predictors. During branch mispredictions, the architectural state of high performance processor cores can be restored at cost of high latency penalties, but the speculative memory requests sent by data memory access instructions on the mispredicted paths cannot be revoked. Such memory requests alter the data arrangement across memory hierarchy and result in wasted memory transactions, bandwidth and energy consumption. Even with low branch misprediction rates, these processor cores spend significant time on mispredicted program paths. In this thesis, we propose a probability based memory access controller to curb the data memory requests sent along mispredicted paths and achieve energy and memory bandwidth savings with minimum impact on performance. It computes path probability of instructions and throttles memory access instructions with low probability of execution. A deterministic or dynamically varying probability value is used as a threshold to control speculative memory requests sent to the memory hierarchy. The proposed design with a dynamic threshold reduces up to 51% of wrong path memory accesses and maximum of 31% of wrong path execution while achieving power savings up to 9.5% and maximum of 6.3% improvement in IPC/Watt in a single core processor system

    Yardstick Damages in Lost Profit Cases: An Econometric Approach

    Get PDF

    Critical Market Crashes

    Full text link
    This review is a partial synthesis of the book ``Why stock market crash'' (Princeton University Press, January 2003), which presents a general theory of financial crashes and of stock market instabilities that his co-workers and the author have developed over the past seven years. The study of the frequency distribution of drawdowns, or runs of successive losses shows that large financial crashes are ``outliers'': they form a class of their own as can be seen from their statistical signatures. If large financial crashes are ``outliers'', they are special and thus require a special explanation, a specific model, a theory of their own. In addition, their special properties may perhaps be used for their prediction. The main mechanisms leading to positive feedbacks, i.e., self-reinforcement, such as imitative behavior and herding between investors are reviewed with many references provided to the relevant literature outside the confine of Physics. Positive feedbacks provide the fuel for the development of speculative bubbles, preparing the instability for a major crash. We demonstrate several detailed mathematical models of speculative bubbles and crashes. The most important message is the discovery of robust and universal signatures of the approach to crashes. These precursory patterns have been documented for essentially all crashes on developed as well as emergent stock markets, on currency markets, on company stocks, and so on. The concept of an ``anti-bubble'' is also summarized, with two forward predictions on the Japanese stock market starting in 1999 and on the USA stock market still running. We conclude by presenting our view of the organization of financial markets.Comment: Latex 89 pages and 38 figures, in press in Physics Report

    Optimization of mechanical properties of multiscale hybrid polymer nanocomposites: A combination of experimental and machine learning techniques

    Get PDF
    Machine learning (ML) models provide fast and accurate predictions of material properties at a low computational cost. Herein, the mechanical properties of multiscale poly(3-hydroxybutyrate) (P3HB)-based nanocomposites reinforced with different concentrations of multiwalled carbon nanotubes (MWCNTs), WS2 nanosheets and sepiolite (SEP) nanoclay have been predicted. The nanocomposites were prepared via solution casting. SEM images revealed that the three nanofillers were homogenously and randomly dispersed into the matrix. A synergistic reinforcement effect was attained, resulting in an unprecedented stiffness improvement of 132% upon addition of 1:2:2 wt% SEP:MWCNTs:WS2. Conversely, the increments in strength were only moderates (up to 13.4%). A beneficial effect in the matrix ductility was also found due to the presence of both nanofillers. Four ML approaches, Recurrent Neural Network (RNN), RNN with Levenberg's algorithm (RNN-LV), decision tree (DT) and Random Forest (RF), were applied. The correlation coefficient (R2), mean absolute error (MAE) and mean square error (MSE) were used as statistical indicators to compare their performance. The best-performing model for the Young's modulus was RNN-LV with 3 hidden layers and 50 neurons in each layer, while for the tensile strength was the RF model using a combination of 100 estimators and a maximum depth of 100. An RNN model with 3 hidden layers was the most suitable to predict the elongation at break and impact strength, with 90 and 50 neurons in each layer, respectively. The highest correlation (R2 of 1 and 0.9203 for the training and test set, respectively) and the smallest errors (MSE of 0.13 and MAE of 0.31) were obtained for the prediction of the elongation at break. The developed models represent a powerful tool for the optimization of the mechanical properties in multiscale hybrid polymer nanocomposites, saving time and resources in the experimental characterization process

    Behavioural finance perspectives on Malaysian stock market efficiency

    Get PDF
    AbstractThis paper provides historical, theoretical, and empirical syntheses in understanding the rationality of investors, stock prices, and stock market efficiency behaviour in the theoretical lenses of behavioural finance paradigm. The inquiry is guided by multidisciplinary behavioural-related theories. The analyses employed a long span of Bursa Malaysia stock market data from 1977 to 2014 along the different phases of economic development and market states. The tests confirmed the presence of asymmetric dynamic behaviour of prices predictability as well as risk and return relationships across different market states, risk states and quantiles data segments. The efficiency tests show trends of an adaptive pattern of weak market efficiency across various economic phases and market states. Collectively, these evidences lend support to bounded-adaptive rational of investors' behaviour, dynamic stock price behaviour, and accordingly forming bounded-adaptive market efficiency

    A Branch-Directed Data Cache Prefetching Technique for Inorder Processors

    Get PDF
    The increasing gap between processor and main memory speeds has become a serious bottleneck towards further improvement in system performance. Data prefetching techniques have been proposed to hide the performance impact of such long memory latencies. But most of the currently proposed data prefetchers predict future memory accesses based on current memory misses. This limits the opportunity that can be exploited to guide prefetching. In this thesis, we propose a branch-directed data prefetcher that uses the high prediction accuracies of current-generation branch predictors to predict a future basic block trace that the program will execute and issues prefetches for all the identified memory instructions contained therein. We also propose a novel technique to generate prefetch addresses by exploiting the correlation between the addresses generated by memory instructions and the values of the corresponding source registers at prior branch instances. We evaluate the impact of our prefetcher by using a cycle-accurate simulation of an inorder processor on the M5 simulator. The results of the evaluation show that the branch-directed prefetcher improves the performance on a set of 18 SPEC CPU2006 benchmarks by an average of 38.789% over a no-prefetching implementation and 2.148% over a system that employs a Spatial Memory Streaming prefetcher

    Predictability of biomass burning in response to climate changes

    Get PDF
    Climate is an important control on biomass burning, but the sensitivity of fire to changes in temperature and moisture balance has not been quantified. We analyze sedimentary charcoal records to show that the changes in fire regime over the past 21,000 yrs are predictable from changes in regional climates. Analyses of paleo- fire data show that fire increases monotonically with changes in temperature and peaks at intermediate moisture levels, and that temperature is quantitatively the most important driver of changes in biomass burning over the past 21,000 yrs. Given that a similar relationship between climate drivers and fire emerges from analyses of the interannual variability in biomass burning shown by remote-sensing observations of month-by-month burnt area between 1996 and 2008, our results signal a serious cause for concern in the face of continuing global warming

    The development and validation of the leadership versatility index for students (LVI-S)

    Get PDF
    Directed by Dr. James M. Benshoff and Dr. Craig S. Cashwell pp. 350 According to Bass' (1990) summary of fifty years of research and nearly thirty dichotomy-based theories, leaders influence people through autocratic use of power (task-oriented) or through democratic use of power (people-oriented). Each style produces unique tensions and tradeoffs, but versatile leaders can incorporate strategies from both sides of the dichotomy, depending on situational needs (Kaplan, 1996). Versatile leaders avoid overusing strengths to the point of weakness--a frequently overlooked leadership flaw (Kaplan & Kaiser, 2006). The versatile leader concept shares much with synergistic supervision, a student affairs supervision model (Winston & Creamer, 1997; 1998). Synergistic supervisors blend strengths from autocratic and democratic approaches, creating synergistic relationships with those they lead (Winston & Creamer, 1997; 1998). Synergy and versatility may be considered different sides of the same coin. Until the Leadership Versatility Index--Student (LVI-S), no quantitative, multi-rater measure of leadership versatility was available for campus leaders. The LVI-S was derived from the executive-focused Leadership Versatility Index® (Kaplan and Kaiser, 2006). Participants were recruited from departments of housing and residence life across seven institutions in the Southeastern United States, including staff from small private colleges through large public universities. Resident Advisor supervisees (n = 262) rated leadership characteristics of their Hall Directors (n = 52); the study averaged 4.9 raters-per-leader. Convergent validity was tested using the Student Leadership Practices Inventory© (SLPI) (Kouzes & Posner, 2003); predictive validity was tested through a global effectiveness measure derived from Tsui's (1984) effectiveness research. LVI-S scale alphas exceeded .80 and scales offered compelling evidence of convergent and predictive validity. A strong predictive relationship was found between versatility and effectiveness (R = .60, Adj. R = .31, F = 7.72, p < .01). Results validated the LVI-S for use in residence life settings and validated behavioral aspects of synergistic supervision. Applications for the LVI-S were discussed as well as avenues for future research

    Decision Sciences, Economics, Finance, Business, Computing, and Big Data: Connections

    Get PDF
    This paper provides a review of some connecting literature in Decision Sciences, Economics, Finance, Business, Computing, and Big Data. We then discuss some research that is related to the six cognate disciplines. Academics could develop theoretical models and subsequent econometric and statistical models to estimate the parameters in the associated models. Moreover, they could then conduct simulations to examine whether the estimators or statistics in the new theories on estimation and hypothesis have small size and high power. Thereafter, academics and practitioners could then apply their theories to analyze interesting problems and issues in the six disciplines and other cognate areas
    corecore