356,731 research outputs found

    A multiobjective model for passive portfolio management: an application on the S&P 100 index

    Get PDF
    This is an author's accepted manuscript of an article published in: “Journal of Business Economics and Management"; Volume 14, Issue 4, 2013; copyright Taylor & Francis; available online at: http://dx.doi.org/10.3846/16111699.2012.668859Index tracking seeks to minimize the unsystematic risk component by imitating the movements of a reference index. Partial index tracking only considers a subset of the stocks in the index, enabling a substantial cost reduction in comparison with full tracking. Nevertheless, when heterogeneous investment profiles are to be satisfied, traditional index tracking techniques may need different stocks to build the different portfolios. The aim of this paper is to propose a methodology that enables a fund s manager to satisfy different clients investment profiles but using in all cases the same subset of stocks, and considering not only one particular criterion but a compromise between several criteria. For this purpose we use a mathematical programming model that considers the tracking error variance, the excess return and the variance of the portfolio plus the curvature of the tracking frontier. The curvature is not defined for a particular portfolio, but for all the portfolios in the tracking frontier. This way funds managers can offer their clients a wide range of risk-return combinations just picking the appropriate portfolio in the frontier, all of these portfolios sharing the same shares but with different weights. An example of our proposal is applied on the S&P 100.García García, F.; Guijarro Martínez, F.; Moya Clemente, I. (2013). A multiobjective model for passive portfolio management: an application on the S&P 100 index. Journal of Business Economics and Management. 14(4):758-775. doi:10.3846/16111699.2012.668859S758775144Aktan, B., Korsakienė, R., & Smaliukienė, R. (2010). TIME‐VARYING VOLATILITY MODELLING OF BALTIC STOCK MARKETS. Journal of Business Economics and Management, 11(3), 511-532. doi:10.3846/jbem.2010.25Ballestero, E., & Romero, C. (1991). A theorem connecting utility function optimization and compromise programming. Operations Research Letters, 10(7), 421-427. doi:10.1016/0167-6377(91)90045-qBeasley, J. E. (1990). OR-Library: Distributing Test Problems by Electronic Mail. Journal of the Operational Research Society, 41(11), 1069-1072. doi:10.1057/jors.1990.166Beasley, J. E., Meade, N., & Chang, T.-J. (2003). An evolutionary heuristic for the index tracking problem. European Journal of Operational Research, 148(3), 621-643. doi:10.1016/s0377-2217(02)00425-3Canakgoz, N. A., & Beasley, J. E. (2009). Mixed-integer programming approaches for index tracking and enhanced indexation. European Journal of Operational Research, 196(1), 384-399. doi:10.1016/j.ejor.2008.03.015Connor, G., & Leland, H. (1995). Cash Management for Index Tracking. Financial Analysts Journal, 51(6), 75-80. doi:10.2469/faj.v51.n6.1952Corielli, F., & Marcellino, M. (2006). Factor based index tracking. Journal of Banking & Finance, 30(8), 2215-2233. doi:10.1016/j.jbankfin.2005.07.012Derigs, U., & Nickel, N.-H. (2004). On a Local-Search Heuristic for a Class of Tracking Error Minimization Problems in Portfolio Management. Annals of Operations Research, 131(1-4), 45-77. doi:10.1023/b:anor.0000039512.98833.5aDose, C., & Cincotti, S. (2005). Clustering of financial time series with application to index and enhanced index tracking portfolio. Physica A: Statistical Mechanics and its Applications, 355(1), 145-151. doi:10.1016/j.physa.2005.02.078Focardi, S. M., & Fabozzi 3, F. J. (2004). A methodology for index tracking based on time-series clustering. Quantitative Finance, 4(4), 417-425. doi:10.1080/14697680400008668Gaivoronski, A. A., Krylov, S., & van der Wijst, N. (2005). Optimal portfolio selection and dynamic benchmark tracking. European Journal of Operational Research, 163(1), 115-131. doi:10.1016/j.ejor.2003.12.001Hallerbach, W. G., & Spronk, J. (2002). The relevance of MCDM for financial decisions. Journal of Multi-Criteria Decision Analysis, 11(4-5), 187-195. doi:10.1002/mcda.328Jarrett, J. E., & Schilling, J. (2008). DAILY VARIATION AND PREDICTING STOCK MARKET RETURNS FOR THE FRANKFURTER BÖRSE (STOCK MARKET). Journal of Business Economics and Management, 9(3), 189-198. doi:10.3846/1611-1699.2008.9.189-198Roll, R. (1992). A Mean/Variance Analysis of Tracking Error. The Journal of Portfolio Management, 18(4), 13-22. doi:10.3905/jpm.1992.701922Rudolf, M., Wolter, H.-J., & Zimmermann, H. (1999). A linear model for tracking error minimization. Journal of Banking & Finance, 23(1), 85-103. doi:10.1016/s0378-4266(98)00076-4Ruiz-Torrubiano, R., & Suárez, A. (2008). A hybrid optimization approach to index tracking. Annals of Operations Research, 166(1), 57-71. doi:10.1007/s10479-008-0404-4Rutkauskas, A. V., & Stasytyte, V. (s. f.). Decision Making Strategies in Global Exchange and Capital Markets. Advances and Innovations in Systems, Computing Sciences and Software Engineering, 17-22. doi:10.1007/978-1-4020-6264-3_4Tabata, Y., & Takeda, E. (1995). Bicriteria Optimization Problem of Designing an Index Fund. Journal of the Operational Research Society, 46(8), 1023-1032. doi:10.1057/jors.1995.139Teresienė, D. (2009). LITHUANIAN STOCK MARKET ANALYSIS USING A SET OF GARCH MODELS. Journal of Business Economics and Management, 10(4), 349-360. doi:10.3846/1611-1699.2009.10.349-36

    Practices for strategic capacity management in Malaysian manufacturing firms

    Get PDF
    While the notion of manufacturing capabilities is a long-standing notion in research on operations management, its actual implementation and management has been hardly researched. Five case studies in Malaysia offered the opportunity to examine the practice of manufacturing managers with regard to strategic capability management. The data collection and analysis was structured by using the notion of Strategic Capacity Management. Whereas traditionally literature has demonstrated the beneficial impact of an appropriate manufacturing strategy on the business strategy and performance, the study highlights the difficulty of managers to set the strategy, let alone implementing it. This is partly caused by the immense pressure of customers in these dominantly Make-To-Order environments for SMEs. Current concepts for manufacturing capabilities have insufficiently accounted this phenomenon and an outline of a research agenda is presented

    STV-based Video Feature Processing for Action Recognition

    Get PDF
    In comparison to still image-based processes, video features can provide rich and intuitive information about dynamic events occurred over a period of time, such as human actions, crowd behaviours, and other subject pattern changes. Although substantial progresses have been made in the last decade on image processing and seen its successful applications in face matching and object recognition, video-based event detection still remains one of the most difficult challenges in computer vision research due to its complex continuous or discrete input signals, arbitrary dynamic feature definitions, and the often ambiguous analytical methods. In this paper, a Spatio-Temporal Volume (STV) and region intersection (RI) based 3D shape-matching method has been proposed to facilitate the definition and recognition of human actions recorded in videos. The distinctive characteristics and the performance gain of the devised approach stemmed from a coefficient factor-boosted 3D region intersection and matching mechanism developed in this research. This paper also reported the investigation into techniques for efficient STV data filtering to reduce the amount of voxels (volumetric-pixels) that need to be processed in each operational cycle in the implemented system. The encouraging features and improvements on the operational performance registered in the experiments have been discussed at the end

    An overview of recent research results and future research avenues using simulation studies in project management

    Get PDF
    This paper gives an overview of three simulation studies in dynamic project scheduling integrating baseline scheduling with risk analysis and project control. This integration is known in the literature as dynamic scheduling. An integrated project control method is presented using a project control simulation approach that combines the three topics into a single decision support system. The method makes use of Monte Carlo simulations and connects schedule risk analysis (SRA) with earned value management (EVM). A corrective action mechanism is added to the simulation model to measure the efficiency of two alternative project control methods. At the end of the paper, a summary of recent and state-of-the-art results is given, and directions for future research based on a new research study are presented

    A hybrid and integrated approach to evaluate and prevent disasters

    Get PDF

    What is Community Operational Research?

    Get PDF
    Community Operational Research (Community OR) has been an explicit sub-domain of OR for more than 30 years. In this paper, we tackle the controversial issue of how it can be differentiated from other forms of OR. While it has been persuasively argued that Community OR cannot be defined by its clients, practitioners or methods, we argue that the common concern of all Community OR practice is the meaningful engagement of communities, whatever form that may take – and the legitimacy of different forms of engagement may be open to debate. We then move on to discuss four other controversies that have implications for the future development of Community OR and its relationship with its parent discipline: the desire for Community OR to be more explicitly political; claims that it should be grounded in the theory, methodology and practice of systems thinking; the similarities and differences between the UK and US traditions; and the extent to which Community OR offers an enhanced understanding of practice that could be useful to OR more generally. Our positions on these controversies all follow from our identification of ‘meaningful engagement’ as a central feature of Community OR

    Ultrasound enhancement of near-neutral photo-Fenton for effective E. coli inactivation in wastewater

    Get PDF
    In this study, we attempt for the first time to couple sonication and photo-Fenton for bacterial inactivation of secondary treated effluent. Synthetic wastewater was subjected to sequential high-frequency/low power sonication, followed by mild photo-Fenton treatment, under a solar simulator. It was followed by the assessment of the contribution of each component of the process (Fenton, US, hv) towards the removal rate and the long-term survival; sunlight greatly improved the treatment efficiency, with the coupled process being the only one to yield total inactivation within the 4-h period of treatment. The short-term beneficial disinfecting action of US and its detrimental effect on bacterial survival in long term, as well as the impact of light addition were also revealed. Finally, an investigation on the operational parameters of the process was performed, to investigate possible improvement and/or limitations of the coupled treatment; 3 levels of each parameter involved (hydraulic, environmental, US and Fenton) were tested. Only H2O2 increased improved the process significantly, but the action mode of the joint process indicated potential cost-effective solutions towards the implementation of this method. (C) 2014 Elsevier B.V. All rights reserved.Preprin

    Henry Ford vs. assembly line balancing

    Get PDF
    Ford’s Assembly Line at Highland Park is one of the most influential conceptualizations of a production system. New data reveal Ford’s operations were adaptable to strongly increasing and highly variable demand. These analyses show Ford’s assembly line was used differently than modern ones and their production systems were more flexible than previously recognized. Assembly line balancing theory largely ignores earlier practice. It will be shown that Ford used multiple lines flexibly to cope with large monthly variations in sales. Although a line may be optimized to yield lowest cost production, systems composed of several parallel lines may yield low cost production along with output and product flexibility. Recent research on multiple parallel lines has focussed on cost effectiveness without appreciating the flexibility such systems may allow. Given the current strategic importance of flexibility it should be included in such analyses as an explicit objective

    Quantitative Verification: Formal Guarantees for Timeliness, Reliability and Performance

    Get PDF
    Computerised systems appear in almost all aspects of our daily lives, often in safety-critical scenarios such as embedded control systems in cars and aircraft or medical devices such as pacemakers and sensors. We are thus increasingly reliant on these systems working correctly, despite often operating in unpredictable or unreliable environments. Designers of such devices need ways to guarantee that they will operate in a reliable and efficient manner. Quantitative verification is a technique for analysing quantitative aspects of a system's design, such as timeliness, reliability or performance. It applies formal methods, based on a rigorous analysis of a mathematical model of the system, to automatically prove certain precisely specified properties, e.g. ``the airbag will always deploy within 20 milliseconds after a crash'' or ``the probability of both sensors failing simultaneously is less than 0.001''. The ability to formally guarantee quantitative properties of this kind is beneficial across a wide range of application domains. For example, in safety-critical systems, it may be essential to establish credible bounds on the probability with which certain failures or combinations of failures can occur. In embedded control systems, it is often important to comply with strict constraints on timing or resources. More generally, being able to derive guarantees on precisely specified levels of performance or efficiency is a valuable tool in the design of, for example, wireless networking protocols, robotic systems or power management algorithms, to name but a few. This report gives a short introduction to quantitative verification, focusing in particular on a widely used technique called model checking, and its generalisation to the analysis of quantitative aspects of a system such as timing, probabilistic behaviour or resource usage. The intended audience is industrial designers and developers of systems such as those highlighted above who could benefit from the application of quantitative verification,but lack expertise in formal verification or modelling
    • 

    corecore