19 research outputs found
INVESTMENT PORTFOLIO REBALANCING DECISION MAKING
Nowadays financial markets’ volatility and significant stock prices’ fluctuations allow improving investment return actively managing investment portfolio, rather than choosing long term investment strategy. Active portfolio management also allows personal investor’s development and gives opportunity to avoid losses in terms of market instability. However active portfolio management is more risky. Rebalancing the investment portfolio investor incurs real costs for expected return, so actively managing the investment portfolio it is crucial to use a good, investor needs meeting portfolio rebalancing method. Dealing with mentioned problem scientific information sources analysis is made and a new portfolio rebalancing method is suggested in the article
Robust portfolio management with multiple financial analysts
Portfolio selection theory, developed by Markowitz (1952), is one of the best known and widely applied methods for allocating funds among possible investment choices, where investment decision making is a trade-off between the expected return and risk of the portfolio. Many portfolio selection models have been developed on the basis of Markowitz’s theory. Most of them assume that complete investment information is available and that it can be accurately extracted from the historical data. However, this complete information never exists in reality. There are many kinds of ambiguity and vagueness which cannot be dealt with in the historical data but still need to be considered in portfolio selection. For example, to address the issue of uncertainty caused by estimation errors, the robust counterpart approach of Ben-Tal and Nemirovski (1998) has been employed frequently in recent years. Robustification, however, often leads to a more conservative solution. As a consequence, one of the most common critiques against the robust counterpart approach is the excessively pessimistic character of the robust asset allocation.
This thesis attempts to develop new approaches to improve on the respective performances of the robust counterpart approach by incorporating additional investment information sources, so that the optimal portfolio can be more reliable and, at the same time, achieve a greater return. [Continues.
A non-invasive machine learning mechanism for early disease recognition on Twitter: The case of anemia
Social media sites, such as Twitter, provide the means for users to share their stories, feelings, and health conditions during the disease course. Anemia, the most common type of blood disorder, is recognized as a major public health problem all over the world. Yet very few studies have explored the potential of recognizing anemia from online posts. This study proposed a novel mechanism for recognizing anemia based on the associations between disease symptoms and patients' emotions posted on the Twitter platform. We used k-means and Latent Dirichlet Allocation (LDA) algorithms to group similar tweets and to identify hidden disease topics. Both disease emotions and symptoms were mapped using the Apriori algorithm. The proposed approach was evaluated using a number of classifiers. A higher prediction accuracy of 98.96 % was achieved using Sequential Minimal Optimization (SMO). The results revealed that fear and sadness emotions are dominant among anemic patients. The proposed mechanism is the first of its kind to diagnose anemia using textual information posted on social media sites. It can advance the development of intelligent health monitoring systems and clinical decision-support systems
Emotional intelligence and individuals’ viewing behaviour of human faces: a predictive approach
Although several studies have looked at the relationship between emotional characteristics and viewing behaviour, understanding how emotional intelligence (EI) contributes to individuals’ viewing behaviour is not clearly understood. This study examined the viewing behaviour of people (74 male and 80 female) with specific EI profiles while viewing five facial expressions. An eye-tracking methodology was employed to examine individuals’ viewing behaviour in relation to their EI. We compared the performance of different machine learning algorithms on the eye-movement parameters of participants to predict their EI profiles. The results revealed that EI profiles of individuals high in self-control, emotionality, and sociability responded differently to the visual stimuli. The prediction results of these EI profiles achieved 94.97% accuracy. The findings are unique in that they provide a new understanding of how eye-movements can be used in the prediction of EI. The findings also contribute to the current understanding of the relationship between EI and emotional expressions, thereby adding to an emerging stream of research that is of interest to researchers and psychologists in human–computer interaction, individual emotion, and information processing
Preispitivanje tobinove teoreme odvajanja
Pripremni deo disertacije, koji vodi ka osnovnom, zasnovan je na parametrima povrat-varijansa koji predstavljaju dve ključne slučajne promenljive modela koji je osmislio Markowitz. U istraživanju su korišćeni istorijski podaci koji sami po sebi reflektuju sve dostupne informacije koje je finansijsko tržište absorbovalo te stoga, možemo da ih smatramo ne samo homogenim već i apsolutnim, (iz razloga realizovanosti). Nad takvim, dakle, ni sa čim uslovljenim, podacima koji predstavljaju kombinacije vrednosti prosečnih povrata i varijansi povrata portfolija, izvršen je analitički postupak aproksimacije polinomom šestog stepena, čime je uspostavljena relacija koja je eksplicitno iskazana algebarskom polinomijalnom jednačinom šestog stepena. Nakon toga, daljim analitičkim postupkom determinisani su uslovi za egzistenciju i minimuma i tangentnog portfolija, a redefinisani su i pojmovi: efikasni skup portfolija, sklonost ka riziku, averzija prema riziku i linija indeferencije. Centralna tema disertacije, preispitivanje Tobinove teoreme odvajanja, formulisana je i dokazana kroz tri teoreme od kojih jedna osnovna i dve pomoćne
Hybrid fuzzy analytical hierarchy process with fuzzy inference system on ranking stem approach towards blended learning in mathematics
In the era of Education 4.0, blended learning has been selected as one of the transformational pedagogies for the teaching and learning process that integrate Science, Technology, Engineering, and Mathematics (STEM), a new norm that needs to be adopted by Malaysia. Since the COVID-19 pandemic, the issue has been highlighted at most levels of study in the education field. However, limited knowledge of the implementation of 21st Century learning skills with Web 2.0 among teachers has made the students demotivated for their mathematics classroom. Moreover, dynamic changes in the standard curriculum have made the situation more challenging for teachers in selecting the appropriate STEM approach to ensure students are fully engaged. Inspired by the problem, this research used fuzzy multi-criteria decision-making (MCDM) concepts. A hybrid fuzzy MCDM model proposes a four stages process to rank and find the best implementation STEM approach in the mathematics classroom. The model is constructed by integrating the Fuzzy Analytical Hierarchy Process (FAHP) to determine the weights of STEM criteria and sub-criteria and the Fuzzy Inference System (FIS) to compute the best STEM approach in the mathematics classroom. The procedure involves exploring the issue associated with the selection problems, deriving decision criteria important weights, and ranking various alternatives with applied intuitive multiple centroids as a defuzzification method. The results showed hands-on activities as the best STEM approach while requisite knowledge is the important criterion with the greatest value of weights. Thus, the proposed model helps provide a clear picture for teachers in the implementation of STEM approach in Mathematics based on a comprehensive view and also lay a new foundation knowledge in fuzzy MCDM view, particularly in STEM education. Also, it helps the Ministry of Education (MoE) to achieve one of the initiatives in Wave 3 of the Malaysia Education Blueprint (2021-2025), which is to share the best practice in the classroom to cultivate a peer-led culture of professional excellence among teachers as the basis for improving the implementation and achievement of STEM at the national level
High-Performance Modelling and Simulation for Big Data Applications
This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
Recommended from our members
Global supply chain optimization: a machine learning perspective to improve caterpillar's logistics operations
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University London.Supply chain optimization is one of the key components for the effective management of a company with a complex manufacturing process and distribution network. Companies with a global presence in particular are motivated to optimize their distribution plans in order to keep their operating costs low and competitive. Changing condition in the global market and volatile energy prices increase the need for an automatic decision and optimization tool. In recent years, many techniques and applications have been proposed to address the
problem of supply chain optimization. However, such techniques are often too problemspecific or too knowledge-intensive to be implemented as in-expensive, and easy-to-use computer system. The effort required to implement an optimization system for a new instance of the problem appears to be quite significant. The development process necessitates the involvement of expert personnel and the level of automation is low. The aim of this project is to develop a set of strategies capable of increasing the level of
automation when developing a new optimization system. An increased level of automation is achieved by focusing on three areas: multi-objective optimization, optimization algorithm usability, and optimization model design. A literature review highlighted the great level of interest for the problem of multiobjective optimization in the research community. However, the review emphasized a lack of standardization in the area and insufficient understanding of the relationship between multi-objective strategies and problems. Experts in the area of optimization and artificial intelligence are interested in improving the usability of the most recent
optimization algorithms. They stated the concern that the large number of variants and parameters, which characterizes such algorithms, affect their potential applicability in real-world environments. Such characteristics are seen as the root cause for the low success of the most recent optimization algorithms in industrial applications. Crucial task for the development of an optimization system is the design of the optimization model. Such task is one of the most complex in the development process, however, it is still performed mostly manually. The importance and the complexity of the task strongly suggest the development of tools to aid the design of optimization models. In order to address such challenges, first the problem of multi-objective optimization is considered and the most widely adopted techniques to solve it are identified. Such techniques are analyzed and described in details to increase the level of standardization in the area. Empirical evidences are highlighted to suggest what type of relationship exists between strategies and problem instances. Regarding the optimization algorithm, a classification method is proposed to improve its usability and computational requirement by automatically tuning one of its key parameters, the termination condition. The algorithm understands the problem complexity and automatically assigns the best termination condition to minimize runtime. The runtime of the optimization system has been reduced by more than 60%. Arguably, the usability of the algorithm has been improved as well, as one of the key configuration tasks can now be completed automatically. Finally, a system is presented to aid the definition of the optimization model through regression analysis. The purpose of the method is to gather as much knowledge about the problem as possible so that the task of the optimization model definition requires a lower user involvement. The application of the proposed algorithm is estimated that could have saved almost 1000 man-weeks to complete the project. The developed strategies have been applied to the problem of Caterpillar’s global supply chain optimization. This thesis describes also the process of developing an optimization system for Caterpillar and highlights the challenges and research opportunities identified while undertaking this work. This thesis describes the optimization model designed for Caterpillar’s supply chain and the implementation details of the Ant Colony System, the algorithm selected to optimize the supply chain. The system is now used to design the distribution plans of more than 7,000 products. The system improved Caterpillar’s marginal profit on such products by a factor of 4.6% on average.Caterpillar Inc
High-Performance Modelling and Simulation for Big Data Applications
This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
Electronic resources as drivers of change in the cataloging system of Library and Information System of Serbia ; Электронные ресурсы как движущие силы изменений в системе каталогизации Библиотечно-информационной системы Сербии
Informaciono-komunikaciona tehnologija menja svet u svim oblastima
nauke, kulture, umetnosti i svim ostalim oblicima ljudskih delatnosti i
izražavanja. U bibliotekarstvu kao nauci, primenjuju se nove tehnologije,
upotrebljavaju se novi izvori, menja se odnos prema korisnicima ali se menjaju i
neke osnovne postavke. Elektronski izvori osim što postaju jedan od najvažnijih
oblika građe u bibliotečkim fondovima, utiču na sistem formalnih i sadržinskih
kataloga kao centralnog dela Bibliotečko-informacionog sistema Republike
Srbije (BISRS).
Ekstremno brz razvoj informaciono-komunikacione tehnologije (IT) uz
ogromne, nedvosmisleno pozitivne promene, suočava nas sa ograničenjima
asimilacije tehnologije u prestrukturiranju bibliotečkih sistema. Na taj
komplikovani poduhvat osim razvoja IT od velikog uticaja su i bibliotečka
regulativa, koja, iako zasnovana na međunarodnim standardima, uvek nosi
nacionalna obeležja izražena kroz jezik, pismo i zakonsku regulativu. U tom
dinamičnom razvoju dva procesa su od posebnog značaja...Information and communication technology is changing the world in all
areas of science, culture, art and all other forms of human activity and
expression. In science of librarianship new technologies is applied, new sources
are used, and relationship towards users generate the change of some basic
settings. Electronic resources in addition to becoming one of the most important
forms of materials in library collections, affecting the system of formal and
content catalog as a central part of Library and Information System of the
Republic of Serbia (LISRS).
The extremely rapid development of information and communication
technology (ICT) with a huge, unambiguously positive change, is facing
constraints in capacity of assimilation of technology in the restructuring of
library systems. In this complicated endeavor, beside IT development are of
great influence library legislation which, although based on international
standards, still wearing the national characteristics expressed through
language, script and legislation. In this dynamic development two processes are
of particular importance..