46 research outputs found
Forecasting the Performance of US Stock Market Indices During COVID-19: RF vs LSTM
The US stock market experienced instability following the recession
(2007-2009). COVID-19 poses a significant challenge to US stock traders and
investors. Traders and investors should keep up with the stock market. This is
to mitigate risks and improve profits by using forecasting models that account
for the effects of the pandemic. With consideration of the COVID-19 pandemic
after the recession, two machine learning models, including Random Forest and
LSTM are used to forecast two major US stock market indices. Data on historical
prices after the big recession is used for developing machine learning models
and forecasting index returns. To evaluate the model performance during
training, cross-validation is used. Additionally, hyperparameter optimizing,
regularization, such as dropouts and weight decays, and preprocessing improve
the performances of Machine Learning techniques. Using high-accuracy machine
learning techniques, traders and investors can forecast stock market behavior,
stay ahead of their competition, and improve profitability. Keywords: COVID-19,
LSTM, S&P500, Random Forest, Russell 2000, Forecasting, Machine Learning, Time
Series JEL Code: C6, C8, G4.Comment: Pennsylvania Economic Association (PEA)- June 202
Recommended from our members
Development of a four-phase thermal-chemical reservoir simulator for heavy oil
textThermal and chemical recovery processes are important EOR methods used often by the oil and gas industry to improve recovery of heavy oil and high viscous oil reservoirs. Knowledge of underlying mechanisms and their modeling in numerical simulation are crucial for a comprehensive study as well as for an evaluation of field treatment. EOS-compositional, thermal, and blackoil reservoir simulators can handle gas (or steam)/oil/water equilibrium for a compressible multiphase flow. Also, a few three-phase chemical flooding reservoir simulators that have been recently developed can model the oil/water/microemulsion equilibrium state. However, an accurate phase behavior and fluid flow formulations are absent in the literature for the thermal chemical processes to capture four-phase equilibrium. On the other hand, numerical simulation of such four-phase model with complex phase behavior in the equilibrium condition between coexisting phases (oil/water/microemulsion/gas or steam) is challenging. Inter-phase mass transfer between coexisting phases and adsorption of components on rock should properly be modeled at the different pressure and temperature to conserve volume balance (e.g. vaporization), mass balance (e.g. condensation), and energy balance (e.g. latent heat). Therefore, efforts to study and understand the performance of these EOR processes using numerical simulation treatments are quite necessary and of utmost importance in the petroleum industry. This research focuses on the development of a robust four-phase reservoir simulator with coupled phase behaviors and modeling of different mechanisms pertaining to thermal and chemical recovery methods. Development and implementation of a four-phase thermal-chemical reservoir simulator is quite important in the study as well as the evaluation of an individual or hybrid EOR methods. In this dissertation, a mathematical formulation of multi (pseudo) component, four-phase fluid flow in porous media is developed for mass conservation equation. Subsequently, a new volume balance equation is obtained for pressure of compressible real mixtures. Hence, the pressure equation is derived by extending a black oil model to a pseudo-compositional model for a wide range of components (water, oil, surfactant, polymer, anion, cation, alcohol, and gas). Mass balance equations are then solved for each component in order to compute volumetric concentrations. In this formulation, we consider interphase mass transfer between oil and gas (steam and water) as well as microemulsion and gas (microemulsion and steam). These formulations are derived at reservoir conditions. These new formulations are a set of coupled, nonlinear partial differential equations. The equations are approximated by finite difference methods implemented in a chemical flooding reservoir simulator (UTCHEM), which was a three-phase slightly compressible simulator, using an implicit pressure and an explicit concentration method. In our flow model, a comprehensive phase behavior is required for considering interphase mass transfer and phase tracking. Therefore, a four-phase behavior model is developed for gas (or steam)/ oil/water /microemulsion coexisting at equilibrium. This model represents coupling of the solution gas or steam table methods with Hand’s rule. Hand’s rule is used to capture the equilibrium between surfactant, oil, and water components as a function of salinity and concentrations for oil/water/microemulsion phases. Therefore, interphase mass transfer between gas/oil or steam/water in the presence of the microemulsion phase and the equilibrium between phases are calculated accurately. In this research, the conservation of energy equation is derived from the first law of thermodynamics based on a few assumptions and simplifications for a four-phase fluid flow model. This energy balance equation considers latent heat effect in solving for temperature due to phase change between water and steam. Accordingly, this equation is linearized and then a sequential implicit scheme is used for calculation of temperature. We also implemented the electrical Joule-heating process, where a heavy oil reservoir is heated in-situ by dissipation of electrical energy to reduce the viscosity of oil. In order to model the electrical Joule-heating in the presence of a four-phase fluid flow, Maxwell classical electromagnetism equations are used in this development. The equations are simplified and assumed for low frequency electric field to obtain the conservation of electrical current equation and the Ohm's law. The conservation of electrical current and the Ohm's law are implemented using a finite difference method in a four-phase chemical flooding reservoir simulator (UTCHEM). The Joule heating rate due to dissipation of electrical energy is calculated and added to the energy equation as a source term. Finally, we applied the developed model for solving different case studies. Our simulation results reveal that our models can accurately and successfully model the hybrid thermal chemical processes in comparison to existing models and simulators.Petroleum and Geosystems Engineerin
Reduced Deep Convolutional Activation Features (R-DeCAF) in Histopathology Images to Improve the Classification Performance for Breast Cancer Diagnosis
Breast cancer is the second most common cancer among women worldwide.
Diagnosis of breast cancer by the pathologists is a time-consuming procedure
and subjective. Computer aided diagnosis frameworks are utilized to relieve
pathologist workload by classifying the data automatically, in which deep
convolutional neural networks (CNNs) are effective solutions. The features
extracted from activation layer of pre-trained CNNs are called deep
convolutional activation features (DeCAF). In this paper, we have analyzed that
all DeCAF features are not necessarily led to a higher accuracy in the
classification task and dimension reduction plays an important role. Therefore,
different dimension reduction methods are applied to achieve an effective
combination of features by capturing the essence of DeCAF features. To this
purpose, we have proposed reduced deep convolutional activation features
(R-DeCAF). In this framework, pre-trained CNNs such as AlexNet, VGG-16 and
VGG-19 are utilized in transfer learning mode as feature extractors. DeCAF
features are extracted from the first fully connected layer of the mentioned
CNNs and support vector machine has been used for binary classification. Among
linear and nonlinear dimensionality reduction algorithms, linear approaches
such as principal component analysis (PCA) represent a better combination among
deep features and lead to a higher accuracy in the classification task using
small number of features considering specific amount of cumulative explained
variance (CEV) of features. The proposed method is validated using experimental
BreakHis dataset. Comprehensive results show improvement in the classification
accuracy up to 4.3% with less computational time. Best achieved accuracy is
91.13% for 400x data with feature vector size (FVS) of 23 and CEV equals to
0.15 using pre-trained AlexNet as feature extractor and PCA as feature
reduction algorithm
Research & Development of Digital Marketing and Innovation in Commercial Automotive Industry
Automotive industry particularly, commercial automotive industry, ranks as a key industry in the economic growth. The necessity of investigating the research & development(R & D) activities of digital marketing and innovation in the form of a dynamic system in automotive industry based on the 3 variables: empowerment of supply network, of product innovation, and of digital marketing is quite undisputed. The present research has been done with a view to identifying and evaluating the cause-and-effect interdependent relations governing the variables of R & D of digital marketing and innovation in commercial automotive industry. The research is typically applied, and has been done using the descriptive-survey method. The research community consisted of 50 experts; all with acceptable academic backgrounds and years of experience as executive managers and marketeers in the R & D of automotive industry. To analyze the data, the views of some elected experts on automotive industry, along with Delphi fuzzy and Dematel method were applied. Our findings showed that the variable “Intensity of R & D of digital marketing and innovation” has the most effect on the other variables. The variable “Empowerment of supply network” with the score of 3,25 has the largest amount of interaction with the other variables. Also, the variable “ Empowerment of R & D in digital marketing and innovation” with the score of 1,08 has the smallest amount of interaction with the other variables
Early Visual Processing of Feature Saliency Tasks: A Review of Psychophysical Experiments
The visual system is constantly bombarded with information originating from the outside world, but it is unable to process all the received information at any given time. In fact, the most salient parts of the visual scene are chosen to be processed involuntarily and immediately after the first glance along with endogenous signals in the brain. Vision scientists have shown that the early visual system, from retina to lateral geniculate nucleus (LGN) and then primary visual cortex, selectively processes the low-level features of the visual scene. Everything we perceive from the visual scene is based on these feature properties and their subsequent combination in higher visual areas. Different experiments have been designed to investigate the impact of these features on saliency and understand the relative visual mechanisms. In this paper, we review the psychophysical experiments which have been published in the last decades to indicate how the low-level salient features are processed in the early visual cortex and extract the most important and basic information of the visual scene. Important and open questions are discussed in this review as well and one might pursue these questions to investigate the impact of higher level features on saliency in complex scenes or natural images
BERT-Deep CNN: State-of-the-Art for Sentiment Analysis of COVID-19 Tweets
The free flow of information has been accelerated by the rapid development of
social media technology. There has been a significant social and psychological
impact on the population due to the outbreak of Coronavirus disease (COVID-19).
The COVID-19 pandemic is one of the current events being discussed on social
media platforms. In order to safeguard societies from this pandemic, studying
people's emotions on social media is crucial. As a result of their particular
characteristics, sentiment analysis of texts like tweets remains challenging.
Sentiment analysis is a powerful text analysis tool. It automatically detects
and analyzes opinions and emotions from unstructured data. Texts from a wide
range of sources are examined by a sentiment analysis tool, which extracts
meaning from them, including emails, surveys, reviews, social media posts, and
web articles. To evaluate sentiments, natural language processing (NLP) and
machine learning techniques are used, which assign weights to entities, topics,
themes, and categories in sentences or phrases. Machine learning tools learn
how to detect sentiment without human intervention by examining examples of
emotions in text. In a pandemic situation, analyzing social media texts to
uncover sentimental trends can be very helpful in gaining a better
understanding of society's needs and predicting future trends. We intend to
study society's perception of the COVID-19 pandemic through social media using
state-of-the-art BERT and Deep CNN models. The superiority of BERT models over
other deep models in sentiment analysis is evident and can be concluded from
the comparison of the various research studies mentioned in this article.Comment: 20 pages, 5 figure
Estimating capital and operational costs of backhoe shovels
Material loading is one of the most critical operations in earthmoving projects. A number of different equipment is available for loading operations. Project managers should consider different technical and economic issues at the feasibility study stage and try to select the optimum type and size of equipment fleet, regarding the production needs and project specifications. The backhoe shovel is very popular for digging, loading and flattening tasks. Adequate cost estimation is one of the most critical tasks in feasibility studies of equipment fleet selection. This paper presents two different cost models for the preliminary and detailed feasibility study stages. These models estimate the capital and operating cost of backhoe shovels using uni-variable exponential regression (UVER) as well as multi-variable linear regression (MVLR), based on principal component analysis. The UVER cost model is suitable for quick cost estimation at the early stages of project evaluation, while the MVLR cost function, which is more detailed, can be useful for the feasibility study stage. Independent variables of MVLR include bucket size, digging depth, dump height, weight and power. Model evaluations show that these functions could be a credible tool for cost estimations in prefeasibility and feasibility studies of mining and construction projects
CCTCOVID: COVID-19 detection from chest X-ray images using Compact Convolutional Transformers
COVID-19 is a novel virus that attacks the upper respiratory tract and the lungs. Its person-to-person transmissibility is considerably rapid and this has caused serious problems in approximately every facet of individuals' lives. While some infected individuals may remain completely asymptomatic, others have been frequently witnessed to have mild to severe symptoms. In addition to this, thousands of death cases around the globe indicated that detecting COVID-19 is an urgent demand in the communities. Practically, this is prominently done with the help of screening medical images such as Computed Tomography (CT) and X-ray images. However, the cumbersome clinical procedures and a large number of daily cases have imposed great challenges on medical practitioners. Deep Learning-based approaches have demonstrated a profound potential in a wide range of medical tasks. As a result, we introduce a transformer-based method for automatically detecting COVID-19 from X-ray images using Compact Convolutional Transformers (CCT). Our extensive experiments prove the efficacy of the proposed method with an accuracy of 99.22% which outperforms the previous works
Effectiveness of "rescue saccades" on the accuracy of tracking multiple moving targets: An eye-tracking study on the effects of target occlusions
Occlusion is one of the main challenges in tracking multiple moving objects. In almost all real-world scenarios, a moving object or a stationary obstacle occludes targets partially or completely for a short or long time during their movement. A previous study (Zelinsky & Todor, 2010) reported that subjects make timely saccades toward the object in danger of being occluded. Observers make these so-called "rescue saccades" to prevent target swapping. In this study, we examined whether these saccades are helpful. To this aim, we used as the stimuli recorded videos from natural movement of zebrafish larvae swimming freely in a circular container. We considered two main types of occlusion: object-object occlusions that naturally exist in the videos, and object-occluder occlusions created by adding a stationary doughnut-shape occluder in some videos. Four different scenarios were studied: (1) no occlusions, (2) only object-object occlusions, (3) only object-occluder occlusion, or (4) both object-object and object-occluder occlusions. For each condition, two set sizes (two and four) were applied. Participants' eye movements were recorded during tracking, and rescue saccades were extracted afterward. The results showed that rescue saccades are helpful in handling object-object occlusions but had no reliable effect on tracking through object-occluder occlusions. The presence of occlusions generally increased visual sampling of the scenes; nevertheless, tracking accuracy declined due to occlusion