313 research outputs found
Block-regularized 52 Cross-validated McNemar's Test for Comparing Two Classification Algorithms
In the task of comparing two classification algorithms, the widely-used
McNemar's test aims to infer the presence of a significant difference between
the error rates of the two classification algorithms. However, the power of the
conventional McNemar's test is usually unpromising because the hold-out (HO)
method in the test merely uses a single train-validation split that usually
produces a highly varied estimation of the error rates. In contrast, a
cross-validation (CV) method repeats the HO method in multiple times and
produces a stable estimation. Therefore, a CV method has a great advantage to
improve the power of McNemar's test. Among all types of CV methods, a
block-regularized 52 CV (BCV) has been shown in many previous studies
to be superior to the other CV methods in the comparison task of algorithms
because the 52 BCV can produce a high-quality estimator of the error
rate by regularizing the numbers of overlapping records between all training
sets. In this study, we compress the 10 correlated contingency tables in the
52 BCV to form an effective contingency table. Then, we define a
52 BCV McNemar's test on the basis of the effective contingency table.
We demonstrate the reasonable type I error and the promising power of the
proposed 52 BCV McNemar's test on multiple simulated and real-world
data sets.Comment: 12 pages, 6 figures, and 5 table
Research on Spillover Effect of Paid Search Advertising Channels
With the diversification of paid search advertising channels, e-commerce enterprises are paying more and more attention on how to evaluate the effectiveness of different paid search advertising channels correctly and accurately to choose the optimal advertising channel or channels. We develop a multivariate time series model to investigate the spillover effect of paid search advertising channels based on the ad click-through rate and conversion rate, and calibrate the model using an e-commerce site\u27s web log data. We determine the long-term equilibrium relationship between each channel\u27s advertisement clicks through the co-integration test and evaluate the effect of short-term fluctuations in the interaction between each channel advertisement clicks through the vector error correction model. Based on the empirical results, this paper puts forward suggestions on the advertising strategy of this e-commerce website
Consumer Coupon Redemption Behavior Prediction on B2C E-commerce
How to recognize the tendency of the coupons among the users who receive the coupons and then send the coupon reminder to improve the coupon redemption rate and reduce the marketing cost has become an important issue in the coupon decision-making process. Based on the log data and transaction data in enterprise database, this study combined with the demographics, past purchasing behavior, past coupon usage behavior and the visiting behavior during the coupon validity period to construct the e-coupon redemption behavior prediction model. The model is constructed to help e-commerce enterprises identify the target users who have the coupon proneness after the coupons are issued, so as to send coupon reminders in time and enhance the effectiveness of coupon marketing
Enhancing User Loyalty through Network Externality: An Empirical Study on B2B Platform
Loyal users are vital to the future of B2B platform with rapid development and intensive competitions. This study examines how network externality, in terms of direct network externality and indirect network externality, enhances B2B platform users\u27 perceived value, and how such perception of value, in turn, influences their satisfaction and loyalty. First, we develop a conceptual model to describe the formation mechanism of user (seller) loyalty on B2B platform. Second, based on literature home and abroad, we develop a questionnaire. With a well-known B2B platform, we get 1,348 valid samples. At last, using structural equation modeling approach, we get the conceptual model fitted. The empirical results show that: network externality can be used as pre-drivers of perceived value thereby affecting user loyalty, but it has no direct influence on user satisfaction
MECHANISTIC INVESTIGATION OF ADDITIONS TO ALKENES
We have conducted correlation studies on ten alkene addition reactions in this project in order to explore the substituent effects on alkene reactivity in these reactions. In these studies, we have correlated the relative reactivities of alkenes versus their measurable characteristics, such as the ionization potentials (IPs), the highest occupied molecular orbital (HOMO) energy levels, and sometimes, the lowest unoccupied molecular orbital (LUMO) energy levels, in order to determine the relative magnitudes of electronic and steric effects in the rate-determining step of the alkene addition. The results from our correlation studies indicate that the majority of the alkene reactions included in this project are electrophilic additions to alkenes either with significant steric effects, such as in acid-catalyzed hydration and complexation with solid iodine, or without significant steric effects, such as in chlorination, bromination, oxidation with chromyl chloride and with chromic acid, ISCN addition, and ICl addition. Only two reactions, oxidation with palladium chloride and homogeneous hydrogenation in presence of Wilkinson's catalyst, were found to be nucleophilic additions with significant steric effects. These results are helpful in predicting alkene relative reactivities in the alkene reactions based on the substituents on the C=C bonds. The patterns of correlation plots in some studies have also provided supportive evidence that helped us in differentiating between alternatively proposed mechanisms for studied alkene additions
Arithmetic Average Density Fusion -- Part III: Heterogeneous Unlabeled and Labeled RFS Filter Fusion
This paper proposes a heterogenous density fusion approach to scalable
multisensor multitarget tracking where the inter-connected sensors run
different types of random finite set (RFS) filters according to their
respective capacity and need. These diverse RFS filters result in heterogenous
multitarget densities that are to be fused with each other in a proper means
for more robust and accurate detection and localization of the targets. Our
approach is based on Gaussian mixture implementations where the local Gaussian
components (L-GCs) are revised for PHD consensus, i.e., the corresponding
unlabeled probability hypothesis densities (PHDs) of each filter best fit their
average regardless of the specific type of the local densities. To this end, a
computationally efficient, coordinate descent approach is proposed which only
revises the weights of the L-GCs, keeping the other parameters unchanged. In
particular, the PHD filter, the unlabeled and labeled multi-Bernoulli (MB/LMB)
filters are considered. Simulations have demonstrated the effectiveness of the
proposed approach for both homogeneous and heterogenous fusion of the
PHD-MB-LMB filters in different configurations.Comment: 11 pages, 14 figures. IEEE Transactions on Aerospace and Electronics
Systems, 202
- …