16,450 research outputs found

    Challenges to describe QoS requirements for web services quality prediction to support web services interoperability in electronic commerce

    Get PDF
    Quality of service (QoS) is significant and necessary for web service applications quality assurance. Furthermore, web services quality has contributed to the successful implementation of Electronic Commerce (EC) applications. However, QoS is still the big issue for web services research and remains one of the main research questions that need to be explored. We believe that QoS should not only be measured but should also be predicted during the development and implementation stages. However, there are challenges and constraints to determine and choose QoS requirements for high quality web services. Therefore, this paper highlights the challenges for the QoS requirements prediction as they are not easy to identify. Moreover, there are many different perspectives and purposes of web services, and various prediction techniques to describe QoS requirements. Additionally, the paper introduces a metamodel as a concept of what makes a good web service

    Modeling Within- and Across-Customer Association in Lifetime Value with Copulas

    Get PDF
    Recent advances in linking Recency-Frequency-Monetary value (RFM) data to Customer Lifetime Value (CLV) in non-contractual settings rely on the assumption of independence between the transaction and spend processes. We propose to model jointly the inter- and intra-customer dependency between both processes using copulas, hereby accounting for the double correlation within and across customers. Applied to a unique data set of securities' transactions, we nd that modeling both associations enhances the accuracy of CLV predictions, thus improving customer valuation and selection tasks.Association;Copula;Customer Lifetime Value;Across and Within Customers

    New Approach for Market Intelligence Using Artificial and Computational Intelligence

    Get PDF
    Small and medium sized retailers are central to the private sector and a vital contributor to economic growth, but often they face enormous challenges in unleashing their full potential. Financial pitfalls, lack of adequate access to markets, and difficulties in exploiting technology have prevented them from achieving optimal productivity. Market Intelligence (MI) is the knowledge extracted from numerous internal and external data sources, aimed at providing a holistic view of the state of the market and influence marketing related decision-making processes in real-time. A related, burgeoning phenomenon and crucial topic in the field of marketing is Artificial Intelligence (AI) that entails fundamental changes to the skillssets marketers require. A vast amount of knowledge is stored in retailers’ point-of-sales databases. The format of this data often makes the knowledge they store hard to access and identify. As a powerful AI technique, Association Rules Mining helps to identify frequently associated patterns stored in large databases to predict customers’ shopping journeys. Consequently, the method has emerged as the key driver of cross-selling and upselling in the retail industry. At the core of this approach is the Market Basket Analysis that captures knowledge from heterogeneous customer shopping patterns and examines the effects of marketing initiatives. Apriori, that enumerates frequent itemsets purchased together (as market baskets), is the central algorithm in the analysis process. Problems occur, as Apriori lacks computational speed and has weaknesses in providing intelligent decision support. With the growth of simultaneous database scans, the computation cost increases and results in dramatically decreasing performance. Moreover, there are shortages in decision support, especially in the methods of finding rarely occurring events and identifying the brand trending popularity before it peaks. As the objective of this research is to find intelligent ways to assist small and medium sized retailers grow with MI strategy, we demonstrate the effects of AI, with algorithms in data preprocessing, market segmentation, and finding market trends. We show with a sales database of a small, local retailer how our Åbo algorithm increases mining performance and intelligence, as well as how it helps to extract valuable marketing insights to assess demand dynamics and product popularity trends. We also show how this results in commercial advantage and tangible return on investment. Additionally, an enhanced normal distribution method assists data pre-processing and helps to explore different types of potential anomalies.Små och medelstora detaljhandlare är centrala aktörer i den privata sektorn och bidrar starkt till den ekonomiska tillväxten, men de möter ofta enorma utmaningar i att uppnå sin fulla potential. Finansiella svårigheter, brist på marknadstillträde och svårigheter att utnyttja teknologi har ofta hindrat dem från att nå optimal produktivitet. Marknadsintelligens (MI) består av kunskap som samlats in från olika interna externa källor av data och som syftar till att erbjuda en helhetssyn av marknadsläget samt möjliggöra beslutsfattande i realtid. Ett relaterat och växande fenomen, samt ett viktigt tema inom marknadsföring är artificiell intelligens (AI) som ställer nya krav på marknadsförarnas färdigheter. Enorma mängder kunskap finns sparade i databaser av transaktioner samlade från detaljhandlarnas försäljningsplatser. Ändå är formatet på dessa data ofta sådant att det inte är lätt att tillgå och utnyttja kunskapen. Som AI-verktyg erbjuder affinitetsanalys en effektiv teknik för att identifiera upprepade mönster som statistiska associationer i data lagrade i stora försäljningsdatabaser. De hittade mönstren kan sedan utnyttjas som regler som förutser kundernas köpbeteende. I detaljhandel har affinitetsanalys blivit en nyckelfaktor bakom kors- och uppförsäljning. Som den centrala metoden i denna process fungerar marknadskorgsanalys som fångar upp kunskap från de heterogena köpbeteendena i data och hjälper till att utreda hur effektiva marknadsföringsplaner är. Apriori, som räknar upp de vanligt förekommande produktkombinationerna som köps tillsammans (marknadskorgen), är den centrala algoritmen i analysprocessen. Trots detta har Apriori brister som algoritm gällande låg beräkningshastighet och svag intelligens. När antalet parallella databassökningar stiger, ökar också beräkningskostnaden, vilket har negativa effekter på prestanda. Dessutom finns det brister i beslutstödet, speciellt gällande metoder att hitta sällan förekommande produktkombinationer, och i att identifiera ökande popularitet av varumärken från trenddata och utnyttja det innan det når sin höjdpunkt. Eftersom målet för denna forskning är att hjälpa små och medelstora detaljhandlare att växa med hjälp av MI-strategier, demonstreras effekter av AI med hjälp av algoritmer i förberedelsen av data, marknadssegmentering och trendanalys. Med hjälp av försäljningsdata från en liten, lokal detaljhandlare visar vi hur Åbo-algoritmen ökar prestanda och intelligens i datautvinningsprocessen och hjälper till att avslöja värdefulla insikter för marknadsföring, framför allt gällande dynamiken i efterfrågan och trender i populariteten av produkterna. Ytterligare visas hur detta resulterar i kommersiella fördelar och konkret avkastning på investering. Dessutom hjälper den utvidgade normalfördelningsmetoden i förberedelsen av data och med att hitta olika slags anomalier

    Data analytics 2016: proceedings of the fifth international conference on data analytics

    Get PDF

    Are property prices non-linear? An investigation of the behaviour of US REITs and UK property company shares

    Get PDF
    Linear models of market performance may be misspecified if the market is subdivided into distinct regimes exhibiting different behaviour. Price movements in the US Real Estate Investment Trusts and UK Property Companies Markets are explored using a Threshold Autoregressive (TAR) model with regimes defined by the real rate of interest. In both US and UK markets, distinctive behaviour emerges, with the TAR model offering better predictive power than a more conventional linear autoregressive model. The research points to the possibility of developing trading rules to exploit the systematically different behaviour across regimes

    Intraday Dynamics of Volatility and Duration: Evidence from the Chinese Stock Market

    Get PDF
    We propose a new joint model of intraday returns and durations to study the dynamics of several Chinese stocks. We include IBM from the U.S. market for comparison purposes. Flexible innovation distributions are used for durations and returns, and the total variance of returns is decomposed into different volatility components associated with different transaction horizons. Our new model strongly dominates existing specifications in the literature. The conditional hazard functions are non-monotonic and there is strong evidence for different volatility components. Although diurnal patterns, volatility components, and market microstructure implications are similar across the markets, there are interesting differences. Durations for lightly traded Chinese stocks tend to carry more information than heavily traded stocks. Chinese investors usually have longer investment horizons, which may be explained by the specific trading rules in China.market microstructure, transaction horizon, high-frequency data, ACD, GARCH

    Firms, Courts, and Reputation Mechanisms: Towards a Positive Theory of Private Ordering

    Get PDF
    This Essay formulates a positive model that predicts when commercial parties will employ private ordering to enforce their agreements. The typical enforcement mechanism associated with private ordering is the reputation mechanism, in which a merchant community punishes parties in breach of contract by denying them future business. The growing private ordering literature argues that these private enforcement mechanisms can be superior to the traditional, less efficient enforcement measures provided by public courts. However, previous comparisons between public and private contractual enforcement have presented a misleading dichotomy by failing to consider a third enforcement mechanim: the vertically integrated firm. This Essay develops a model that comprehensively addresses three distinct types of enforcement mechanisms--firms, courts, and reputation-based private ordering. The model rests on a synthesis of transaction cost economics, which compares the efficiencies of firms versus markets, and the private ordering literature, which compares the efficiencies of public courts versus private ordering. It hypothesizes that private ordering will arise when agreements present enforcement difficulties, high-powered market incentives are important, and the costs of entry barriers are low. The Essay then conducts an illustrative test by comparing the model\u27s predictions to documented instances of private ordering
    corecore