3,705 research outputs found

    Procrastination in the Workplace: Evidence from the U.S. Patent Office

    Get PDF
    Despite much theoretical attention to the concept of procrastination and much exploration of this phenomenon in laboratory settings, there remain few empirical investigations into the practice of procrastination in real world contexts, especially in the workplace. In this paper, we attempt to fill these gaps by exploring procrastination among U.S. patent examiners. We find that nearly half of examiners’ first substantive reports are completed immediately prior to the operable deadlines. Moreover, we find a range of additional empirical markers to support that this “end-loading” of reviews results from a model of procrastination rather than various alternative time-consistent models of behavior. In one such approach, we take advantage of the natural experiment afforded by the Patent Office’s staggered implementation of its telecommuting program, a large-scale development that we theorize might exacerbate employee self-control problems due to the ensuing reduction in direct supervision. Supporting the procrastination theory, we estimate an immediate spike in application end-loading and other indicia of procrastination upon the onset of telecommuting. Finally, contributing to a growing empirical literature over the efficiency of the patent examination process, we assess the consequences of procrastination for the quality of the reviews completed by the affected examiners. This analysis suggests that the primary harm stemming from procrastination is delay in the ultimate application process, with rushed reviews completed at deadlines resulting in the need for revisions in subsequent rounds of review. Our findings imply that nearly 1/6 of the annual growth in the Agency’s much-publicized backlog may be attributable to examiner procrastination

    The Internet as a Service Channel in the Public Sector : A substitute or complement of traditional service channels?

    Get PDF
    The Internet has been used as a channel for public service delivery since the mid 1990’s. During the first years of its existence it was believed to be the service channel of the future, making all other channels obsolete. But until now, the telephone and face-to-face contact remain being used more frequently and are rated higher. By comparing various studies that have recently been conducted in a number of countries, this paper suggests that the characteristics of the channel make it a suitable channel for basic transactions and simple information provision, and that the telephone and face-to-face contact remain prevalent for at least ambiguous and complex tasks. Therefore the Internet might be a complementary channel rather than a substitute of traditional channels. Research findings are interpreted by means of Media Richness Theory, the Social Influence model and Channel Expansion Theory

    Information Transparency and Market Efficiency in Blockchain-enabled Marketplaces: Role of Traders’ Analytical Ability

    Get PDF
    Classic economic theory asserts that full information transparency entails information symmetry and, thus, market efficiency. We test if this theory still holds in a blockchain-enabled marketplace where full information transparency is accomplished. We leverage the data from EnjinX, a non-fungible-token (NFT) marketplace, where the entire historical NFT transactions are symmetrically accessible to all buyers and sellers. We surprisingly observe substantial market inefficiencies. To explain this paradox that inefficiencies persist even in a fully information-transparent environment, we propose that traders’ limited analytical ability, rather than information asymmetry, ultimately drives market inefficiencies. We quantify analytical ability by examining whether traders’ performance can be augmented by machine-learning algorithms. And we find that having ten more historical transactions increases market efficiency by 1.10%. However, market efficiency could decrease by 69.02% when traders cannot effectively consume the available information. Our findings contribute to the literature by quantifying analytical ability and highlighting the analytical-ability divide phenomenon

    A software approach to enhancing quality of service in internet commerce

    Get PDF

    Benchmarking the implementation of E-Commerce A Case Study Approach

    Get PDF
    The purpose of this thesis was to develop a guideline to support the implementation of E-Commerce with E-Commerce benchmarking. Because of its importance as an interface with the customer, web-site benchmarking has been a widely researched topic. However, limited research has been conducted on benchmarking E-Commerce across other areas of the value chain. Consequently this thesis aims to extend benchmarking into E-Commerce related subjects. The literature review examined two main bodies of theory, E-Commerce and benchmarking. lt became clearly apparent that a gap in the literature existed for E-Commerce benchmarking. To address this gap, a single-case-study exploratory methodology has been applied. The case study method was considered most suitable for this research given the exploratory nature of the research aim and question as well as the potential for new insights to be gained from the samples. Three sub-studies have been applied within this single-case-study exploratory design. In study 1, 20 semi-structured interviews were conducted to explore possible themes related to E-Commerce, benchmarking and E-Commerce benchmarking. Those themes were included in study 2, the exploratory quantitative questionnaire survey. 146 responses were analyzed in this phase. In study 3, six expert interviews were executed to explore potential themes based on the research of the first two studies. The data analysis of this thesis included descriptive statistics and a mixture of grounded and content analysis. There were a number of important findings that emerged from this research. Firstly, E-Commerce benchmarking is mostly executed as web-site benchmarking, customer surveys and basic top line indicators like the E-Share. Secondly, exchange of best practices, target setting, customer satisfaction and competitive advantage emerged as benefits from E-Commerce benchmarking. Thirdly, there are two distinct differences between E-Commerce benchmarking and traditional bench marking: (1) higher frequency and (2) types of indicators. Fourthly, external benchmarking, process benchmarking and additional indicators were identified as appropriate avenues of benchmarking E-Commerce. The contribution of this thesis relates to extending current literature on E-Commerce benchmarking. Furthermore, a guideline for the implementation of E-Commerce benchmarking is provided. In summary it is proven that the implementation of E-Commerce can and needs to be benchmarked

    Information Processing in Electronic Markets: Measuring Subjective Interpretation Using Sentiment Analysis

    Get PDF
    Information availability plays an important role in the efficient resource allocation of electronic markets and e-commerce. Most of this information is of qualitative nature containing essential facts that are, however, difficult to decode. Currently, the information processing capabilities of human agents facing such qualitative news is mostly unknown. Accordingly, it is crucial to understand how different decision makers process qualitative information. In this paper we show that sentiment-analysis facilitates research in qualitative information processing. We use a capital market example to demonstrate how investors and analysts perceive novel information. We find that their interpretation is different from one another: investors rapidly translate novel information into transactions, whereas analysts take more time to respond. We further observe that analysts emphasize different parts of information than investors, and are less put-off by complex information. The approach can applied to other electronic markets and the e-commerce industry where individuals react upon textual information

    IMPROVING RECOMMENDATION PERFORMANCE WITH USER INTEREST EVOLUTION PATTERNS

    Get PDF
    Effective recommendation is indispensable to customized or personalized services. Collaborative filtering approach is a salient technique to support automated recommendations, which relies on the profiles of customers to make recommendations to a target customer based on the neighbors with similar preferences. However, traditional collaborative recommendation techniques only use static information of customers’ preferences and ignore the evolution of their purchasing behaviours which contain valuable information for making recommendations. Thus, this study proposes an approach to increase the effectiveness of personalized recommendations by mining the sequence patterns from the evolving preferences of a target customer over time. The experimental results have shown that the proposed technique has improved the recommendation precision in comparison with collaborative filtering method based on Top k recommendation

    Fraud Dataset Benchmark and Applications

    Full text link
    Standardized datasets and benchmarks have spurred innovations in computer vision, natural language processing, multi-modal and tabular settings. We note that, as compared to other well researched fields, fraud detection has unique challenges: high-class imbalance, diverse feature types, frequently changing fraud patterns, and adversarial nature of the problem. Due to these, the modeling approaches evaluated on datasets from other research fields may not work well for the fraud detection. In this paper, we introduce Fraud Dataset Benchmark (FDB), a compilation of publicly available datasets catered to fraud detection FDB comprises variety of fraud related tasks, ranging from identifying fraudulent card-not-present transactions, detecting bot attacks, classifying malicious URLs, estimating risk of loan default to content moderation. The Python based library for FDB provides a consistent API for data loading with standardized training and testing splits. We demonstrate several applications of FDB that are of broad interest for fraud detection, including feature engineering, comparison of supervised learning algorithms, label noise removal, class-imbalance treatment and semi-supervised learning. We hope that FDB provides a common playground for researchers and practitioners in the fraud detection domain to develop robust and customized machine learning techniques targeting various fraud use cases

    Payment Fintechs and Debt Enforcement

    Get PDF
    Fintech payment companies acting as lenders possess a potential solution to weak debt enforcement. Their location in the payment chain yields them a senior position in the revenue stream of the borrowing merchant, as the payment company can deduct part of the merchant's sales it processes to amortize the loan. Our analysis of the transactions processed through a fintech company offering such sales-linked loans suggests that some borrowers discontinuously reduce sales processed through the company immediately after the loan disbursal to strategically default. We find that competition from other lenders and cash limits the effectiveness of this enforcement technology

    Innovative Tokyo

    Get PDF
    This paper compares and contrasts Tokyo's innovation structure with the industrial districts model and the international hub model in the literature on urban and regional development. The Tokyo model embraces and yet transcends both industrial districts and international hub models. The paper details key elements making up the Tokyo model-organizational knowledge creation, integral and co-location systems of corporate R&D and new product development, test markets, industrial districts and clusters, participative consumer culture, continuous learning from abroad, local government policies, the national system of innovation, and the historical genesis of Tokyo in Japan's political economy. The paper finds that the Tokyo model of innovation will continue to evolve with the changing external environment, but fundamentally retains its main characteristics. The lessons from the Tokyo model is that openness, a diversified industrial base, the continuing development of new industries, and an emphasis on innovation, all contribute to the dynamism of a major metropolitan region.Labor Policies,Environmental Economics&Policies,Public Health Promotion,ICT Policy and Strategies,Agricultural Knowledge&Information Systems,ICT Policy and Strategies,Environmental Economics&Policies,Health Monitoring&Evaluation,Agricultural Knowledge&Information Systems,Innovation
    • 

    corecore