10 research outputs found

    How Are Patent Cases Resolved? An Empirical Examination of the Adjudication and Settlement of Patent Disputes

    Get PDF
    From an institutional perspective, the patent system is a two-stage bargain. At the first stage, the U.S. Patent and Trademark Office (PTO) grants patent rights to inventors after conducting an examination of the prior art and of the patent application to determine whether the requirements for patentability are met. At the next stage, in order to enforce their issued patent rights, patentees have to resort to the federal courts with an action for patent infringement. This Article is organized as follows. Part II reviews the previous literature on patent litigation. Part III describes our methodology for collecting data on patent cases and classifying them according to the precise manner in which they were resolved. We then analyze the results and insights that we gain from a study of these case outcomes. Part IV presents our analysis of the costs of patent litigation, across all cases, for cases adjudicated through final judgments, and for cases with formal rulings of infringement or invalidity. In Part V, we present our conclusions

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    Privatization of Public Services: Market Structure, Analysis of Performance and Implications for Anti-Trust

    No full text
    97 p.Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2006.Privatization through contracting has been promoted as a way for government to control costs, yet many services are still provided through apparently less efficient mechanisms. Researchers have speculated why this should be the case. Others have questioned whether privatization can meet all its objectives, since governments will not achieve cost savings if there is not sufficient competition for contracts. This study examines these questions for the privatization of municipal solid waste employing two new and original data sources: survey of Illinois municipalities and a compilation of firms competing for solid waste contracts. It finds that while the vast majority of cities contract out, but a substantial number supply the service inhouse (8%) or through and the "open" systems in which households choose their waste hauler amongst a number of private firms (30%). The study also found a decline of almost 50% in the number of firms competing for contracts since 1995. City choice among delivery modes was analyzed using multinomial logit analysis and it was found that the most important factors explaining choices were whether the city hosted a waste disposal facility, the age and diversity of the population and the degree of local competition for contracts. The propensity of firms to bid on contracts was also analyzed and it was found that the distance from the firm to the city, the size of the firm, the number of competitors located closer to the city, and whether the firm was vertically integrated by owning a waste disposal facility explained the willingness of a firm to compete for a contract. The predicted probability of bidding across all firms was computed for every contracting city and used to simulate the number of expected bidders over time. This last analysis demonstrated the decline in competition for contracts more clearly than the techniques used in standard anti-trust analysis of spatial markets.U of I OnlyRestricted to the U of I community idenfinitely during batch ingest of legacy ETD

    Privatization of Public Services: Market Structure, Analysis of Performance and Implications for Anti-Trust

    No full text
    97 p.Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2006.Privatization through contracting has been promoted as a way for government to control costs, yet many services are still provided through apparently less efficient mechanisms. Researchers have speculated why this should be the case. Others have questioned whether privatization can meet all its objectives, since governments will not achieve cost savings if there is not sufficient competition for contracts. This study examines these questions for the privatization of municipal solid waste employing two new and original data sources: survey of Illinois municipalities and a compilation of firms competing for solid waste contracts. It finds that while the vast majority of cities contract out, but a substantial number supply the service inhouse (8%) or through and the "open" systems in which households choose their waste hauler amongst a number of private firms (30%). The study also found a decline of almost 50% in the number of firms competing for contracts since 1995. City choice among delivery modes was analyzed using multinomial logit analysis and it was found that the most important factors explaining choices were whether the city hosted a waste disposal facility, the age and diversity of the population and the degree of local competition for contracts. The propensity of firms to bid on contracts was also analyzed and it was found that the distance from the firm to the city, the size of the firm, the number of competitors located closer to the city, and whether the firm was vertically integrated by owning a waste disposal facility explained the willingness of a firm to compete for a contract. The predicted probability of bidding across all firms was computed for every contracting city and used to simulate the number of expected bidders over time. This last analysis demonstrated the decline in competition for contracts more clearly than the techniques used in standard anti-trust analysis of spatial markets.U of I OnlyRestricted to the U of I community idenfinitely during batch ingest of legacy ETD

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    No full text
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical science. © The Author(s) 2019. Published by Oxford University Press

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    No full text

    Chapter Two: Durrell as Research Leader

    No full text
    corecore