501 research outputs found
Optimal Threshold Control by the Robots of Web Search Engines with Obsolescence of Documents
A typical web search engine consists of three principal parts: crawling
engine, indexing engine, and searching engine. The present work aims to
optimize the performance of the crawling engine. The crawling engine finds new
web pages and updates web pages existing in the database of the web search
engine. The crawling engine has several robots collecting information from the
Internet. We first calculate various performance measures of the system (e.g.,
probability of arbitrary page loss due to the buffer overflow, probability of
starvation of the system, the average time waiting in the buffer). Intuitively,
we would like to avoid system starvation and at the same time to minimize the
information loss. We formulate the problem as a multi-criteria optimization
problem and attributing a weight to each criterion. We solve it in the class of
threshold policies. We consider a very general web page arrival process modeled
by Batch Marked Markov Arrival Process and a very general service time modeled
by Phase-type distribution. The model has been applied to the performance
evaluation and optimization of the crawler designed by INRIA Maestro team in
the framework of the RIAM INRIA-Canon research project
Whittle Index Policy for Crawling Ephemeral Content
We consider a task of scheduling a crawler to retrieve content from several
sites with ephemeral content. A user typically loses interest in ephemeral
content, like news or posts at social network groups, after several days or
hours. Thus, development of timely crawling policy for such ephemeral
information sources is very important. We first formulate this problem as an
optimal control problem with average reward. The reward can be measured in the
number of clicks or relevant search requests. The problem in its initial
formulation suffers from the curse of dimensionality and quickly becomes
intractable even with moderate number of information sources. Fortunately, this
problem admits a Whittle index, which leads to problem decomposition and to a
very simple and efficient crawling policy. We derive the Whittle index and
provide its theoretical justification
THE ROLE OF ARTIFICIAL INTELLIGENCE IN PUSHING THE BOUNDARIES OF U.S. REGULATION: A SYSTEMATIC REVIEW
Artificial Intelligence’s (AI) growing catalog of applications and methods has the potential to profoundly affect public policy by generating instances where regulations are not adequate to confront the issues faced by society, also known as regulatory gaps. The objective of this article is to improve our understanding of how AI influences U.S. public policy. It does so by systematically exploring, for the first time, this technology’s role in the generation of regulatory gaps. Specifically, it addresses two research questions:
What U.S. regulatory gaps exist due to AI methods and applications?
When looking across all of the gaps identified in the first research question, what trends and insights emerge that can help stakeholders plan for the future?
These questions are answered through a systematic review of four academic literature databases in the hard and social sciences. Its implementation is guided by a protocol that identified 5,240 candidate articles. A screening process reduced this sample to 241 articles (published between 1976 and February of 2018) relevant to answering the research questions.
This article contributes to the literature by adapting the work of Bennett-Moses and Calo to effectively characterize regulatory gaps caused by AI in the U.S. In addition, it finds that most gaps: do not require new regulation or the creation of governance frameworks for their resolution, are found at the federal and state levels of government, and AI applications are recognized more often than methods as their cause
Deep Reinforcement Learning for Web Crawling
International audienceA search engine uses a web crawler to crawl the pages from the world wide web (WWW) and aims to maintain its local cache as fresh as possible. Unfortunately, the rates at which different pages change in WWW are highly nonuniform and also, unknown in many real-life scenarios. In addition, the finite available bandwidth and possible server restrictions on crawling frequency make it very difficult for the crawler to find the optimal scheduling policy that maximises the freshness of the local cache. We model this problem in a multi-armed restless bandits framework, where each arm represents a web page or an aggregate of statistically identical web pages. The objective is to find the scheduling policy that gives the exact indices of the pages to be crawled at a particular instance. We provide an online learning scheme using deep reinforcement learning (DRL) framework which learns the unknown page change dynamics on the fly along with the optimal crawling policy. Finally, we run numerical simulations to compare our approach with state-of-the-art algorithms such as static optimisation and Thompson sampling. We observe better performance for DRL
Technology and Australia's Future: New technologies and their role in Australia's security, cultural, democratic, social and economic systems
Chapter 1. Introducing technology -- Chapter 2. The shaping of technology -- Chapter 3. Prediction of future technologies -- Chapter 4. The impacts of technology -- Chapter 5. Meanings, attitudes and behaviour -- Chapter 6. Evaluation -- Chapter 7. Intervention -- Conclusion - adapt or wither.This report was commisioned by Australian Council of Learned Academies
B!SON: A Tool for Open Access Journal Recommendation
Finding a suitable open access journal to publish scientific work is a complex task: Researchers have to navigate a constantly growing number of journals, institutional agreements with publishers, funders’ conditions and the risk of Predatory Publishers. To help with these challenges, we introduce a web-based journal recommendation system called B!SON. It is developed based on a systematic requirements analysis, built on open data, gives publisher-independent recommendations and works across domains. It suggests open access journals based on title, abstract and references provided by the user. The recommendation quality has been evaluated using a large test set of 10,000 articles. Development by two German scientific libraries ensures the longevity of the project
Artificial Intelligence Through the Eyes of the Public
Artificial Intelligence is becoming a popular field in computer science. In this report we explored its history, major accomplishments and the visions of its creators. We looked at how Artificial Intelligence experts influence reporting and engineered a survey to gauge public opinion. We also examined expert predictions concerning the future of the field as well as media coverage of its recent accomplishments. These results were then used to explore the links between expert opinion, public opinion and media coverage
A framework development to predict remaining useful life of a gas turbine mechanical component
Power-by-the-hour is a performance based offering for delivering outstanding service to operators of civil aviation aircraft. Operators need to guarantee to minimise downtime, reduce service cost and ensure value for money which requires an innovative advanced technology for predictive maintenance. Predictability, availability and reliability of the engine offers better service for operators, and the need to estimate the expected component failure prior to failure occurrence requires a proactive approach to predict the remaining useful life of components within an assembly.
This research offers a framework for component remaining useful life prediction using assembly level data. The thesis presents a critical analysis on literature identifying the Weibull method, statistical technique and data-driven methodology relating to remaining useful life prediction, which are used in this research. The AS-IS practice captures relevant information based on the investigation conducted in the aerospace industry. The analysis of maintenance cycles relates to the examination of high-level events for engine availability, whereby more communications with industry showcase a through-life performance timeline visualisation. Overhaul sequence and activities are presented to gain insights of the timeline visualisation.
The thesis covers the framework development and application to gas turbine single stage assembly, repair and replacement of components in single stage assembly, and multiple stage assembly. The framework is demonstrated in aerospace engines and power generation engines. The framework developed enables and supports domain experts to quickly respond to, and prepare for maintenance and on-time delivery of spare parts.
The results of the framework show the probability of failure based on a pair of error values using the corresponding Scale and Shape parameters. The probability of failure is transformed into the remaining useful life depicting a typical Weibull distribution. The resulting Weibull curves developed with three scenarios of the case shows there are components renewals, therefore, the remaining useful life of the components are established. The framework is validated and verified through a case study with three scenarios and also through expert judgement
Standardization Roadmap for Unmanned Aircraft Systems, Version 2.0
This Standardization Roadmap for Unmanned Aircraft Systems, Version 2.0 (“roadmap”) is an update to version 1.0 of this document published in December 2018. It identifies existing standards and standards in development, assesses gaps, and makes recommendations for priority areas where there is a perceived need for additional standardization and/or pre-standardization R&D.
The roadmap has examined 78 issue areas, identified a total of 71 open gaps and corresponding recommendations across the topical areas of airworthiness; flight operations (both general concerns and application-specific ones including critical infrastructure inspections, commercial services, and public safety operations); and personnel training, qualifications, and certification. Of that total, 47 gaps/recommendations have been identified as high priority, 21 as medium priority, and 3 as low priority. A “gap” means no published standard or specification exists that covers the particular issue in question. In 53 cases, additional R&D is needed.
As with the earlier version of this document, the hope is that the roadmap will be broadly adopted by the standards community and that it will facilitate a more coherent and coordinated approach to the future development of standards for UAS. To that end, it is envisioned that the roadmap will continue to be promoted in the coming year. It is also envisioned that a mechanism may be established to assess progress on its implementation
- …