727,971 research outputs found

    Selective web information retrieval

    Get PDF
    This thesis proposes selective Web information retrieval, a framework formulated in terms of statistical decision theory, with the aim to apply an appropriate retrieval approach on a per-query basis. The main component of the framework is a decision mechanism that selects an appropriate retrieval approach on a per-query basis. The selection of a particular retrieval approach is based on the outcome of an experiment, which is performed before the final ranking of the retrieved documents. The experiment is a process that extracts features from a sample of the set of retrieved documents. This thesis investigates three broad types of experiments. The first one counts the occurrences of query terms in the retrieved documents, indicating the extent to which the query topic is covered in the document collection. The second type of experiments considers information from the distribution of retrieved documents in larger aggregates of related Web documents, such as whole Web sites, or directories within Web sites. The third type of experiments estimates the usefulness of the hyperlink structure among a sample of the set of retrieved Web documents. The proposed experiments are evaluated in the context of both informational and navigational search tasks with an optimal Bayesian decision mechanism, where it is assumed that relevance information exists. This thesis further investigates the implications of applying selective Web information retrieval in an operational setting, where the tuning of a decision mechanism is based on limited existing relevance information and the information retrieval system’s input is a stream of queries related to mixed informational and navigational search tasks. First, the experiments are evaluated using different training and testing query sets, as well as a mixture of different types of queries. Second, query sampling is introduced, in order to approximate the queries that a retrieval system receives, and to tune an ad-hoc decision mechanism with a broad set of automatically sampled queries

    International Standard ISO 9001–A Soft Computing View

    Get PDF
    In order to add value to ISO 9001, a Quality Management Systems that assess, measure, documents, improves, and certify processes to increase productivity, i.e., that transforms business at any level. On the one hand, this work focuses on the development of a decision support system, which will allow companies to be able to meet the needs of customers by fulfilling requirements that reflect either the effectiveness or the non-effectiveness of an organization. On the other hand, many approaches for knowledge representation and reasoning have been proposed using Logic Programming (LP), namely in the area of Model Theory or Proof Theory. In this work it is followed the proof theoretical approach in terms of an extension to the LP language to knowledge representation and reasoning. The computational framework is centered on Artificial Neural Networks to evaluate customer’s satisfaction and the degree of confidence that one has on such a happening

    A Case Study for Georgia Southwestern State University: The Discrepancies\u27 of Financial Aid Services that Impact Student Enrollment

    Get PDF
    At many traditional universities, the federal timelines for determining financial aid eligibility is based on releasing of the Free Application of Federal Student Aid each January, and the subsequent financial aid processing cycle July 1- June 30th. These federally established dates can conflict with traditional August class starts and creates a backlog and delayed processing of information that, in turn, hinders students from receiving timely information in order to make informed decisions based on financial aid awards. The purpose of this case study of a traditional university in Georgia was to apply net price theory and rational choice theory to evaluate the impact of timeline conflicts and how students make decisions about which institution to attend. Data consisted of internal documents, including the results of a prior survey of 425 freshmen, and 13 alumni focus group and survey participants. All data were inductively coded and analyzed using a constant comparative method to reveal key themes. Key findings indicated decision making by prospective students largely focused on accurate and timely communication and cost of attendance. One discrepant area was the decision maker\u27s ability to differentiate between cost of attendance and net price which impacted some student decisions to enroll. The findings are consistent with both net price and rational choice theory. Recommendations to university leaders include encouraging early communication to prospective students and retraining efforts for financial aid staff in order to meet regulatory demands and timelines, increase student enrollment, and reduce anxieties for potential students and families associated with the financial aid process. These outcomes enhance social change by potentially opening doors to higher education for new generations of students

    Lessons from the field: how successful community-based coalitions on obesity, choose and prioritize interventions to improve health policy, health behaviors, and health outcomes

    Full text link
    Thesis (D.P.H.)--Boston UniversityBackground: Community-based coalitions could be mechanisms to foster individual and systems change in their communities in order to reduce the burden of obesity. Coalitions can increase the likelihood of reducing obesity by selecting and implementing effective interventions. Members of community-based coalitions are challenged to consider the multiple and interacting determinants of obesity and to select effective interventions from hundreds of untested recommendations. This investigation identified best practices in collective decision-making processes used to select obesity interventions. These practices may be adopted by other community coalitions working to reduce or prevent obesity in their communities. Methods: Three exemplar community-based obesity coalitions were investigated using a multiple case study design. Data from twenty-six coalition member interviews were analyzed using methods based in Grounded Theory in order to identify practices in decision-making processes related to intervention selection. Documentation was reviewed to verify coalition activity during this planning stage. Findings: Nine shared practices related to decision-making processes were found among the three exemplar coalitions: setting a vision and objectives that target determinants of obesity and emphasize comprehensive solutions focusing on the community environment; defining an organizational structure that maximizes collaboration and shared decision-making; leaders taking a strong role in guiding and simplifying the process; obtaining financial resources that support the objectives; gathering information from the community; communicating information with the community and coalition members; aligning community-based information with objectives; making final selections by consensus; and dispersing objectives to other community organizations to build support and momentum. Research translation: A teaching case study was developed that documents and analyzes the practices and processes that exemplar community coalitions engage in while working to select a comprehensive intervention to prevent and reduce obesity. Conclusions: Successful community-based obesity coalitions formulate a clear vision with strategic objectives, develop organizational structure and processes, utilize information gathered from both subject matter experts (individuals and agencies) and community members, and consider local community needs, assets, and interests in order to prioritize and select obesity interventions for their communities. Key words: obesity, community-based coalitions, decision-making, best practices, best processe

    The Management of Energy Savings Performance Contracts (ESPCs)

    Get PDF
    Energy Savings Performance Contracts (ESPCs) originated to accomplish several objectives: (1) to meet energy efficiency goals mandated by executive orders and energy policies; (2) to improve federal government facilities using funds allocated for utility bills; and (3) to receive repayment of expenditures through energy savings reflected in reduced utility bills. In ESPC\u27s, the contractor guarantees savings to the federal government agency. 10 CFR 436 limits the time necessary for payback. However, this regulation and others were written prior to the deregulation of utility companies. This theory is based on the underlying premise that the contractor payback is a direct result of the energy savings. The population of study is all of the Air Force ESPCs. The sampling frame used will be the ESPCs and their task orders (TO\u27s) listed in the Air Force Civil Engineering Support Agency (AFCESA) database. The primary unit of analysis will be the individual task order. Data will be collected from interviews, observations, conferences, archives, and other task order related documents. Using case study methodology, contract financial data, energy rates contract decision memorandums, contract clauses and statements of work, observation, open interviews, and other relevant meetings and materials will be evaluated to determine whether deregulation has an effect on contractor payback and what the effect entails

    Finding Academic Experts on a MultiSensor Approach using Shannon's Entropy

    Full text link
    Expert finding is an information retrieval task concerned with the search for the most knowledgeable people, in some topic, with basis on documents describing peoples activities. The task involves taking a user query as input and returning a list of people sorted by their level of expertise regarding the user query. This paper introduces a novel approach for combining multiple estimators of expertise based on a multisensor data fusion framework together with the Dempster-Shafer theory of evidence and Shannon's entropy. More specifically, we defined three sensors which detect heterogeneous information derived from the textual contents, from the graph structure of the citation patterns for the community of experts, and from profile information about the academic experts. Given the evidences collected, each sensor may define different candidates as experts and consequently do not agree in a final ranking decision. To deal with these conflicts, we applied the Dempster-Shafer theory of evidence combined with Shannon's Entropy formula to fuse this information and come up with a more accurate and reliable final ranking list. Experiments made over two datasets of academic publications from the Computer Science domain attest for the adequacy of the proposed approach over the traditional state of the art approaches. We also made experiments against representative supervised state of the art algorithms. Results revealed that the proposed method achieved a similar performance when compared to these supervised techniques, confirming the capabilities of the proposed framework

    Penyelesaian Sengketa Perbankan Syariah dalam Perspektif Hukum Progresif

    Full text link
    The purpose of this study was to explain Islamic banking dispute serrlement in progressive legal perspective relevant to thinking about law Hans Kelsen for the development of progressive legal concept that in order to complement and enhance the operation of law in society.The paradigm of this research is constuktivisme, while the method of research was carried out by two strategies, namely the research literature (library research) and case studies (case study). Literature study conducted on all documents or literature on legal theory. Documents then grouped according to the dimension of time or periodization. Case studies conducted in this study is a case related to the operation of law in society related to Islamic banking dispute resolution. This study used a socio-legal rules governing this studies. This research using secondary data and primary data. Secondary data was obtained through the Research Library (Library Research) and Legal Document. Secondary data include: 1) Primary Legal Materials, in the form of Article 55 of Law No. 21 Th 2008 and explanations, Article 39 of Law No. 30 Th, 1999, Law No. 4 Th. 1996, Law No. 50 Th. 2009 Ă Statute aproach civil relationship Ă  Theory Agreement and Procedural Law, Procedural Law Religious Court, the Constitutional Court Decision No. 93 / PUU-X / 2012 Ă  Pollitik Ă Teori Political Law Law, Religion and ruling Justice Court Judge District Court relating to Settlement Dispute Guarantee Mortgage. 2) Secondary Legal Materials, consisting of a book-nail on legal theory, legal philosophy, paradigm, socio-legal studies and research methods. Primary data was obtained through research in the field (Field Research) was done by observation, interview, which includes: 1) Law sanction institution: Judges. 2) Role Occupant: Judges, Academics, Advocate, Legal Staff of The Islamic Bank, Islamic Bank Customer, Successor (cadre) Satjipto Rahardjo and Hans Kelsen. Ă  implemantation with hermeneutics and phenomenology

    Examining Philosophy of Technology Using Grounded Theory Methods

    Get PDF
    A qualitative study was conducted to examine the philosophy of technology of K-12 technology leaders, and explore the influence of their thinking on technology decision making. The research design aligned with CORBIN and STRAUSS grounded theory methods, and I proceeded from a research paradigm of critical realism. The subjects were school technology directors and instructional technology specialists, and data collection consisted of interviews and a written questionnaire. Data analysis involved the use of grounded theory methods including memo writing, open and axial coding, constant comparison, the use of purposive and theoretical sampling, and theoretical saturation of categories. Three broad philosophy of technology views were widely held by participants: an instrumental view of technology, technological optimism, and a technological determinist perspective that saw technological change as inevitable. Technology leaders were guided by two main approaches to technology decision making, represented by the categories Educational goals and curriculum should drive technology, and Keep up with Technology (or be left behind). The core category and central phenomenon that emerged was that technology leaders approached technology leadership by placing greater emphasis on keeping up with technology, being influenced by an ideological orientation to technological change, and being concerned about preparing students for a technological future

    Verifying the Interplay of Authorization Policies and Workflow in Service-Oriented Architectures (Full version)

    Full text link
    A widespread design approach in distributed applications based on the service-oriented paradigm, such as web-services, consists of clearly separating the enforcement of authorization policies and the workflow of the applications, so that the interplay between the policy level and the workflow level is abstracted away. While such an approach is attractive because it is quite simple and permits one to reason about crucial properties of the policies under consideration, it does not provide the right level of abstraction to specify and reason about the way the workflow may interfere with the policies, and vice versa. For example, the creation of a certificate as a side effect of a workflow operation may enable a policy rule to fire and grant access to a certain resource; without executing the operation, the policy rule should remain inactive. Similarly, policy queries may be used as guards for workflow transitions. In this paper, we present a two-level formal verification framework to overcome these problems and formally reason about the interplay of authorization policies and workflow in service-oriented architectures. This allows us to define and investigate some verification problems for SO applications and give sufficient conditions for their decidability.Comment: 16 pages, 4 figures, full version of paper at Symposium on Secure Computing (SecureCom09

    Probabilistic learning for selective dissemination of information

    Get PDF
    New methods and new systems are needed to filter or to selectively distribute the increasing volume of electronic information being produced nowadays. An effective information filtering system is one that provides the exact information that fulfills user's interests with the minimum effort by the user to describe it. Such a system will have to be adaptive to the user changing interest. In this paper we describe and evaluate a learning model for information filtering which is an adaptation of the generalized probabilistic model of information retrieval. The model is based on the concept of 'uncertainty sampling', a technique that allows for relevance feedback both on relevant and nonrelevant documents. The proposed learning model is the core of a prototype information filtering system called ProFile
    • …
    corecore