235 research outputs found

    World-wide web information discovery via relevance feedback.

    Get PDF
    Yue Che Wang, Kenneth.Thesis (M.Phil.)--Chinese University of Hong Kong, 1998.Includes bibliographical references (leaves 100-106).Abstract also in Chinese.Abstract --- p.iAbstract (Chinese) --- p.ivAcknowledgement --- p.viChapter 1 --- Introduction --- p.1Chapter 1.1 --- The World-Wide Web --- p.1Chapter 1.2 --- Searching Information on the WWW --- p.2Chapter 1.3 --- Intelligent content-based information discovery on the Web --- p.4Chapter 1.4 --- Organization of the Thesis --- p.7Chapter 2 --- Literature Review --- p.9Chapter 2.1 --- Search Engines --- p.9Chapter 2.2 --- Information Indexing Systems --- p.11Chapter 2.3 --- Agent-based Systems --- p.13Chapter 2.4 --- Information Filtering Systems --- p.16Chapter 3 --- Overview of the Proposed Approach --- p.20Chapter 3.1 --- System Architecture --- p.21Chapter 3.2 --- Topic Profile Specification --- p.25Chapter 3.3 --- Text Representation --- p.29Chapter 3.3.1 --- Profile Feature Representation --- p.30Chapter 3.3.2 --- Document Feature Representation --- p.33Chapter 3.4 --- Advantages of the Topic Profile Specifications --- p.34Chapter 4 --- Relevance Score Evaluation Process and Relevance Feedback Model --- p.36Chapter 4.1 --- Term Weights --- p.37Chapter 4.2 --- Document Evaluation through Relevance Score --- p.39Chapter 4.3 --- Learning via Relevance Feedback --- p.42Chapter 4.3.1 --- Introduction to Relevance Feedback --- p.43Chapter 4.3.2 --- Feature Extraction from the Relevance Feedback Models --- p.44Chapter 4.3.3 --- Topic Feature Vectors Refinement --- p.49Chapter 5 --- Intelligent Web Exploration --- p.51Chapter 5.1 --- Introduction to Simulated Annealing --- p.51Chapter 5.2 --- Intelligent Web Exploration by Simulated Annealing --- p.54Chapter 5.2.1 --- Mathematical Setting of the Discovery Process --- p.57Chapter 5.2.2 --- The Entire Exploration Algorithm --- p.58Chapter 5.3 --- Incorporating with the Relevance Feedback Model --- p.60Chapter 6 --- Experimental Results --- p.61Chapter 6.1 --- The Design of the Experiments --- p.61Chapter 6.2 --- Experiments on the Effects of the Simulated Annealing Schedule upon the Discovery Precision --- p.65Chapter 6.2.1 --- Experiment Setup --- p.66Chapter 6.2.2 --- Results --- p.66Chapter 6.3 --- Experiments on the Index Page Topic Profile Specification --- p.72Chapter 6.3.1 --- Experiment Setup --- p.72Chapter 6.3.2 --- Results --- p.73Chapter 6.4 --- Experiments on the Relevance Feedback with Full-Text Feature Extraction Strategy --- p.75Chapter 6.4.1 --- Experiment Setup --- p.75Chapter 6.4.2 --- Results --- p.76Chapter 6.5 --- Comparisons of the Relevance Feedback Feature Extraction Strate- gies --- p.78Chapter 6.5.1 --- Experiment Setup --- p.78Chapter 6.5.2 --- Results --- p.79Chapter 6.6 --- Comparisons between the Example Page and the Keyword Topic Profile Specifications --- p.82Chapter 6.6.1 --- Experiment Setup --- p.83Chapter 6.6.2 --- Results --- p.83Chapter 6.7 --- Summary from the Experimental Results --- p.87Chapter 7 --- Conclusion --- p.91Chapter 7.1 --- The Aim of Our Proposed System --- p.91Chapter 7.2 --- The Favorable Features and the Effectiveness of Our Proposed System --- p.92Chapter 7.3 --- Future Work --- p.94Appendix --- p.96Chapter A --- List of URLs for the Example Pages --- p.96Chapter B --- List of URLs for the Arbitrarily Chosen Index Pages --- p.98Bibliography --- p.10

    Performance Evaluation for the Sustainable Supply Chain Management

    Get PDF
    Supply chain SC activities transform natural resources, raw materials, and components into various finished products that are delivered to end customers. A high efficient SC would bring great benefits to an enterprise such as integrated resources, reduced logistics costs, improved logistics efficiency, and high quality of overall level of services. In contrast, an inefficient SC will bring additional transaction costs, information management costs, and resource waste, reduce the production capacity of all enterprises on the chain, and unsatisfactory customer relationships. So the evaluation of a SC is important for an enterprise to survive in a competitive market in a globalized business environment. Therefore, it is important to research the various methods, performance indicator systems, and technology for evaluating, monitoring, predicting, and optimizing the performance of a SC. A typical procedure of the performance evaluation (PE) of a SC is to use the established evaluation performance indicators, employ an analytical method, follow a given procedure, to carry out quantitatively or qualitatively comparative analysis to provide the objective and accurate evaluation of a SC performance in a selected operation period. Various research works have been carried out in proposing the performance indicator systems and methods for SC performance evaluations. But there are no widely accepted indicator systems that can be applied in practical SC performance evaluations due to the fact that the indicators in different systems have been defined without a common understanding of the meanings and the relationships between them, and they are nonlinear and very complicated

    Consideration behavior and design decision making

    Get PDF
    Over the past decade, design engineering has developed a systematic framework to coordinate with consumer behavior models. Traditional consumer models applied in the past has mainly focused on the preference of compensatory trade-offs in the choice decisions. Recent marketing research has become interested in developing consumer models that are representative in that they reflect realistic human decision processes. One important example is consideration : the process of quickly screening out many available alternatives using non-compensatory rules before trading off the value of different feature combinations. Is capturing consideration important for design? This research investigates the impact of modeling consideration behavior to design engineering, aiming at constructing consideration models that can inform strategic decisions. The study includes several features absent in existing research: quantifying the mis-specifications of the underlying choice process, tailoring survey instruments for particular models, and exploring the models\u27 strategic value on product profitability and design feature differences. First, numerical methods are explored to address the discontinuity in the profit-oriented optimization problem introduced by the consideration models. Methods based on complementarity constraints, smoothing functions and genetic algorithms are implemented and evaluated with a vehicle design case study. Second, a simulation experiment based on synthetic market data compares consideration models and a variety of conventional choice models in the process of model estimation and design optimization. The simulation finds that even when estimated compensatory models provide relatively good predictive accuracy, they can lead to sub-optimal design decisions when the population uses consideration behavior; convergence of compensatory models to non-compensatory behavior is likely to require unrealistic amounts of data; modeling heterogeneity in non-compensatory screening is more valuable than heterogeneity in compensatory trade-offs. The synthetic experiment framework then further extends the comparison to include the survey design process guided by the different assumptions behind considerations and traditional models. A product line design case study reveals that even though both compensatory models and consideration models show robustness in profitability, using consideration models leads to optimal portfolios with higher feature diversity while reducing the risk of overestimating profits. Finally, the research explores how to use consideration models to analyze the market penetration of newly designed product in a case study of a consideration maximization problem. It is the hope that this research will arouse the attention of designers to the informative power of consideration models, expand the understanding of consumer behavior modeling from the predictive power in the marketing field to the strategic impacts to design decisions, and provide technical support to the future application of consideration models in design engineering

    RadPathFinder: An application for finding optimal paths in a radiation environment

    Get PDF

    Automatic binary patching for flaws repairing using static rewriting and reverse dataflow analysis

    Get PDF
    Tese de Mestrado, Segurança Informática, 2022, Universidade de Lisboa, Faculdade de CiênciasThe C programming language is widely used in embedded systems, kernel and hardware programming, making it one of the most commonly used programming languages. However, C lacks of boundary verification of variables, making it one of the most vulnerable languages. Because of this and associated with its high usability, it is also the language with most reported vulnerabilities in the past ten years, being the memory corruption the most common type of vulnerabilities, specifically buffer overflows. These vulnerabilities when exploited can produce critical consequences, being thus extremely important not only to correctly identify these vulnerabilities but also to properly fix them. This work aims to study buffer overflow vulnerabilities in C binary programs by identifying possible malicious inputs that can trigger such vulnerabilities and finding their root cause in order to mitigate the vulnerabilities by rewriting the binary assembly code and thus generating a new binary without the original flaw. The main focus of this thesis is the use of binary patching to automatically fix stack overflow vulnerabilities and validate its effectiveness while ensuring that these do not add new vulnerabilities. Working with the binary code of applications and without accessing their source code is a challenge because any required change to its binary code (i.e, assembly) needs to take into consideration that new instructions must be allocated, and this typically means that existing instructions will need to be moved to create room for new ones and recover the control flow information, otherwise the application would be compromised. The approach we propose to address this problem was successfully implemented in a tool and evaluated with a set of test cases and real applications. The evaluation results showed that the tool was effective in finding vulnerabilities, as well as in patching them

    Modelling visual search for surface defects

    Get PDF
    Much work has been done on developing algorithms for automated surface defect detection. However, comparisons between these models and human perception are rarely carried out. This thesis aims to investigate how well human observers can nd defects in textured surfaces, over a wide range of task di culties. Stimuli for experiments will be generated using texture synthesis methods and human search strategies will be captured by use of an eye tracker. Two di erent modelling approaches will be explored. A computational LNL-based model will be developed and compared to human performance in terms of the number of xations required to find the target. Secondly, a stochastic simulation, based on empirical distributions of saccades, will be compared to human search strategies

    Controlled self-organisation using learning classifier systems

    Get PDF
    The complexity of technical systems increases, breakdowns occur quite often. The mission of organic computing is to tame these challenges by providing degrees of freedom for self-organised behaviour. To achieve these goals, new methods have to be developed. The proposed observer/controller architecture constitutes one way to achieve controlled self-organisation. To improve its design, multi-agent scenarios are investigated. Especially, learning using learning classifier systems is addressed

    Development and application of 3D X-ray diffraction for the study of phase transformations in metallic materials

    Get PDF
    Many steel alloy types, both currently in use and under development, exploit a deformation-induced phase transformation to achieve a combined high strength and ductility. As deformation is applied to these alloys, a metastable retained austenite phase transforms to martensite. This process acts as a significant carrier of plasticity, increasing the work-hardening rate and therefore the ductility. The "stability", or resistance against martensitic transformation, of the austenite phase is the main parameter that governs the martensitic transformation rate and therefore the work-hardening behaviour of the steel. In the last few decades, the stability of an individual austenite grain has been shown to depend on a number of microstructural properties, such as the size of the grain, its orientation relative to the loading axis, the alloy chemistry, and the configuration of the grain's immediate crystallographic neighbourhood. A good understanding of how exactly these properties modify austenite grain stability is crucial to the development of accurate models of deformation-induced phenomena, which themselves directly contribute to the design of new and improved alloys that better exploit said phenomena. In the past, austenite grain stability has usually been evaluated for a steel sample either through phase-averaged behaviour, where the stability of the phase overall is characterised, or on an individual grain level, where typically only a few grains are considered. This is primarily due to the difficulties involved with measuring the martensitic transformation in situ at a per-grain level for a large number of grains simultaneously. The recent development of far-field Three-Dimensional X-Ray Diffraction (3DXRD) has enabled such measurements on a range of polycrystalline materials, capturing the grain-level position, orientation and strain tensor for many thousands of grains in situ. However, the 3DXRD technique poses a number of significant challenges related to data analysis and post-processing, both crucial steps that must be carefully implemented to enable detailed measurements of complicated polycrystal samples. In this study, 3DXRD was implemented at the I12 Joint Engineering, Environmental, and Processing (JEEP) Beamline at the Diamond Light Source X-ray synchrotron. Then, the capabilities of the technique were explored by examining how a microstructurally "simple" single phase ferritic steel responds to in-situ tensile deformation on a per-grain level. A number of micromechanical phenomena were investigated, including a small (but statistically significant) grain neighbourhood effect, where the stress state of a central grain was found to depend on the orientation of its immediate neighbourhood grains, a finding never before seen for large numbers of grains in a cubic polycrystal. During this 3DXRD implementation, a sophisticated automated data analysis and post-processing pipeline was developed, that enabled rapid exploration of such micromechanical effects. With 3DXRD implemented and a data analysis pipeline developed, a novel metastable stainless steel alloy system was devised that enabled the exploration of the martensitic transformation at very low applied strains, as 3DXRD is typically limited to ~2% maximum strain. This alloy system was extensively characterised non-destructively in three dimensions with laboratory electron-based and X-ray based techniques, and was used to evaluate both the performance of multi-phase laboratory-based Diffraction Contrast Tomography (DCT), as well as a novel registration algorithm that accurately located two-dimensional planes measured with Electron Back-Scatter Diffraction (EBSD) within the three-dimensional DCT dataset. Finally, the deformation response of the alloy was measured in-situ with 3DXRD at the ID11 beamline of the European Synchrotron Radiation Facility, coupled with in-situ EBSD scans using an in-chamber tensile stage. Substantial martensite transformations were found even within the ~2% maximum strain window, proving the alloy design was successful and enabling extensive in-situ analyses of austenite grain stability in the bulk material with 3DXRD. Austenite grain stability was found to be influenced by grain size, orientation, and local neighbourhood. Larger grains, grains oriented with {100} close to the loading axis, and grains with more ferrite/martensite-dense neighbourhoods were found to have reduced stability against deformation. The minimum strain work criterion model was also evaluated against the experimental data — it was found to correctly predict the orientation of martensite that formed in the majority of grains, given the parent orientation and macroscopic applied load. Grains where the model failed tended to have reduced levels of stress just before forming martensite, which was attributed to the use of the global stress state by the model as opposed to more granular measurements of the immediate stress field around a grain

    Cyberspace and Real-World Behavioral Relationships: Towards the Application of Internet Search Queries to Identify Individuals At-risk for Suicide

    Get PDF
    The Internet has become an integral and pervasive aspect of society. Not surprisingly, the growth of ecommerce has led to focused research on identifying relationships between user behavior in cyberspace and the real world - retailers are tracking items customers are viewing and purchasing in order to recommend additional products and to better direct advertising. As the relationship between online search patterns and real-world behavior becomes more understood, the practice is likely to expand to other applications. Indeed, Google Flu Trends has implemented an algorithm that accurately charts the relationship between the number of people searching for flu-related topics on the Internet, and the number of people who actually have flu symptoms in that region. Because the results are real-time, studies show Google Flu Trends estimates are typically two weeks ahead of the Center for Disease Control. The Air Force has devoted considerable resources to suicide awareness and prevention. Despite these efforts, suicide rates have remained largely unaffected. The Air Force Suicide Prevention Program assists family, friends, and co-workers of airmen in recognizing and discussing behavioral changes with at-risk individuals. Based on other successes in correlating behaviors in cyberspace and the real world, is it possible to leverage online activities to help identify individuals that exhibit suicidal or depression-related symptoms? This research explores the notion of using Internet search queries to classify individuals with common search patterns. Text mining was performed on user search histories for a one-month period from nine Air Force installations. The search histories were clustered based on search term probabilities, providing the ability to identify relationships between individuals searching for common terms. Analysis was then performed to identify relationships between individuals searching for key terms associated with suicide, anxiety, and post-traumatic stress

    Operations research: from computational biology to sensor network

    Get PDF
    In this dissertation we discuss the deployment of combinatorial optimization methods for modeling and solve real life problemS, with a particular emphasis to two biological problems arising from a common scenario: the reconstruction of the three-dimensional shape of a biological molecule from Nuclear Magnetic Resonance (NMR) data. The fi rst topic is the 3D assignment pathway problem (APP) for a RNA molecule. We prove that APP is NP-hard, and show a formulation of it based on edge-colored graphs. Taking into account that interactions between consecutive nuclei in the NMR spectrum are diff erent according to the type of residue along the RNA chain, each color in the graph represents a type of interaction. Thus, we can represent the sequence of interactions as the problem of fi nding a longest (hamiltonian) path whose edges follow a given order of colors (i.e., the orderly colored longest path). We introduce three alternative IP formulations of APP obtained with a max flow problem on a directed graph with packing constraints over the partitions, which have been compared among themselves. Since the last two models work on cyclic graphs, for them we proposed an algorithm based on the solution of their relaxation combined with the separation of cycle inequalities in a Branch & Cut scheme. The second topic is the discretizable distance geometry problem (DDGP), which is a formulation on discrete search space of the well-known distance geometry problem (DGP). The DGP consists in seeking the embedding in the space of a undirected graph, given a set of Euclidean distances between certain pairs of vertices. DGP has two important applications: (i) fi nding the three dimensional conformation of a molecule from a subset of interatomic distances, called Molecular Distance Geometry Problem, and (ii) the Sensor Network Localization Problem. We describe a Branch & Prune (BP) algorithm tailored for this problem, and two versions of it solving the DDGP both in protein modeling and in sensor networks localization frameworks. BP is an exact and exhaustive combinatorial algorithm that examines all the valid embeddings of a given weighted graph G=(V,E,d), under the hypothesis of existence of a given order on V. By comparing the two version of BP to well-known algorithms we are able to prove the e fficiency of BP in both contexts, provided that the order imposed on V is maintained
    • …
    corecore