107 research outputs found

    Minmax subtree cover problem on cacti

    Get PDF
    AbstractLet G=(V,E) be a connected graph such that edges and vertices are weighted by nonnegative reals. Let p be a positive integer. The minmax subtree cover problem (MSC) asks to find a pair (X,T) of a partition X={X1,X2,…,Xp} of V and a set T of p subtrees T1,T2,…,Tp, each Ti containing Xi so as to minimize the maximum cost of the subtrees, where the cost of Ti is defined to be the sum of the weights of edges in Ti and the weights of vertices in Xi. In this paper, we propose an O(p2n) time (4-4/(p+1))-approximation algorithm for the MSC when G is a cactus

    Практичний курс з перекладу термінології комп'ютерних систем з англійської на українську мову

    Get PDF
    Посібник побудовано таким чином, що кожний розділ містить у собі матеріал та вправи, в ході виконання яких студенти мають набути вказаних вище перекладацьких умінь і навичок, а саме: ознайомлення з текстами, що сприяють отриманню фонових знань (тексти і міни-тексти, що містять відповідну термінологію і опис якогось явища, наприклад, особливостей функціонування веб-сайтів); вправи для перевірки знання термінологічних еквівалентів, зокрема, переклад міні-текстів з англійської на українську і навпаки; аудіовправи, які закріплюють знання термінології і готують перекладача до усного перекладу у зазначеній галузі

    The Multi-Depot Minimum Latency Problem with Inter-Depot Routes

    Get PDF
    The Minimum Latency Problem (MLP) is a class of routing problems that seeks to minimize the wait times (latencies) of a set of customers in a system. Similar to its counterparts in the Traveling Salesman Problem (TSP) and Vehicle Routing Problem (VRP), the MLP is NP-hard. Unlike these other problem classes, however, the MLP is customer-oriented and thus has impactful potential for better serving customers in settings where they are the highest priority. While the VRP is very widely researched and applied to many industry settings to reduce travel times and costs for service-providers, the MLP is a more recent problem and does not have nearly the body of literature supporting it as found in the VRP. However, it is gaining significant attention recently because of its application to such areas as disaster relief logistics, which are a growing problem area in a global context and have potential for meaningful improvements that translate into reduced suffering and saved lives. An effective combination of MLP\u27s and route minimizing objectives can help relief agencies provide aid efficiently and within a manageable cost. To further the body of literature on the MLP and its applications to such settings, a new variant is introduced here called the Multi-Depot Minimum Latency Problem with Inter-Depot Routes (MDMLPI). This problem seeks to minimize the cumulative arrival times at all customers in a system being serviced by multiple vehicles and depots. Vehicles depart from one central depot and have the option of refilling their supply at a number of intermediate depots. While the equivalent problem has been studied using a VRP objective function, this is a new variant of the MLP. As such, a mathematical model is introduced along with several heuristics to provide the first solution approaches to solving it. Two objectives are considered in this work: minimizing latency, or arrival times at each customer, and minimizing weighted latency, which is the product of customer need and arrival time at that customer. The case of weighted latency carries additional significance as it may correspond to a larger number of customers at one location, thus adding emphasis to the speed with which they are serviced. Additionally, a discussion on fairness and application to disaster relief settings is maintained throughout. To reflect this, standard deviation among latencies is also evaluated as a measure of fairness in each of the solution approaches. Two heuristic approaches, as well as a second-phase adjustment to be applied to each, are introduced. The first is based on an auction policy in which customers bid to be the next stop on a vehicle\u27s tour. The second uses a procedure, referred to as an insertion technique, in which customers are inserted one-by-one into a partial routing solution such that each addition minimizes the (weighted) latency impact of that single customer. The second-phase modification takes the initial solutions achieved in the first two heuristics and considers the (weighted) latency impact of repositioning nodes one at a time. This is implemented to remove potential inefficient routing placements from the original solutions that can have compounding effects for all ensuing stops on the tour. Each of these is implemented on ten test instances. A nearest neighbor (greedy) policy and previous solutions to these instances with a VRP objective function are used as benchmarks. Both heuristics perform well in comparison to these benchmarks. Neither heuristic appears to perform clearly better than the other, although the auction policy achieves slightly better averages for the performance measures. When applying the second-phase adjustment, improvements are achieved and lead to even greater reductions in latency and standard deviation for both objectives. The value of these latency reductions is thoroughly demonstrated and a call for further research regarding customer-oriented objectives and evaluation of fairness in routing solutions is discussed. Finally, upon conclusion of the results presented in this work, several promising areas for future work and existing gaps in the literature are highlighted. As the body of literature surrounding the MLP is small yet growing, these areas constitute strong directions with important relevance to Operations Research, Humanitarian Logistics, Production Systems, and more

    The New Hampshire, Vol. 51, No. 22 (Feb. 22, 1962)

    Get PDF
    An independent student produced newspaper from the University of New Hampshire

    Prometheus: a generic e-commerce crawler for the study of business markets and other e-commerce problems

    Get PDF
    Dissertação de mestrado em Computer ScienceThe continuous social and economic development has led over time to an increase in consumption, as well as greater demand from the consumer for better and cheaper products. Hence, the selling price of a product assumes a fundamental role in the purchase decision by the consumer. In this context, online stores must carefully analyse and define the best price for each product, based on several factors such as production/acquisition cost, positioning of the product (e.g. anchor product) and the competition companies strategy. The work done by market analysts changed drastically over the last years. As the number of Web sites increases exponentially, the number of E-commerce web sites also prosperous. Web page classification becomes more important in fields like Web mining and information retrieval. The traditional classifiers are usually hand-crafted and non-adaptive, that makes them inappropriate to use in a broader context. We introduce an ensemble of methods and the posterior study of its results to create a more generic and modular crawler and scraper for detection and information extraction on E-commerce web pages. The collected information may then be processed and used in the pricing decision. This framework goes by the name Prometheus and has the goal of extracting knowledge from E-commerce Web sites. The process requires crawling an online store and gathering product pages. This implies that given a web page the framework must be able to determine if it is a product page. In order to achieve this we classify the pages in three categories: catalogue, product and ”spam”. The page classification stage was addressed based on the html text as well as on the visual layout, featuring both traditional methods and Deep Learning approaches. Once a set of product pages has been identified we proceed to the extraction of the pricing information. This is not a trivial task due to the disparity of approaches to create a web page. Furthermore, most product pages are dynamic in the sense that they are truly a page for a family of related products. For instance, when visiting a shoe store, for a particular model there are probably a number of sizes and colours available. Such a model may be displayed in a single dynamic web page making it necessary for our framework to explore all the relevant combinations. This process is called scraping and is the last stage of the Prometheus framework.O contínuo desenvolvimento social e económico tem conduzido ao longo do tempo a um aumento do consumo, assim como a uma maior exigência do consumidor por produtos melhores e mais baratos. Naturalmente, o preço de venda de um produto assume um papel fundamental na decisão de compra por parte de um consumidor. Nesse sentido, as lojas online precisam de analisar e definir qual o melhor preço para cada produto, tendo como base diversos fatores, tais como o custo de produção/venda, posicionamento do produto (e.g. produto âncora) e as próprias estratégias das empresas concorrentes. O trabalho dos analistas de mercado mudou drasticamente nos últimos anos. O crescimento de sites na Web tem sido exponencial, o número de sites E-commerce também tem prosperado. A classificação de páginas da Web torna-se cada vez mais importante, especialmente em campos como mineração de dados na Web e coleta/extração de informações. Os classificadores tradicionais são geralmente feitos manualmente e não adaptativos, o que os torna inadequados num contexto mais amplo. Nós introduzimos um conjunto de métodos e o estudo posterior dos seus resultados para criar um crawler e scraper mais genéricos e modulares para extração de conhecimento em páginas de Ecommerce. A informação recolhida pode então ser processada e utilizada na tomada de decisão sobre o preço de venda. Esta Framework chama-se Prometheus e tem como intuito extrair conhecimento de Web sites de E-commerce. Este processo necessita realizar a navegação sobre lojas online e armazenar páginas de produto. Isto implica que dado uma página web a framework seja capaz de determinar se é uma página de produto. Para atingir este objetivo nós classificamos as páginas em três categorias: catálogo, produto e spam. A classificação das páginas foi realizada tendo em conta o html e o aspeto visual das páginas, utilizando tanto métodos tradicionais como Deep Learning. Depois de identificar um conjunto de páginas de produto procedemos à extração de informação sobre o preço. Este processo não é trivial devido à quantidade de abordagens possíveis para criar uma página web. A maioria dos produtos são dinâmicos no sentido em que um produto é na realidade uma família de produtos relacionados. Por exemplo, quando visitamos uma loja online de sapatos, para um modelo em especifico existe a provavelmente um conjunto de tamanhos e cores disponíveis. Esse modelo pode ser apresentado numa única página dinâmica fazendo com que seja necessário para a nossa Framework explorar estas combinações relevantes. Este processo é chamado de scraping e é o último passo da Framework Prometheus

    Organize

    Get PDF
    Digital media technologies re-pose the question of organization - and thus of power and domination, control and surveillance, disruption and emancipation. This book interrogates organization as effect and condition of media. How can we understand the recursive relationship between media and organization? How can we think, explore, critique - and perhaps alter - the organizational bodies and scripts that shape contemporary life

    Organize

    Get PDF
    Digital media technologies re-pose the question of organization—and thus of power and domination, control and surveillance, disruption and emancipation. This book interrogates organization as effect and condition of media. How can we understand the recursive relationship between media and organization? How can we think, explore, critique—and perhaps alter—the organizational bodies and scripts that shape contemporary life

    Impact of IIoT and Lean bundles configurations on proactive work behaviors

    Get PDF
    This thesis investigates the impact of IIoT and Lean bundles configurations on team proactivity, that is calculated as a mean of individual proactivity values. It is divided in four chapters: in the first chapter an introduction to Industry 4.0 and a description of its main technologies is performed, the second chapter is characterized by a literature review on the integration between Industry 4.0 an Lean Production, the third chapter consists on a description of the Qualitative Comparative Analysis approach and the last chapter discusses the results of the analysis

    The Ledger and Times, March 21, 1968

    Get PDF
    corecore