1,619 research outputs found

    Cyclic schedules for r irregularly occurring event

    Get PDF
    Consider r irregular polygons with vertices on some circle. Authors explains how the polygons should be arranged to minimize some criterion function depending on the distances between adjacent vertices. A solution of this problem is given. It is based on a decomposition of the set of all schedules into local regions in which the optimization problem is convex. For the criterion functions minimize the maximum distance and maximize the minimum distance the local optimization problems are related to network flow problems which can be solved efficiently. If the sum of squared distances is to be minimized a locally optimal solution can be found by solving a system of linear equations. For fixed r the global problem is polynomially solvable for all the above-mentioned objective functions. In the general case, however, the global problem is NP-hard

    Routing of railway carriages: A case study

    Get PDF
    In the context of organizing timetables for railway companies the following railway carriage routing problem occurs. Given a timetable containing rail links with departure and destination times/stations and the composition of the trains, find a routing of railway carriages such that the required carriages are always available when a train departs. We will present a local search approach for this routing problem for the railway carriages. The approach uses structural properties of an integer multi-commodity network flow formulation of the problem. Computational results for a real world instance are given

    Snowex 2017 Community Snow Depth Measurements: A Quality-Controlled, Georeferenced Product

    Get PDF
    Snow depth was one of the core ground measurements required to validate remotely-sensed data collected during SnowEx Year 1, which occurred in Colorado. The use of a single, common protocol was fundamental to produce a community reference dataset of high quality. Most of the nearly 100 Grand Mesa and Senator Beck Basin SnowEx ground crew participants contributed to this crucial dataset during 6-25 February 2017. Snow depths were measured along ~300 m transects, whose locations were determined according to a random-stratified approach using snowfall and tree-density gradients. Two-person teams used snowmobiles, skis, or snowshoes to travel to staked transect locations and to conduct measurements. Depths were measured with a 1-cm incremented probe every 3 meters along transects. In shallow areas of Grand Mesa, depth measurements were also collected with GPS snow-depth probes (a.k.a. MagnaProbes) at ~1-m intervals. During summer 2017, all reference stake positions were surveyed with <10 cm accuracy to improve overall snow depth location accuracy. During the campaign, 193 transects were measured over three weeks at Grand Mesa and 40 were collected over two weeks in Senator Beck Basin, representing more than 27,000 depth values. Each day of the campaign depth measurements were written in waterproof field books and photographed by National Snow and Ice Data Center (NSIDC) participants. The data were later transcribed and prepared for extensive quality assessment and control. Common issues such as protocol errors (e.g., survey in reverse direction), notebook image issues (e.g., halo in the center of digitized picture), and data-entry errors (sloppy writing and transcription errors) were identified and fixed on a point-by-point basis. In addition, we strove to produce a georeferenced product of fine quality, so we calculated and interpolated coordinates for every depth measurement based on surveyed stakes and the number of measurements made per transect. The product has been submitted to NSIDC in csv format. To educate data users, we present the study design and processing steps that have improved the quality and usability of this product. Also, we will address measurement and design uncertainties, which are different in open vs. forest areas

    Weekly Gridded Aquarius L-band Radiometer-Scatterometer Observations and Salinity Retrievals over the Polar Regions - Part 2: Initial Product Analysis

    Get PDF
    Following the development and availability of Aquarius weekly polar-gridded products, this study presents the spatial and temporal radiometer and scatterometer observations at L band (frequency1.4 GHz) over the cryosphere including the Greenland and Antarctic ice sheets, sea ice in both hemispheres, and over sub-Arctic land for monitoring the soil freeze-thaw state. We provide multiple examples of scientific applications for the L-band data over the cryosphere. For example, we show that over the Greenland Ice Sheet, the unusual 2012 melt event lead to an L-band brightness temperature (TB) sustained decrease of 5 K at horizontal polarization. Over the Antarctic ice sheet, normalized radar cross section (NRCS) observations recorded during ascending and descending orbits are significantly different, highlighting the anisotropy of the ice cover. Over sub-Arctic land, both passive and active observations show distinct values depending on the soil physical state (freeze-thaw). Aquarius sea surface salinity (SSS) retrievals in the polar waters are also presented. SSS variations could serve as an indicator of fresh water input to the ocean from the cryosphere, however the presence of sea ice often contaminates the SSS retrievals, hindering the analysis. The weekly grided Aquarius L-band products used a redistributed by the US Snow and Ice Data Center at http:nsidc.orgdataaquariusindex.html, and show potential for cryospheric studies

    Using the isabelle ontology framework: Linking the formal with the informal

    Get PDF
    This is the author accepted manuscript. The final version is available from the publisher via the DOI in this recordWhile Isabelle is mostly known as part of Isabelle/HOL (an interactive theorem prover), it actually provides a framework for developing a wide spectrum of applications. A particular strength of the Isabelle framework is the combination of text editing, formal verification, and code generation. Up to now, Isabelle’s document preparation system lacks a mechanism for ensuring the structure of different document types (as, e.g., required in certification processes) in general and, in particular, mechanism for linking informal and formal parts of a document. In this paper, we present Isabelle/DOF, a novel Document Ontology Framework on top of Isabelle. Isabelle/DOF allows for conventional typesetting as well as formal development. We show how to model document ontologies inside Isabelle/DOF, how to use the resulting meta-information for enforcing a certain document structure, and discuss ontology-specific IDE support

    Inverse scheduling: two machine flow shop problem

    Get PDF
    We study an inverse counterpart of the two-machine flow-shop scheduling problem that arises in the context of inverse optimization. While in the forward scheduling problem all parameters are given and the objective is to find job sequence(s) for which the value of the makespan is minimum, in the inverse scheduling the exact values of processing times are unknown and they should be selected within given boundaries so that pre-specified job sequence(s) become optimal. We derive necessary and sufficient conditions of optimality of a given solution for the general case of the flow-shop problem when the job sequences on the machines can be different. Based on these conditions we prove that the inverse flow-shop problem is NP-hard even in the case of the same job sequence on both machines and produce a linear programming formulation for a special case which can be solved efficientl

    A polynomial-time algorithm for a flow-shop batching problem with equal-length operations

    Get PDF
    A flow-shop batching problem with consistent batches is considered in which the processing times of all jobs on each machine are equal to p and all batch set-up times are equal to s. In such a problem, one has to partition the set of jobs into batches and to schedule the batches on each machine. The processing time of a batch B i is the sum of processing times of operations in B i and the earliest start of B i on a machine is the finishing time of B i on the previous machine plus the set-up time s. Cheng et al. (Naval Research Logistics 47:128–144, 2000) provided an O(n) pseudopolynomial-time algorithm for solving the special case of the problem with two machines. Mosheiov and Oron (European Journal of Operational Research 161:285–291, 2005) developed an algorithm of the same time complexity for the general case with more than two machines. Ng and Kovalyov (Journal of Scheduling 10:353–364, 2007) improved the pseudopolynomial complexity to O(n √ ) . In this paper, we provide a polynomial-time algorithm of time complexity O(log 3 n)
    • …
    corecore