4,819 research outputs found

    Certified Computation from Unreliable Datasets

    Full text link
    A wide range of learning tasks require human input in labeling massive data. The collected data though are usually low quality and contain inaccuracies and errors. As a result, modern science and business face the problem of learning from unreliable data sets. In this work, we provide a generic approach that is based on \textit{verification} of only few records of the data set to guarantee high quality learning outcomes for various optimization objectives. Our method, identifies small sets of critical records and verifies their validity. We show that many problems only need poly(1/ε)\text{poly}(1/\varepsilon) verifications, to ensure that the output of the computation is at most a factor of (1±ε)(1 \pm \varepsilon) away from the truth. For any given instance, we provide an \textit{instance optimal} solution that verifies the minimum possible number of records to approximately certify correctness. Then using this instance optimal formulation of the problem we prove our main result: "every function that satisfies some Lipschitz continuity condition can be certified with a small number of verifications". We show that the required Lipschitz continuity condition is satisfied even by some NP-complete problems, which illustrates the generality and importance of this theorem. In case this certification step fails, an invalid record will be identified. Removing these records and repeating until success, guarantees that the result will be accurate and will depend only on the verified records. Surprisingly, as we show, for several computation tasks more efficient methods are possible. These methods always guarantee that the produced result is not affected by the invalid records, since any invalid record that affects the output will be detected and verified

    Application of advanced technology to space automation

    Get PDF
    Automated operations in space provide the key to optimized mission design and data acquisition at minimum cost for the future. The results of this study strongly accentuate this statement and should provide further incentive for immediate development of specific automtion technology as defined herein. Essential automation technology requirements were identified for future programs. The study was undertaken to address the future role of automation in the space program, the potential benefits to be derived, and the technology efforts that should be directed toward obtaining these benefits

    Applying close range digital photogrammetry in soil erosion studies

    Get PDF
    Soil erosion due to rainfall and overland flow is a significant environmental problem. Studying the phenomenon requires accurate high-resolution measurements of soil surface topography and morphology. Close range digital photogrammetry with an oblique convergent configuration is proposed in this paper as a useful technique for such measurements, in the context of a flume-scale experimental study. The precision of the technique is assessed by comparing triangulation solutions and the resulting DEMs with varying tie point distributions and control point measurements, as well as by comparing DEMs extracted from different images of the same surface. Independent measurements were acquired using a terrestrial laser scanner for comparison with a DEM derived from photogrammetry. The results point to the need for a stronger geometric configuration to improve precision. They also suggest that the camera lens models were not fully adequate for the large object depths in this study. Nevertheless, the photogrammetric output can provide useful topographical information for soil erosion studies, provided limitations of the technique are duly considered

    Lower bounds on the size of semidefinite programming relaxations

    Full text link
    We introduce a method for proving lower bounds on the efficacy of semidefinite programming (SDP) relaxations for combinatorial problems. In particular, we show that the cut, TSP, and stable set polytopes on nn-vertex graphs are not the linear image of the feasible region of any SDP (i.e., any spectrahedron) of dimension less than 2nc2^{n^c}, for some constant c>0c > 0. This result yields the first super-polynomial lower bounds on the semidefinite extension complexity of any explicit family of polytopes. Our results follow from a general technique for proving lower bounds on the positive semidefinite rank of a matrix. To this end, we establish a close connection between arbitrary SDPs and those arising from the sum-of-squares SDP hierarchy. For approximating maximum constraint satisfaction problems, we prove that SDPs of polynomial-size are equivalent in power to those arising from degree-O(1)O(1) sum-of-squares relaxations. This result implies, for instance, that no family of polynomial-size SDP relaxations can achieve better than a 7/8-approximation for MAX-3-SAT

    Project LOCOST: Laser or Chemical Hybrid Orbital Space Transport

    Get PDF
    A potential mission in the late 1990s is the servicing of spacecraft assets located in GEO. The Geosynchronous Operations Support Center (GeoShack) will be supported by a space transfer vehicle based at the Space Station (SS). The vehicle will transport cargo between the SS and the GeoShack. A proposed unmanned, laser or chemical hybrid orbital space transfer vehicle (LOCOST) can be used to efficiently transfer cargo between the two orbits. A preliminary design shows that an unmanned, laser/chemical hybrid vehicle results in the fuel savings needed while still providing fast trip times. The LOCOST vehicle receives a 12 MW laser beam from one Earth orbiting, solar pumped, iodide Laser Power Station (LPS). Two Energy Relay Units (ERU) provide laser beam support during periods of line-of-sight blockage by the Earth. The baseline mission specifies a 13 day round trip transfer time. The ship's configuration consist of an optical train, one hydrogen laser engine, two chemical engines, a 18 m by 29 m box truss, a mission-flexible payload module, and propellant tanks. Overall vehicle dry mass is 8,000 kg. Outbound cargo mass is 20,000 kg, and inbound cargo mass is 6,000 kg. The baseline mission needs 93,000 kg of propellants to complete the scenario. Fully fueled, outbound mission mass is 121,000 kg. A regeneratively cooled, single plasma, laser engine design producing a maximum of 768 N of thrust is utilized along with two traditional chemical engines. The payload module is designed to hold 40,000 kg of cargo, though the baseline mission specifies less. A proposed design of a laser/chemical hybrid vehicle provides a trip time and propellant efficient means to transport cargo from the SS to a GeoShack. Its unique, hybrid propulsion system provides safety through redundancy, allows baseline missions to be efficiently executed, while still allowing for the possibility of larger cargo transfers

    A Mobile Query Service for Integrated Access to Large Numbers of Online Semantic Web Data Sources

    Get PDF
    From the Semantic Web’s inception, a number of concurrent initiatives have given rise to multiple segments: large semantic datasets, exposed by query endpoints; online Semantic Web documents, in the form of RDF files; and semantically annotated web content (e.g., using RDFa), semantic sources in their own right. In various mobile application scenarios, online semantic data has proven to be useful. While query endpoints are most commonly exploited, they are mainly useful to expose large semantic datasets. Alternatively, mobile RDF stores are utilized to query local semantic data, but this requires the design-time identification and replication of relevant data. Instead, we present a mobile query service that supports on-the-fly and integrated querying of semantic data, originating from a largely unused portion of the Semantic Web, comprising online RDF files and semantics embedded in annotated webpages. To that end, our solution performs dynamic identification, retrieval and caching of query-relevant semantic data. We explore several data identification and caching alternatives, and investigate the utility of source metadata in optimizing these tasks. Further, we introduce a novel cache replacement strategy, fine- tuned to the described query dataset, and include explicit support for the Open World Assumption. An extensive experimental validation evaluates the query service and its alternative components
    corecore