19,997 research outputs found

    Performance of concatenated codes using 8-bit and 10-bit Reed-Solomon codes

    Get PDF
    The performance improvement of concatenated coding systems using 10-bit instead of 8-bit Reed-Solomon codes is measured by simulation. Three inner convolutional codes are considered: (7,1/2), (15,1/4), and (15,1/6). It is shown that approximately 0.2 dB can be gained at a bit error rate of 10(-6). The loss due to nonideal interleaving is also evaluated. Performance comparisons at very low bit error rates may be relevant for systems using data compression

    Phobos lander coding system: Software and analysis

    Get PDF
    The software developed for the decoding system used in the telemetry link of the Phobos Lander mission is described. Encoders and decoders are provided to cover the three possible telemetry configurations. The software can be used to decode actual data or to simulate the performance of the telemetry system. The theoretical properties of the codes chosen for this mission are analyzed and discussed

    Further results on finite-state codes

    Get PDF
    A general construction for finite-state (FS) codes is applied to some well-known block codes. Subcodes of the (24,12) Golay code are used to generate two optimal FS codes with d sub free = 12 and 16. A partition of the (16,8) Nordstrom-Robinson code yields a d sub free = 10 FS code. Simulation results are shown and decoding algorithms are briefly discussed

    Modeling near-field tsunami observations to improve finite-fault slip models for the 11 March 2011 Tohoku earthquake

    Get PDF
    The massive tsunami generated by the 11 March 2011 Tohoku earthquake (M_w 9.0) was widely recorded by GPS buoys, wave gauges, and ocean bottom pressure sensors around the source. Numerous inversions for finite-fault slip time histories have been performed using seismic and/or geodetic observations, yielding generally consistent patterns of large co-seismic slip offshore near the hypocenter and/or up-dip near the trench, where estimated peak slip is ~60 m. Modeling the tsunami generation and near-field wave processes using two detailed rupture models obtained from either teleseismic P waves or high-rate GPS recordings in Japan allows evaluation of how well the finite-fault models account for the regional tsunami data. By determining sensitivity of the tsunami calculations to rupture model features, we determine model modifications that improve the fit to the diverse tsunami data while retaining the fit to the seismic and geodetic observations

    Process waste analysis for offsite production methods for house construction – A case study of factory wall panel production

    Get PDF
    There is a growing interest in the use of offsite manufacturing (OSM) in the construction industry disregarding criticisms of lacking real improvement from some offsite approaches adopted by housebuilders as compared to their onsite counterparts. Quantitative performance measures from previous studies are based on conventional onsite methods, with little attention paid to the performance and process improvements derived from various OSM methods. In response, a case study was conducted based on two OSM methods using standardized and non-standardized processes for the production stage of a factory-manufactured wall panel. Value system analysis and root cause analysis using the 5Whys method was adopted to evaluate possible improvements in terms of process waste. The study reveals that OSM production methods that replicate site arrangements and activities involving significant manual tasks do not necessarily provide a marked improvement from the conventional onsite method. Thus, there is a need to re-evaluate the processes involved to eliminate such embedded process wastes as non-value-added time and cost and to consider automating critical activities. The analysis adopted in the case study provides measurable evidence of the performance gained from having a structured workflow over a non-structured workflow. It also reveals how process wastes are generated in the production process of wall panels offsite

    TOWARDS AN ONTOLOGY-BASED APPROACH TO MEASURING PRODUCTIVITY FOR OFFSITE MANUFACTURING METHOD

    Get PDF
    The steady decline of manual and skilled trades in the construction industry has increased the recognition of offsite manufacturing (OSM), an aspect of Design for Manufacture and Assembly (DFMA) methods as one way to boost productivity and performance. However, existing productivity estimation approaches are carried out in isolation thus limiting the sort of result obtained from such systems. Also, there is yet to be a holistic approach that enables productivity estimation using different metrics and integrates experts’ knowledge to predict productivity and guide decision making at the early development stage of a project. This study aims to develop a method that can be used to generate multiple estimations for all these metrics simultaneously through linking their relationships. An ontology-based knowledge modelling approach for estimating productivity at the production stage for OSM projects is proposed. A case study of panel system offsite is used as a proof-of-concept for data collection and knowledge modelling in an ontology. Results from the study through the use of rules and semantic reasoning retrieved cost estimates and time schedule for a panel system production with considerations for different design choices. It is thus proven that systemising the production process knowledge of OSM methods enables practitioners to make informed choices on product design to best suit productivity requirements. The developed method helps to reduce the level of uncertainty by encouraging measurable evidence and allows for better decision-making on productivity

    Compressed/reconstructed test images for CRAF/Cassini

    Get PDF
    A set of compressed, then reconstructed, test images submitted to the Comet Rendezvous Asteroid Flyby (CRAF)/Cassini project is presented as part of its evaluation of near lossless high compression algorithms for representing image data. A total of seven test image files were provided by the project. The seven test images were compressed, then reconstructed with high quality (root mean square error of approximately one or two gray levels on an 8 bit gray scale), using discrete cosine transforms or Hadamard transforms and efficient entropy coders. The resulting compression ratios varied from about 2:1 to about 10:1, depending on the activity or randomness in the source image. This was accomplished without any special effort to optimize the quantizer or to introduce special postprocessing to filter the reconstruction errors. A more complete set of measurements, showing the relative performance of the compression algorithms over a wide range of compression ratios and reconstruction errors, shows that additional compression is possible at a small sacrifice in fidelity
    • 

    corecore