56,798 research outputs found

    A Convex Model for Edge-Histogram Specification with Applications to Edge-preserving Smoothing

    Full text link
    The goal of edge-histogram specification is to find an image whose edge image has a histogram that matches a given edge-histogram as much as possible. Mignotte has proposed a non-convex model for the problem [M. Mignotte. An energy-based model for the image edge-histogram specification problem. IEEE Transactions on Image Processing, 21(1):379--386, 2012]. In his work, edge magnitudes of an input image are first modified by histogram specification to match the given edge-histogram. Then, a non-convex model is minimized to find an output image whose edge-histogram matches the modified edge-histogram. The non-convexity of the model hinders the computations and the inclusion of useful constraints such as the dynamic range constraint. In this paper, instead of considering edge magnitudes, we directly consider the image gradients and propose a convex model based on them. Furthermore, we include additional constraints in our model based on different applications. The convexity of our model allows us to compute the output image efficiently using either Alternating Direction Method of Multipliers or Fast Iterative Shrinkage-Thresholding Algorithm. We consider several applications in edge-preserving smoothing including image abstraction, edge extraction, details exaggeration, and documents scan-through removal. Numerical results are given to illustrate that our method successfully produces decent results efficiently

    Exact States in Waveguides With Periodically Modulated Nonlinearity

    Get PDF
    We introduce a one-dimensional model based on the nonlinear Schrodinger/Gross-Pitaevskii equation where the local nonlinearity is subject to spatially periodic modulation in terms of the Jacobi dn function, with three free parameters including the period, amplitude, and internal form-factor. An exact periodic solution is found for each set of parameters and, which is more important for physical realizations, we solve the inverse problem and predict the period and amplitude of the modulation that yields a particular exact spatially periodic state. Numerical stability analysis demonstrates that the periodic states become modulationally unstable for large periods, and regain stability in the limit of an infinite period, which corresponds to a bright soliton pinned to a localized nonlinearity-modulation pattern. Exact dark-bright soliton complex in a coupled system with a localized modulation structure is also briefly considered . The system can be realized in planar optical waveguides and cigar-shaped atomic Bose-Einstein condensates.Comment: EPL, in pres

    Statistics Of The Burst Model At Super-critical Phase

    Full text link
    We investigate the statistics of a model of type-I X-ray burst [Phys. Rev. E, {\bf 51}, 3045 (1995)] in its super-critical phase. The time evolution of the burnable clusters, places where fire can pass through, is studied using simple statistical arguments. We offer a simple picture for the time evolution of the percentage of space covered by burnable clusters. A relation between the time-average and the peak percentage of space covered by burnable clusters is also derived.Comment: 11 Pages in Revtex 3.0. Two figures available by sending request to [email protected]

    Transverse Entanglement Migration in Hilbert Space

    Full text link
    We show that, although the amount of mutual entanglement of photons propagating in free space is fixed, the type of correlations between the photons that determine the entanglement can dramatically change during propagation. We show that this amounts to a migration of entanglement in Hilbert space, rather than real space. For the case of spontaneous parametric down conversion, the migration of entanglement in transverse coordinates takes place from modulus to phase of the bi-photon state and back again. We propose an experiment to observe this migration in Hilbert space and to determine the full entanglement.Comment: 4 pages, 3 figure

    Embodied carbon and construction cost differences between Hong Kong and Melbourne buildings

    Get PDF
    Limiting the amount of embodied carbon in buildings can help minimize the damaging impacts of global warming through lower upstream emission of CO2. This study empirically investigates the embodied carbon footprint of new-build and refurbished buildings in both Hong Kong and Melbourne to determine the embodied carbon profile and its relationship to both embodied energy and construction cost. The Hong Kong findings suggest that mean embodied carbon for refurbished buildings is 33-39% lower than new-build projects, and the cost for refurbished buildings is 22-50% lower than new-build projects (per square metre of floor area). The Melbourne findings, however, suggest that mean embodied carbon for refurbished buildings is 4% lower than new-build projects, and the cost for refurbished buildings is 24% higher than new-build projects (per square metre of floor area). Embodied carbon ranges from 645-1,059 kgCO2e/m2 for new-build and 294-655 kgCO2e/m2 for refurbished projects in Hong Kong, and 1,138-1,705 kgCO2e/m2 for new-build and 900-1,681 kgCO2e/m2 for refurbished projects in Melbourne. The reasons behind these locational discrepancies are explored and critiqued. Overall, a very strong linear relationship between embodied energy and construction cost in both cities was found and can be used to predict the former, given the latter

    Profitable Scheduling on Multiple Speed-Scalable Processors

    Full text link
    We present a new online algorithm for profit-oriented scheduling on multiple speed-scalable processors. Moreover, we provide a tight analysis of the algorithm's competitiveness. Our results generalize and improve upon work by \textcite{Chan:2010}, which considers a single speed-scalable processor. Using significantly different techniques, we can not only extend their model to multiprocessors but also prove an enhanced and tight competitive ratio for our algorithm. In our scheduling problem, jobs arrive over time and are preemptable. They have different workloads, values, and deadlines. The scheduler may decide not to finish a job but instead to suffer a loss equaling the job's value. However, to process a job's workload until its deadline the scheduler must invest a certain amount of energy. The cost of a schedule is the sum of lost values and invested energy. In order to finish a job the scheduler has to determine which processors to use and set their speeds accordingly. A processor's energy consumption is power \Power{s} integrated over time, where \Power{s}=s^{\alpha} is the power consumption when running at speed ss. Since we consider the online variant of the problem, the scheduler has no knowledge about future jobs. This problem was introduced by \textcite{Chan:2010} for the case of a single processor. They presented an online algorithm which is αα+2eα\alpha^{\alpha}+2e\alpha-competitive. We provide an online algorithm for the case of multiple processors with an improved competitive ratio of αα\alpha^{\alpha}.Comment: Extended abstract submitted to STACS 201
    • …
    corecore