534 research outputs found

    Approximate Deadline-Scheduling with Precedence Constraints

    Full text link
    We consider the classic problem of scheduling a set of n jobs non-preemptively on a single machine. Each job j has non-negative processing time, weight, and deadline, and a feasible schedule needs to be consistent with chain-like precedence constraints. The goal is to compute a feasible schedule that minimizes the sum of penalties of late jobs. Lenstra and Rinnoy Kan [Annals of Disc. Math., 1977] in their seminal work introduced this problem and showed that it is strongly NP-hard, even when all processing times and weights are 1. We study the approximability of the problem and our main result is an O(log k)-approximation algorithm for instances with k distinct job deadlines

    Alternative conservation outcomes from aquatic fauna translocations: Losing and saving the Running River rainbowfish

    Get PDF
    1. The translocation of species outside their natural range is a threat to aquatic biodiversity globally, especially freshwater fishes, as most are not only susceptible to predation and competition but readily hybridize with congeners. 2. Running River rainbowfish (RRR, Melanotaenia sp.) is a narrow-ranged, small-bodied freshwater fish that recently became threatened and was subsequently listed as Critically Endangered, owing to introgressive hybridization and competition following the translocation of a congeneric species, the eastern rainbowfish (Melanotaenia splendida). 3. To conserve RRR, wild fish were taken into captivity, genetically confirmed as pure representatives, and successfully bred. As the threat of introgression with translocated eastern rainbowfish could not be mitigated, a plan was devised to translocate captive raised RRR into unoccupied habitats within their native catchment, upstream of natural barriers. The translocation plan involved careful site selection and habitat assessment, predator training (exposure to predators prior to release), soft release (with a gradual transition from captivity to nature), and post-release monitoring, and this approach was ultimately successful. 4. Two populations of RRR were established in two previously unoccupied streams above waterfalls with a combined stream length of 18 km. Post-release monitoring was affected by floods and low sample sizes, but suggested that predation and time of release are important factors to consider in similar conservation recovery programmes for small-bodied, short-lived fishes

    νd→μ−Δ++n\nu d \to \mu^- \Delta^{++} n Reaction and Axial Vector N−ΔN-\Delta Coupling

    Full text link
    The reaction νd→μ−Δ++n\nu d \to \mu^- \Delta^{++} n is studied in the region of low q2q^2 to investigate the effect of deuteron structure and width of the Δ\Delta resonance on the differential cross section. The results are used to extract the axial vector N−ΔN-\Delta coupling C5AC^{A}_5 from the experimental data on this reaction. The possibility to determine this coupling from electroweak interaction experiments with high intensity electron accelerators is discussed.Comment: 14 pages, REVTEX, 5 figure

    Projective re-normalization for improving the behavior of a homogeneous conic linear system

    Get PDF
    In this paper we study the homogeneous conic system F : Ax = 0, x ∈ C \ {0}. We choose a point ¯s ∈ intC∗ that serves as a normalizer and consider computational properties of the normalized system F¯s : Ax = 0, ¯sT x = 1, x ∈ C. We show that the computational complexity of solving F via an interior-point method depends only on the complexity value ϑ of the barrier for C and on the symmetry of the origin in the image set H¯s := {Ax : ¯sT x = 1, x ∈ C}, where the symmetry of 0 in H¯s is sym(0,H¯s) := max{α : y ∈ H¯s -->−αy ∈ H¯s} .We show that a solution of F can be computed in O(sqrtϑ ln(ϑ/sym(0,H¯s)) interior-point iterations. In order to improve the theoretical and practical computation of a solution of F, we next present a general theory for projective re-normalization of the feasible region F¯s and the image set H¯s and prove the existence of a normalizer ¯s such that sym(0,H¯s) ≥ 1/m provided that F has an interior solution. We develop a methodology for constructing a normalizer ¯s such that sym(0,H¯s) ≥ 1/m with high probability, based on sampling on a geometric random walk with associated probabilistic complexity analysis. While such a normalizer is not itself computable in strongly-polynomialtime, the normalizer will yield a conic system that is solvable in O(sqrtϑ ln(mϑ)) iterations, which is strongly-polynomialtime. Finally, we implement this methodology on randomly generated homogeneous linear programming feasibility problems, constructed to be poorly behaved. Our computational results indicate that the projective re-normalization methodology holds the promise to markedly reduce the overall computation time for conic feasibility problems; for instance we observe a 46% decrease in average IPM iterations for 100 randomly generated poorly-behaved problem instances of dimension 1000 × 5000.Singapore-MIT Allianc

    Deuteron Electroweak Disintegration

    Get PDF
    We study the deuteron electrodisintegration with inclusion of the neutral currents focusing on the helicity asymmetry of the exclusive cross section in coplanar geometry. We stress that a measurement of this asymmetry in the quasi elastic region is of interest for an experimental determination of the weak form factors of the nucleon, allowing one to obtain the parity violating electron neutron asymmetry. Numerically, we consider the reaction at low momentum transfer and discuss the sensitivity of the helicity asymmetry to the strangeness radius and magnetic moment. The problems coming from the finite angular acceptance of the spectrometers are also considered.Comment: 30 pages, Latex, 7 eps figures, submitted to Phys.Rev.C e-mail: [email protected] , [email protected]

    Parity violating target asymmetry in electron - proton scattering

    Get PDF
    We analyze the parity-violating (PV) components of the analyzing power in elastic electron-proton scattering and discuss their sensitivity to the strange quark contributions to the proton weak form factors. We point out that the component of the analyzing power along the momentum transfer is independent of the electric weak form factor and thus compares favorably with the PV beam asymmetry for a determination of the strangeness magnetic moment. We also show that the transverse component could be used for constraining the strangeness radius. Finally, we argue that a measurement of both components could give experimental information on the strangeness axial charge.Comment: 24 pages, Latex, 5 eps figures, submitted to Phys.Rev.

    Measures of process harmonization

    Get PDF
    Context Many large organizations juggle an application portfolio that contains different applications that fulfill similar tasks in the organization. In an effort to reduce operating costs, they are attempting to consolidate such applications. Before consolidating applications, the work that is done with these applications must be harmonized. This is also known as process harmonization. Objective The increased interest in process harmonization calls for measures to quantify the extent to which processes have been harmonized. These measures should also uncover the factors that are of interest when harmonizing processes. Currently, such measures do not exist. Therefore, this study develops and validates a measurement model to quantify the level of process harmonization in an organization. Method The measurement model was developed by means of a literature study and structured interviews. Subsequently, it was validated through a survey, using factor analysis and correlations with known related constructs. Results As a result, a valid and reliable measurement model was developed. The factors that are found to constitute process harmonization are: the technical design of the business process and its data, the resources that execute the process, and the information systems that are used in the process. In addition, strong correlations were found between process harmonization and process standardization and between process complexity and process harmonization. Conclusion The measurement model can be used by practitioners, because it shows them the factors that must be taken into account when harmonizing processes, and because it provides them with a means to quantify the extent to which they succeeded in harmonizing their processes. At the same time, it can be used by researchers to conduct further empirical research in the area of process harmonization

    Origins of the Ambient Solar Wind: Implications for Space Weather

    Full text link
    The Sun's outer atmosphere is heated to temperatures of millions of degrees, and solar plasma flows out into interplanetary space at supersonic speeds. This paper reviews our current understanding of these interrelated problems: coronal heating and the acceleration of the ambient solar wind. We also discuss where the community stands in its ability to forecast how variations in the solar wind (i.e., fast and slow wind streams) impact the Earth. Although the last few decades have seen significant progress in observations and modeling, we still do not have a complete understanding of the relevant physical processes, nor do we have a quantitatively precise census of which coronal structures contribute to specific types of solar wind. Fast streams are known to be connected to the central regions of large coronal holes. Slow streams, however, appear to come from a wide range of sources, including streamers, pseudostreamers, coronal loops, active regions, and coronal hole boundaries. Complicating our understanding even more is the fact that processes such as turbulence, stream-stream interactions, and Coulomb collisions can make it difficult to unambiguously map a parcel measured at 1 AU back down to its coronal source. We also review recent progress -- in theoretical modeling, observational data analysis, and forecasting techniques that sit at the interface between data and theory -- that gives us hope that the above problems are indeed solvable.Comment: Accepted for publication in Space Science Reviews. Special issue connected with a 2016 ISSI workshop on "The Scientific Foundations of Space Weather." 44 pages, 9 figure

    Search for a W' boson decaying to a bottom quark and a top quark in pp collisions at sqrt(s) = 7 TeV

    Get PDF
    Results are presented from a search for a W' boson using a dataset corresponding to 5.0 inverse femtobarns of integrated luminosity collected during 2011 by the CMS experiment at the LHC in pp collisions at sqrt(s)=7 TeV. The W' boson is modeled as a heavy W boson, but different scenarios for the couplings to fermions are considered, involving both left-handed and right-handed chiral projections of the fermions, as well as an arbitrary mixture of the two. The search is performed in the decay channel W' to t b, leading to a final state signature with a single lepton (e, mu), missing transverse energy, and jets, at least one of which is tagged as a b-jet. A W' boson that couples to fermions with the same coupling constant as the W, but to the right-handed rather than left-handed chiral projections, is excluded for masses below 1.85 TeV at the 95% confidence level. For the first time using LHC data, constraints on the W' gauge coupling for a set of left- and right-handed coupling combinations have been placed. These results represent a significant improvement over previously published limits.Comment: Submitted to Physics Letters B. Replaced with version publishe
    • …
    corecore