25,105 research outputs found

    On Redundancy Elimination Tolerant Scheduling Rules

    Full text link
    In (Ferrucci, Pacini and Sessa, 1995) an extended form of resolution, called Reduced SLD resolution (RSLD), is introduced. In essence, an RSLD derivation is an SLD derivation such that redundancy elimination from resolvents is performed after each rewriting step. It is intuitive that redundancy elimination may have positive effects on derivation process. However, undesiderable effects are also possible. In particular, as shown in this paper, program termination as well as completeness of loop checking mechanisms via a given selection rule may be lost. The study of such effects has led us to an analysis of selection rule basic concepts, so that we have found convenient to move the attention from rules of atom selection to rules of atom scheduling. A priority mechanism for atom scheduling is built, where a priority is assigned to each atom in a resolvent, and primary importance is given to the event of arrival of new atoms from the body of the applied clause at rewriting time. This new computational model proves able to address the study of redundancy elimination effects, giving at the same time interesting insights into general properties of selection rules. As a matter of fact, a class of scheduling rules, namely the specialisation independent ones, is defined in the paper by using not trivial semantic arguments. As a quite surprising result, specialisation independent scheduling rules turn out to coincide with a class of rules which have an immediate structural characterisation (named stack-queue rules). Then we prove that such scheduling rules are tolerant to redundancy elimination, in the sense that neither program termination nor completeness of equality loop check is lost passing from SLD to RSLD.Comment: 53 pages, to appear on TPL

    Whose absentee votes are returned and counted: The variety and use of absentee ballots in California

    Get PDF
    Absentee voting is becoming more prevalent throughout the United States. Although there has been some research focused on who votes by absentee ballot, little research has considered another important question about absentee voting: which absentee ballots are counted and which are not? Research in the wake of the 2000 presidential election has studied the problem of uncounted ballots for precinct voters but not for absentee voters. Using data from Los Angeles County – nation's largest and most diverse voting jurisdiction – for the November 2002 general election, we test a series of hypotheses that certain types of voters have a higher likelihood that their ballots will be counted. We find that uniform service personnel, overseas civilians, voters who request non-English ballots and permanent absentee voters have a much lower likelihood of returning their ballot, and once returned, a lower likelihood that their ballots will be counted compared with the general absentee voting population. We also find that there is little partisan effect as to which voters are more likely to return their ballots or have their ballots counted. We conclude our paper with a discussion of the implications of our research for the current debates about absentee voting

    A Comparative Analysis of Phytovolume Estimation Methods Based on UAV-Photogrammetry and Multispectral Imagery in a Mediterranean Forest

    Get PDF
    Management and control operations are crucial for preventing forest fires, especially in Mediterranean forest areas with dry climatic periods. One of them is prescribed fires, in which the biomass fuel present in the controlled plot area must be accurately estimated. The most used methods for estimating biomass are time-consuming and demand too much manpower. Unmanned aerial vehicles (UAVs) carrying multispectral sensors can be used to carry out accurate indirect measurements of terrain and vegetation morphology and their radiometric characteristics. Based on the UAV-photogrammetric project products, four estimators of phytovolume were compared in a Mediterranean forest area, all obtained using the difference between a digital surface model (DSM) and a digital terrain model (DTM). The DSM was derived from a UAV-photogrammetric project based on the structure from a motion algorithm. Four different methods for obtaining a DTM were used based on an unclassified dense point cloud produced through a UAV-photogrammetric project (FFU), an unsupervised classified dense point cloud (FFC), a multispectral vegetation index (FMI), and a cloth simulation filter (FCS). Qualitative and quantitative comparisons determined the ability of the phytovolume estimators for vegetation detection and occupied volume. The results show that there are no significant differences in surface vegetation detection between all the pairwise possible comparisons of the four estimators at a 95% confidence level, but FMI presented the best kappa value (0.678) in an error matrix analysis with reference data obtained from photointerpretation and supervised classification. Concerning the accuracy of phytovolume estimation, only FFU and FFC presented differences higher than two standard deviations in a pairwise comparison, and FMI presented the best RMSE (12.3 m) when the estimators were compared to 768 observed data points grouped in four 500 m2 sample plots. The FMI was the best phytovolume estimator of the four compared for low vegetation height in a Mediterranean forest. The use of FMI based on UAV data provides accurate phytovolume estimations that can be applied on several environment management activities, including wildfire prevention. Multitemporal phytovolume estimations based on FMI could help to model the forest resources evolution in a very realistic way

    Rapid methods of landslide hazard mapping : Fiji case study

    Get PDF
    A landslide hazard probability map can help planners (1) prepare for, and/or mitigate against, the effects of landsliding on communities and infrastructure, and (2) avoid or minimise the risks associated with new developments. The aims of the project were to establish, by means of studies in a few test areas, a generic method by which remote sensing and data analysis using a geographic information system (GIS) could provide a provisional landslide hazard zonation map. The provision of basic hazard information is an underpinning theme of the UN’s International Decade for Natural Disaster Reduction (IDNDR). It is an essential requirement for disaster preparedness and mitigation planning. This report forms part of BGS project 92/7 (R5554) ‘Rapid assessment of landslip hazards’ Carried out under the ODA/BGS Technology Development and Research Programme as part of the British Government’s provision of aid to developing countries. It provides a detailed technical account of work undertaken in a test area in Viti Levu in collaboration with Fiji Mineral Resources Department. The study represents a demonstration of a methodology that is applicable to many developing countries. The underlying principle is that relationships between past landsliding events, interpreted from remote sensing, and factors such as the geology, relief, soils etc provide the basis for modelling where future landslides are most likely to occur. This is achieved using a GIS by ‘weighting’ each class of each variable (e.g. each lithology ‘class’ of the variable ‘geology’) according to the proportion of landslides occurring within it compared to the regional average. Combinations of variables, produced by summing the weights in individual classes, provide ‘models’ of landslide probability. The approach is empirical but has the advantage of potentially being able to provide regional scale hazard maps over large areas quickly and cheaply; this is unlikely to be achieved using conventional ground-based geotechnical methods. In Fiji, landslides are usually triggered by intense rain storms commonly associated with tropical cyclones. However, the regional distribution of landslides has not been mapped nor is it known how far geology and landscape influence the location and severity of landsliding events. The report discusses the remote sensing and GIS methodology, and describes the results of the pilot study over an area of 713 km2 in south east Viti Levu. The landslide model uses geology, elevation, slope angle, slope aspect, soil type, and forest cover as inputs. The resulting provisional landslide hazard zonation map, divided into high, medium and low zones of landslide hazard probability, suggests that whilst rainfall is the immediate cause, others controls do exert a significant influence. It is recommended that consideration be given in Fiji to implementing the techniques as part of a national strategic plan for landslide hazard zonation mapping

    Spitzer IRAC Observations of Star Formation in N159 in the LMC

    Full text link
    We present observations of the giant HII region complex N159 in the LMC using IRAC on the {\it Spitzer Space Telescope}. One of the two objects previously identified as protostars in N159 has an SED consistent with classification as a Class I young stellar object (YSO) and the other is probably a Class I YSO as well, making these two stars the youngest stars known outside the Milky Way. We identify two other sources that may also be Class I YSOs. One component, N159AN, is completely hidden at optical wavelengths, but is very prominent in the infrared. The integrated luminosity of the entire complex is L ≈9×106\approx 9\times10^6L⊙_{\odot}, consistent with the observed radio emission assuming a normal Galactic initial mass function (IMF). There is no evidence for a red supergiant population indicative of an older burst of star formation. The N159 complex is 50 pc in diameter, larger in physical size than typical HII regions in the Milky Way with comparable luminosity. We argue that all of the individual components are related in their star formation history. The morphology of the region is consistent with a wind blown bubble $\approx 1-2Myr-old that has initiated star formation now taking place at the rim. Other than its large physical size, star formation in N159 appears to be indistinguishable from star formation in the Milky Way.Comment: 14 figure

    European welfare state under the policy "make work pay" : Analysis with composite indicators

    Get PDF
    The social security systems in 22 European countries are evaluated with a specially constructed indicator. It is based on a census-simulating model which combines both empirical (statistical) and normative (rule-based) approaches. The individual answers of unemployed on social security benefits are normatively derived from their personal situations with the OECD Tax-Benefit Models. The empirical data about personal situations are available from EuroStat. The goal is estimating the national average of net replacement rates (NRR) for unemployed persons. Such an indicator of social security shows the average degree with which social benefits compensate the loss of previous earnings. Thus, the paper suggests: -(Methodology) a model of census simulation combining statistical data on the population with individual answers computed with a rule-based model, -(Indicator) an integral quantitative evaluation of social security in Europe, which reveals its total decline by 2004 contrary to institutional improvements, -(Analysis) an explanation of the decline by a structural change of European labour markets with rapidly growing `atypical' employment groups (= part-time, temporary, self-employed, etc.) with a lower eligibility to social benefits than normally employed (= permanently full-time), -(Policy implications) a possible resolution of European policy contradictions by the "basic income model" with "flexinsurance". --Composite indicators,social security,European welfare state,European Union,"make work pay" policy

    Automated Detection and Tracking of Solar Magnetic Bright Points

    Full text link
    Magnetic Bright Points (MBPs) in the internetwork are among the smallest objects in the solar photosphere and appear bright against the ambient environment. An algorithm is presented that can be used for the automated detection of the MBPs in the spatial and temporal domains. The algorithm works by mapping the lanes through intensity thresholding. A compass search, combined with a study of the intensity gradient across the detected objects, allows the disentanglement of MBPs from bright pixels within the granules. Object growing is implemented to account for any pixels that might have been removed when mapping the lanes. The images are stabilized by locating long-lived objects that may have been missed due to variable light levels and seeing quality. Tests of the algorithm employing data taken with the Swedish Solar Telescope (SST), reveal that ~90% of MBPs within a 75"x 75" field of view are detected
    • 

    corecore