790 research outputs found

    Interiorization and Localization: An Analysis of Immigration Enforcement in Local Contexts

    Get PDF
    Immigration and Customs Enforcement (ICE) is a federal agency that plays a large role in surveilling, apprehending, detaining, incarcerating, and deporting undocumented immigrants in the United States. Due to constraints on the number of ICE’s available personnel and resources, the agency relies on deputizing, or devolving to local law enforcement agencies the authority to enforce federal immigration policies. Prior to the 1990s, the enforcement of policies directed at controlling flows of undocumented immigrants was generally under the purview of federal law enforcement agencies and administrators, not state or local ones. The attacks on September, 11th 2001 represented a flashpoint, from which followed a series of significant policy changes that allowed for the possibility of local level law enforcement agencies to be folded into domains, such as anti-terrorism and homeland security- that were previously the realm of the federal government. Counties, municipalities, and localities could now choose to enter into agreements, like 287(g), with federal agencies like ICE to police undocumented immigrants and enforce federal immigration policy. This local-federal cooperation serves to multiply the scope and reach of ICE, as cooperating localities can have their law enforcement agencies function as proxies for ICE. Cooperation with ICE is voluntary and is typically either approved or rejected by local elected officials, like County Sheriffs. This thesis seeks to shed light on possible reasons why a municipality, county, or locality would choose to cooperate with ICE by assessing which areas in the country have high rates of cooperation. This was done through obtaining large datasets from the Transactional Records Access Clearinghouse (TRAC), based in Syracuse University. This data contains nation-wide ICE arrest and detainer request records between 2002 and 2015. TRAC data provided information on hundreds of thousands of ICE arrest cases that involved an ICE Detainer. An ICE Detainer is an administrative hold that ICE can send to local jails, requesting the jail to keep an individual detained for an additional 48 hours if they are suspected of being undocumented so that ICE can intercept said individual. TRAC data was processed in SPSS, where descriptive statistics were run to show which localities chose to honor ICE detainer requests and at what rates they did. From these analyses, heatmaps were generated to show the geographic distribution of localities with high honor rates. Additionally, four localities were selected for more in depth case studies to help understand what sociological factors may have contributed to localities decisions to honor ICE detainers at high and low rates. Findings suggest that racial and ethnic demographic changes that decrease a white majority in a location, the concentration of certain industries that tend to employ migrant labor, and the partisanship of local officials and administrators may help explain why certain localities honor ICE detainer requests at high rates. Additionally, this research affirms claims from scholars of immigration enforcement that following 9/11, forms of local-federal cooperation aimed at hindering the settlement of undocumented immigrants happen at higher rates in the interior of the country, as opposed to in areas along the U.S Mexico Border by providing descriptive statistics and qualitative case studies of local level cooperation with ICE

    Generalized commutation relations and Non linear momenta theories, a close relationship

    Full text link
    A revision of generalized commutation relations is performed, besides a description of Non linear momenta realization included in some DSR theories. It is shown that these propositions are closely related, specially we focus on Magueijo Smolin momenta and Kempf et al. and L.N. Chang generalized commutators. Due to this, a new algebra arises with its own features that is also analyzed.Comment: accepted version in IJMP

    Estimation in the Birnbaum-Saunders distribution based on scale-mixture of normals and the EM-algorithm

    Get PDF
    Scale mixtures of normal (SMN) distributions are used for modeling symmetric data. Members of this family have appealing properties such as robust estimates, easy number generation, and efficient computation of the ML estimates via the EM-algorithm. The Birnbaum-Saunders (BS) distribution is a positively skewed model that is related to the normal distribution and has received considerable attention. We introduce a type of BS distributions based on SMN models, produce a lifetime analysis, develop the EM-algorithm for ML estimation of parameters, and illustrate the obtained results with real data showing the robustness of the estimation procedure.Peer Reviewe

    Shape and change point analyses of the Birnbaum-Saunders-t hazard rate and associated estimation

    Get PDF
    The hazard rate is a statistical indicator commonly used in lifetime analysis. The Birnbaum-Saunders (BS) model is a life distribution originated from a problem pertaining to material fatigue that has been applied to diverse fields. The BS model relates the total time until failure to some type of cumulative damage that is normally distributed. The generalized BS (GBS) distribution is a class of positively skewed models with lighter and heavier tails than the BS distribution. Particular cases of GBS distributions are the BS and BS-Student-t (BS-t) models. In this paper, we discuss shape and change point analyses for the hazard rate of the BS-t distribution. In addition, we evaluate the performance of the maximum likelihood and moment estimators of this change point using Monte Carlo methods. We also present an application with a real life data set useful for survival analysis, which shows the convenience of knowing such instant of change for establishing a reduction in the dose and, as a consequence, in the cost of the treatment.FEDER Funds- Programa Operacional Factores de Competitividade - COMPETEFundação para a Ciênciae a Tecnologia (FCT) - Project Est-C/MAT/UI0013/2011FONDECYT 1120879 grant, Chil

    1-D Harmonic Oscillator in Snyder Space, the Classic and the Quantum

    Full text link
    The 1-D dimension harmonic oscillator in Snyder space is investigated in its classical and quantum versions. The classical trajectory is obtained and the semiclassical quantization from the phase space trajectories is discussed. In the meanwhile, an effective cutoff to high frequencies is found. The quantum version is developed and an equivalent usual harmonic oscillator is obtained through an effective mass and an effective frequency introduced in the model. This modified parameters give us an also modified energy spectra.Comment: 8 pages, 2 figure

    Development of Standard Criteria to Evaluate the Effectiveness of Helmets at Decreasing the Risk of Concussions

    Get PDF
    In many sports, such as American football, accumulations of mild traumatic brain injuries have been suggested as a possible link to neurodegeneration and future mental disorders. With head impacts occurring at all levels of competition and in different sports, it is critical to develop an accurate method for quantifying the effects of head impacts and determining the efficacy of helmets. This study examines the derivation of different dimensionless numbers and ascertains the critical factors needed to predict the effects of head impacts, specifically the resulting accelerations from an impact. Given a known force of impact, parameters such as peak translation acceleration and impact duration were collected for a total of 200 impacts at 10 locations around the head. These parameters were used in conjunction with dimensionless numbers to compare various helmet designs across sports. Five input and four output criteria, or π variables, were derived using fundamental variables of total mass, width of neck, and the difference between muscle reaction time and the impact duration. By determining the coefficients of the governing equations for each output π variable, the impulse of impacts had a consistent effect on helmet efficacy, while the masses and radii of helmets contained confounding variables that made it difficult to predict the effectiveness of attenuating the head accelerations

    L-moments of the Birnbaum-Saunders distribution and its extreme value version: Estimation, goodness of fit and application to earthquake data

    Get PDF
    Understanding patterns in the frequency of extreme natural events, such as earthquakes, is important as it helps in the prediction of their future occurrence and hence provides better civil protection. Distributions describing these events are known to be heavy tailed and positive skew making standard distributions unsuitable for such a situation. The Birnbaum-Saunders distribution and its extreme value version have been widely studied and applied due to their attractive properties. We derive L-moment equations for these distributions and propose novel methods for parameter estimation, goodness-of-fit assessment and model selection. A simulation study is conducted to evaluate the performance of the L-moment estimators, which is compared to that of the maximum likelihood estimators, demonstrating the superiority of the proposed methods. To illustrate these methods in a practical application, a data analysis of real-world earthquake magnitudes, obtained from the global centroid moment tensor catalogue during 1962-2015, is carried out. This application identifies the extreme value Birnbaum-Saunders distribution as a better model than classic extreme value distributions for describing seismic events

    Dynamic resource allocation in different ultrawideband optical network topologies

    Get PDF
    We study the blocking performance of dynamic resource allocation strategies in ultrawideband elastic optical networks under different topologies. State-of-the-art heuristics are evaluated on four different network topologies. Results show consistent better performance of heuristics that prioritise allocation based on the connections bitrate

    Self-consistency in Theories with a Minimal Length

    Full text link
    The aim of this paper is to clarify the relation between three different approaches of theories with a minimal length scale: A modification of the Lorentz-group in the 'Deformed Special Relativity', theories with a 'Generalized Uncertainty Principle' and those with 'Modified Dispersion Relations'. It is shown that the first two are equivalent, how they can be translated into each other, and how the third can be obtained from them. An adequate theory with a minimal length scale requires all three features to be present.Comment: typos corrected, published with new title following referee's advic
    • …
    corecore