18,052 research outputs found
Using the DNA Testing of Arrestees to Reevaluate Fourth Amendment Doctrine
With the advent of DNA testing, numerous issues have arisen with regard to obtaining and using evidence developed from such testing. As courts have come to regard DNA testing as a reliable method for linking some people to crimes and for exonerating others, these issues are especially significant. The federal government and most states have enacted statutes that permit or direct the testing of those convicted of at least certain crimes. Courts have almost universally approved such testing, rejecting arguments that obtaining and using such evidence violates the Fourth Amendment.
More recently governments have enacted laws permitting or directing the taking of DNA samples from those arrested, but not yet convicted, for certain serious crimes. Courts had been far more divided about the constitutionality of DNA testing for arrestees than they were for the comparable testing of those already convicted of crimes. Given the division in the holdings among both state and federal courts and the increasing importance of DNA evidence in criminal investigations, it was hardly surprising that the Supreme Court agreed to hear a case regarding the constitutionality of a Maryland statute allowing for such testing.
Section II of this article will provide a brief description of the science of DNA testing as it is used in the criminal justice system. Section III will discuss the Supreme Court\u27s decision in Maryland v. King. Section IV will address the argument of the opponents of the DNA testing of arrestees - that it violates the presumption of innocence. The chief focus of the article will appear in Sections V and VI, which will respond to the arguments posed by those who claim such testing violates the Fourth Amendment. Section V will address the balancing test for such searches and seizures long employed by the Supreme Court. Section VI describes and critiques the use of the primary purpose test as an important factor in determining whether the Fourth Amendment has been violated. This test looks to whether the primary purpose of the government\u27s search or seizure was something other than to ferret out ordinary criminal wrongdoing, and only in such situations excuses the absence of individualized suspicion
A Three-Flavor, Lorentz-Violating Solution to the LSND Anomaly
We investigate whether postulating the existence of Lorentz-violating,
CPT-conserving interactions allows three-neutrino solutions to the LSND anomaly
that are also consistent with all other neutrino data. We show that
Lorentz-violating interactions that couple only to one of the active neutrinos
have the right properties to explain all the data. The details of the data make
this solution unattractive. We find, for example, that a highly non-trivial
energy dependence of the Lorentz-violating interactions is required.Comment: 15 pages, two eps figures. V2 - Minor modification
QCD effects on "stable" micro black holes at the LHC
If Micro Black Holes (MBHs) can be produced at the LHC, they will decay very
fast. We study hypothetical MBHs that do not decay; in particular, QCD effects
on accretion by MBHs that are produced at rest. We explain why accretion of a
nucleon by such MBHs is associated with pion emission. This pion emission
results in a kick to the MBHs, such that their velocities are large enough to
escape the Earth. Our study provides an extra assurance that MBHs which might
be produced at the LHC are not dangerous.Comment: 10 page
Fast CP Violation
flavor tagging will be extensively studied at the asymmetric
factories due to its importance in CP asymmetry measurements. The primary
tagging modes are the semileptonic decays of the (lepton tag), or the
hadronic decays (kaon tag). We suggest that looking for time
dependent CP asymmetries in events where one is tagged leptonically and the
other one is tagged with a kaon could result in an early detection of CP
violation. Although in the Standard Model these asymmetries are expected to be
small, , they could be measured with about the same amount of data as
in the ``gold-plated'' decay . In the presence of physics
beyond the Standard Model, these asymmetries could be as large as ,
and the first CP violation signal in the system may show up in these
events. We give explicit examples of new physics scenarios where this occurs.Comment: 9 pages, revtex, no figures. Discussion of new physics effects on CP
violation with two lepton tags expanded. Factors of 2 correcte
Pseudo-Deterministic Streaming
A pseudo-deterministic algorithm is a (randomized) algorithm which, when run multiple times on the same input, with high probability outputs the same result on all executions. Classic streaming algorithms, such as those for finding heavy hitters, approximate counting, ?_2 approximation, finding a nonzero entry in a vector (for turnstile algorithms) are not pseudo-deterministic. For example, in the instance of finding a nonzero entry in a vector, for any known low-space algorithm A, there exists a stream x so that running A twice on x (using different randomness) would with high probability result in two different entries as the output.
In this work, we study whether it is inherent that these algorithms output different values on different executions. That is, we ask whether these problems have low-memory pseudo-deterministic algorithms. For instance, we show that there is no low-memory pseudo-deterministic algorithm for finding a nonzero entry in a vector (given in a turnstile fashion), and also that there is no low-dimensional pseudo-deterministic sketching algorithm for ?_2 norm estimation. We also exhibit problems which do have low memory pseudo-deterministic algorithms but no low memory deterministic algorithm, such as outputting a nonzero row of a matrix, or outputting a basis for the row-span of a matrix.
We also investigate multi-pseudo-deterministic algorithms: algorithms which with high probability output one of a few options. We show the first lower bounds for such algorithms. This implies that there are streaming problems such that every low space algorithm for the problem must have inputs where there are many valid outputs, all with a significant probability of being outputted
The importance of N2 leptogenesis
We argue that fast interactions of the lightest singlet neutrino would
project part of a preexisting lepton asymmetry onto a direction that is
protected from washout effects, thus preventing it from being erased. In
particular, we consider an asymmetry generated in decays, assuming that
interactions are fast enough to bring into full thermal
equilibrium. If decays occur at T\gsim 10^9 GeV, that is, before the
muon Yukawa interactions enter into thermal equilibrium, then generically part
of survives. In this case some of the constraints implied by the standard
leptogenesis scenario hold only if . For T\lsim 10^9
GeV, is generally erased, unless special alignment/orthogonality
conditions in flavor space are realized.Comment: 5 pages. A few clarifications added, conclusions unchanged. Version
published in Phys. Rev. Lett. (Title changed in journal
Suggestive Identifications: The Supreme Court\u27s Due Process Test Fails to Meet Its Own Criteria
There are perhaps few procedures in our system of criminal justice more inexact than eyewitness identification of criminal suspects. This is due in large measure to the many subtle psychological influences that affect any person\u27s ability to observe, retain, and recollect events, particularly when stress is present. The author discusses the current constitutional standard for the admissibility of eyewitness identifications and examines whether this test serves the interests it purports to uphold. After discussing the impact of psychological factors and suggestive police practices, the author offers some guidelines for more consistent application of the existing test
Whither Reasonable Suspicion: The Supreme Court\u27s Functional Abandonment of the Reasonableness Requirement for Fourth Amendment Seizures
Although the United States Supreme Court’s approach to issues governing application of the probable cause requirement of the Fourth Amendment has mutated over the years, at least one aspect of its approach has remained constant. Before information leading to probable cause or its lesser iteration of reasonable suspicion is found to exist, the government must demonstrate in some meaningful way the reliability of the person providing the information or of the information itself. Lacking such reliability, no search or seizure based on probable cause or reasonable suspicion is permitted. In its recent decision in Navarette v. California, the Court largely abandoned the requirement that this reliability be meaningful. It did so by holding that an anonymous 911 call without any impactful corroboration could supply the reasonable suspicion necessary to effect a seizure protected by the Fourth Amendment. This abandonment significantly increases the ability of the government to deprive a person of his or her freedom in conducting a seizure. Now, such a seizure can be effected without the government demonstrating that the individual who provided the information justifying the seizure is worthy of belief in any manner that has traditionally been used by the Court to show reliability. In its effort to justify this approach to reliability, the Court in Navarette misinterpreted rather egregiously its previous holdings on reliability in similar cases, and then offered new arguments to buttress its decision. These new arguments are unpersuasive in their application to the facts of Navarette and, even more troubling, are at odds with the principles embodied in the Fourth Amendment. Section I of this Article will examine the Supreme Court’s foundational decisions regarding the requirements for the government to show probable cause and the lower standard of reasonable suspicion for less intrusive searches and seizures. Section II will focus on the Court’s application of the reliability requirement for determining reasonable suspicion in the two cases that are directly on point with the facts and legal issues raised in Navarette. Section III will explore the Court’s holding in Navarette—examining the Court’s misapplication of the principles of previous holdings and the flawed reasoning used to justify the reliability of an anonymous, uncorroborated 911 call. One of the methods offered by the Court to show reliability involved the application of arguably related hearsay exceptions. Accordingly, Section IV will assess the propriety of using evidentiary principles in reaching determinations of constitutional law. The Article will conclude with a suggested approach for determining reliability in cases that rely on the presence of reasonable suspicion to justify a seizure protected by the Fourth Amendment
- …