13,092 research outputs found

    Translating \u3ci\u3eUnocal\u3c/i\u3e: The Expanding Web of Liability for Business Entities Implicated in International Crimes

    Get PDF
    The Ninth Circuit ruled that a corporation could be held liable under the federal Alien Tort Claims Act for its complicity in a violation of international criminal law occurring outside the U.S. (Doe I v. Unocal Corp., 395 F.3d 932 (9th Cir. 2002)). Since then, litigants have filed increasing numbers of such cases. These cases raise two questions: (1) Is the United States the only country that provides judicial accountability for business entities involved in international crimes abroad? and (2) How are other countries translating the basic kinds of accountability that Unocal recognized into their own legal systems? This Article attempts to answer these questions by presenting the results of a comparative law survey involving sixteen countries that invited lawyers and legal scholars to examine questions relating to the status of international criminal law in each country. Their responses examine the incorporation into domestic penal codes of international criminal law from the Rome Statute of the International Criminal Court and other international covenants; describe applicable concepts of third-party liability; and evaluate the status of corporate liability under domestic penal codes. The responses reveal other sources of criminal liability for illicit business conduct abroad, such as bribery of foreign officials, money laundering, and dealing in stolen property. Finally, they provide analyses of the laws and legal customs relating to the rights of victims to access civil courts in the various countries in search of compensation and other remedies. The responses present compelling evidence of the existence of what has been termed an emerging transnational web of liability for business entities implicated in international crimes. Since the sixteen countries in the survey represent both civil and common law traditions, parties and nonparties to the ICC, and a wide geographic range, we believe conclusions reached may be extrapolated more broadly

    Differential Uptake of Gold Nanoparticles by 2 Species of Tadpole, the Wood Frog (Lithobates Sylvaticus) and the Bullfrog (Lithobates Catesbeianus)

    Full text link
    Engineered nanoparticles are aquatic contaminants of emerging concern that exert ecotoxicological effects on a wide variety of organisms. We exposed cetyltrimethylammonium bromide–capped spherical gold nanoparticles to wood frog and bullfrog tadpoles with conspecifics and in combination with the other species continuously for 21 d, then measured uptake and localization of gold. Wood frog tadpoles alone and in combination with bullfrog tadpoles took up significantly more gold than bullfrogs. Bullfrog tadpoles in combination with wood frogs took up significantly more gold than controls. The rank order of weight-normalized gold uptake was wood frogs in combination \u3e wood frogs alone \u3e bullfrogs in combination \u3e bullfrogs alone \u3e controls. In all gold-exposed groups of tadpoles, gold was concentrated in the anterior region compared with the posterior region of the body. The concentration of gold nanoparticles in the anterior region of wood frogs both alone and in combination with bullfrogs was significantly higher than the corresponding posterior regions. We also measured depuration time of gold in wood frogs. After 21 d in a solution of gold nanoparticles, tadpoles lost \u3e83% of internalized gold when placed in gold-free water for 5 d. After 10 d in gold-free water, tadpoles lost 94% of their gold. After 15 d, gold concentrations were below the level of detection. Our finding of differential uptake between closely related species living in similar habitats with overlapping geographical distributions argues against generalizing toxicological effects of nanoparticles for a large group of organisms based on measurements in only one species

    Should Higher Education respond to recent changes in the forensic science marketplace?

    Get PDF
    The evolution of forensic science within the United Kingdom over the past four decades has been rapid and dynamic. This has included policy responses to highly public miscarriages of justice, introduction of commercialisation and pioneering scientific developments such as DNA profiling. However even within this context, changes within forensic science over the last two years has been unprecedented; such as the closure of The Forensic Science Service; a Home Office review of Research and Development within forensic science; the challenges facing fingerprint identification as a result of The Fingerprint Inquiry (Scotland) and the embryonic development of a new professional body for the police force. Correspondingly, development of forensic science within Higher Education (HE) has been substantially transformed from a small number of Masters Courses in forensic science delivered by a small number of universities, to a plethora of undergraduate courses now available throughout the United Kingdom. This rapid expansion of forensic science courses has been openly criticised and debated and it is incumbent upon the university to not only focus on education but also to provide graduates with transferrable skills making them more employment ready. As a consequence HE establishments must be cognisant of and reactive to changes within any associated industry and respond to changes accordingly. However, have the universities delivering forensic science courses fully responded to these recent and unprecedented developments in the history of forensic science within the United Kingdom? This paper will consider the most recent changes to the forensic science marketplace and their ramifications for forensic science education within the HE sector. Challenges which have resulted from the changes will be highlighted and the educational impact on forensic science courses throughout the UK and their future will be evaluated in chronological order

    Behaviour of Magnetic Tubes in Neutron Star's Interior

    Full text link
    It is found from Maxwell's equations that the magnetic field lines are good analogues of relativistic strings. It is shown that the super-conducting current in the neutron star's interior causes local rotation of magnetic flux tubes carrying quantized flux.Comment: 6 pages, no figure

    Detection of fast radio transients with multiple stations: a case study using the Very Long Baseline Array

    Full text link
    Recent investigations reveal an important new class of transient radio phenomena that occur on sub-millisecond timescales. Often transient surveys' data volumes are too large to archive exhaustively. Instead, an on-line automatic system must excise impulsive interference and detect candidate events in real-time. This work presents a case study using data from multiple geographically distributed stations to perform simultaneous interference excision and transient detection. We present several algorithms that incorporate dedispersed data from multiple sites, and report experiments with a commensal real-time transient detection system on the Very Long Baseline Array (VLBA). We test the system using observations of pulsar B0329+54. The multiple-station algorithms enhanced sensitivity for detection of individual pulses. These strategies could improve detection performance for a future generation of geographically distributed arrays such as the Australian Square Kilometre Array Pathfinder and the Square Kilometre Array.Comment: 12 pages, 14 figures. Accepted for Ap

    Analysing Astronomy Algorithms for GPUs and Beyond

    Full text link
    Astronomy depends on ever increasing computing power. Processor clock-rates have plateaued, and increased performance is now appearing in the form of additional processor cores on a single chip. This poses significant challenges to the astronomy software community. Graphics Processing Units (GPUs), now capable of general-purpose computation, exemplify both the difficult learning-curve and the significant speedups exhibited by massively-parallel hardware architectures. We present a generalised approach to tackling this paradigm shift, based on the analysis of algorithms. We describe a small collection of foundation algorithms relevant to astronomy and explain how they may be used to ease the transition to massively-parallel computing architectures. We demonstrate the effectiveness of our approach by applying it to four well-known astronomy problems: Hogbom CLEAN, inverse ray-shooting for gravitational lensing, pulsar dedispersion and volume rendering. Algorithms with well-defined memory access patterns and high arithmetic intensity stand to receive the greatest performance boost from massively-parallel architectures, while those that involve a significant amount of decision-making may struggle to take advantage of the available processing power.Comment: 10 pages, 3 figures, accepted for publication in MNRA
    • …
    corecore