1,676 research outputs found

    Automated identification of neurons and their locations

    Full text link
    Individual locations of many neuronal cell bodies (>10^4) are needed to enable statistically significant measurements of spatial organization within the brain such as nearest-neighbor and microcolumnarity measurements. In this paper, we introduce an Automated Neuron Recognition Algorithm (ANRA) which obtains the (x,y) location of individual neurons within digitized images of Nissl-stained, 30 micron thick, frozen sections of the cerebral cortex of the Rhesus monkey. Identification of neurons within such Nissl-stained sections is inherently difficult due to the variability in neuron staining, the overlap of neurons, the presence of partial or damaged neurons at tissue surfaces, and the presence of non-neuron objects, such as glial cells, blood vessels, and random artifacts. To overcome these challenges and identify neurons, ANRA applies a combination of image segmentation and machine learning. The steps involve active contour segmentation to find outlines of potential neuron cell bodies followed by artificial neural network training using the segmentation properties (size, optical density, gyration, etc.) to distinguish between neuron and non-neuron segmentations. ANRA positively identifies 86[5]% neurons with 15[8]% error (mean[st.dev.]) on a wide range of Nissl-stained images, whereas semi-automatic methods obtain 80[7]%/17[12]%. A further advantage of ANRA is that it affords an unlimited increase in speed from semi-automatic methods, and is computationally efficient, with the ability to recognize ~100 neurons per minute using a standard personal computer. ANRA is amenable to analysis of huge photo-montages of Nissl-stained tissue, thereby opening the door to fast, efficient and quantitative analysis of vast stores of archival material that exist in laboratories and research collections around the world.Comment: 38 pages. Formatted for two-sided printing. Supplemental material and software available at http://physics.bu.edu/~ainglis/ANRA

    Wideband digital phase comparator for high current shunts

    Full text link
    A wideband phase comparator for precise measurements of phase difference of high current shunts has been developed at INRIM. The two-input digital phase detector is realized with a precision wideband digitizer connected through a pair of symmetric active guarded transformers to the outputs of the shunts under comparison. Data are first acquired asynchronously, and then transferred from on-board memory to host memory. Because of the large amount of data collected the filtering process and the analysis algorithms are performed outside the acquisition routine. Most of the systematic errors can be compensated by a proper inversion procedure. The system is suitable for comparing shunts in a wide range of currents, from several hundred of milliampere up to 100 A, and frequencies ranging between 500 Hz and 100 kHz. Expanded uncertainty (k=2) less than 0.05 mrad, for frequency up to 100 kHz, is obtained in the measurement of the phase difference of a group of 10 A shunts, provided by some European NMIs, using a digitizer with sampling frequency up to 1 MHz. An enhanced version of the phase comparator employs a new digital phase detector with higher sampling frequency and vertical resolution. This permits to decrease the contribution to the uncertainty budget of the phase detector of a factor two from 20 kHz to 100 kHz. Theories and experiments show that the phase difference between two high precision wideband digitizers, coupled as phase detector, depends on multiple factors derived from both analog and digital imprint of each sampling system.Comment: 20 pages, 9 figure

    The development of policing in Britain in the next five years.

    Get PDF
    The British police service is currently going through a radical transformation phase. The present Tory-led coalition government has set out an agenda to bring about drastic changes in policing. These proposed changes are unprecedented in the history of policing since 1829. The police service is governed by a tripartite arrangement of checks and balances laid down under the Police Act 1964. By this I mean that there are three key players in relation to police governance in Britain: the Home Secretary, the local police authority and the chief constable. The future of policing in the next five years is set out clearly by the Home Secretary, Theresa May MP, under the Police Reform and Social Responsibility Bill, which is currently being reviewed in the House of Lords. The recent phone hacking scandal has made it imperative for the British public to have a closer look at the police service in relation to proper accountability. There have been references to police corruption as far back as the era of 'parish constables', dating back to 1800, when it was alleged that police officers took bribes, got drunk whilst on duty and lacked moral credibility to protect and serve us (Critchley, 1978). In the seventies and eighties the British public was informed of another scandal involving members of Scotland Yard and criminal gangs in the East End of London. In this article, I shall argue that the issue of police corruption is not a new phenomenon. It is has been an ongoing issue that has haunted the police for over a century. This article is divided into three parts. In the first part of the article I present the following issues: the Metropolitan Police policing plan 2011-2014; the merits and demerits of the policing plan; tripartite police accountability and its shortcomings; democratic accountability and localisation of policing; the professionalisation of policing and the creation of the Police Body; review of police pay and benefits; and the impact of this on police officers' morale. In the second part of my article I present some of the criticisms levelled against the ongoing police reforms. I will look at the criticisms from both internal and external perspectives. By internal criticism, I mean police officers' opposition to the reforms. By external criticism, I mean criticisms from criminologists and members of the British public. In the third part of my article I made my position clear on where I stand in relation to the ongoing police reforms. I shall argue that the current ongoing job cuts in the police service are a disaster waiting to happen, and that our safety has been compromised by politicians. We are now living at the mercy of criminals and law breakers due to manpower shortage. We are all living witnesses to the ongoing public disturbances in Tottenham, Enfield, Brixton, Peckham, Walthamstow and Croydon, in London. The speed of the spread of these riots to other cities like Bristol, Birmingham, Manchester and Liverpool occurred on an unimaginable scale. We all watched how difficult it was for the police to restore order and normality. Rioters looted and plundered goods and burnt down buildings as if no laws existed in our country. A complete breakdown of law and order put the lives of citizens at risk. My article makes a passionate appeal to the present coalition government to rethink the issue of reducing the numbers of police officers protecting us. I shall argue that we need more police officers in Britain not fewer. The level of anger and social discontent is higher than the government ever anticipated, partly because of economic hardship. My argument is that economic hardship is not an excuse to commit burglary, theft, arson, murder and criminal damage with intent to endanger life. Rioters are shameless opportunists, a bunch of hoodlums, criminals who have no place in any civilised society, who should be made to face the due process of law

    Circulating tumour cell RNA characterisation from colorectal cancer patient blood after inertial microfluidic enrichment

    Full text link
    © 2019 The Authors The detection and molecular analysis of circulating tumour cells (CTCs) potentially provides a significant insight to the characterisation of disease, stage of progression and therapeutic options for cancer patients. Following on from the protocol by Warkiani et al. 2016, which describes a method of enriching CTCs from cancer patient blood with inertial microfluidics, we describe a method to measure the CTC RNA expression in the enriched fraction using droplet digital PCR and compare transcript detection with and without RNA pre-amplification. • Inertial microfluidics combined with droplet digital PCR is advantageous as it allows for CTC enrichment and subsequent RNA analysis from patient blood. This allows for patient tumour analysis with increased sensitivity and precision compared to quantitative Real Time PCR and enables the direct quantification of nucleic acids without the need for tumour biopsy. • This method demonstrates an efficient approach providing important insights into the analysis of colorectal cancer patients’ CTCs using a specific gene subset or biomarkers, an approach that may be tailored to tumour type or expanded to larger panels

    Effects of crack tip geometry on dislocation emission and cleavage: A possible path to enhanced ductility

    Full text link
    We present a systematic study of the effect of crack blunting on subsequent crack propagation and dislocation emission. We show that the stress intensity factor required to propagate the crack is increased as the crack is blunted by up to thirteen atomic layers, but only by a relatively modest amount for a crack with a sharp 60∘^\circ corner. The effect of the blunting is far less than would be expected from a smoothly blunted crack; the sharp corners preserve the stress concentration, reducing the effect of the blunting. However, for some material parameters blunting changes the preferred deformation mode from brittle cleavage to dislocation emission. In such materials, the absorption of preexisting dislocations by the crack tip can cause the crack tip to be locally arrested, causing a significant increase in the microscopic toughness of the crack tip. Continuum plasticity models have shown that even a moderate increase in the microscopic toughness can lead to an increase in the macroscopic fracture toughness of the material by several orders of magnitude. We thus propose an atomic-scale mechanism at the crack tip, that ultimately may lead to a high fracture toughness in some materials where a sharp crack would seem to be able to propagate in a brittle manner. Results for blunt cracks loaded in mode II are also presented.Comment: 12 pages, REVTeX using epsfig.sty. 13 PostScript figures. Final version to appear in Phys. Rev. B. Main changes: Discussion slightly shortened, one figure remove

    Hydrological and associated biogeochemical consequences of rapid global warming during the Paleocene-Eocene Thermal Maximum

    Get PDF
    The Paleocene-Eocene Thermal Maximum (PETM) hyperthermal, ~ 56 million years ago (Ma), is the most dramatic example of abrupt Cenozoic global warming. During the PETM surface temperatures increased between 5 and 9 °C and the onset likely took < 20 kyr. The PETM provides a case study of the impacts of rapid global warming on the Earth system, including both hydrological and associated biogeochemical feedbacks, and proxy data from the PETM can provide constraints on changes in warm climate hydrology simulated by general circulation models (GCMs). In this paper, we provide a critical review of biological and geochemical signatures interpreted as direct or indirect indicators of hydrological change at the PETM, explore the importance of adopting multi-proxy approaches, and present a preliminary model-data comparison. Hydrological records complement those of temperature and indicate that the climatic response at the PETM was complex, with significant regional and temporal variability. This is further illustrated by the biogeochemical consequences of inferred changes in hydrology and, in fact, changes in precipitation and the biogeochemical consequences are often conflated in geochemical signatures. There is also strong evidence in many regions for changes in the episodic and/or intra-annual distribution of precipitation that has not widely been considered when comparing proxy data to GCM output. Crucially, GCM simulations indicate that the response of the hydrological cycle to the PETM was heterogeneous – some regions are associated with increased precipitation – evaporation (P – E), whilst others are characterised by a decrease. Interestingly, the majority of proxy data come from the regions where GCMs predict an increase in PETM precipitation. We propose that comparison of hydrological proxies to GCM output can be an important test of model skill, but this will be enhanced by further data from regions of model-simulated aridity and simulation of extreme precipitation events

    Why 'scaffolding' is the wrong metaphor : the cognitive usefulness of mathematical representations.

    Get PDF
    The metaphor of scaffolding has become current in discussions of the cognitive help we get from artefacts, environmental affordances and each other. Consideration of mathematical tools and representations indicates that in these cases at least (and plausibly for others), scaffolding is the wrong picture, because scaffolding in good order is immobile, temporary and crude. Mathematical representations can be manipulated, are not temporary structures to aid development, and are refined. Reflection on examples from elementary algebra indicates that Menary is on the right track with his ‘enculturation’ view of mathematical cognition. Moreover, these examples allow us to elaborate his remarks on the uniqueness of mathematical representations and their role in the emergence of new thoughts.Peer reviewe
    • …
    corecore