3,171 research outputs found
Robots and tools for remodeling bone
The field of robotic surgery has progressed from small teams of researchers repurposing industrial robots, to a competitive and highly innovative subsection of the medical device industry. Surgical robots allow surgeons to perform tasks with greater ease, accuracy, or safety, and fall under one of four levels of autonomy; active, semi-active, passive, and remote manipulator. The increased accuracy afforded by surgical robots has allowed for cementless hip arthroplasty, improved postoperative alignment following knee arthroplasty, and reduced duration of intraoperative fluoroscopy among other benefits. Cutting of bone has historically used tools such as hand saws and drills, with other elaborate cutting tools now used routinely to remodel bone. Improvements in cutting accuracy and additional options for safety and monitoring during surgery give robotic surgeries some advantages over conventional techniques. This article aims to provide an overview of current robots and tools with a common target tissue of bone, proposes a new process for defining the level of autonomy for a surgical robot, and examines future directions in robotic surgery
A novel robot with a toroidal workspace for coring applications
Removing a volume of tissue as a single mass can be a complicated operation, and in many cases it would be advantageous to be able to remove more complex geometries and preserve nearby critical regions. In this paper, we present a novel robot with 2 degrees of freedom, designed for cutting around a parametric torus-shaped volume. The kinematics of this system make it possible to drive the tip of the linkage such that it does not collide with any point not on the surface of the specified torus. Two types of tool paths can be followed, with the torus geometry defined to fit a target region of material. Three bone tumor cases were used for simulating the approach. Compared to more standard tri-planar cutting approaches, a toroidal resection could improve upon the amount of healthy bone resected, and reduce surgical access requirements
Automated resection planning for bone tumor surgery
Planning for bone tumor resection surgery is a technically demanding and time-consuming task, reliant on manual positioning of cutting planes (CPs). This work describes an automated approach for generating bone tumor resection plans, where the volume of healthy bone collaterally resected with the tumor is minimized through optimized placement of CPs. Particle swarm optimization calculates the optimal position and orientation of the CPs by introducing a single new CP to an existing resection, then optimizing all CPs to find the global minima. The bone bounded by all CPs is collaterally resected with the tumor. The approach was compared to manual resection plans from an experienced surgeon for 20 tumor cases. It was found that a greater number of CPs reduce the collaterally resected healthy bone, with diminishing returns on this improvement after five CPs. The algorithm-generated resection plan with equivalent number of CPs resulted in a statistically significant improvement over manual plans (paired t-test, p < 0.001). The described approach has potential to improve patient outcomes by reducing loss of healthy bone in tumor surgery while offering a surgeon multiple resection plan options
A Molecular Platinum Cluster Junction: A Single-Molecule Switch
We present a theoretical study of the electronic transport through
single-molecule junctions incorporating a Pt6 metal cluster bound within an
organic framework. We show that the insertion of this molecule between a pair
of electrodes leads to a fully atomically engineered nano-metallic device with
high conductance at the Fermi level and two sequential high on/off switching
states. The origin of this property can be traced back to the existence of a
HOMO which consists of two degenerate and asymmetric orbitals, lying close in
energy to the Fermi level of the metallic leads. Their degeneracy is broken
when the molecule is contacted to the leads, giving rise to two resonances
which become pinned close to the Fermi level and display destructive
interference.Comment: 4 pages, 4 figures. Reprinted (adapted) with permission from J. Am.
Chem. Soc., 2013, 135 (6), 2052. Copyright 2013 American Chemical Societ
The importance of high-throughput cell separation technologies for genomics/proteomics-based clinical diagnostics
Gene expression microarray analyses of mixtures of cells approximate a weighted average of the gene expression profiles (GEPs) of each cell type according to its relative abundance in the overall cell sample being analyzed. If the targeted subpopulation of cells is in the minority, or the expected perturbations are marginal, then such changes will be masked by the GEP of the normal/unaffected cells. We show that the GEP of a minor cell subpopulation is often lost when that cell subpopulation is of a frequency less than 30 percent. The GEP is almost always masked by the other cell subpopulations when that frequency drops to 10 percent or less. On the basis of these results one should always assume that the GEP of a given cell subpopulation is probably seriously affected by, the presence of significant numbers of other "contaminating" cell types. Several methodologies can be employed to enrich the target cells submitted for microarray analyses. These include magnetic sorting and laser capture microdissection. If a cell subpopulation of interest is small, very high-throughput cell separation technologies are needed to separate enough cells for conventional microarrays. However, high-throughput flow cytometry/cell sorting overcomes many restrictions of experimental enrichment conditions. This technology can also be used to sort smaller numbers of cells of specific cell subpopulations and subsequently amplify their mRNAs before microarray analyses. When purification techniques are applied to unfixed samples, the potential for changes in gene levels during the process of collection is an additional concern. Since RNA rapidly degrades, and specific mRNAs turn over in minutes or hours, the cell separation process must be very rapid. Hence, high-throughput cell separation (HTS) technologies are needed that can process the necessary number of cells expeditiously in order to avoid such uncontrolled changes in the target cells GEP. In cases where even the use of HTS yields only a small number of cells, the mRNAs (after reverse transcription to cDNA's) must be amplified to yield enough material for conventional microarray analyses. However, the problem of using "microamplification" PCR methods to expand the amount of cDNAs (from mRNAs) is that it is very difficult to amplify equally all of the mRNAs. Unequal amplification leads to a distorted gene expression profile on the microarray. Linear amplifications is difficult to achieve. Unfortunately, present-day gene-chips need to be about 100 times more sensitive than they are now to be able to do many biologically and biomedically meaningful experiments and clinical tests
Biosensor-controlled gene therapy/drug delivery with nanoparticles for nanomedicine
Nanomedicine involves cell-by-cell regenerative medicine, either repairing cells one at a time or triggering apoptotic pathways in cells that are not repairable. Multilayered nanoparticle systems are being constructed for the targeted delivery of gene therapy to single cells. Cleavable shells containing targeting, biosensing, and gene therapeutic molecules are being constructed to direct nanoparticles to desired intracellular targets. Therapeutic gene sequences are controlled by biosensor-activated control switches to provide the proper amount of gene therapy on a single cell basis. The central idea is to set up gene therapy "nanofactories" inside single living cells. Molecular biosensors linked to these genes control their expression. Gene delivery is started in response to a biosensor detected problem; gene delivery is halted when the cell response indicates that more gene therapy is not needed
Squamous cell carcinoma of the nasal cavity:A descriptive analysis of cases from the Head and Neck 5000 study
OBJECTIVES: This paper aims to provide contemporary epidemiological data on squamous cell carcinoma (SCC) of the nasal cavity, which represents a rare type of head and neck cancer.DESIGN, SETTING & PARTICIPANTS: A descriptive analysis of people with nasal cavity SCC treated with curative intent from the Head and Neck 5000 study; a multicentre clinical cohort study of people from the UK with head and neck cancer. People with tumours of the nasopharynx, paranasal sinuses and other sub-sites of the head and neck were excluded.MAIN OUTCOME MEASURES: Demographic data and treatment details are presented for all participants. The main outcomes were overall survival and survival according to categories of characteristics (e.g. smoker vs non-smoker); these were explored using Kaplan-Meier plots.RESULTS: Thirty people with nasal cavity SCC were included in the study, of which most were male (67%) and current or ex-smokers (70%). The majority (70%) presented with early stage (T1/2, N0) tumours. Cervical lymph node metastases at presentation were rare, occurring in only one person. Nine people died during the follow up period (30%). Worse survival outcomes were seen in people with moderate or severe co-morbidities.CONCLUSIONS: This paper provides epidemiological data on nasal cavity SCC in the UK. Patterns of disease and survival outcomes are described, identifying high-risk groups. Further studies should explore whether primary treatment modality alters survival. This article is protected by copyright. All rights reserved.</p
The Reputational Consequences of Failed Replications and Wrongness Admission among Scientists
Scientists are dedicating more attention to replication efforts. While the scientific utility of replications is unquestionable, the impact of failed replication efforts and the discussions surrounding them deserve more attention. Specifically, the debates about failed replications on social media have led to worry, in some scientists, regarding reputation. In order to gain data-informed insights into these issues, we collected data from 281 published scientists. We assessed whether scientists overestimate the negative reputational effects of a failed replication in a scenario-based study. Second, we assessed the reputational consequences of admitting wrongness (versus not) as an original scientist of an effect that has failed to replicate. Our data suggests that scientists overestimate the negative reputational impact of a hypothetical failed replication effort. We also show that admitting wrongness about a non-replicated finding is less harmful to one’s reputation than not admitting. Finally, we discovered a hint of evidence that feelings about the replication movement can be affected by whether replication efforts are aimed one’s own work versus the work of another. Given these findings, we then present potential ways forward in these discussions
Unbiased Global Optimization of Lennard-Jones Clusters for N <= 201 by Conformational Space Annealing Method
We apply the conformational space annealing (CSA) method to the Lennard-Jones
clusters and find all known lowest energy configurations up to 201 atoms,
without using extra information of the problem such as the structures of the
known global energy minima. In addition, the robustness of the algorithm with
respect to the randomness of initial conditions of the problem is demonstrated
by ten successful independent runs up to 183 atoms. Our results indicate that
the CSA method is a general and yet efficient global optimization algorithm
applicable to many systems.Comment: revtex, 4 pages, 2 figures. Physical Review Letters, in pres
- …