3,738 research outputs found
A Brief Comparison Between Available Bio-printing Methods
The scarcity of organs for transplant has led to large waiting lists of very
sick patients. In drug development, the time required for human trials greatly
increases the time to market. Drug companies are searching for alternative
environments where the in-vivo conditions can be closely replicated. Both these
problems could be addressed by manufacturing artificial human tissue. Recently,
researchers in tissue engineering have developed tissue generation methods
based on 3-D printing to fabricate artificial human tissue. Broadly, these
methods could be classified as laser-assisted and laser free. The former have
very fine spatial resolutions (10s of m) but suffer from slow speed ( drops per second). The later have lower spatial resolutions (100s of
m) but are very fast (up to drops per second). In this
paper we review state-of-the-art methods in each of these classes and provide a
comparison based on reported resolution, printing speed, cell density and cell
viability
A Framework for Megascale Agent Based Model Simulations on Graphics Processing Units
Agent-based modeling is a technique for modeling dynamic systems from the bottom up. Individual elements of the system are represented computationally as agents. The system-level behaviors emerge from the micro-level interactions of the agents. Contemporary state-of-the-art agent-based modeling toolkits are essentially discrete-event simulators designed to execute serially on the Central Processing Unit (CPU). They simulate Agent-Based Models (ABMs) by executing agent actions one at a time. In addition to imposing an un-natural execution order, these toolkits have limited scalability. In this article, we investigate data-parallel computer architectures such as Graphics Processing Units (GPUs) to simulate large scale ABMs. We have developed a series of efficient, data parallel algorithms for handling environment updates, various agent interactions, agent death and replication, and gathering statistics. We present three fundamental innovations that provide unprecedented scalability. The first is a novel stochastic memory allocator which enables parallel agent replication in O(1) average time. The second is a technique for resolving precedence constraints for agent actions in parallel. The third is a method that uses specialized graphics hardware, to gather and process statistical measures. These techniques have been implemented on a modern day GPU resulting in a substantial performance increase. We believe that our system is the first ever completely GPU based agent simulation framework. Although GPUs are the focus of our current implementations, our techniques can easily be adapted to other data-parallel architectures. We have benchmarked our framework against contemporary toolkits using two popular ABMs, namely, SugarScape and StupidModel.GPGPU, Agent Based Modeling, Data Parallel Algorithms, Stochastic Simulations
On signalling over through-silicon via (TSV) interconnects in 3-D integrated circuits.
This paper discusses signal integrity (SI) issues and signalling techniques for Through Silicon Via (TSV) interconnects in 3-D Integrated Circuits (ICs). Field-solver extracted parasitics of TSVs have been employed in Spice simulations to investigate the effect of each parasitic component on performance metrics such as delay and crosstalk and identify a reduced-order electrical model that captures all relevant effects. We show that in dense TSV structures voltage-mode (VM) signalling does not lend itself to achieving high data-rates, and that current-mode (CM) signalling is more effective for high throughput signalling as well as jitter reduction. Data rates, energy consumption and coupled noise for the different signalling modes are extracted
Sparse And Low Rank Decomposition Based Batch Image Alignment for Speckle Reduction of retinal OCT Images
Optical Coherence Tomography (OCT) is an emerging technique in the field of
biomedical imaging, with applications in ophthalmology, dermatology, coronary
imaging etc. Due to the underlying physics, OCT images usually suffer from a
granular pattern, called speckle noise, which restricts the process of
interpretation. Here, a sparse and low rank decomposition based method is used
for speckle reduction in retinal OCT images. This technique works on input data
that consists of several B-scans of the same location. The next step is the
batch alignment of the images using a sparse and low-rank decomposition based
technique. Finally the denoised image is created by median filtering of the
low-rank component of the processed data. Simultaneous decomposition and
alignment of the images result in better performance in comparison to simple
registration-based methods that are used in the literature for noise reduction
of OCT images.Comment: Accepted for presentation at ISBI'1
Minimal Walking Technicolor: Set Up for Collider Physics
Different theoretical and phenomenological aspects of the Minimal and
Nonminimal Walking Technicolor theories have recently been studied. The goal
here is to make the models ready for collider phenomenology. We do this by
constructing the low energy effective theory containing scalars, pseudoscalars,
vector mesons and other fields predicted by the minimal walking theory. We
construct their self-interactions and interactions with standard model fields.
Using the Weinberg sum rules, opportunely modified to take into account the
walking behavior of the underlying gauge theory, we find interesting relations
for the spin-one spectrum. We derive the electroweak parameters using the newly
constructed effective theory and compare the results with the underlying gauge
theory. Our analysis is sufficiently general such that the resulting model can
be used to represent a generic walking technicolor theory not at odds with
precision data.Comment: 42 pages, 3 figures. RevTex forma
Renal Hyperfiltration and the Development of Microalbuminuria in Type 1 Diabetes
OBJECTIVE: The purpose of this study was to examine prospectively whether renal hyperfiltration is associated with the development of microalbuminuria in patients with type 1 diabetes, after taking into account known risk factors. RESEARCH DESIGN AND METHODS: The study group comprised 426 participants with normoalbuminuria from the First Joslin Kidney Study, followed for 15 years. Glomerular filtration rate was estimated by serum cystatin C, and hyperfiltration was defined as exceeding the 97.5th percentile of the sex-specific distribution of a similarly aged, nondiabetic population (134 and 149 ml/min per 1.73 m2 for men and women, respectively). The outcome was time to microalbuminuria development (multiple albumin excretion rate >30 μg/min). Hazard ratios (HRs) for microalbuminuria were calculated at 5, 10, and 15 years. RESULTS: Renal hyperfiltration was present in 24% of the study group and did not increase the risk of developing microalbuminuria. The unadjusted HR for microalbuminuria comparing those with and without hyperfiltration at baseline was 0.8 (95% CI 0.4–1.7) during the first 5 years, 1.0 (0.6–1.7) during the first 10 years, and 0.8 (0.5–1.4) during 15 years of follow-up. The model adjusted for baseline known risk factors including A1C, age at diagnosis of diabetes, diabetes duration, and cigarette smoking resulted in similar HRs. In addition, incorporating changes in hyperfiltration status during follow-up had minimal impact on the HRs for microalbuminuria. CONCLUSION;S Renal hyperfiltration does not have an impact on the development of microalbuminuria in type 1 diabetes during 5, 10, or 15 years of follow-up.National Institutes of Health Grant (DK 041526
- …
