7,359 research outputs found
AMCTD: Adaptive Mobility of Courier nodes in Threshold-optimized DBR Protocol for Underwater Wireless Sensor Networks
In dense underwater sensor networks (UWSN), the major confronts are high
error probability, incessant variation in topology of sensor nodes, and much
energy consumption for data transmission. However, there are some remarkable
applications of UWSN such as management of seabed and oil reservoirs,
exploration of deep sea situation and prevention of aqueous disasters. In order
to accomplish these applications, ignorance of the limitations of acoustic
communications such as high delay and low bandwidth is not feasible. In this
paper, we propose Adaptive mobility of Courier nodes in Threshold-optimized
Depth-based routing (AMCTD), exploring the proficient amendments in depth
threshold and implementing the optimal weight function to achieve longer
network lifetime. We segregate our scheme in 3 major phases of weight updating,
depth threshold variation and adaptive mobility of courier nodes. During data
forwarding, we provide the framework for alterations in threshold to cope with
the sparse condition of network. We ultimately perform detailed simulations to
scrutinize the performance of our proposed scheme and its comparison with other
two notable routing protocols in term of network lifetime and other essential
parameters. The simulations results verify that our scheme performs better than
the other techniques and near to optimal in the field of UWSN.Comment: 8th International Conference on Broadband and Wireless Computing,
Communication and Applications (BWCCA'13), Compiegne, Franc
Comparison of spatial domain optimal trade-off maximum average correlation height (OT-MACH) filter with scale invariant feature transform (SIFT) using images with poor contrast and large illumination gradient
A spatial domain optimal trade-off Maximum Average Correlation Height (OT-MACH) filter has been previously developed and shown to have advantages over frequency domain implementations in that it can be made locally adaptive to spatial variations in the input image background clutter and normalised for local intensity changes. In this paper we compare the performance of the spatial domain (SPOT-MACH) filter to the widely applied data driven technique known as the Scale Invariant Feature Transform (SIFT). The SPOT-MACH filter is shown to provide more robust recognition performance than the SIFT technique for demanding images such as scenes in which there are large illumination gradients. The SIFT method depends on reliable local edge-based feature detection over large regions of the image plane which is compromised in some of the demanding images we examined for this work. The disadvantage of the SPOTMACH filter is its numerically intensive nature since it is template based and is implemented in the spatial domain. © (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only
Collaborative Infrastructures for Mobilizing Intellectual Resources: assessing intellectual bandwidth in a knowledge intensive organization
The use of intellectual assets of key professionals to provide customized goods and services is seen to be a key characteristic of knowledge intensive organizations. While knowledge management efforts have become popular in organizations that depend on the knowledge and skills of their employees, it is unclear what the benefits of such efforts are and how these intellectual resources may actually create value for the organization. At the same time, vast information and communication technology infrastructures are being implemented to tap into the diverse intellectual resources to little effect. This paper uses the Intellectual Bandwidth Model originally developed by Nunamaker et al. (2001) to investigate the extent to which do collaborative technologies support the mobilization of intellectual resources to create value for an organization. Following a investigation of the intellectual bandwidth of a large multinational consulting company, this paper provides insight into the role of technology for mobilizing intellectual resources and offers implications for developing infrastructure to support core business processes
Wake Forest University long-term follow-up of type 2 myocardial infarction: The Wake-Up T2MI Registry
BACKGROUND: The Wake-Up T2MI Registry is a retrospective cohort study investigating patients with type 2 myocardial infarction (T2MI), acute myocardial injury, and chronic myocardial injury. We aim to explore risk stratification strategies and investigate clinical characteristics, management, and short- and long-term outcomes in this high-risk, understudied population.
METHODS: From 1 January 2009 to 31 December 2010, 2846 patients were identified with T2MI or myocardial injury defined as elevated cardiac troponin I with at least one value above the 99th percentile upper reference limit and coefficient of variation of 10% ( \u3e 40 ng/L) and meeting our inclusion criteria. Data of at least two serial troponin values will be collected from the electronic health records to differentiate between acute and chronic myocardial injury. The Fourth Universal Definition will be used to classify patients as having (a) T2MI, (b) acute myocardial injury, or (c) chronic myocardial injury during the index hospitalization. Long-term mortality data will be collected through data linkage with the National Death Index and North Carolina State Vital Statistics.
RESULTS: We have collected data for a total of 2205 patients as of November 2018. The mean age of the population was 65.6 +/- 16.9 years, 48% were men, and 64% were white. Common comorbidities included hypertension (71%), hyperlipidemia (35%), and diabetes mellitus (30%). At presentation, 40% were on aspirin, 38% on beta-blockers, and 30% on statins.
CONCLUSION: Improved characterization and profiling of this cohort may further efforts to identify evidence-based strategies to improve cardiovascular outcomes among patients with T2MI and myocardial injury
Energy-Aware Cloud Management through Progressive SLA Specification
Novel energy-aware cloud management methods dynamically reallocate
computation across geographically distributed data centers to leverage regional
electricity price and temperature differences. As a result, a managed VM may
suffer occasional downtimes. Current cloud providers only offer high
availability VMs, without enough flexibility to apply such energy-aware
management. In this paper we show how to analyse past traces of dynamic cloud
management actions based on electricity prices and temperatures to estimate VM
availability and price values. We propose a novel SLA specification approach
for offering VMs with different availability and price values guaranteed over
multiple SLAs to enable flexible energy-aware cloud management. We determine
the optimal number of such SLAs as well as their availability and price
guaranteed values. We evaluate our approach in a user SLA selection simulation
using Wikipedia and Grid'5000 workloads. The results show higher customer
conversion and 39% average energy savings per VM.Comment: 14 pages, conferenc
Statistics and UV-IR Mixing with Twisted Poincare Invariance
We elaborate on the role of quantum statistics in twisted Poincare invariant
theories. It is shown that, in order to have twisted Poincare group as the
symmetry of a quantum theory, statistics must be twisted. It is also confirmed
that the removal of UV-IR mixing (in the absence of gauge fields) in such
theories is a natural consequence.Comment: 13 pages, LaTeX; typos correcte
- …