2,519 research outputs found
Permutation Trellis Coded Multi-level FSK Signaling to Mitigate Primary User Interference in Cognitive Radio Networks
We employ Permutation Trellis Code (PTC) based multi-level Frequency Shift
Keying signaling to mitigate the impact of Primary Users (PUs) on the
performance of Secondary Users (SUs) in Cognitive Radio Networks (CRNs). The
PUs are assumed to be dynamic in that they appear intermittently and stay
active for an unknown duration. Our approach is based on the use of PTC
combined with multi-level FSK modulation so that an SU can improve its data
rate by increasing its transmission bandwidth while operating at low power and
not creating destructive interference for PUs. We evaluate system performance
by obtaining an approximation for the actual Bit Error Rate (BER) using
properties of the Viterbi decoder and carry out a thorough performance analysis
in terms of BER and throughput. The results show that the proposed coded system
achieves i) robustness by ensuring that SUs have stable throughput in the
presence of heavy PU interference and ii) improved resiliency of SU links to
interference in the presence of multiple dynamic PUs.Comment: 30 pages, 12 figure
Composing features by managing inconsistent requirements
One approach to system development is to decompose the requirements into features and specify the individual features before composing them. A major limitation of deferring feature composition is that inconsistency between the solutions to individual features may not be uncovered early in the development, leading to unwanted feature interactions. Syntactic inconsistencies arising from the way software artefacts are described can be addressed by the use of explicit, shared, domain knowledge. However, behavioural inconsistencies are more challenging: they may occur within the requirements associated with two or more features as well as at the level of individual features. Whilst approaches exist that address behavioural inconsistencies at design time, these are overrestrictive in ruling out all possible conflicts and may weaken the requirements further than is desirable. In this paper, we present a lightweight approach to dealing with behavioural inconsistencies at run-time. Requirement Composition operators are introduced that specify a run-time prioritisation to be used on occurrence of a feature interaction. This prioritisation can be static or dynamic. Dynamic prioritisation favours some requirement according to some run-time criterion, for example, the extent to which it is already generating behaviour
Improving Receiver Performance of Diffusive Molecular Communication with Enzymes
This paper studies the mitigation of intersymbol interference in a diffusive
molecular communication system using enzymes that freely diffuse in the
propagation environment. The enzymes form reaction intermediates with
information molecules and then degrade them so that they cannot interfere with
future transmissions. A lower bound expression on the expected number of
molecules measured at the receiver is derived. A simple binary receiver
detection scheme is proposed where the number of observed molecules is sampled
at the time when the maximum number of molecules is expected. Insight is also
provided into the selection of an appropriate bit interval. The expected bit
error probability is derived as a function of the current and all previously
transmitted bits. Simulation results show the accuracy of the bit error
probability expression and the improvement in communication performance by
having active enzymes present.Comment: 13 pages, 8 figures, 1 table. To appear in IEEE Transactions on
Nanobioscience (submitted January 22, 2013; minor revision October 16, 2013;
accepted December 4, 2013
Engineering calculations for communications systems planning
The single entry interference problem is treated for frequency sharing between the broadcasting satellite and intersatellite services near 23 GHz. It is recommended that very long (more than 120 longitude difference) intersatellite hops be relegated to the unshared portion of the band. When this is done, it is found that suitable orbit assignments can be determined easily with the aid of a set of universal curves. An attempt to develop synthesis procedures for optimally assigning frequencies and orbital slots for the broadcasting satellite service in region 2 was initiated. Several discrete programming and continuous optimization techniques are discussed
Data centric trust evaluation and prediction framework for IOT
© 2017 ITU. Application of trust principals in internet of things (IoT) has allowed to provide more trustworthy services among the corresponding stakeholders. The most common method of assessing trust in IoT applications is to estimate trust level of the end entities (entity-centric) relative to the trustor. In these systems, trust level of the data is assumed to be the same as the trust level of the data source. However, most of the IoT based systems are data centric and operate in dynamic environments, which need immediate actions without waiting for a trust report from end entities. We address this challenge by extending our previous proposals on trust establishment for entities based on their reputation, experience and knowledge, to trust estimation of data items [1-3]. First, we present a hybrid trust framework for evaluating both data trust and entity trust, which will be enhanced as a standardization for future data driven society. The modules including data trust metric extraction, data trust aggregation, evaluation and prediction are elaborated inside the proposed framework. Finally, a possible design model is described to implement the proposed ideas
INASUD project findings on integrated assessment of climate policies
This communication summarizes the main findings of INASUD, an Europeanwide research project on integrated assessment of climate policies. The projectaimed at improving the framing of climate policy analysis through the parallel use of various existing integrated assessment models. It provides a comprehensive examination of the link between uncertainty regarding damages and inertia in economic systems. Results show that the Kyoto targets and timing are consistent with the precautionary principle but offers little insurance for longer-term climate protection. Flexibility mechanisms offer potentials for cooperation with developing countries, and are necessary to tap the environmental and economic benefits of joint carbon and sulfur emissions abatement.integrated assessment modeling; climate policy; Kyoto protocol; dynamic consistency; double dividend; cooperation
Boosting Deep Open World Recognition by Clustering
While convolutional neural networks have brought significant advances in
robot vision, their ability is often limited to closed world scenarios, where
the number of semantic concepts to be recognized is determined by the available
training set. Since it is practically impossible to capture all possible
semantic concepts present in the real world in a single training set, we need
to break the closed world assumption, equipping our robot with the capability
to act in an open world. To provide such ability, a robot vision system should
be able to (i) identify whether an instance does not belong to the set of known
categories (i.e. open set recognition), and (ii) extend its knowledge to learn
new classes over time (i.e. incremental learning). In this work, we show how we
can boost the performance of deep open world recognition algorithms by means of
a new loss formulation enforcing a global to local clustering of class-specific
features. In particular, a first loss term, i.e. global clustering, forces the
network to map samples closer to the class centroid they belong to while the
second one, local clustering, shapes the representation space in such a way
that samples of the same class get closer in the representation space while
pushing away neighbours belonging to other classes. Moreover, we propose a
strategy to learn class-specific rejection thresholds, instead of heuristically
estimating a single global threshold, as in previous works. Experiments on
RGB-D Object and Core50 datasets show the effectiveness of our approach.Comment: IROS/RAL 202
Going beyond two degrees? The risks and opportunities of alternative options
Since the mid-1990s, the aim of keeping climate change within 2 °C has become firmly entrenched in policy discourses. In the past few years, the likelihood of achieving it has been increasingly called into question. The debate around what to do with a target that seems less and less achievable is, however, only just beginning. As the UN commences a two-year review of the 2 °C target, this article moves beyond the somewhat binary debates about whether or not it should or will be met, in order to analyse more fully some of the alternative options that have been identified but not fully explored in the existing literature. For the first time, uncertainties, risks, and opportunities associated with four such options are identified and synthesized from the literature. The analysis finds that the significant risks and uncertainties associated with some options may encourage decision makers to recommit to the 2 °C target as the least unattractive course of action
- …