38,010 research outputs found
Completion Time Reduction in Instantly Decodable Network Coding Through Decoding Delay Control
For several years, the completion time and decoding delay problems in
Instantly Decodable Network Coding (IDNC) were considered separately and were
thought to completely act against each other. Recently, some works aimed to
balance the effects of these two important IDNC metrics but none of them
studied a further optimization of one by controlling the other. In this paper,
we study the effect of controlling the decoding delay to reduce the completion
time below its currently best known solution. We first derive the
decoding-delay-dependent expressions of the users' and overall completion
times. Although using such expressions to find the optimal overall completion
time is NP-hard, we design a novel heuristic that minimizes the probability of
increasing the maximum of these decoding-delay-dependent completion time
expressions after each transmission through a layered control of their decoding
delays. Simulation results show that this new algorithm achieves both a lower
mean completion time and mean decoding delay compared to the best known
heuristic for completion time reduction. The gap in performance becomes
significant for harsh erasure scenarios
Recommended from our members
Combinatorial optimization and metaheuristics
Today, combinatorial optimization is one of the youngest and most active areas of discrete mathematics. It is a branch of optimization in applied mathematics and computer science, related to operational research, algorithm theory and computational complexity theory. It sits at the intersection of several fields, including artificial intelligence, mathematics and software engineering. Its increasing interest arises for the fact that a large number of scientific and industrial problems can be formulated as abstract combinatorial optimization problems, through graphs and/or (integer) linear programs. Some of these problems have polynomial-time (“efficient”) algorithms, while most of them are NP-hard, i.e. it is not proved that they can be solved in polynomial-time. Mainly, it means that it is not possible to guarantee that an exact solution to the problem can be found and one has to settle for an approximate solution with known performance guarantees. Indeed, the goal of approximate methods is to find “quickly” (reasonable run-times), with “high” probability, provable “good” solutions (low error from the real optimal solution). In the last 20 years, a new kind of algorithm commonly called metaheuristics have emerged in this class, which basically try to combine heuristics in high level frameworks aimed at efficiently and effectively exploring the search space. This report briefly outlines the components, concepts, advantages and disadvantages of different metaheuristic approaches from a conceptual point of view, in order to analyze their similarities and differences. The two very significant forces of intensification and diversification, that mainly determine the behavior of a metaheuristic, will be pointed out. The report concludes by exploring the importance of hybridization and integration methods
Frequency planning for clustered jointly processed cellular multiple access channel
Owing to limited resources, it is hard to guarantee minimum service levels to all users in conventional cellular systems. Although global cooperation of access points (APs) is considered promising, practical means of enhancing efficiency of cellular systems is by considering distributed or clustered jointly processed APs. The authors present a novel `quality of service (QoS) balancing scheme' to maximise sum rate as well as achieve cell-based fairness for clustered jointly processed cellular multiple access channel (referred to as CC-CMAC). Closed-form cell level QoS balancing function is derived. Maximisation of this function is proved as an NP hard problem. Hence, using power-frequency granularity, a modified genetic algorithm (GA) is proposed. For inter site distance (ISD) <; 500 m, results show that with no fairness considered, the upper bound of the capacity region is achievable. Applying hard fairness restraints on users transmitting in moderately dense AP system, 20% reduction in sum rate contribution increases fairness by upto 10%. The flexible QoS can be applied on a GA-based centralised dynamic frequency planner architecture
User-Base Station Association in HetSNets: Complexity and Efficient Algorithms
This work considers the problem of user association to small-cell base
stations (SBSs) in a heterogeneous and small-cell network (HetSNet). Two
optimization problems are investigated, which are maximizing the set of
associated users to the SBSs (the unweighted problem) and maximizing the set of
weighted associated users to the SBSs (the weighted problem), under
signal-to-interference-plus-noise ratio (SINR) constraints. Both problems are
formulated as linear integer programs. The weighted problem is known to be
NP-hard and, in this paper, the unweighted problem is proved to be NP-hard as
well. Therefore, this paper develops two heuristic polynomial-time algorithms
to solve both problems. The computational complexity of the proposed algorithms
is evaluated and is shown to be far more efficient than the complexity of the
optimal brute-force (BF) algorithm. Moreover, the paper benchmarks the
performance of the proposed algorithms against the BF algorithm, the
branch-and-bound (B\&B) algorithm and standard algorithms, through numerical
simulations. The results demonstrate the close-to-optimal performance of the
proposed algorithms. They also show that the weighted problem can be solved to
provide solutions that are fair between users or to balance the load among
SBSs
Scheduling over Scenarios on Two Machines
We consider scheduling problems over scenarios where the goal is to find a
single assignment of the jobs to the machines which performs well over all
possible scenarios. Each scenario is a subset of jobs that must be executed in
that scenario and all scenarios are given explicitly. The two objectives that
we consider are minimizing the maximum makespan over all scenarios and
minimizing the sum of the makespans of all scenarios. For both versions, we
give several approximation algorithms and lower bounds on their
approximability. With this research into optimization problems over scenarios,
we have opened a new and rich field of interesting problems.Comment: To appear in COCOON 2014. The final publication is available at
link.springer.co
- …