28,162 research outputs found
Managing Uncertainty: A Case for Probabilistic Grid Scheduling
The Grid technology is evolving into a global, service-orientated
architecture, a universal platform for delivering future high demand
computational services. Strong adoption of the Grid and the utility computing
concept is leading to an increasing number of Grid installations running a wide
range of applications of different size and complexity. In this paper we
address the problem of elivering deadline/economy based scheduling in a
heterogeneous application environment using statistical properties of job
historical executions and its associated meta-data. This approach is motivated
by a study of six-month computational load generated by Grid applications in a
multi-purpose Grid cluster serving a community of twenty e-Science projects.
The observed job statistics, resource utilisation and user behaviour is
discussed in the context of management approaches and models most suitable for
supporting a probabilistic and autonomous scheduling architecture
Optimality of Treating Interference as Noise: A Combinatorial Perspective
For single-antenna Gaussian interference channels, we re-formulate the
problem of determining the Generalized Degrees of Freedom (GDoF) region
achievable by treating interference as Gaussian noise (TIN) derived in [3] from
a combinatorial perspective. We show that the TIN power control problem can be
cast into an assignment problem, such that the globally optimal power
allocation variables can be obtained by well-known polynomial time algorithms.
Furthermore, the expression of the TIN-Achievable GDoF region (TINA region) can
be substantially simplified with the aid of maximum weighted matchings. We also
provide conditions under which the TINA region is a convex polytope that relax
those in [3]. For these new conditions, together with a channel connectivity
(i.e., interference topology) condition, we show TIN optimality for a new class
of interference networks that is not included, nor includes, the class found in
[3].
Building on the above insights, we consider the problem of joint link
scheduling and power control in wireless networks, which has been widely
studied as a basic physical layer mechanism for device-to-device (D2D)
communications. Inspired by the relaxed TIN channel strength condition as well
as the assignment-based power allocation, we propose a low-complexity
GDoF-based distributed link scheduling and power control mechanism (ITLinQ+)
that improves upon the ITLinQ scheme proposed in [4] and further improves over
the heuristic approach known as FlashLinQ. It is demonstrated by simulation
that ITLinQ+ provides significant average network throughput gains over both
ITLinQ and FlashLinQ, and yet still maintains the same level of implementation
complexity. More notably, the energy efficiency of the newly proposed ITLinQ+
is substantially larger than that of ITLinQ and FlashLinQ, which is desirable
for D2D networks formed by battery-powered devices.Comment: A short version has been presented at IEEE International Symposium on
Information Theory (ISIT 2015), Hong Kon
Feedback and time are essential for the optimal control of computing systems
The performance, reliability, cost, size and energy usage of computing systems can be improved by one or more orders of magnitude by the systematic use of modern control and optimization methods. Computing systems rely on the use of feedback algorithms to schedule tasks, data and resources, but the models that are used to design these algorithms are validated using open-loop metrics. By using closed-loop metrics instead, such as the gap metric developed in the control community, it should be possible to develop improved scheduling algorithms and computing systems that have not been over-engineered. Furthermore, scheduling problems are most naturally formulated as constraint satisfaction or mathematical optimization problems, but these are seldom implemented using state of the art numerical methods, nor do they explicitly take into account the fact that the scheduling problem itself takes time to solve. This paper makes the case that recent results in real-time model predictive control, where optimization problems are solved in order to control a process that evolves in time, are likely to form the basis of scheduling algorithms of the future. We therefore outline some of the research problems and opportunities that could arise by explicitly considering feedback and time when designing optimal scheduling algorithms for computing systems
An integrated shipment planning and storage capacity decision under uncertainty: a simulation study
Purpose
– In transportation and distribution systems, the shipment decisions, fleet capacity, and storage capacity are interrelated in a complex way, especially when the authors take into account uncertainty of the demand rate and shipment lead time. While shipment planning is tactical or operational in nature, increasing storage capacity often requires top management’s authority. The purpose of this paper is to present a new method to integrate both operational and strategic decision parameters, namely shipment planning and storage capacity decision under uncertainty. The ultimate goal is to provide a near optimal solution that leads to a striking balance between the total logistics costs and product availability, critical in maritime logistics of bulk shipment of commodity items.
Design/methodology/approach
– The authors use simulation as research method. The authors develop a simulation model to investigate the effects of various factors on costs and service levels of a distribution system. The model mimics the transportation and distribution problems of bulk cement in a major cement company in Indonesia consisting of a silo at the port of origin, two silos at two ports of destination, and a number of ships that transport the bulk cement. The authors develop a number of “what-if” scenarios by varying the storage capacity at the port of origin as well as at the ports of destinations, number of ships operated, operating hours of ports, and dispatching rules for the ships. Each scenario is evaluated in terms of costs and service level. A full factorial experiment has been conducted and analysis of variance has been used to analyze the results.
Findings
– The results suggest that the number of ships deployed, silo capacity, working hours of ports, and the dispatching rules of ships significantly affect both total costs and service level. Interestingly, operating fewer ships enables the company to achieve almost the same service level and gaining substantial cost savings if constraints in other part of the system are alleviated, i.e., storage capacities and working hours of ports are extended.
Practical implications
– Cost is a competitive factor for bulk items like cement, and thus the proposed scenarios could be implemented by the company to substantially reduce the transportation and distribution costs. Alleviating storage capacity constraint is obviously an idea that needs to be considered when optimizing shipment planning alone could not give significant improvements.
Originality/value
– Existing research has so far focussed on the optimization of shipment planning/scheduling, and considers shipment planning/scheduling as the objective function while treating the storage capacity as constraints. The simulation model enables “what-if” analyses to be performed and has overcome the difficulties and impracticalities of analytical methods especially when the system incorporates stochastic variables exhibited in the case example. The use of efficient frontier analysis for analyzing the simulation results is a novel idea which has been proven to be effective in screening non-dominated solutions. This has provided the authors with near optimal solutions to trade-off logistics costs and service levels (availability), with minimal experimentation times
Multiple Timescale Dispatch and Scheduling for Stochastic Reliability in Smart Grids with Wind Generation Integration
Integrating volatile renewable energy resources into the bulk power grid is
challenging, due to the reliability requirement that at each instant the load
and generation in the system remain balanced. In this study, we tackle this
challenge for smart grid with integrated wind generation, by leveraging
multi-timescale dispatch and scheduling. Specifically, we consider smart grids
with two classes of energy users - traditional energy users and opportunistic
energy users (e.g., smart meters or smart appliances), and investigate pricing
and dispatch at two timescales, via day-ahead scheduling and realtime
scheduling. In day-ahead scheduling, with the statistical information on wind
generation and energy demands, we characterize the optimal procurement of the
energy supply and the day-ahead retail price for the traditional energy users;
in realtime scheduling, with the realization of wind generation and the load of
traditional energy users, we optimize real-time prices to manage the
opportunistic energy users so as to achieve systemwide reliability. More
specifically, when the opportunistic users are non-persistent, i.e., a subset
of them leave the power market when the real-time price is not acceptable, we
obtain closedform solutions to the two-level scheduling problem. For the
persistent case, we treat the scheduling problem as a multitimescale Markov
decision process. We show that it can be recast, explicitly, as a classic
Markov decision process with continuous state and action spaces, the solution
to which can be found via standard techniques. We conclude that the proposed
multi-scale dispatch and scheduling with real-time pricing can effectively
address the volatility and uncertainty of wind generation and energy demand,
and has the potential to improve the penetration of renewable energy into smart
grids.Comment: Submitted to IEEE Infocom 2011. Contains 10 pages and 4 figures.
Replaces the previous arXiv submission (dated Aug-23-2010) with the same
titl
- …