4,476 research outputs found
Advanced fibre Bragg grating structures: design and application
This thesis presents experimental and computational work on a variety of advanced fibre Bragg grating structures covering long dispersion compensating chirped Bragg gratings, superstructured Bragg gratings for identical multiple channel operation, Bragg gratings for pulse-shaping applications and Bragg gratings for add-drop applications in high bit-rate systems. Development of the fabrication-technique developed and analysed as a part of this work has led to a number of experimental 'firsts', including the meter-long Bragg gratings with dispersion-characteristics designed to compensate simultaneous linear and higher order dispersion. Upon transfer of this technology to our industrial partners, a number of field-trial experiments utilising gratings written using this fabrication technique have been successfully performed. Some of the requirements identified from customers led to the discovery of the importance and understanding of high-quality reflection and time-delay profiles. Another product of the high flexibility provided by the developed fabrication technique have led to demonstrations of superstructured Bragg gratings for a number of exciting applications such as multiple-channel filters obtained through a periodic sinc modulation of the refractive index-profile in fibre Bragg gratings and pulse-reshaping from a soliton to square-pulse with applications in high-speed demultiplexing. Additionally, it is discussed how uniform apodised Bragg gratings filters for application in dense WDM networks, despite their near ideal spectral performance, suffer from non-linear phase attributes in the stop-band, that could limit their use in high bit-rate systems (10Gbit/s and above). Linear phase-filters for dispersion-free filtering are proposed and demonstrated as a solution to this problem for bit-rates up to 40Gbit/s and channel spacings as narrow as 25GHz
The Value 1 Problem Under Finite-memory Strategies for Concurrent Mean-payoff Games
We consider concurrent mean-payoff games, a very well-studied class of
two-player (player 1 vs player 2) zero-sum games on finite-state graphs where
every transition is assigned a reward between 0 and 1, and the payoff function
is the long-run average of the rewards. The value is the maximal expected
payoff that player 1 can guarantee against all strategies of player 2. We
consider the computation of the set of states with value 1 under finite-memory
strategies for player 1, and our main results for the problem are as follows:
(1) we present a polynomial-time algorithm; (2) we show that whenever there is
a finite-memory strategy, there is a stationary strategy that does not need
memory at all; and (3) we present an optimal bound (which is double
exponential) on the patience of stationary strategies (where patience of a
distribution is the inverse of the smallest positive probability and represents
a complexity measure of a stationary strategy)
Qualitative Analysis of Concurrent Mean-payoff Games
We consider concurrent games played by two-players on a finite-state graph,
where in every round the players simultaneously choose a move, and the current
state along with the joint moves determine the successor state. We study a
fundamental objective, namely, mean-payoff objective, where a reward is
associated to each transition, and the goal of player 1 is to maximize the
long-run average of the rewards, and the objective of player 2 is strictly the
opposite. The path constraint for player 1 could be qualitative, i.e., the
mean-payoff is the maximal reward, or arbitrarily close to it; or quantitative,
i.e., a given threshold between the minimal and maximal reward. We consider the
computation of the almost-sure (resp. positive) winning sets, where player 1
can ensure that the path constraint is satisfied with probability 1 (resp.
positive probability). Our main results for qualitative path constraints are as
follows: (1) we establish qualitative determinacy results that show that for
every state either player 1 has a strategy to ensure almost-sure (resp.
positive) winning against all player-2 strategies, or player 2 has a spoiling
strategy to falsify almost-sure (resp. positive) winning against all player-1
strategies; (2) we present optimal strategy complexity results that precisely
characterize the classes of strategies required for almost-sure and positive
winning for both players; and (3) we present quadratic time algorithms to
compute the almost-sure and the positive winning sets, matching the best known
bound of algorithms for much simpler problems (such as reachability
objectives). For quantitative constraints we show that a polynomial time
solution for the almost-sure or the positive winning set would imply a solution
to a long-standing open problem (the value problem for turn-based deterministic
mean-payoff games) that is not known to be solvable in polynomial time
Short-wavelength transmission-loss suppression in fibre Bragg gratings
Fibre Bragg Gratings (FBGs) are known to suffer from short-wavelength, transmission losses due to resonant coupling into backward-propagating cladding modes. Figure 1 shows a typical transmission spectrum of a 10cm standard FBG. The cladding mode losses increase with grating reflectivity and could eventually impose severe limitations in the use of FBGs. The problem can be quite acute in the case that FBG wavelength-multiplexing is required. So far, several attempts have been made to eliminate the short-wavelength, transmission losses and improve grating performance. In all cases, the resonant coupling of the forward-propagating core mode to the backward-propagating cladding modes is minimised by reducing the coupling strength. In this paper, we report on a novel method for reducing cladding-mode transmission losses in standard FBGs. We show that short-wavelength, transmission losses can be practically eliminated by damping the resonant excitation of the cladding modes. The damping is achieved by properly introducing a substantial propagation loss into the cladding modes. For maximum effect, the core mode should experience no extra propagation losses. By applying a thin lossy layer on the fibre cladding surface, a reduction of cladding-mode-losses of about 12dB was achieved
Job Creation and Destruction over the Business Cycles and the Impact on Individual Job Flows in Denmark 1980-2001
Job creation and destruction should be considered as key success or failure criteria of the economic policy. Job creation and destruction are both effects of economic policy, the degree of out- and in-sourcing, and the ability to create new ideas that can be transformed into jobs. Job creation and destruction are results of businesses attempting to maximize their economic outcome. One of the costs of this process is that employees have to move from destroyed jobs to created jobs. The development of this process probably depends on labor protection laws, habits, the educational system, and the whole UI-system. A flexible labor market ensures that scarce labor resources are used where they are most in demand. Thus, labor turnover is an essential factor in a well-functioning economy. This paper uses employer-employee data from the Danish registers of persons and workplaces to show where jobs have been destroyed and where they have been created over the last couple of business cycles. Jobs are in general destroyed and created simultaneously within each industry, but at the same time a major restructuring has taken place, so that jobs have been lost in Textile and Clothing, Manufacturing and the other “old industries”, while jobs have been created in Trade and Service industries. Out-sourcing has been one of the causes. This restructuring has caused a tremendous pressure on workers and their ability to find employment in expanding sectors. The paper shows how this has been accomplished. Especially, the paper shows what has happened to employees involved. Have they become unemployed, employed in the welfare sector or where?job creation and job destruction; turnover of personnel; duration of unemployment; and impact of business cycles
Robust Draws in Balanced Knockout Tournaments
Balanced knockout tournaments are ubiquitous in sports competitions and are
also used in decision-making and elections. The traditional computational
question, that asks to compute a draw (optimal draw) that maximizes the winning
probability for a distinguished player, has received a lot of attention.
Previous works consider the problem where the pairwise winning probabilities
are known precisely, while we study how robust is the winning probability with
respect to small errors in the pairwise winning probabilities. First, we
present several illuminating examples to establish: (a)~there exist
deterministic tournaments (where the pairwise winning probabilities are~0 or~1)
where one optimal draw is much more robust than the other; and (b)~in general,
there exist tournaments with slightly suboptimal draws that are more robust
than all the optimal draws. The above examples motivate the study of the
computational problem of robust draws that guarantee a specified winning
probability. Second, we present a polynomial-time algorithm for approximating
the robustness of a draw for sufficiently small errors in pairwise winning
probabilities, and obtain that the stated computational problem is NP-complete.
We also show that two natural cases of deterministic tournaments where the
optimal draw could be computed in polynomial time also admit polynomial-time
algorithms to compute robust optimal draws
Job Creation by Firms in Denmark
In this paper we will look at job creation and destruction in firms. We will answer the question if it is the large companies that create jobs, while the smaller companies are contributing much less. Or is it the young companies that create jobs? And who destroys the most jobs? In the crisis Denmark lost 186,000 jobs in the private sector. The question is where and how could these jobs be recreated. Are these issues specific to industries or are they universal? The data used is register data on workplaces and firms for the period 1980-2007. The base unit of data is the workplace. The company (firm) is the legal entity. A company can have many sites, and one of the ways companies can grow is by expanding with multiple sites. This can happen by mergers and acquisitions but can also happen by creating "daughter workplaces". It is therefore essential to look at workplaces and firms at the same time. A complication here is that firms switch ID over time because of change of ownership, mergers and divisions. Data must be corrected so that these administrative issues will not affect the survival of firms. The data are used in a way where we can cover firm birth and firm death, spin-offs and mergers. The analysis will make it possible to differentiate between net and gross creation of jobs because we can follow each single individual in and out of jobs. We have for Denmark found that size on its own does not have a big impact, but young firms are much more likely to contribute to a positive growth. For the U.S. it has been found that the growth in jobs comes from small businesses. A closer analysis though shows that the main factor here is the firm age. Thus, it is found that young firms net create the most jobs, but they are also responsible for the most job destructions.job creation, job destruction, firm age, firm size, education, employer-employee data
All-optical pulse reshaping and retiming systems incorporating pulse shaping fiber Bragg grating
This paper demonstrates two optical pulse retiming and reshaping systems incorporating superstructured fiber Bragg gratings (SSFBGs) as pulse shaping elements. A rectangular switching window is implemented to avoid conversion of the timing jitter on the original data pulses into pulse amplitude noise at the output of a nonlinear optical switch. In a first configuration, the rectangular pulse generator is used at the (low power) data input to a nonlinear optical loop mirror (NOLM) to perform retiming of an incident noisy data signal using a clean local clock signal to control the switch. In a second configuration, the authors further amplify the data signal and use it to switch a (low power) clean local clock signal. The S-shaped nonlinear characteristic of the NOLM results in this instance in a reduction of both timing and amplitude jitter on the data signal. The underlying technologies required for the implementation of this technique are such that an upgrade of the scheme for the regeneration of ultrahigh bit rate signals at data rates in excess of 320 Gb/s should be achievable
- …