2,377 research outputs found
Monolithic microwave integrated circuits: Interconnections and packaging considerations
Monolithic microwave integrated circuits (MMIC's) above 18 GHz were developed because of important potential system benefits in cost reliability, reproducibility, and control of circuit parameters. The importance of interconnection and packaging techniques that do not compromise these MMIC virtues is emphasized. Currently available microwave transmission media are evaluated to determine their suitability for MMIC interconnections. An antipodal finline type of microstrip waveguide transition's performance is presented. Packaging requirements for MMIC's are discussed for thermal, mechanical, and electrical parameters for optimum desired performance
On Structural Parameterizations of Hitting Set: Hitting Paths in Graphs Using 2-SAT
Hitting Set is a classic problem in combinatorial optimization. Its input
consists of a set system F over a finite universe U and an integer t; the
question is whether there is a set of t elements that intersects every set in
F. The Hitting Set problem parameterized by the size of the solution is a
well-known W[2]-complete problem in parameterized complexity theory. In this
paper we investigate the complexity of Hitting Set under various structural
parameterizations of the input. Our starting point is the folklore result that
Hitting Set is polynomial-time solvable if there is a tree T on vertex set U
such that the sets in F induce connected subtrees of T. We consider the case
that there is a treelike graph with vertex set U such that the sets in F induce
connected subgraphs; the parameter of the problem is a measure of how treelike
the graph is. Our main positive result is an algorithm that, given a graph G
with cyclomatic number k, a collection P of simple paths in G, and an integer
t, determines in time 2^{5k} (|G| +|P|)^O(1) whether there is a vertex set of
size t that hits all paths in P. It is based on a connection to the 2-SAT
problem in multiple valued logic. For other parameterizations we derive
W[1]-hardness and para-NP-completeness results.Comment: Presented at the 41st International Workshop on Graph-Theoretic
Concepts in Computer Science, WG 2015. (The statement of Lemma 4 was
corrected in this update.
Minimum Degree up to Local Complementation: Bounds, Parameterized Complexity, and Exact Algorithms
The local minimum degree of a graph is the minimum degree that can be reached
by means of local complementation. For any n, there exist graphs of order n
which have a local minimum degree at least 0.189n, or at least 0.110n when
restricted to bipartite graphs. Regarding the upper bound, we show that for any
graph of order n, its local minimum degree is at most 3n/8+o(n) and n/4+o(n)
for bipartite graphs, improving the known n/2 upper bound. We also prove that
the local minimum degree is smaller than half of the vertex cover number (up to
a logarithmic term). The local minimum degree problem is NP-Complete and hard
to approximate. We show that this problem, even when restricted to bipartite
graphs, is in W[2] and FPT-equivalent to the EvenSet problem, which
W[1]-hardness is a long standing open question. Finally, we show that the local
minimum degree is computed by a O*(1.938^n)-algorithm, and a
O*(1.466^n)-algorithm for the bipartite graphs
The parameterized complexity of some geometric problems in unbounded dimension
We study the parameterized complexity of the following fundamental geometric
problems with respect to the dimension : i) Given points in \Rd,
compute their minimum enclosing cylinder. ii) Given two -point sets in
\Rd, decide whether they can be separated by two hyperplanes. iii) Given a
system of linear inequalities with variables, find a maximum-size
feasible subsystem. We show that (the decision versions of) all these problems
are W[1]-hard when parameterized by the dimension . %and hence not solvable
in time, for any computable function and constant
%(unless FPT=W[1]). Our reductions also give a -time lower bound
(under the Exponential Time Hypothesis)
Simultaneous Water Vapor and Dry Air Optical Path Length Measurements and Compensation with the Large Binocular Telescope Interferometer
The Large Binocular Telescope Interferometer uses a near-infrared camera to
measure the optical path length variations between the two AO-corrected
apertures and provide high-angular resolution observations for all its science
channels (1.5-13 m). There is however a wavelength dependent component to
the atmospheric turbulence, which can introduce optical path length errors when
observing at a wavelength different from that of the fringe sensing camera.
Water vapor in particular is highly dispersive and its effect must be taken
into account for high-precision infrared interferometric observations as
described previously for VLTI/MIDI or the Keck Interferometer Nuller. In this
paper, we describe the new sensing approach that has been developed at the LBT
to measure and monitor the optical path length fluctuations due to dry air and
water vapor separately. After reviewing the current performance of the system
for dry air seeing compensation, we present simultaneous H-, K-, and N-band
observations that illustrate the feasibility of our feedforward approach to
stabilize the path length fluctuations seen by the LBTI nuller.Comment: SPIE conference proceeding
Structurally Parameterized d-Scattered Set
In -Scattered Set we are given an (edge-weighted) graph and are asked to
select at least vertices, so that the distance between any pair is at least
, thus generalizing Independent Set. We provide upper and lower bounds on
the complexity of this problem with respect to various standard graph
parameters. In particular, we show the following:
- For any , an -time algorithm, where
is the treewidth of the input graph.
- A tight SETH-based lower bound matching this algorithm's performance. These
generalize known results for Independent Set.
- -Scattered Set is W[1]-hard parameterized by vertex cover (for
edge-weighted graphs), or feedback vertex set (for unweighted graphs), even if
is an additional parameter.
- A single-exponential algorithm parameterized by vertex cover for unweighted
graphs, complementing the above-mentioned hardness.
- A -time algorithm parameterized by tree-depth
(), as well as a matching ETH-based lower bound, both for
unweighted graphs.
We complement these mostly negative results by providing an FPT approximation
scheme parameterized by treewidth. In particular, we give an algorithm which,
for any error parameter , runs in time
and returns a
-scattered set of size , if a -scattered set of the same
size exists
Moderate Resolution Spectroscopy For The Space Infrared Telescope Facility (SIRTF)
A conceptual design for an infrared spectrometer capable of both low resolution (λ/Δ-λ = 50; 2.5-200 microns) and moderate resolution (1000; 4-200 microns) and moderate resolution (1000; 4-200 microns) has been developed. This facility instrument will permit the spectroscopic study in the infrared of objects ranging from within the solar system to distant galaxies. The spectroscopic capability provided by this instrument for SIRTF will give astronomers orders of magnitude greater sensitivity for the study of faint objects than had been previously available. The low resolution mode will enable detailed studies of the continuum radiation. The moderate resolution mode of the instrument will permit studies of a wide range of problems, from the infrared spectral signatures of small outer solar system bodies such as Pluto and the satellites of the giant planets, to investigations of more luminous active galaxies and QS0s at substantially greater distances. A simple design concept has been developed for the spectrometer which supports the science investigation with practical cryogenic engineering. Operational flexibility is preserved with a minimum number of mechanisms. The five modules share a common aperture, and all gratings share a single scan mechanism. High reliability is achieved through use of flight-proven hardware concepts and redundancy. The design controls the heat load into the SIRTF cryogen, with all heat sources other than the detectors operating at 7K and isolated from the 4K cold station. Two-dimensional area detector arrays are used in the 2.5-120μm bands to simultaneously monitor adjacent regions in extended objects and to measure the background near point sources
On The Power of Tree Projections: Structural Tractability of Enumerating CSP Solutions
The problem of deciding whether CSP instances admit solutions has been deeply
studied in the literature, and several structural tractability results have
been derived so far. However, constraint satisfaction comes in practice as a
computation problem where the focus is either on finding one solution, or on
enumerating all solutions, possibly projected to some given set of output
variables. The paper investigates the structural tractability of the problem of
enumerating (possibly projected) solutions, where tractability means here
computable with polynomial delay (WPD), since in general exponentially many
solutions may be computed. A general framework based on the notion of tree
projection of hypergraphs is considered, which generalizes all known
decomposition methods. Tractability results have been obtained both for classes
of structures where output variables are part of their specification, and for
classes of structures where computability WPD must be ensured for any possible
set of output variables. These results are shown to be tight, by exhibiting
dichotomies for classes of structures having bounded arity and where the tree
decomposition method is considered
Model for floodplain management in urbanizing areas
A target land use pattern found using a dynamic programming model is shown to be a useful reference for comparing the success of floodplain management policies. At least in the test case, there is interdependence in the land use allocation for floodplain management--that is, a good solution includes some reduction of current land use in the floodplain and some provision of detention storage.
For the test case, current floodplain management policies are not sufficient; some of the existing floodplain use should be removed. Although specific land use patterns are in part sensitive to potential error in land value data and to inaccuracy in the routing model, the general conclusion that some existing use must be removed is stable within the range of likely error. Trend surface analysis is shown to be a potentially useful way of generating bid price data for use in land use allocation models. Sensitivity analysis of the dynamic programming model with respect to routing of hydrographs is conducted through simulation based on expected distributions of error.U.S. Geological SurveyU.S. Department of the InteriorOpe
STS in management education: connecting theory and practice
This paper explores the value of science and technology studies (STS) to management education. The work draws on an ethnographic study of second year management undergraduates studying decision making. The nature and delivery of the decision making module is outlined and the value of STS is demonstrated in terms of both teaching method and module content. Three particular STS contributions are identified and described: the social construction of technological systems; actor network theory; and ontological politics. Affordances and sensibilities are identified for each contribution and a discussion is developed that illustrates how these versions of STS are put to use in management education. It is concluded that STS has a pivotal role to play in critical management (education) and in the process offers opportunities for new forms of managin
- …
