908 research outputs found
On -stable locally checkable problems parameterized by mim-width
In this paper we continue the study of locally checkable problems under the
framework introduced by Bonomo-Braberman and Gonzalez in 2020, by focusing on
graphs of bounded mim-width. We study which restrictions on a locally checkable
problem are necessary in order to be able to solve it efficiently on graphs of
bounded mim-width. To this end, we introduce the concept of -stability of a
check function. The related locally checkable problems contain large classes of
problems, among which we can mention, for example, LCVP problems. We give an
algorithm showing that these problems are XP when parameterized by the
mim-width of a given binary decomposition tree of the input graph, that is,
that they can be solved in polynomial time given a binary decomposition tree of
bounded mim-width. We explore the relation between -stable locally checkable
problems and the recently introduced DN logic (Bergougnoux, Dreier and Jaffke,
2022), and show that both frameworks model the same family of problems. We
include a list of concrete examples of -stable locally checkable problems
whose complexity on graphs of bounded mim-width was open so far
Comparing Platform Core Features with Third-Party Complements. Machine-Learning Evidence from Apple iOS.
Software-based platforms have become omnipresent both in private and professional contexts. Platform owners constantly invest in platform evolution in that they update the technological core and enrich its feature base. The question arises how such platform core feature changes can be compared with third-party complements. We investigate this question in the context of an exploratory machine-learning based case study on Apple’s mobile platform iOS. By analyzing the changes to iOS over time and developing an approach using natural language processing, we are able identify functional overlaps between platform core features and complements. Our results suggest that platform core features are indeed functionally related to those of complementors and that the strategy of releasing novel platform core features changes over time. Besides, our approach enables us to assign platform core features to app categories. The analysis of functional overlaps raises relevant implications for research and practice
Finite Element Simulation of the process combination Hammering Turning
In the process combination of Hammering Turning, the workpiece surface is microtextured during machining by means of a multi-axis superimposed oscillation of the cutting tool. These microtextures are formed by plastic deformation of the surface layer in a process phase without material removal. In this paper, the formation of the microtexture in the originally three-dimensional process is investigated in simplified analogue 2D FEM simulations. This therefore only represents the processes and mechanisms directly below the cutting edge. The influence of the tool orientation on the plastic material flow and the formation of the microtexture as well as the residual stresses in the surface layer is investigate
Better turbulence spectra from velocity–azimuth display scanning wind lidar
Turbulent velocity spectra derived from velocity–azimuth display (VAD)
scanning wind lidars deviate from spectra derived from one-point measurements
due to averaging effects and cross-contamination among the velocity
components. This work presents two novel methods for minimizing these effects
through advanced raw data processing. The squeezing method is based on the
assumption of frozen turbulence and introduces a time delay into the raw data
processing in order to reduce cross-contamination. The two-beam method uses
only certain laser beams in the reconstruction of wind vector components to
overcome averaging along the measurement circle. Models are developed for
conventional VAD scanning and for both new data processing methods to predict
the spectra and identify systematic differences between the methods.
Numerical modeling and comparison with measurement data were both used to
assess the performance of the methods. We found that the squeezing method
reduces cross-contamination by eliminating the resonance effect caused by the
longitudinal separation of measurement points and also considerably reduces
the averaging along the measurement circle. The two-beam method eliminates this
averaging effect completely. The combined use of the squeezing and two-beam
methods substantially improves the ability of VAD scanning wind lidars to
measure in-wind (u) and vertical (w) fluctuations.</p
Low-noise quantum frequency conversion in a monolithic bulk ppKTP cavity
Interfacing the different building blocks of a future large scale quantum
network will demand efficient and noiseless frequency conversion of quantum
light. Nitrogen-vacancy (NV) centers in diamond are a leading candidate to form
the nodes of such a network. However, the performance of a suitable converter
remains a bottleneck, with existing demonstrations severely limited by
parasitic noise arising at the target telecom wavelength. Here, we demonstrate
a new platform for efficient low-noise quantum frequency conversion based on a
monolithic bulk ppKTP cavity and show its suitability for the conversion of 637
nm single photons from NV centers in diamond to telecommunication wavelengths.
By resonantly enhancing the power of an off-the-shelf pump laser, we achieve an
internal conversion efficiency of while generating only
(110\pm 4) \mbox{ kHz/nm} noise at the target wavelength without the need for
any active stabilization. This constitutes a 5-fold improvement in noise over
existing state-of-the-art single-step converters at this wavelengths. We verify
the almost ideal preservation of non-classical correlations by converting
photons from a spontaneous parametric down-conversion source and moreover show
the preservation of time-energy entanglement via Franson interferometry.Comment: 7 pages, 6 figures, 2 table
- …