139 research outputs found
Streaming algorithms for language recognition problems
We study the complexity of the following problems in the streaming model.
Membership testing for \DLIN We show that every language in \DLIN\ can be
recognised by a randomized one-pass space algorithm with inverse
polynomial one-sided error, and by a deterministic p-pass space
algorithm. We show that these algorithms are optimal.
Membership testing for \LL For languages generated by \LL grammars
with a bound of on the number of nonterminals at any stage in the left-most
derivation, we show that membership can be tested by a randomized one-pass
space algorithm with inverse polynomial (in ) one-sided error.
Membership testing for \DCFL We show that randomized algorithms as efficient
as the ones described above for \DLIN\ and \LL(k) (which are subclasses of
\DCFL) cannot exist for all of \DCFL: there is a language in \VPL\ (a subclass
of \DCFL) for which any randomized p-pass algorithm with error bounded by
must use space.
Degree sequence problem We study the problem of determining, given a sequence
and a graph , whether the degree sequence of is
precisely . We give a randomized one-pass space
algorithm with inverse polynomial one-sided error probability. We show that our
algorithms are optimal.
Our randomized algorithms are based on the recent work of Magniez et al.
\cite{MMN09}; our lower bounds are obtained by considering related
communication complexity problems
Tunneling density of states of fractional quantum Hall edges: an unconventional bosonization approach
An unconventional bosonization approach that employs a modified Fermi-Bose
correspondence is used to obtain the tunneling density of states (TDOS) of
fractional quantum Hall (FQHE) edges in the vicinity of a point contact. The
chiral Luttinger liquid model is generally used to describe FQHE edge
excitations. We introduce a bosonization procedure to study edge state
transport in Laughlin states at filling with odd (single edge
mode) in the presence of a point contact constriction that brings the top and
bottom edges of the sample into close proximity. The unconventional
bosonization involves modifying the Fermi-Bose correspondence to incorporate
backscattering at the point contact, leaving the action of the theory purely
quadratic even in presence of the inhomogeneity. We have shown convincingly in
earlier works that this procedure correctly reproduces the most singular parts
of the Green functions of the system even when mutual forward scattering
between fermions are included. The most singular part of the density-density
correlation function (DDCF) relevant to TDOS calculation is computed using a
generating functional approach. The TDOS for both the electron tunneling as
well as the Laughlin quasiparticle tunneling cases is obtained and is found to
agree with previous results in the literature. For electron tunneling the
well-known universal power laws for TDOS viz. \sim \mbox{ }\omega^{ m-1 }
and for quasi-particle tunneling the power law \sim \mbox{ } \omega^{
\frac{1}{m}-1 } are both correctly recovered using our unconventional
bosonization scheme. This demonstrates convincingly the utility of the present
method which unlike conventional approaches, does not treat the point-contact
as an afterthought and yet remains solvable so long as only the most singular
parts of the correlation functions are desired.Comment: 14 pages, 2 figure
Unconventional bosonization of chiral quantum wires coupled through a point-contact driven out of equilibrium
Non-chiral bosonization technique adapted to study chiral quantum wires with
non-interacting fermions coupled through a point-contact with a constant bias
between the wires is introduced and is shown to reproduce the exact Green
functions of this system which was previously derived by the present authors
analytically using standard methods. The tunneling I-V characteristics are
obtained using the bosonized Green functions. The proposed unconventional
bosonization scheme is also shown to be internally consistent as the four-point
functions evaluated using NCBT are shown to be related to the two-point
functions through Wick's theorem as it should be. In equilibrium, the equal
space-time NCBT Green functions for an interacting Luttinger liquid with
impurities obtained in a previous work shows universal scaling behaviour in
accordance with Bethe ansatz and functional renormalization group predictions.
We expect to obtain a similar scaling form of the tunneling properties out of
equilibrium in presence of interparticle interactions.Comment: 19 page
DATA-DRIVEN CHARACTERIZATION OF TECHNICAL DEBT IN A COMPLEX INFORMATION SYSTEM
Presented herein are techniques that provide a holistic and integrated abstraction among different categories of technical debt (TD) in a complex software system, as well as among different TD-related data sources such as logs, traces, telemetry, and metrics. The techniques presented herein allow for accelerated, automated, and evolutionary TD management in a complex software development life cycle (SDLC). The techniques learn the context throughout the SDLC pipeline and turn this context into actionable insights for use in repaying the technical debt at the earliest stages of the development process. The techniques presented herein provide an automated and low cost mechanism that may reduce debt within a company
T-Transmuted X Family of Distributions
Using the quadratic transmutation map (QRTM) approach of Shaw and Buckley (2007) and the T -X family method by Alzaatreh et al. (2013b), we have developed a new family of distributions called T -transmuted X family of distributions. Many of the existing family of distributions are sub models of this family. As a special case, exponential transmuted exponential (ETE) distribution is studied in detail. The application and flexibility of this new distribution is illustrated using two real data sets
The Effect of Heat Treatment on the Dry sliding Wear Behaviour of as cast and Grain Refined and Modified, Gravity cast A356
ABSTRACT: In this study, Al-Si alloy A356 grain refined and modified using Al-5Ti-1B and Al-10Sr respectively was cast in pre-heated permanent mold using liquid metallurgy route. They were further heat treated (T6) by solutionising at 540 0 C, quenched in water at 70 0 C and aged for 5 hours at 180 0 C. They were tested for hardness and wear resistance as per relevant ASTM standards. A quantum rise in wear resistance was observed in Grain refined, Modified and heat treated A356 compared to as cast A356, Grain refined and modified A356. The improvement in wear resistance may be attributed to the refined microstructure, grain refinement and modification and improved hardness due to heat treatment
IDENTIFYING ENTERPRISE RISK BASED ON BUSINESS CONTEXT WITH THREAT INTELLIGENCE
Presented herein are techniques that facilitate prioritizing risk mitigation efforts for business-critical services and transactions through the incorporation of a business context into threat intelligence scoring. Under aspects of the presented techniques, traditional threat intelligence tools may be employed to evaluate the risk that is associated with an enterprise asset; the results of such an evaluation may then be augmented with an enterprise-assigned business value for the asset to derive the asset’s business risk; and such a business risk may be leveraged to prioritize risk mitigation efforts, may be combined with other business risks, etc. The above-described process may be referred to herein as Business Risk Management (BRM)
ASSESSING RISK INTRODUCED THROUGH A CODE CHANGE
Techniques are presented herein that shift the risk assessment focus during a software development process, away from the traditional end-of-process review (when a new feature is delivered, or an application is deployed) to earlier in the process when developers are actively at work. Such an approach allows a developer to assess the risk that a candidate software change is about to introduce prior to the developer committing that change, providing the developer with time (during the early portion of the process) to revisit the software and eliminate the identified risk. Aspects of the presented techniques leverage elements of a continuous integration (CI) and continuous deployment (CD) facility, the results that are available from existing unit and end-to-end tests, and the collection and analysis of OpenTelemetry (OTEL)-based metrics, events, logs, and traces (MELT) data to deliver security insights
The Diabetic Heart: Too Sweet for Its Own Good?
Diabetes mellitus is a major risk factor for ischemic heart disease (IHD). Patients with diabetes and IHD experience worse clinical outcomes, suggesting that the diabetic heart may be more susceptible to ischemia-reperfusion injury (IRI). In contrast, the animal data suggests that the diabetic heart may be either more, equally, or even less susceptible to IRI. The conflicting animal data may be due to the choice of diabetic and/or IRI animal model. Ischemic conditioning, a phenomenon in which the heart is protected against IRI by one or more brief nonlethal periods of ischemia and reperfusion, may provide a novel cardioprotective strategy for the diabetic heart. Whether the diabetic heart is amenable to ischemic conditioning remains to be determined using relevant animal models of IRI and/or diabetes. In this paper, we review the limitations of the current experimental models used to investigate IRI and cardioprotection in the diabetic heart
- …