2,065 research outputs found
Model-Independent Constraints on Lorentz Invariance Violation via the Cosmographic Approach
Since Lorentz invariance plays an important role in modern physics, it is of
interest to test the possible Lorentz invariance violation (LIV). The time-lag
(the arrival time delay between light curves in different energy bands) of
Gamma-ray bursts (GRBs) has been extensively used to this end. However, to our
best knowledge, one or more particular cosmological models were assumed {\it a
priori} in (almost) all of the relevant works in the literature. So, this makes
the results on LIV in those works model-dependent and hence not so robust in
fact. In the present work, we try to avoid this problem by using a
model-independent approach. We calculate the time delay induced by LIV with the
cosmic expansion history given in terms of cosmography, without assuming any
particular cosmological model. Then, we constrain the possible LIV with the
observational data, and find weak hints for LIV.Comment: 15 pages, 4 figures, 3 tables, revtex4; v2: discussions added, Phys.
Lett. B in pres
3,8-Dimethylquinazoline-2,4(1H,3H)-dione
In the title compound, C10H10N2O2, all non-H atoms are approximately co-planar with an r.m.s. deviation of 0.016 Å. In the crystal, molecules are linked into inversion dimers by pairs of N—H⋯O hydrogen bonds. Chains along [010] are buiilt up by π–π interactions [centroid–centroid distance = 3.602 (1) Å] between the benzene and piperazine rings of adjacent molecules
Single machine scheduling with exponential time-dependent learning effect and past-sequence-dependent setup times
AbstractIn this paper we consider the single machine scheduling problem with exponential time-dependent learning effect and past-sequence-dependent (p-s-d) setup times. By the exponential time-dependent learning effect, we mean that the processing time of a job is defined by an exponent function of the total normal processing time of the already processed jobs. The setup times are proportional to the length of the already processed jobs. We consider the following objective functions: the makespan, the total completion time, the sum of the quadratic job completion times, the total weighted completion time and the maximum lateness. We show that the makespan minimization problem, the total completion time minimization problem and the sum of the quadratic job completion times minimization problem can be solved by the smallest (normal) processing time first (SPT) rule, respectively. We also show that the total weighted completion time minimization problem and the maximum lateness minimization problem can be solved in polynomial time under certain conditions
Towards Verifiable Text Generation with Evolving Memory and Self-Reflection
Despite the remarkable ability of large language models (LLMs) in language
comprehension and generation, they often suffer from producing factually
incorrect information, also known as hallucination. A promising solution to
this issue is verifiable text generation, which prompts LLMs to generate
content with citations for accuracy verification. However, verifiable text
generation is non-trivial due to the focus-shifting phenomenon, the intricate
reasoning needed to align the claim with correct citations, and the dilemma
between the precision and breadth of retrieved documents. In this paper, we
present VTG, an innovative framework for Verifiable Text Generation with
evolving memory and self-reflection. VTG introduces evolving long short-term
memory to retain both valuable documents and recent documents. A two-tier
verifier equipped with an evidence finder is proposed to rethink and reflect on
the relationship between the claim and citations. Furthermore, active retrieval
and diverse query generation are utilized to enhance both the precision and
breadth of the retrieved documents. We conduct extensive experiments on five
datasets across three knowledge-intensive tasks and the results reveal that VTG
significantly outperforms baselines
- …