6,623 research outputs found
A Simple Method to Estimate the Time-dependent ROC Curve Under Right Censoring
The time-dependent Receiver Operating Characteristic (ROC) curve is often used to study the diagnostic accuracy of a single continuous biomarker, measured at baseline, on the onset of a disease condition when the disease onset may occur at different times during the follow-up and hence may be right censored. Due to censoring, the true disease onset status prior to the pre-specified time horizon may be unknown on some patients, which causes difficulty in calculating the time-dependent sensitivity and specificity. We study a simple method that adjusts for censoring by weighting the censored data by the conditional probability of disease onset prior to the time horizon given the biomarker and the observed censoring time. Our numerical study shows that the proposed method produces unbiased and efficient estimators of time-dependent sensitivity and specificity as well as area under the ROC curve, and outperforms several other published methods currently implemented in R packages
Deterministic versus probabilistic quantum information masking
We investigate quantum information masking for arbitrary dimensional quantum
states. We show that mutually orthogonal quantum states can always be served
for deterministic masking of quantum information. We further construct a
probabilistic masking machine for linearly independent states. It is shown that
a set of d dimensional states, , , can be probabilistically masked by a general
unitary-reduction operation if they are linearly independent. The maximal
successful probability of probabilistic masking is analyzed and derived for the
case of two initial states.Comment: 5 pages, 1 figure
Norm Tweaking: High-performance Low-bit Quantization of Large Language Models
As the size of large language models (LLMs) continues to grow, model
compression without sacrificing accuracy has become a crucial challenge for
deployment. While some quantization methods, such as GPTQ, have made progress
in achieving acceptable 4-bit weight-only quantization, attempts at lower bit
quantization often result in severe performance degradation. In this paper, we
introduce a technique called norm tweaking, which can be used as a plugin in
current PTQ methods to achieve high precision while being cost-efficient. Our
approach is inspired by the observation that rectifying the quantized
activation distribution to match its float counterpart can readily restore
accuracy for LLMs. To achieve this, we carefully design a tweaking strategy
that includes calibration data generation and channel-wise distance constraint
to update the weights of normalization layers for better generalization. We
conduct extensive experiments on various datasets using several open-sourced
LLMs. Our method demonstrates significant improvements in both weight-only
quantization and joint quantization of weights and activations, surpassing
existing PTQ methods. On GLM-130B and OPT-66B, our method even achieves the
same level of accuracy at 2-bit quantization as their float ones. Our simple
and effective approach makes it more practical for real-world applications
- …