671 research outputs found
Distinction Between Inflection and Derivation of Learning Reduplication in Mandarin
Reduplication as a word-formation process in Mandarin, which is one of the most difficult knowledge to comprehend for scholar and student. Theoretically this research offers an approach that is different from what has been made by previous researchers. Using the M.D.S Simatupang free context approach this research contrasts the reduplicative forms of all word classes and shows the relationships between them (AA, AABB, ABAB, ABB) and their basic forms (A, AB), then based on test of categorical word and test of lexical decomposition as proposed by J.W.M Verhaar, this study analyzes and explains reduplication and inflectional reduplication in Mandarin in order to students understand as their meaning vocabularies. As a result, this research examines the derivational and inflectional reduplication in Mandarin all at once can disseminate the use of morphological theory. In addition, this study discusses Mandarin reduplication based on various word classes that are contained as a basis for the relevant form of reduplication. Beginner research results will be presented in this study in order to stimulate more complete writing, it will be better if this research can be disseminated in order to add learning and reading material for future research
Exhaustive and Efficient Constraint Propagation: A Semi-Supervised Learning Perspective and Its Applications
This paper presents a novel pairwise constraint propagation approach by
decomposing the challenging constraint propagation problem into a set of
independent semi-supervised learning subproblems which can be solved in
quadratic time using label propagation based on k-nearest neighbor graphs.
Considering that this time cost is proportional to the number of all possible
pairwise constraints, our approach actually provides an efficient solution for
exhaustively propagating pairwise constraints throughout the entire dataset.
The resulting exhaustive set of propagated pairwise constraints are further
used to adjust the similarity matrix for constrained spectral clustering. Other
than the traditional constraint propagation on single-source data, our approach
is also extended to more challenging constraint propagation on multi-source
data where each pairwise constraint is defined over a pair of data points from
different sources. This multi-source constraint propagation has an important
application to cross-modal multimedia retrieval. Extensive results have shown
the superior performance of our approach.Comment: The short version of this paper appears as oral paper in ECCV 201
Testing Reactive Probabilistic Processes
We define a testing equivalence in the spirit of De Nicola and Hennessy for
reactive probabilistic processes, i.e. for processes where the internal
nondeterminism is due to random behaviour. We characterize the testing
equivalence in terms of ready-traces. From the characterization it follows that
the equivalence is insensitive to the exact moment in time in which an internal
probabilistic choice occurs, which is inherent from the original testing
equivalence of De Nicola and Hennessy. We also show decidability of the testing
equivalence for finite systems for which the complete model may not be known
Barriers and Best Practices for the Circular Economy
Introduction
We’re living in an exciting era. Rather than just another societal transition, we’re going
through a fundamental societal transformation. Ecologist Joanne Macy calls this period ‘The
Great Turning’: a period wherein we change from an industrial growth society into a life
sustaining system’. Macy: “The most remarkable feature of this historical moment on Earth
is not that we are on the way to destroying the world; we've actually been on the way for
quite a while. It is that we are beginning to wake up, as from a millennia-long sleep, to a
whole new relationship to our world, to ourselves and each other.” It is with these eyes that
we have to see the rise of the Circular Economy. The Circular Economy is not just another
trend in business; it’s the start of a completely new economic reality. The Circular Economy
is the starting point for regenerative economics; for a new business-as-usual that - first and
foremost - serves life and is based upon a fundamentally new value-paradigm. The future of
success in business is about doing good for all stakeholders and creating benefit; not just
profit.
The Circular Economy demands next level thinking-and-doing in business, and there is no
one more willing and able than the next generation of young professionals. It is therefore
with great pride and pleasure that I present to you this publication of the SMO Promovendi.
It offers fresh perspectives of a group of promising young scientists. All aspiring
changemakers. It’s made with love and with the best of intentions; to help the Circular
Economy forward
Skill and self-knowledge:Empirical refutation of the dual-burden account of the Dunning-Kruger effect
For many intellectual tasks, the people with the least skill overestimate themselves the most, a pattern popularly known as the Dunning–Kruger effect (DKE). The dominant account of this effect depends on the idea that assessing the quality of one's performance (metacognition) requires the same mental resources as task performance itself (cognition). Unskilled people are said to suffer a dual burden: they lack the cognitive resources to perform well, and this deprives them of metacognitive insight into their failings. In this Registered Report, we applied recently developed methods for the measurement of metacognition to a matrix reasoning task, to test the dual-burden account. Metacognitive sensitivity (information exploited by metacognition) tracked performance closely, so less information was exploited by the metacognitive judgements of poor performers; but metacognitive efficiency (quality of metacognitive processing itself) was unrelated to performance. Metacognitive bias (overall tendency towards high or low confidence) was positively associated with performance, so poor performers were appropriately less confident—not more confident—than good performers. Crucially, these metacognitive factors did not cause the DKE pattern, which was driven overwhelmingly by performance scores. These results refute the dual-burden account and suggest that the classic DKE is a statistical regression artefact that tells us nothing much about metacognition
Recommended from our members
Accelerating Multi-GPU Embedding Retrieval with PGAS-Style Communication for Deep Learning Recommendation Systems
In this paper, we propose using Partitioned Global Address Space (PGAS) GPU one-sided asynchronous small messages to replace the widely used collective communication calls for sparse input multi-GPU embedding retrieval in deep learning recommendation systems. This GPU PGAS communication approach achieves (1) better communication and computation overlap, (2) smoother network usage, and (3) reduced overhead (due to the data unpack and rearrangement steps associated with collective communication calls). We implement a CUDA embedding retrieval backend for PyTorch that supports the proposed PGAS communication scheme and evaluate it on deep learning recommendation inference passes. Our backend outperforms the baseline using NCCL collective calls, achieving 1.97x speedup for the weak scaling test and 2.63x speedup for the strong scaling test in a 4 GPU NVLink-connected system
Optimal Multi-Distribution Learning
Multi-distribution learning (MDL), which seeks to learn a shared model that
minimizes the worst-case risk across distinct data distributions, has
emerged as a unified framework in response to the evolving demand for
robustness, fairness, multi-group collaboration, etc. Achieving data-efficient
MDL necessitates adaptive sampling, also called on-demand sampling, throughout
the learning process. However, there exist substantial gaps between the
state-of-the-art upper and lower bounds on the optimal sample complexity.
Focusing on a hypothesis class of Vapnik-Chervonenkis (VC) dimension d, we
propose a novel algorithm that yields an varepsilon-optimal randomized
hypothesis with a sample complexity on the order of (d+k)/varepsilon^2 (modulo
some logarithmic factor), matching the best-known lower bound. Our algorithmic
ideas and theory are further extended to accommodate Rademacher classes. The
proposed algorithms are oracle-efficient, which access the hypothesis class
solely through an empirical risk minimization oracle.
Additionally, we establish the necessity of randomization, revealing a large
sample size barrier when only deterministic hypotheses are permitted. These
findings resolve three open problems presented in COLT 2023 (i.e.,
citet[Problems 1, 3 and 4]{awasthi2023sample})
Azithromycin reduces bronchial wall thickening in infants with cystic fibrosis
Background: COMBAT-CF showed that children aged 0–3 years treated with azithromycin did clinically better than placebo but there was no effect on CT-scores. We reanalysed CTs using an automatic bronchus-artery (BA) analysis. Method: Inspiratory and expiratory CTs at 12 and 36 months were analysed. BA-analysis measures BA-diameters: bronchial outer wall (Bout), bronchial inner wall (Bin), artery (A), and bronchial wall thickness (Bwt) and computes BA-ratios: Bout/A and Bin/A for bronchial widening, Bwt/A and Bwa/Boa (bronchial wall area/bronchial outer area) for bronchial wall thickening. Low attenuation regions (LAR) were analysed using an automatic method. Mixed-effect model was used to compare BA-outcomes at 36 months between treatment groups. Results: 228 CTs (59 placebo; 66 azithromycin) were analysed. The azithromycin group had lower Bwa/Boa (p = 0.0034) and higher Bin/A (p = 0.001) relative to placebo. Bout/A (p = 0.0088) was higher because of a reduction in artery diameters which correlated to a reduction in LAR. Conclusion: Azithromycin-treated infants with CF show a reduction in bronchial wall thickness and possibly a positive effect on lung perfusion
Prediction of overall survival for patients with metastatic castration-resistant prostate cancer : development of a prognostic model through a crowdsourced challenge with open clinical trial data
Background Improvements to prognostic models in metastatic castration-resistant prostate cancer have the potential to augment clinical trial design and guide treatment strategies. In partnership with Project Data Sphere, a not-for-profit initiative allowing data from cancer clinical trials to be shared broadly with researchers, we designed an open-data, crowdsourced, DREAM (Dialogue for Reverse Engineering Assessments and Methods) challenge to not only identify a better prognostic model for prediction of survival in patients with metastatic castration-resistant prostate cancer but also engage a community of international data scientists to study this disease. Methods Data from the comparator arms of four phase 3 clinical trials in first-line metastatic castration-resistant prostate cancer were obtained from Project Data Sphere, comprising 476 patients treated with docetaxel and prednisone from the ASCENT2 trial, 526 patients treated with docetaxel, prednisone, and placebo in the MAINSAIL trial, 598 patients treated with docetaxel, prednisone or prednisolone, and placebo in the VENICE trial, and 470 patients treated with docetaxel and placebo in the ENTHUSE 33 trial. Datasets consisting of more than 150 clinical variables were curated centrally, including demographics, laboratory values, medical history, lesion sites, and previous treatments. Data from ASCENT2, MAINSAIL, and VENICE were released publicly to be used as training data to predict the outcome of interest-namely, overall survival. Clinical data were also released for ENTHUSE 33, but data for outcome variables (overall survival and event status) were hidden from the challenge participants so that ENTHUSE 33 could be used for independent validation. Methods were evaluated using the integrated time-dependent area under the curve (iAUC). The reference model, based on eight clinical variables and a penalised Cox proportional-hazards model, was used to compare method performance. Further validation was done using data from a fifth trial-ENTHUSE M1-in which 266 patients with metastatic castration-resistant prostate cancer were treated with placebo alone. Findings 50 independent methods were developed to predict overall survival and were evaluated through the DREAM challenge. The top performer was based on an ensemble of penalised Cox regression models (ePCR), which uniquely identified predictive interaction effects with immune biomarkers and markers of hepatic and renal function. Overall, ePCR outperformed all other methods (iAUC 0.791; Bayes factor >5) and surpassed the reference model (iAUC 0.743; Bayes factor >20). Both the ePCR model and reference models stratified patients in the ENTHUSE 33 trial into high-risk and low-risk groups with significantly different overall survival (ePCR: hazard ratio 3.32, 95% CI 2.39-4.62, p Interpretation Novel prognostic factors were delineated, and the assessment of 50 methods developed by independent international teams establishes a benchmark for development of methods in the future. The results of this effort show that data-sharing, when combined with a crowdsourced challenge, is a robust and powerful framework to develop new prognostic models in advanced prostate cancer.Peer reviewe
- …
