277 research outputs found
Recommended from our members
Focus on Prenatal Detection of Micrognathia
Fetal micrognathia involves abnormal or arrested development of the fetal mandible. Till recently, the prenatal diagnosis was subjective, based on the evaluation of the fetal profile and assessment of the relationship between the maxilla and the mandible. Recently objective sonographic methods have been utilized for diagnosing micrognathia such as the inferior facial angle, the jaw index, the frontal nasomental angle, the mandible width/maxilla width ratio and the mandibular length. Another useful sonographic sign, the mandibular gap in the retronasal triangle view, increases the accuracy of the diagnosis early in the first trimester. 3D sonographic views can add to the diagnosis and prenatal MRI is a useful adjunct to ultrasound in cases of limited acoustic window, maternal obesity, oligohydramnios and anterior spine position. The identification of micrognathia should prompt karyotyping and sonographic investigation for other abnormalities. The outcome of fetuses with this seemingly isolated finding is more guarded than one would intuitively believe, and the parents should be counseled accordingly. Postnatal complications including mild to severe upper airway obstruction leading to respiratory distress, feeding difficulties and mild to severe long-term developmental delay are common. One should be careful in pronouncing a fetus having ‘micrognathia’, especially on subjective evaluation, as this term implies that the fetus is abnormal with presence of significant pathology. There is no ‘gold standard’ for a definitive diagnosis of micrognathia on post-natal evaluation. Using a combination of objective sonographic markers as well as follow-up ultrasound assessments can significantly reduce the risk of a false diagnosis. Follow-up scans should be arranged, and neonatal service should be alerted in cases of ongoing pregnancies
Recommended from our members
Selective Fetal Growth Restriction in Dichorionic Twin Pregnancies: Diagnosis, Natural History, and Perinatal Outcome.
This study aims to evaluate the natural history, disease progression, and outcomes in dichorionic twins with selective fetal growth restriction (sFGR) according to different diagnostic criteria and time of onset. Dichorionic twins seen from the first trimester were included. sFGR was classified according to the Delphi consensus, and was compared to the outcomes of those classified by the International Society of Ultrasound in Obstetrics and Gynecology (ISUOG) diagnostic criteria. Early sFGR occurred before 32-weeks, and late sFGR after 32-weeks. Disease progression, neonatal outcomes such as gestation at delivery, birthweight, neonatal unit (NNU) admission, and morbidities were compared. One-hundred twenty-three of 1053 dichorionic twins had sFGR, where 8.4% were classified as early sFGR, and 3.3% were late sFGR. Disease progression was seen in 36%, with a longer progression time (5 vs. 1 week) and higher progression rate (40% vs. 26%) in early sFGR. Perinatal death was significantly higher in the sFGR than the non-sFGR group (24 vs. 16 per 1000 births, p = 0.018), and those with early sFGR had more NNU admissions than late sFGR (p = 0.005). The ISUOG diagnostic criteria yielded a higher number of sFGR than the Delphi criteria, but similar outcomes. sFGR have worse perinatal outcomes, with early onset being more prevalent. Use of the Delphi diagnostic criteria can reduce over-diagnosis of sFGR and avoid unnecessary intervention
Extra-Newton: A First Approach to Noise-Adaptive Accelerated Second-Order Methods
This work proposes a universal and adaptive second-order method for
minimizing second-order smooth, convex functions. Our algorithm achieves
convergence when the oracle feedback is stochastic with
variance , and improves its convergence to with
deterministic oracles, where is the number of iterations. Our method also
interpolates these rates without knowing the nature of the oracle apriori,
which is enabled by a parameter-free adaptive step-size that is oblivious to
the knowledge of smoothness modulus, variance bounds and the diameter of the
constrained set. To our knowledge, this is the first universal algorithm with
such global guarantees within the second-order optimization literature.Comment: 32 pages, 4 figures, accepted at NeurIPS 202
Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements
International audienceWe propose a new family of adaptive first-order methods for a class of convex minimization problems that may fail to be Lipschitz continuous or smooth in the standard sense. Specifically, motivated by a recent flurry of activity on non-Lipschitz (NoLips) optimization, we consider problems that are continuous or smooth relative to a reference Bregman function-as opposed to a global, ambient norm (Euclidean or otherwise). These conditions encompass a wide range of problems with singular objective, such as Fisher markets, Poisson tomography, D-design, and the like. In this setting, the application of existing order-optimal adaptive methods-like UnixGrad or AcceleGrad-is not possible, especially in the presence of randomness and uncertainty. The proposed method, adaptive mirror descent (AdaMir), aims to close this gap by concurrently achieving min-max optimal rates in problems that are relatively continuous or smooth, including stochastic ones
Distributed Extra-gradient with Optimal Complexity and Communication Guarantees
We consider monotone variational inequality (VI) problems in multi-GPU
settings where multiple processors/workers/clients have access to local
stochastic dual vectors. This setting includes a broad range of important
problems from distributed convex minimization to min-max and games.
Extra-gradient, which is a de facto algorithm for monotone VI problems, has not
been designed to be communication-efficient. To this end, we propose a
quantized generalized extra-gradient (Q-GenX), which is an unbiased and
adaptive compression method tailored to solve VIs. We provide an adaptive
step-size rule, which adapts to the respective noise profiles at hand and
achieve a fast rate of under relative noise, and an
order-optimal under absolute noise and show
distributed training accelerates convergence. Finally, we validate our
theoretical results by providing real-world experiments and training generative
adversarial networks on multiple GPUs.Comment: International Conference on Learning Representations (ICLR 2023
Advancing the lower bounds: An accelerated, stochastic, second-order method with optimal adaptation to inexactness
We present a new accelerated stochastic second-order method that is robust to
both gradient and Hessian inexactness, which occurs typically in machine
learning. We establish theoretical lower bounds and prove that our algorithm
achieves optimal convergence in both gradient and Hessian inexactness in this
key setting. We further introduce a tensor generalization for stochastic
higher-order derivatives. When the oracles are non-stochastic, the proposed
tensor algorithm matches the global convergence of Nesterov Accelerated Tensor
method. Both algorithms allow for approximate solutions of their auxiliary
subproblems with verifiable conditions on the accuracy of the solution
- …