49 research outputs found
ReDi: Efficient Learning-Free Diffusion Inference via Trajectory Retrieval
Diffusion models show promising generation capability for a variety of data.
Despite their high generation quality, the inference for diffusion models is
still time-consuming due to the numerous sampling iterations required. To
accelerate the inference, we propose ReDi, a simple yet learning-free
Retrieval-based Diffusion sampling framework. From a precomputed knowledge
base, ReDi retrieves a trajectory similar to the partially generated trajectory
at an early stage of generation, skips a large portion of intermediate steps,
and continues sampling from a later step in the retrieved trajectory. We
theoretically prove that the generation performance of ReDi is guaranteed. Our
experiments demonstrate that ReDi improves the model inference efficiency by 2x
speedup. Furthermore, ReDi is able to generalize well in zero-shot cross-domain
image generation such as image stylization.Comment: ICML 202
Generative Autoencoders as Watermark Attackers: Analyses of Vulnerabilities and Threats
Invisible watermarks safeguard images' copyrights by embedding hidden
messages detectable by owners. It also prevents people from misusing images,
especially those generated by AI models. Malicious adversaries can violate
these rights by removing the watermarks. In order to remove watermarks without
damaging the visual quality, the adversary needs to erase them while retaining
the essential information in the image. This is analogous to the encoding and
decoding process of generative autoencoders, especially variational
autoencoders (VAEs) and diffusion models. We propose a framework using
generative autoencoders to remove invisible watermarks and test it using VAEs
and diffusions. Our results reveal that, even without specific training,
off-the-shelf Stable Diffusion effectively removes most watermarks, surpassing
all current attackers. The result underscores the vulnerabilities in existing
watermarking schemes and calls for more robust methods for copyright
protection
ALGO: Synthesizing Algorithmic Programs with LLM-Generated Oracle Verifiers
Large language models (LLMs) excel at implementing code from functionality
descriptions but struggle with algorithmic problems that require not only
implementation but also identification of the suitable algorithm. Moreover,
LLM-generated programs lack guaranteed correctness and require human
verification. To address these challenges, we propose ALGO, a framework that
synthesizes Algorithmic programs with LLM-Generated Oracles to guide the
generation and verify their correctness. ALGO first generates a reference
oracle by prompting an LLM to exhaustively enumerate all the combinations of
relevant variables. This oracle is then utilized to guide an arbitrary search
strategy in exploring the algorithm space and to verify the synthesized
algorithms. Our study shows that the LLM-generated oracles are correct for 88%
of the cases. With the oracles as verifiers, ALGO can be integrated with any
existing code generation model in a model-agnostic manner to enhance its
performance. Experiments show that when equipped with ALGO, we achieve an 8x
better one-submission pass rate over the Codex model and a 2.6x better
one-submission pass rate over CodeT, the current state-of-the-art model on
CodeContests. We can also get 1.3x better pass rate over the ChatGPT Code
Interpreter on unseen problems. The problem set we used for testing, the
prompts we used, the verifier and solution programs, and the test cases
generated by ALGO are available at https://github.com/zkx06111/ALGO.Comment: NeurIPS 202
In Situ Monitoring of Catalytic Molecular Transformations on Noble Metal Nanocatalysts Using Surface-Enhanced Raman Spectroscopy
Noble metal nanoparticles have long been of tremendous interest in the nanophotonics and nanocatalysis communities owing to their intriguing size- and shape-dependent plasmonic and catalytic properties. The combination of tunable plasmon resonances with superior catalytic activities on the same noble metal nanoparticle, however, has long been challenging because the research on nanoplasmonics and nanocatalysis deals with nanoparticles in two drastically different size regimes. While tunable plasmon resonances are a unique feature of metallic nanoparticles in the sub-wavelength size regime, heterogeneous catalysis requires the use of substrate-supported sub-5 nm nanoparticulate catalysts. In this mini-review article, we share with the readers several approaches we recently developed toward the realization of plasmonic-catalytic dual-functionalities on a single noble metal nanoparticle. Our approaches involve judicious tailoring of the atomic-level surface structures of sub-wavelength plasmonic nanoparticles through either kinetically controlled seed-mediated nanocrystal growth or regioselective surface etching. These structurally tailored, dual-functional nanoparticles serve as both substrates for surface-enhanced Raman spectroscopy (SERS) and free-standing nanoparticulate catalysts. Using SERS as a molecular finger-printing spectroscopic tool, we have been able to track detailed structural evolution of molecular adsorbates in real time during catalytic reactions. The quantitative insights gained from the in situ SERS measurements shed light on the detailed relationships between interfacial molecule-transforming behaviors and the atomic-level surface structures of noble metal nanocatalysts
Hire a Linguist!: Learning Endangered Languages with In-Context Linguistic Descriptions
How can large language models (LLMs) process and translate endangered
languages? Many languages lack a large corpus to train a decent LLM; therefore
existing LLMs rarely perform well in unseen, endangered languages. On the
contrary, we observe that 2000 endangered languages, though without a large
corpus, have a grammar book or a dictionary. We propose LINGOLLM, a
training-free approach to enable an LLM to process unseen languages that hardly
occur in its pre-training. Our key insight is to demonstrate linguistic
knowledge of an unseen language in an LLM's prompt, including a dictionary, a
grammar book, and morphologically analyzed input text. We implement LINGOLLM on
top of two models, GPT-4 and Mixtral, and evaluate their performance on 5 tasks
across 8 endangered or low-resource languages. Our results show that LINGOLLM
elevates translation capability from GPT-4's 0 to 10.5 BLEU for 10 language
directions. Our findings demonstrate the tremendous value of linguistic
knowledge in the age of LLMs for endangered languages. Our data, code, and
model generations can be found at https://github.com/LLiLab/llm4endangeredlang
Optimize Individualized Energy Delivery for Septic Patients Using Predictive Deep Learning Models: A Real World Study
Background and Objectives: We aim to establish deep learning models to
optimize the individualized energy delivery for septic patients. Methods and
Study Design: We conducted a study of adult septic patients in Intensive Care
Unit (ICU), collecting 47 indicators for 14 days. After data cleaning and
preprocessing, we used stats to explore energy delivery in deceased and
surviving patients. We filtered out nutrition-related features and divided the
data into three metabolic phases: acute early, acute late, and rehabilitation.
Models were built using data before September 2020 and validated on the rest.
We then established optimal energy target models for each phase using deep
learning. Results: A total of 277 patients and 3115 data were included in this
study. The models indicated that the optimal energy targets in the three phases
were 900kcal/d, 2300kcal/d, and 2000kcal/d, respectively. Excessive energy
intake increased mortality rapidly in the early period of the acute phase.
Insufficient energy in the late period of the acute phase significantly raised
the mortality of septic patients. For the rehabilitation phase, too much or too
little energy delivery both associated with high mortality. Conclusion: Our
study established time-series prediction models for septic patients to optimize
energy delivery in the ICU. This approach indicated the feasibility of
developing nutritional tools for critically ill patients. We recommended
permissive underfeeding only in the early acute phase. Later, increased energy
intake may improve survival and settle energy debts caused by underfeeding
Metabolomics in the Development and Progression of Dementia: A Systematic Review
Dementia has become a major global public health challenge with a heavy economic burden. It is urgently necessary to understand dementia pathogenesis and to identify biomarkers predicting risk of dementia in the preclinical stage for prevention, monitoring, and treatment. Metabolomics provides a novel approach for the identification of biomarkers of dementia. This systematic review aimed to examine and summarize recent retrospective cohort human studies assessing circulating metabolite markers, detected using high-throughput metabolomics, in the context of disease progression to dementia, including incident mild cognitive impairment, all-cause dementia, and cognitive decline. We systematically searched the PubMed, Embase, and Cochrane databases for retrospective cohort human studies assessing associations between blood (plasma or serum) metabolomics profile and cognitive decline and risk of dementia from inception through October 15, 2018. We identified 16 studies reporting circulating metabolites and risk of dementia, and six regarding cognitive performance change. Concentrations of several blood metabolites, including lipids (higher phosphatidylcholines, sphingomyelins, and lysophophatidylcholine, and lower docosahexaenoic acid and high-density lipoprotein subfractions), amino acids (lower branched-chain amino acids, creatinine, and taurine, and higher glutamate, glutamine, and anthranilic acid), and steroids were associated with cognitive decline and the incidence or progression of dementia. Circulating metabolites appear to be associated with the risk of dementia. Metabolomics could be a promising tool in dementia biomarker discovery. However, standardization and consensus guidelines for study design and analytical techniques require future development
Design of Hydraulic Bulging Die for Automobile Torsion Beam and Optimization of Forming Process Parameters
The hydraulic bulging technology of tubes can provide hollow parts with special-shaped cross sections. Its manufacturing process can effectively improve material utilization and product accuracy and reduce the number and cost of molds. However, the hydraulic bulging process of parts is very complicated. The size of the tube blank, the design of the loading route, and the forming process parameters will have an effect on the molding quality. Closed tubular torsion automobile beam is considered as the research object to study hydraulic bulging die design and optimize forming process parameters. CATIA software is used to design torsion beam product structure and hydraulic bulging die. AMESim software is employed to design hydraulic synchronous control system for cylinders on both sides of the hydraulic bulging die. Mathematical control model is established and verified in Simulink software. DYNAFORM software is applied to conduct numerical simulation of hydraulic expansion. The supporting pressure, molding pressure, friction coefficient, and feeding quantity are taken as orthogonal experiment level factors. Maximum thinning and maximum thickening rates are taken as hydraulic pressure expansion evaluation indexes to complete the orthogonal experiments. Main molding process parameters are analyzed via orthogonal experiment results and optimized by employing the Taguchi method. Optimal hydraulic bulging parameters are obtained as follows: supporting pressure of 20 MPa, molding pressure of 150 MPa, feeding quantity of 25 mm, and friction coefficient of 0.075. Simulation analysis results indicate that the maximum thinning rate is equal to 9.013%, while the maximum thickening rate is equal to 16.523%. Finally, the design of hydraulic bulging die for torsion beam was completed, and its forming process parameters were optimized
A Method for Determining the Mechanical Parameters of Solution Pore and Crevice Limestone Based on Porosity
Limestone stratum has great anisotropy, which is distributed from large karst caves, pipelines, and faults to small solution pores, and crevices. In this paper, uniaxial compression tests of solution pore and crevice limestones from Mamaya I hydropower station and Ronglai hydropower station are conducted, and the porosity of these limestones is measured. The results show that there is a good power function relationship between compressive strength and the porosity of the solution pore and crevice limestone. Based on the Hoek–Brown criterion, the method for determining mechanical parameters of the solution pore and crevice limestones is proposed, taking the porosity of the rock into consideration. Then, the relationships between the rock mass parameters mb, s, and a and the porosity n are deduced. Based on the proposed method, the variation laws of the mechanical parameters of the limestones, including uniaxial compressive strength (UCS), tensile strength, deformation modulus, shear strength parameters are analyzed. The proposed method simplifies the complexity of mechanical parameters selection by quantifying GSI, avoids the subjectivity and uncertainty, and has good reliability and suitability in the pore and crevice limestone stratum, which has a certain guiding significance for the construction of similar sites
Effect of polymer coatings on the freezing–thawing and carbonation resistances of nano-SiO2 and polyvinyl alcohol fiber-reinforced cementitious composites
The inclusion of nano-SiO2 and polyvinyl alcohol (PVA) fibers can improve the strength and toughness of cementitious composites, and the durability can be further enhanced by protecting the surface of the composite with a polymer coating. Herein, three types of nano-SiO2 and PVA fiber-reinforced cementitious composites were prepared with water–binder ratios of 0.35, 0.40, and 0.45. Furthermore, the following three types of polymer coatings were applied to the surfaces of the protective objects: single-layer and double-layer chlorinated rubber coatings, polyurethane coatings, and permeable silane impregnation agents. The effects of the types and number of layers of polymer coatings on water absorption, freezing–thawing resistance, and carbonation resistance of the nano-SiO2 and PVA fiber-reinforced cementitious composites, as well as the effects of the water–binder ratio of the cementitious composites on the protective efficiency of the polymer coatings, were investigated. The results revealed that the three types of polymer coatings could decrease the capillary water absorption of cementitious composites and improve the freezing–thawing resistance and carbonation resistance of the composites. The carbonation depth and mass loss after freezing–thawing cycles of the material coated with a single polymer coating were reduced by 12–66% and 30.2–60.4%, respectively, compared with the corresponding values for the cementitious composites without coatings. Moreover, the effect of the double-layer polymer coatings was superior to that of the single-layer coatings. As the water–binder ratio decreased, the influence of the polymer coating on the durability of the cementitious composites became increasingly apparent