835 research outputs found

    Circadian Timing In Cancer Treatment: A Mini Review On Cancer Chronotherapy

    Get PDF
    Organisms exhibit rhythmic fluctuations in their behavior and metabolism every 24 hours, a phenomenon controlled by their circadian clock to anticipate changes in the environment. In the past forty years, substantial progress has been made in our knowledge of the molecular mechanisms underlying the circadian clock and cancer. In this context, researchers have explored the possibility of leveraging the circadian clock to improve cancer treatment. Several randomized controlled trials have investigated the effects of circadian chemotherapy and radiotherapy on drug toxicity and efficacy, with many studies reporting clinically significant outcomes, although some findings remain inconsistent. This mini review aims to summarize the current state of research on chronotherapy in oncology by examining the results of randomized controlled trials investigating chemotherapy and radiotherapy. The goal is to provide an overview of the potential of chronotherapy in the tumor field and to highlight areas where further investigation is needed

    An equivalent-effect phenomenon in eddy current non-destructive testing of thin structures

    Full text link
    The inductance/impedance due to thin metallic structures in non-destructive testing (NDT) is difficult to evaluate. In particular, in Finite Element Method (FEM) eddy current simulation, an extremely fine mesh is required to accurately simulate skin effects especially at high frequencies, and this could cause an extremely large total mesh for the whole problem, i.e. including, for example, other surrounding structures and excitation sources like coils. Consequently, intensive computation requirements are needed. In this paper, an equivalent-effect phenomenon is found, which has revealed that alternative structures can produce the same effect on the sensor response, i.e. mutual impedance/inductance of coupled coils if a relationship (reciprocal relationship) between the electrical conductivity and the thickness of the structure is observed. By using this relationship, the mutual inductance/impedance can be calculated from the equivalent structures with much fewer mesh elements, which can significantly save the computation time. In eddy current NDT, coils inductance/impedance is normally used as a critical parameter for various industrial applications, such as flaw detection, coating and microstructure sensing. Theoretical derivation, measurements and simulations have been presented to verify the feasibility of the proposed phenomenon

    Dynosaur: A Dynamic Growth Paradigm for Instruction-Tuning Data Curation

    Full text link
    Instruction tuning has emerged to enhance the capabilities of large language models (LLMs) to comprehend instructions and generate appropriate responses. Existing methods either manually annotate or employ LLM (e.g., GPT-series) to generate data for instruction tuning. However, they often overlook associating instructions with existing annotated datasets. In this paper, we propose Dynosaur, a dynamic growth paradigm for the automatic curation of instruction-tuning data. Based on the metadata of existing datasets, we use LLMs to automatically construct instruction-tuning data by identifying relevant data fields and generating appropriate instructions. By leveraging the existing annotated datasets, Dynosaur offers several advantages: 1) it reduces the API cost for generating instructions (e.g., it costs less than $12 USD by calling GPT-3.5-turbo for generating 800K instruction tuning samples; 2) it provides high-quality data for instruction tuning (e.g., it performs better than Alpaca and Flan on Super-NI and Longform with comparable data sizes); and 3) it supports the continuous improvement of models by generating instruction-tuning data when a new annotated dataset becomes available. We further investigate a continual learning scheme for learning with the ever-growing instruction-tuning dataset, and demonstrate that replaying tasks with diverse instruction embeddings not only helps mitigate forgetting issues but generalizes to unseen tasks better. Code and data are available at https://github.com/WadeYin9712/Dynosaur.Comment: EMNLP 2023. Code and data are available at https://github.com/WadeYin9712/Dynosau

    More comprehensive facial inversion for more effective expression recognition

    Full text link
    Facial expression recognition (FER) plays a significant role in the ubiquitous application of computer vision. We revisit this problem with a new perspective on whether it can acquire useful representations that improve FER performance in the image generation process, and propose a novel generative method based on the image inversion mechanism for the FER task, termed Inversion FER (IFER). Particularly, we devise a novel Adversarial Style Inversion Transformer (ASIT) towards IFER to comprehensively extract features of generated facial images. In addition, ASIT is equipped with an image inversion discriminator that measures the cosine similarity of semantic features between source and generated images, constrained by a distribution alignment loss. Finally, we introduce a feature modulation module to fuse the structural code and latent codes from ASIT for the subsequent FER work. We extensively evaluate ASIT on facial datasets such as FFHQ and CelebA-HQ, showing that our approach achieves state-of-the-art facial inversion performance. IFER also achieves competitive results in facial expression recognition datasets such as RAF-DB, SFEW and AffectNet. The code and models are available at https://github.com/Talented-Q/IFER-master
    • …
    corecore