65 research outputs found
When ChatGPT is gone: Creativity reverts and homogeneity persists
ChatGPT has been evidenced to enhance human performance in creative tasks.
Yet, it is still unclear if this boosting effect sustains with and without
ChatGPT. In a pre-registered seven-day lab experiment and a follow-up survey
after 30 days of experiment completion, we examined the impacts of ChatGPT
presence and absence on sustained creativity using a text dataset of 3302
creative ideas and 427 creative solutions from 61 college students.
Participants in the treatment group used ChatGPT in creative tasks, while those
in the control group completed the tasks by themselves. The findings show that
although the boosting effect of ChatGPT was consistently observed over a
five-day creative journey, human creative performance reverted to baseline when
ChatGPT was down on the 7th and the 30th day. More critically, the use of
ChatGPT in creative tasks resulted in increasingly homogenized contents, and
this homogenization effect persisted even when ChatGPT was absence. These
findings pose a challenge to the prevailing argument that ChatGPT can enhance
human creativity. In fact, generative AI like ChatGPT lends to human with a
temporary rise in creative performance but boxes human creative capability in
the long run, highlighting the imperative for cautious generative AI
integration in creative endeavors.Comment: 30pages,6figure
TokenMix: Rethinking Image Mixing for Data Augmentation in Vision Transformers
CutMix is a popular augmentation technique commonly used for training modern
convolutional and transformer vision networks. It was originally designed to
encourage Convolution Neural Networks (CNNs) to focus more on an image's global
context instead of local information, which greatly improves the performance of
CNNs. However, we found it to have limited benefits for transformer-based
architectures that naturally have a global receptive field. In this paper, we
propose a novel data augmentation technique TokenMix to improve the performance
of vision transformers. TokenMix mixes two images at token level via
partitioning the mixing region into multiple separated parts. Besides, we show
that the mixed learning target in CutMix, a linear combination of a pair of the
ground truth labels, might be inaccurate and sometimes counter-intuitive. To
obtain a more suitable target, we propose to assign the target score according
to the content-based neural activation maps of the two images from a
pre-trained teacher model, which does not need to have high performance. With
plenty of experiments on various vision transformer architectures, we show that
our proposed TokenMix helps vision transformers focus on the foreground area to
infer the classes and enhances their robustness to occlusion, with consistent
performance gains. Notably, we improve DeiT-T/S/B with +1% ImageNet top-1
accuracy. Besides, TokenMix enjoys longer training, which achieves 81.2% top-1
accuracy on ImageNet with DeiT-S trained for 400 epochs. Code is available at
https://github.com/Sense-X/TokenMix.Comment: ECCV 2022; Code: https://github.com/Sense-X/TokenMi
Effect of potassium simplex optimization medium (KSOM) and embryo screening on the production of human lactoferrin transgenic cloned dairy goats
In this study, we produced cloned transgenic dairy goat based on dairy goat ear skin fibroblast as donor cells for nuclear transfer (NT), which were modified by human lactoferrin (hLF) gene. The developmental competence of NT embryos was compared with either between different embryo culture medium, potassium simplex optimization medium (KSOM) and tissue culture medium (TCM 199), or different classification of NT embryos (48 h after fusion). First we cultured NT embryos to cleavage stage (48 h after fusion) by TCM 199 supplemented with 1 mg/ml bovine serum albumin BSA and KSOM, then used TCM 199 supplemented with 10% FBS to culture them to blastula stage. The results show that the NT embryos in KSOM (19.5%) were superior to TCM 199 (10.6%) in blastulation. In the second experiment, we found that the growth rate of NT embryos (48 h after fusion) was different, then we divided them into four groups: 2-cell, 3- to 4-cell, 5- to 8-cell and >8-cell in stereo microscope and cultured them in vitro respectively. The results show day-2 embryos at 3-4cell and 5-8cell stage (31.9 and 28.2%, P < 0.05) had higher blastocyst formation rates than those at both 2-cell (9.1%) and >8-cell (8.3%) stage, and finally three healthy cloned transgenic goat were successfully produced using 3-8 cell embryos at Day-2 (82%). Using Hoechst 33342 staining, we also found that the >8 cells embryos at Day- 2 demonstrated higher frequency of fragmentation, which may be the one cause of the low blastocyst formation rate. This study therefore demonstrates that KSOM medium could be selected as the early embryo culture medium, and 3-8 cell embryos at day-2 (48 h after fusion) may be the suitable embryos for transplantation, which could reduce the nuclei fragmentation and result in good quality blastocysts that may also enhance the efficiency of transgenic cloned dairy goats production, as well as decrease the economic loss due to embryonic mortality when embryos are transferred to synchronized recipients.Key words: Nuclear transfer, KSOM, transgenic, human lactoferrin, dairy goat
Deep-Learning-Enabled Fast Optical Identification and Characterization of Two-Dimensional Materials
Advanced microscopy and/or spectroscopy tools play indispensable role in
nanoscience and nanotechnology research, as it provides rich information about
the growth mechanism, chemical compositions, crystallography, and other
important physical and chemical properties. However, the interpretation of
imaging data heavily relies on the "intuition" of experienced researchers. As a
result, many of the deep graphical features obtained through these tools are
often unused because of difficulties in processing the data and finding the
correlations. Such challenges can be well addressed by deep learning. In this
work, we use the optical characterization of two-dimensional (2D) materials as
a case study, and demonstrate a neural-network-based algorithm for the material
and thickness identification of exfoliated 2D materials with high prediction
accuracy and real-time processing capability. Further analysis shows that the
trained network can extract deep graphical features such as contrast, color,
edges, shapes, segment sizes and their distributions, based on which we develop
an ensemble approach topredict the most relevant physical properties of 2D
materials. Finally, a transfer learning technique is applied to adapt the
pretrained network to other applications such as identifying layer numbers of a
new 2D material, or materials produced by a different synthetic approach. Our
artificial-intelligence-based material characterization approach is a powerful
tool that would speed up the preparation, initial characterization of 2D
materials and other nanomaterials and potentially accelerate new material
discoveries
Non-compartment model to compartment model pharmacokinetics transformation meta-analysis – a multivariate nonlinear mixed model
Background
To fulfill the model based drug development, the very first step is usually a model establishment from published literatures. Pharmacokinetics model is the central piece of model based drug development. This paper proposed an important approach to transform published non-compartment model pharmacokinetics (PK) parameters into compartment model PK parameters. This meta-analysis was performed with a multivariate nonlinear mixed model. A conditional first-order linearization approach was developed for statistical estimation and inference.
Results
Using MDZ as an example, we showed that this approach successfully transformed 6 non-compartment model PK parameters from 10 publications into 5 compartment model PK parameters. In simulation studies, we showed that this multivariate nonlinear mixed model had little relative bias (<1%) in estimating compartment model PK parameters if all non-compartment PK parameters were reported in every study. If there missing non-compartment PK parameters existed in some published literatures, the relative bias of compartment model PK parameter was still small (<3%). The 95% coverage probabilities of these PK parameter estimates were above 85%.
Conclusions
This non-compartment model PK parameter transformation into compartment model meta-analysis approach possesses valid statistical inference. It can be routinely used for model based drug development
Bayesian pharmacokinetic model-based drug -drug interaction prediction.
A drug-drug interaction (DDI) occurs between a drug and the other concomitantly administered drug that affects the pharmacokinetic (PK) profile and/or therapeutic or adverse effects of the drug. DDI emerges as a crucial issue in drug development and its multi-fold impact on pharmaceutical industry, public health and society has now been recognized. In this dissertation, we consider three specific problems: (i) How can we predict in vivo DDI from available individual drug's in vivo data and early-stage in vitro DDI data in aid of screening compounds that pose potential DDI from entering later-phase drug development and market? (ii) How can we clinically evaluate the model built in (i) with an observed clinical pharmacokinetic DDI study data? (iii) Non-compartmental model has been a popular PK method; however, it is less useful than compartment model in the predicting problems such as DDI prediction. Therefore, recovering the compartmental model parameters from the published non-compartmental model results has become a crucial step before compartmental modeling can be taken, and this has now been an obstacle facing the pharmacokinetic modelers. In problem (i), we develop an integrated, stochastic, mechanism-based DDI-predication methodology, in which we build a system of PK models connected via the well-stirred enzyme inhibition-based liver model, adopt Bayesian hierarchical nonlinear framework to incorporate the multi-level prior information learned for the PK parameters, and implement a Monte Carlo simulation-based approach to render the DDI prediction evaluation with stochastic properties. The system of differential equations is solved numerically. Next in problem (ii), we propose Monte Carlo based equivalence tests to perform clinical evaluation of the built model at both the mean level and variability level, respectively. Some asymptotic properties of the testing approaches related to the sample size and power are analytically derived. For illustration, we use the ketoconazole-midazalom pair as an example. Lastly, in problem (iii), we propose a Bayesian meta-analytic approach. We develop a three-level Bayesian hierarchical model based on summarized data across studies, where we derive the (nonlinear) functional relationship of the observed non-compartmental model PK parameters to the unknown compartmental model parameters. First-order Taylor approximation is used. The use of hybrid MCMC sampling of Gibbs sampler with Metropolis-Hastings algorithm for posterior inference makes the computation feasible. We evaluate the performance of the proposed model by analyzing both simulated and real data. For illustration, we use midazolam as an example.Ph.D.Biological SciencesBiostatisticsHealth and Environmental SciencesMathematicsPharmacologyPure SciencesUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/126988/2/3287673.pd
- …