57 research outputs found
Jing Tong Yu Shu, a traditional Chinese medicine, suppresses IL-1β and IL-6 gene expressions in macrophages, and alleviates endometriosis
Purpose: To evaluate the effect of a traditional Chinese medicine, Jing Tong Yu Shu (JTYS) on endometriosis in a rat surgical model. Methods: Endometriosis was induced in 40 female rats. The rats were randomly divided into 4 groups: three JTYS groups given different doses of the drug, and a saline group. After four weeks of treatment with JTYS, the volume of the endometriotic explants was measured, and the levels of IL-1β and IL-6 in peritoneal fluid and serum were determined by enzyme-linked immunosorbent assay (ELISA). The production of cytokine IL-1β and IL-6 by peritoneal macrophages was also measured for each group. Results: JTYS treatment brought about regression of implants and inhibition of IL-1β and IL-6 production in a dose-dependent manner, with high-dose JTYS eliciting 66.76 % reduction in mean endometriotic explant volume. Plasma and peritoneal fluid levels of IL-1β and IL-6 were significantly lower in the high-dose JTYS group than in the saline group (p < 0.05). However, JTYS treatment significantly inhibited IL-1β and IL-6 production in peritoneal macrophages (p < 0.05). Conclusion: These results suggest that JTYS treatment leads to regression of endometriotic lesions in rat. Thus, JTYS has the potential to be developed into a new drug for the treatment of endometriosis. Keywords: Endometriosis, Interleukins, Traditional Chinese medicine, Jing Tong Yu Shu Macrophage
Towards Foundation Models for Learning on Tabular Data
Learning on tabular data underpins numerous real-world applications. Despite
considerable efforts in developing effective learning models for tabular data,
current transferable tabular models remain in their infancy, limited by either
the lack of support for direct instruction following in new tasks or the
neglect of acquiring foundational knowledge and capabilities from diverse
tabular datasets. In this paper, we propose Tabular Foundation Models (TabFMs)
to overcome these limitations. TabFMs harness the potential of generative
tabular learning, employing a pre-trained large language model (LLM) as the
base model and fine-tuning it using purpose-designed objectives on an extensive
range of tabular datasets. This approach endows TabFMs with a profound
understanding and universal capabilities essential for learning on tabular
data. Our evaluations underscore TabFM's effectiveness: not only does it
significantly excel in instruction-following tasks like zero-shot and
in-context inference, but it also showcases performance that approaches, and in
instances, even transcends, the renowned yet mysterious closed-source LLMs like
GPT-4. Furthermore, when fine-tuning with scarce data, our model achieves
remarkable efficiency and maintains competitive performance with abundant
training data. Finally, while our results are promising, we also delve into
TabFM's limitations and potential opportunities, aiming to stimulate and
expedite future research on developing more potent TabFMs
AoM: Detecting Aspect-oriented Information for Multimodal Aspect-Based Sentiment Analysis
Multimodal aspect-based sentiment analysis (MABSA) aims to extract aspects
from text-image pairs and recognize their sentiments. Existing methods make
great efforts to align the whole image to corresponding aspects. However,
different regions of the image may relate to different aspects in the same
sentence, and coarsely establishing image-aspect alignment will introduce noise
to aspect-based sentiment analysis (i.e., visual noise). Besides, the sentiment
of a specific aspect can also be interfered by descriptions of other aspects
(i.e., textual noise). Considering the aforementioned noises, this paper
proposes an Aspect-oriented Method (AoM) to detect aspect-relevant semantic and
sentiment information. Specifically, an aspect-aware attention module is
designed to simultaneously select textual tokens and image blocks that are
semantically related to the aspects. To accurately aggregate sentiment
information, we explicitly introduce sentiment embedding into AoM, and use a
graph convolutional network to model the vision-text and text-text interaction.
Extensive experiments demonstrate the superiority of AoM to existing methods.
The source code is publicly released at https://github.com/SilyRab/AoM.Comment: Findings of ACL 202
Pruning random resistive memory for optimizing analogue AI
The rapid advancement of artificial intelligence (AI) has been marked by the
large language models exhibiting human-like intelligence. However, these models
also present unprecedented challenges to energy consumption and environmental
sustainability. One promising solution is to revisit analogue computing, a
technique that predates digital computing and exploits emerging analogue
electronic devices, such as resistive memory, which features in-memory
computing, high scalability, and nonvolatility. However, analogue computing
still faces the same challenges as before: programming nonidealities and
expensive programming due to the underlying devices physics. Here, we report a
universal solution, software-hardware co-design using structural
plasticity-inspired edge pruning to optimize the topology of a randomly
weighted analogue resistive memory neural network. Software-wise, the topology
of a randomly weighted neural network is optimized by pruning connections
rather than precisely tuning resistive memory weights. Hardware-wise, we reveal
the physical origin of the programming stochasticity using transmission
electron microscopy, which is leveraged for large-scale and low-cost
implementation of an overparameterized random neural network containing
high-performance sub-networks. We implemented the co-design on a 40nm 256K
resistive memory macro, observing 17.3% and 19.9% accuracy improvements in
image and audio classification on FashionMNIST and Spoken digits datasets, as
well as 9.8% (2%) improvement in PR (ROC) in image segmentation on DRIVE
datasets, respectively. This is accompanied by 82.1%, 51.2%, and 99.8%
improvement in energy efficiency thanks to analogue in-memory computing. By
embracing the intrinsic stochasticity and in-memory computing, this work may
solve the biggest obstacle of analogue computing systems and thus unleash their
immense potential for next-generation AI hardware
Random resistive memory-based deep extreme point learning machine for unified visual processing
Visual sensors, including 3D LiDAR, neuromorphic DVS sensors, and
conventional frame cameras, are increasingly integrated into edge-side
intelligent machines. Realizing intensive multi-sensory data analysis directly
on edge intelligent machines is crucial for numerous emerging edge
applications, such as augmented and virtual reality and unmanned aerial
vehicles, which necessitates unified data representation, unprecedented
hardware energy efficiency and rapid model training. However, multi-sensory
data are intrinsically heterogeneous, causing significant complexity in the
system development for edge-side intelligent machines. In addition, the
performance of conventional digital hardware is limited by the physically
separated processing and memory units, known as the von Neumann bottleneck, and
the physical limit of transistor scaling, which contributes to the slowdown of
Moore's law. These limitations are further intensified by the tedious training
of models with ever-increasing sizes. We propose a novel hardware-software
co-design, random resistive memory-based deep extreme point learning machine
(DEPLM), that offers efficient unified point set analysis. We show the system's
versatility across various data modalities and two different learning tasks.
Compared to a conventional digital hardware-based system, our co-design system
achieves huge energy efficiency improvements and training cost reduction when
compared to conventional systems. Our random resistive memory-based deep
extreme point learning machine may pave the way for energy-efficient and
training-friendly edge AI across various data modalities and tasks
The Influence of Kaolinite and Quartz on Stability of Coal Froths â A Rheology and Structure Study
- âŚ