84 research outputs found
PROTECTION OF OWNERSHIP ACCORDING TO THE EUROPEAN CONVENTION ON HUMAN RIGHTS AND THE CONSTITUTION OF GEORGIA
According to John F. Kennedy: “the right of each human being shall be violated when there is a danger of violating the right of a single human being” - call for the nation, June11, 1963. Nowadays, in Georgia as well as in the whole modern world, the rights to property is one of the most important rights, as this is one of the basis of the development of the country; this is why it is essential that such right be respected. The right to property is being regulated by the National Law of almost every country and by international treaties. Protection of private property is one of the main issues in the development of the liberal economy and in ensuring sustainable democratic, political and legislative systems, which shall serve the interests of the modern civilized society and the individual members of this society. In such system, the state has a role of guarantor to effectively use the right to property and it is not involved in free turnover of the property among the individuals. To put it otherwise, the function of the modern state is protection of property. Such modern state shall not fix useless restrictions and limitations on the rights of using property in a peaceful and effective manner, except for the cases, such restrictions and limitations are extremely necessary, proportional and are based on the law principles, which is a background for the effective use of the right to property
In-Context Learning Functions with Varying Number of Minima
Large Language Models (LLMs) have proven effective at In-Context Learning
(ICL), an ability that allows them to create predictors from labeled examples.
Few studies have explored the interplay between ICL and specific properties of
functions it attempts to approximate. In our study, we use a formal framework
to explore ICL and propose a new task of approximating functions with varying
number of minima. We implement a method that allows for producing functions
with given inputs as minima. We find that increasing the number of minima
degrades ICL performance. At the same time, our evaluation shows that ICL
outperforms 2-layer Neural Network (2NN) model. Furthermore, ICL learns faster
than 2NN in all settings. We validate the findings through a set of few-shot
experiments across various hyperparameter configurations
Few-Shot Learning for Clinical Natural Language Processing Using Siamese Neural Networks
Clinical Natural Language Processing (NLP) has become an emerging technology
in healthcare that leverages a large amount of free-text data in electronic
health records (EHRs) to improve patient care, support clinical decisions, and
facilitate clinical and translational science research. Recently, deep learning
has achieved state-of-the-art performance in many clinical NLP tasks. However,
training deep learning models usually requires large annotated datasets, which
are normally not publicly available and can be time-consuming to build in
clinical domains. Working with smaller annotated datasets is typical in
clinical NLP and therefore, ensuring that deep learning models perform well is
crucial for the models to be used in real-world applications. A widely adopted
approach is fine-tuning existing Pre-trained Language Models (PLMs), but these
attempts fall short when the training dataset contains only a few annotated
samples. Few-Shot Learning (FSL) has recently been investigated to tackle this
problem. Siamese Neural Network (SNN) has been widely utilized as an FSL
approach in computer vision, but has not been studied well in NLP. Furthermore,
the literature on its applications in clinical domains is scarce. In this
paper, we propose two SNN-based FSL approaches for clinical NLP, including
Pre-Trained SNN (PT-SNN) and SNN with Second-Order Embeddings (SOE-SNN). We
evaluated the proposed approaches on two clinical tasks, namely clinical text
classification and clinical named entity recognition. We tested three few-shot
settings including 4-shot, 8-shot, and 16-shot learning. Both clinical NLP
tasks were benchmarked using three PLMs, including BERT,BioBERT, and
BioClinicalBERT. The experimental results verified the effectiveness of the
proposed SNN-based FSL approaches in both NLP tasks
- …