438 research outputs found

    Fuzzy gravity approach for determinants of exports

    Get PDF
    Purpose: The present study proposes the fuzzy gravity model approach to examine the main factors that affect the export volume of Turkey and whether the European Union Customs Union (EUCU) membership affected the export of it. Design/Methodology/Approach: A fuzzy approach was developed for the gravity model by using these variables. The dependent variable was the export volume of Turkey to 204 countries throughout the whole world, and the explanatory variables were the gross domestic product (GDP) of the countries that are subject to export and their distances to Turkey (dij), populations, whether the seaway transport was possible, and the membership to the EUCU. The last two items were added as dummy variables in the model. Findings: The results showed that gross domestic product (GDP) of the country which Turkey exports to, its being a member of the EUCU, and its population affected the export volume of Turkey positively. However, the distance of the target country to Turkey had a negative effect. The coefficient of distance variable was found to be negative fuzzy and ones of the seaway was fully fuzzy. However, when seaway variable was taken together with distance, it was positive fuzzy, which showed the positive effect of the opportunity of transportation by sea in long distances on export volume. Practical Implications: It is thought that the policies to be established based on these findings will be beneficial in improving the country's exports. Other available fuzzy regression approaches may be tried in future studies to obtain a fuzzy model. In addition, the import or foreign trade structure of Turkey may be dealt with the same approach. Originality/Value: In this study the fuzzy gravity model approach was used as a novelty, which was different from the methods in the literature, because the gravity model is criticized sometimes that its theoretical basis is weak and the relations between dependent variables and explanatory variables are not adequately clear.peer-reviewe

    Holography of Gravitational Action Functionals

    Get PDF
    Einstein-Hilbert (EH) action can be separated into a bulk and a surface term, with a specific ("holographic") relationship between the two, so that either can be used to extract information about the other. The surface term can also be interpreted as the entropy of the horizon in a wide class of spacetimes. Since EH action is likely to just the first term in the derivative expansion of an effective theory, it is interesting to ask whether these features continue to hold for more general gravitational actions. We provide a comprehensive analysis of lagrangians of the form L=Q_a^{bcd}R^a_{bcd}, in which Q_a^{bcd} is a tensor with the symmetries of the curvature tensor, made from metric and curvature tensor and satisfies the condition \nabla_cQ^{abcd}=0, and show that they share these features. The Lanczos-Lovelock lagrangians are a subset of these in which Q^{abcd} is a homogeneous function of the curvature tensor. They are all holographic, in a specific sense of the term, and -- in all these cases -- the surface term can be interpreted as the horizon entropy. The thermodynamics route to gravity, in which the field equations are interpreted as TdS=dE+pdV, seems to have greater degree of validity than the field equations of Einstein gravity itself. The results suggest that the holographic feature of EH action could also serve as a new symmetry principle in constraining the semiclassical corrections to Einstein gravity. The implications are discussed.Comment: revtex 4; 17 pages; no figure

    Sample selection with SOMP for robust basis recovery in sparse coding dictionary learning

    Get PDF
    Sparse Coding Dictionary (SCD) learning is to decompose a given hyperspectral image into a linear combination of a few bases. In a natural scene, because there is an imbalance in the abundance of materials, the problem of learning a given material well is directly proportional to its abundance in the training scene. By a random selection of pixels to train a given dictionary, the probability of bases learning a given material is proportional to its distribution in the scene. We propose to use SOMP residue for sample selection with each iteration for a more robust or ‘more complete’ learning. Experiments show that the proposed method learns from both background and trace materials accurately with over 0.95 in Pearson correlation coefficient. Furthermore, the proposed implementation has resulted in considerable improvements in Target Detection with Adaptive Cosine Estimator (ACE)

    Using in-situ microLaue diffraction to understand plasticity in MgO

    Get PDF
    The present study investigates the micromechanical modes of deformation in MgO prior to cracking at room temperature. A combination of time resolved white beam Laue diffraction technique and in-situ nano-indentation of large single crystal micropillars provides a unique method to study the operating mechanisms of deformation in this otherwise brittle oxide ceramic. Upon indenting an [100]-oriented MgO micropillar, rotation and streaking of Laue spots were observed. From the streaking of the Laue spots, differential slip on orthogonal {110} slip planes was inferred to take place in adjacent areas under the indent - this was consistent with the results from the transmission electron microscopy studies. Upon cyclic loading of the pillar, subsequent stretching and relaxation of peaks was hypothesised to happen due to pronounced mechanical hysteresis commonly observed in MgO. Also, time-resolved spatial mapping of the deformation gradients of the area under the indent were obtained from which the strain and rotation components were identified

    Building a Sentiment Corpus of Tweets in Brazilian Portuguese

    Full text link
    The large amount of data available in social media, forums and websites motivates researches in several areas of Natural Language Processing, such as sentiment analysis. The popularity of the area due to its subjective and semantic characteristics motivates research on novel methods and approaches for classification. Hence, there is a high demand for datasets on different domains and different languages. This paper introduces TweetSentBR, a sentiment corpora for Brazilian Portuguese manually annotated with 15.000 sentences on TV show domain. The sentences were labeled in three classes (positive, neutral and negative) by seven annotators, following literature guidelines for ensuring reliability on the annotation. We also ran baseline experiments on polarity classification using three machine learning methods, reaching 80.99% on F-Measure and 82.06% on accuracy in binary classification, and 59.85% F-Measure and 64.62% on accuracy on three point classification.Comment: Accepted for publication in 11th International Conference on Language Resources and Evaluation (LREC 2018

    An Online Decision-Theoretic Pipeline for Responder Dispatch

    Full text link
    The problem of dispatching emergency responders to service traffic accidents, fire, distress calls and crimes plagues urban areas across the globe. While such problems have been extensively looked at, most approaches are offline. Such methodologies fail to capture the dynamically changing environments under which critical emergency response occurs, and therefore, fail to be implemented in practice. Any holistic approach towards creating a pipeline for effective emergency response must also look at other challenges that it subsumes - predicting when and where incidents happen and understanding the changing environmental dynamics. We describe a system that collectively deals with all these problems in an online manner, meaning that the models get updated with streaming data sources. We highlight why such an approach is crucial to the effectiveness of emergency response, and present an algorithmic framework that can compute promising actions for a given decision-theoretic model for responder dispatch. We argue that carefully crafted heuristic measures can balance the trade-off between computational time and the quality of solutions achieved and highlight why such an approach is more scalable and tractable than traditional approaches. We also present an online mechanism for incident prediction, as well as an approach based on recurrent neural networks for learning and predicting environmental features that affect responder dispatch. We compare our methodology with prior state-of-the-art and existing dispatch strategies in the field, which show that our approach results in a reduction in response time with a drastic reduction in computational time.Comment: Appeared in ICCPS 201

    Kinematic landslide monitoring with Kalman filtering

    No full text
    International audienceLandslides are serious geologic disasters that threat human life and property in every country. In addition, landslides are one of the most important natural phenomena, which directly or indirectly affect countries' economy. Turkey is also the country that is under the threat of landslides. Landslides frequently occur in all of the Black Sea region as well as in many parts of Marmara, East Anatolia, and Mediterranean regions. Since these landslides resulted in destruction, they are ranked as the second important natural phenomenon that comes after earthquake in Turkey. In recent years several landslides happened after heavy rains and the resulting floods. This makes the landslide monitoring and mitigation techniques an important study subject for the related professional disciplines in Turkey. The investigations on surface deformations are conducted to define the boundaries of the landslide, size, level of activity and direction(s) of the movement, and to determine individual moving blocks of the main slide. This study focuses on the use of a kinematic deformation analysis based on Kalman Filtering at a landslide area near Istanbul. Kinematic deformation analysis has been applied in a landslide area, which is located to the north of Istanbul city. Positional data were collected using GPS technique. As part of the study, conventional static deformation analysis methodology has also been applied on the same data. The results and comparisons are discussed in this paper

    A radiative transfer model-based multi-layered regression learning to estimate shadow map in hyperspectral images

    Get PDF
    The application of Empirical Line Method (ELM) for hyperspectral Atmospheric Compensation (AC) premises the underlying linear relationship between a material’s reflectance and appearance. ELM solves the Radiative Transfer (RT) equation under specialized constraint by means of in-scene white and black calibration panels. The reflectance of material is invariant to illumination. Exploiting this property, we articulated a mathematical formulation based on the RT model to create cost functions relating variably illuminated regions within a scene. In this paper, we propose multi-layered regression learning-based recovery of radiance components, i.e., total ground-reflected radiance and path radiance from reflectance and radiance images of the scene. These decomposed components represent terms in the RT equation and enable us to relate variable illumination. Therefore, we assume that Hyperspectral Image (HSI) radiance of the scene is provided and AC can be processed on it, preferably with QUick Atmospheric Correction (QUAC) algorithm. QUAC is preferred because it does not account for surface models. The output from the proposed algorithm is an intermediate map of the scene on which our mathematically derived binary and multi-label threshold is applied to classify shadowed and non-shadowed regions. Results from a satellite and airborne NADIR imagery are shown in this paper. Ground truth (GT) is generated by ray-tracing on a LIDAR-based surface model in the form of contour data, of the scene. Comparison of our results with GT implies that our algorithm’s binary classification shadow maps outperform other existing shadow detection algorithms in true positive, which is the detection of shadows when it is in ground truth. It also has the lowest false negative i.e., detecting non-shadowed region as shadowed, compared to existing algorithms

    Deformation analysis with Total Least Squares

    Get PDF
    Deformation analysis is one of the main research fields in geodesy. Deformation analysis process comprises measurement and analysis phases. Measurements can be collected using several techniques. The output of the evaluation of the measurements is mainly point positions. In the deformation analysis phase, the coordinate changes in the point positions are investigated. Several models or approaches can be employed for the analysis. One approach is based on a Helmert or similarity coordinate transformation where the displacements and the respective covariance matrix are transformed into a unique datum. Traditionally a Least Squares (LS) technique is used for the transformation procedure. Another approach that could be introduced as an alternative methodology is the Total Least Squares (TLS) that is considerably a new approach in geodetic applications. In this study, in order to determine point displacements, 3-D coordinate transformations based on the Helmert transformation model were carried out individually by the Least Squares (LS) and the Total Least Squares (TLS), respectively. The data used in this study was collected by GPS technique in a landslide area located nearby Istanbul. The results obtained from these two approaches have been compared
    • …
    corecore