381 research outputs found

    LA CLASIFICACI 3N DE LA INDUSTRIA MANUFACTURERA EN VENEZUELA: UNA APROXIMACI 3N DESDE LA PERSPECTIVA MULTIVARIANTE DE LOS COSTOS

    Get PDF
    Delimitar el concepto de Peque\uf1a y Mediana Empresa (PYME) y sus consecuentes clasificaciones, obedece ciertamente al inter\ue9s de las naciones en mejorar los niveles de producci\uf3n de sus empresas, as\ued como optimizar la pol\uedtica establecida por las instituciones p\ufablicas y c\ue1maras privadas, en t\ue9rminos de utilizaci\uf3n de los recursos econ\uf3micos y desarrollo de las cadenas productivas. Los estudios realizados en el mundo y en Venezuela se basan en el tama\uf1o de las empresas a partir del personal ocupado y del valor de las ventas; as\ued las empresas son usualmente clasificadas atendiendo al concepto de tama\uf1o. Ello constituye un enfoque limitado por no considerar otros aspectos de diferenciaci\uf3n. Esta investigaci\uf3n, ofrece como alternativa una propuesta de agrupaci\uf3n de empresas desde una perspectiva multivariante, incorporando el estudio de los principales componentes de costos de la industria manufacturera venezolana. Otro elemento adicional lo constituye la comparaci\uf3n de esta clasificaci\uf3n con la obtenci\uf3n de grupos a partir de la metodolog\ueda planteada por el Mercado del Sur (MERCOSUR) para clasificar a las empresas seg\ufan sean: los propuestos en la Ley para la Promoci\uf3n y Desarrollo de la Peque\uf1a y Mediana Industria (PYMI), y los criterios establecidos por Fundes. La clasificaci\uf3n propuesta por MERCOSUR, se considera una t\ue9cnica v\ue1lida de agrupaci\uf3n para diferenciar a la Gran Industria de la Peque\uf1a o Micro Industria m\ue1s no as\ued dentro de la Mediana Industria, donde el aporte de otras variables como consumo intermedio y costo de la mano de obra resultan determinante en su diferenciaci\uf3n. Las clasificaciones a partir de perfiles multivariantes basadas en costos permitieron identificar seis grupos de industrias: dos determinados por el valor del consumo intermedio y el costo de la mano de obra, un tercero formado por el efecto de los ingresos no caracter\uedsticos, el cuarto influenciado por el efecto del excedente de explotaci\uf3n y valor agregado, el quinto y sexto grupo, determinados por el consumo intermedio, valor de las ventas y consumo de materia prima nacional. En t\ue9rminos de la actividad econ\uf3mica se resumen en dos grupos, el primero formado por un gran n\ufamero de peque\uf1as y medianas industrias de actividades variadas, y el segundo, conformado por grandes industrias dedicadas a la producci\uf3n de alimentos, bebidas, tabaco e industrias qu\uedmicas, petr\uf3leo, caucho y pl\ue1stico. Se espera que el aporte de este trabajo pueda ser de utilidad para otras investigaciones en el \ue1rea, tanto a nivel nacional como latinoamericano. ABSTRACT THE CLASSIFICATION OF THE MANUFACTURING INDUSTRY IN VENEZUELA: AN APPROACH FROM THE MULTIVARIANT PERSPECTIVE OF THE COSTS. Delimiting the concept of small and medium-sized industries ( PYME ) and their consequent classifications, applies well to the interest of nations in improving levels of production of their companies, as well as optimising policies established by public and private institutions in terms of using the economic resources and development of productive chains. Studies carried out around the world and in Venezuela are based on the size of the companies according to the number of employees and the value of profits; usually classified according to the concept of size. This offers a limited focus because other aspects of differentiation are not considered. This research paper offers, as an alternative, a proposal to group companies from a multivariable perspective. This incorporates the study of the principal components of costs of the manufacturing industry in Venezuela. Another additional element to the perspective is the comparison of this classification with the methodology suggested by the "Mercado del Sur" MERCOSUR ) to classify companies according to proposals set out in the Law for the Advancement and Development of Small and Medium-sized Industry ( PYMI ) and the criteria established by Fundes.The classification proposed byMERCOSUR is considered to be a valid technique for differentiating between large and small industries but not between small and medium-sized industries, where the contribution of other variables like intermediate consumption and labour costs are what determine the differences.The classifications from a multivariable perspective based on costs permit the identification of six groups of industries: two determined by the value of intermediate consumption and the labour costs, a third formed by the effect of non-characteristic revenues, the fourth influenced by the effect of excessive exploitation and added value, the fifth and sixth group, determined by the intermediate consumption, value of sales and consumption of primary national materials. In relation to economic activity, industries are collected into two groups, the first being a large number of small and medium-sized industries concerned with a variety of activities, and the second group, made up of large industries dedicated to the production of food, drink, tobacco and chemical, petroleum, tyre and plastics industries. It is hoped that the contribution of this paper may be of use for more research in the area, both at a national and LatinAmerican level

    Collaborative Research: Multiscale Analysis of Geological Structures That Influence Crustal Seismic Anisotropy

    Get PDF
    This project is a study of crustal material anisotropy with a focus on macroscale structural geometries and how they will modify the seismic response of rock fabrics. Seismic anisotropy is the cumulative interplay between propagating seismic waves and anisotropic earth material that manifests itself through the directional dependence of seismic wave speeds. Unraveling this effect in deformed crustal terranes is complex due to several factors, such as 3D geological geometry and heterogeneity, microscale fabric, bending of seismic raypaths due to velocity gradients, field experiments that may not offer full azimuthal coverage, and the observation of anisotropy as second-order waveform or traveltime features. While seismic anisotropy can originate from upper crustal fractures or by organized fine-scale layering of isotropic material, material anisotropy is also a cause and involves at least four factors: (1) microstructural characteristics including spatial arrangement, modal abundances, and crystallographic and shape orientations of constituent minerals(2) inherent azimuthal variation of properties and approximation using symmetry classes,(3) bulk representation (effective media) of material properties at different scales, and (4) the types and internal geometries of macroscale structures. The reorientation of sample-scale material anisotropy by macroscale structures imparts its own effect. A seismic wave will produce one type of signal response due to material; it can produce a different response due to a package of rocks that are reoriented due to the geometry of a structure. The researchers will use the concept of seismic effective media to represent earth volumes through which seismic waves travel. They will employ a representation of earth volumes that allow for a tensorial representation of effective media. This allows via the wave equation an algebraic tensor manipulation to separate the structural geometry and the rocks composing the structure. A primary goal of the project is to define the contributions of structure to form effective media. Each structure has a geometrical impulse response which will modify a rock texture into an effective medium representation of the structure. A second goal of the project is to understand how the role of microscale rock fabrics contribute towards the effective media for given structures. Both combine to produce the net effective medium that a propagating wave responds to. They will conduct a quantitative and systematic study of common crustal structural geometries and how they modify rock anisotropy, and represent structures using analytical geometry surfaces and create a rigorous and integrated methodology to calculate effective media at different scales and their combined effects on seismic wave propagation. They will also examine how the tensorial form of microscale rock fabrics are sensitive to the modal compositions and statistical orientations of constituent minerals. Results of this project will be designed to aid the seismic interpretation of real anisotropic seismic data. This project brings together expertise in seismology, structural/microstructural geology and theoretical/computational mechanics to help develop a quantitative framework for the analysis of material anisotropy and resulting seismic anisotropy in deformed polymineralic rocks of the continental crust

    Integrated Analytical-Computational Analysis of Microstructural Influences on Seismic Anisotropy

    Get PDF
    The magnitudes, orientations and spatial distributions of elastic anisotropy in Earth\u27s crust and mantle carry valuable information about gradients in thermal, mechanical and kinematic parameters arising from mantle convection, mantle-crust coupling and tectonic plate interactions. Relating seismic signals to deformation regimes requires knowledge of the elastic signatures (bulk stiffnesses) of different microstructures that characterize specific deformation environments, but the influence of microstructural heterogeneity on bulk stiffness has not been comprehensively evaluated. The objectives of this project are to: (1) scale up a preliminary method to determine the bulk stiffness of rocks using integrated analytical (electron backscatter diffraction) and computational (asymptotic expansion homogenization) approaches that fully account for the grain-scale elastic interactions among the different minerals in the sample; (2) apply this integrated framework to investigate the effect on elastic anisotropy of several common crustal microstructures; (3) integrate time-dependent microstructure modeling with bulk stiffness calculations to investigate the effects of strain- and process-dependent microstructure evolution on elastic anisotropy in mantle rocks; and (4) disseminate open-source software for the calculation of bulk stiffnesses from electron backscatter diffraction data and creation of synthetic (computer generated) microstructures that can be used in sensitivity analyses among other applications. Because commonly used methods, such as the Voigt, Reuss and Voigt-Reuss-Hill averages, for calculating bulk rock stiffnesses do not account for elastic interactions among the constituent minerals, they exhibit marked, non-systematic differences from stiffnesses obtained using asymptotic expansion homogenization. These objectives are important because the results would substantially improve understanding of the nature of seismic anisotropy in the Earth\u27s crust, which is composed of rocks dominated by low symmetry minerals with complex structures. Traditional methods for performing these calculations do not easily incorporate these effects. This project will develop an elegant, easily-implemented alternative method for anisotropic materials. The scientific results and computational tools that result from this project will have global application across a number of solid Earth and engineering disciplines. Open-source codes developed in this project will made available through existing open-source ELLE platform. Classroom exercises developed for Earth Science and Mechanical Engineering courses that employ this software will be make available to the community, probably through the Science Education Resource Center website at Carleton College

    A key-formula to compute the gravitational potential of inhomogeneous discs in cylindrical coordinates

    Full text link
    We have established the exact expression for the gravitational potential of a homogeneous polar cell - an elementary pattern used in hydrodynamical simulations of gravitating discs. This formula, which is a closed-form, works for any opening angle and radial extension of the cell. It is valid at any point in space, i.e. in the plane of the distribution (inside and outside) as well as off-plane, thereby generalizing the results reported by Durand (1953) for the circular disc. The three components of the gravitational acceleration are given. The mathematical demonstration proceeds from the "incomplete version of Durand's formula" for the potential (based on complete elliptic integrals). We determine first the potential due to the circular sector (i.e. a pie-slice sheet), and then deduce that of the polar cell (from convenient radial scaling and subtraction). As a by-product, we generate an integral theorem stating that "the angular average of the potential of any circular sector along its tangent circle is 2/PI times the value at the corner". A few examples are presented. For numerical resolutions and cell shapes commonly used in disc simulations, we quantify the importance of curvature effects by performing a direct comparison between the potential of the polar cell and that of the Cartesian (i.e. rectangular) cell having the same mass. Edge values are found to deviate roughly like 2E-3 x N/256 in relative (N is the number of grid points in the radial direction), while the agreement is typically four orders of magnitude better for values at the cell's center. We also produce a reliable approximation for the potential, valid in the cell's plane, inside and close to the cell. Its remarkable accuracy, about 5E-4 x N/256 in relative, is sufficient to estimate the cell's self-acceleration.Comment: Accepted for publication in Celestial Mechanics and Dynamical Astronom

    A Static Optimality Transformation with Applications to Planar Point Location

    Full text link
    Over the last decade, there have been several data structures that, given a planar subdivision and a probability distribution over the plane, provide a way for answering point location queries that is fine-tuned for the distribution. All these methods suffer from the requirement that the query distribution must be known in advance. We present a new data structure for point location queries in planar triangulations. Our structure is asymptotically as fast as the optimal structures, but it requires no prior information about the queries. This is a 2D analogue of the jump from Knuth's optimum binary search trees (discovered in 1971) to the splay trees of Sleator and Tarjan in 1985. While the former need to know the query distribution, the latter are statically optimal. This means that we can adapt to the query sequence and achieve the same asymptotic performance as an optimum static structure, without needing any additional information.Comment: 13 pages, 1 figure, a preliminary version appeared at SoCG 201

    Conditions de formation de composés organoiodés sapides lors de l'oxydation par le chlore d'eaux contenant des ions iodure

    Get PDF
    Le travail a consisté à préciser les conditions de formation d'une molécule iodée sapide, l'iodoforme, lors de l'oxydation d'une eau brute par le chlore et à proposer une voie réactionnelle possible.L'étude de la chloration d'une eau brute en présence d'azote ammoniacal et d'ions iodure conduit à la formation d'iodoforme uniquement pour des taux inférieurs au point de rupture. Les résultats montrent que l'oxydation de l'ion ammonium conduit à la formation de monochloramine dont le pouvoir oxydant totalement disponible pourrait être impliqué dans la formation de iodamines ou de chloroiodamines. Ces réactions sont plus favorables en présence d'iode qu'en présence d'ions iodure. Mais l'action de l'iode seul en présence d'ammoniaque et en absence de monochloramine ne permet pas d'expliquer la production des composés organoiodés observés. Ce sont les précurseurs intermédiaires formés à partir des chloramines qui, par action sur la matière organique naturelle, seraient responsables de la formation d'iodoforme. Dans une moindre mesure, certains composés azotés organiques tels les amines et les acides aminés pourraient prendre part à la production des composés organoiodés lors de la chloration.This work consisted of specifying the conditions of iodoform formation during chlorination of a raw water containing iodides. To reach this objective, there was need to spike the studied natural water with potassium iodide (200 µg.L-1) in order to increase the low natural iodide content. Free and combined chlorine, chlorinated and brominated trihalomethanes (THMs) and iodoform were analyzed.It was shown that :- iodoform is formed for chlorine doses prior to the breakpoint, in a region where the formation of the most classical chlorinated and brominated THMs is usually disfavored (Figures 1-4); - in the presence of chloramines the rate of production of iodoform increases with increasing I- or I2 (Figure 5); - the direct reaction of I2 with THM precursors to produce iodoform is slow and independent of the presence of ammonia (Table 1). - Nitrogenated compounds such as amines and amino acids would also take part in the production of organoiodinated compounds during chlorination (Figure 7). However, under water treatment conditions, taking into account the amine and amino acid content of natural waters, this class of compounds will only take a small part in the mechanism of iodoform formation. Among the possible routes that could account for the observations made in this research, the formation of iodamines or chloroiodamines as intermediates is suggested (Figure 8). From a practical point of view, the removal of ammonia from water by a biological process (nitrification step) would inhibit the iodoform formation potential and allow the application of the final chlorination step. Another alternative would involve replacing the chlorination step by oxidation with chlorine dioxide

    Measuring Accuracy of Automated Parsing and Categorization Tools and Processes in Digital Investigations

    Full text link
    This work presents a method for the measurement of the accuracy of evidential artifact extraction and categorization tasks in digital forensic investigations. Instead of focusing on the measurement of accuracy and errors in the functions of digital forensic tools, this work proposes the application of information retrieval measurement techniques that allow the incorporation of errors introduced by tools and analysis processes. This method uses a `gold standard' that is the collection of evidential objects determined by a digital investigator from suspect data with an unknown ground truth. This work proposes that the accuracy of tools and investigation processes can be evaluated compared to the derived gold standard using common precision and recall values. Two example case studies are presented showing the measurement of the accuracy of automated analysis tools as compared to an in-depth analysis by an expert. It is shown that such measurement can allow investigators to determine changes in accuracy of their processes over time, and determine if such a change is caused by their tools or knowledge.Comment: 17 pages, 2 appendices, 1 figure, 5th International Conference on Digital Forensics and Cyber Crime; Digital Forensics and Cyber Crime, pp. 147-169, 201

    Continuous selections of multivalued mappings

    Full text link
    This survey covers in our opinion the most important results in the theory of continuous selections of multivalued mappings (approximately) from 2002 through 2012. It extends and continues our previous such survey which appeared in Recent Progress in General Topology, II, which was published in 2002. In comparison, our present survey considers more restricted and specific areas of mathematics. Note that we do not consider the theory of selectors (i.e. continuous choices of elements from subsets of topological spaces) since this topics is covered by another survey in this volume

    Authorship Analysis Approaches

    Get PDF
    This chapter presents an overview of authorship analysis from multiple standpoints. It includes historical perspective, description of stylometric features, and authorship analysis techniques and their limitations

    Feature extraction and selection for Arabic tweets authorship authentication

    Get PDF
    © 2017, Springer-Verlag Berlin Heidelberg. In tweet authentication, we are concerned with correctly attributing a tweet to its true author based on its textual content. The more general problem of authenticating long documents has been studied before and the most common approach relies on the intuitive idea that each author has a unique style that can be captured using stylometric features (SF). Inspired by the success of modern automatic document classification problem, some researchers followed the Bag-Of-Words (BOW) approach for authenticating long documents. In this work, we consider both approaches and their application on authenticating tweets, which represent additional challenges due to the limitation in their sizes. We focus on the Arabic language due to its importance and the scarcity of works related on it. We create different sets of features from both approaches and compare the performance of different classifiers using them. We experiment with various feature selection techniques in order to extract the most discriminating features. To the best of our knowledge, this is the first study of its kind to combine these different sets of features for authorship analysis of Arabic tweets. The results show that combining all the feature sets we compute yields the best results
    • …
    corecore