127,685 research outputs found

    A Triangular Assessment Method

    Get PDF
    A Triangular Assessment Method (Research in progress, abbreviated to MTC, from the initials of the Spanish name: Método del Triángulo de Calificaciones) is described. The proposed method is an improvement of the well known Analytic Hierarchy Process (AHP), which is a pairwise comparison technique developed by Saaty and often applied in complex decision making. The MTC basically consists of comparing criteria and alternative levels of each criterion in trios instead of in pairs. This overcomes some of the drawbacks of the AHP, such as the large number of pairs that must be analyzed when there are numerous criteria and alternatives

    Design of allocation of new technological equipment within the frame of production process in company Getrag Ford Transmissions

    Get PDF
    Assessment of optimal solution – alternative of layout is deeply wedded with material flow. For alternative design it is possible touse a combination of optimizing method for layout of workstations, triangular method and computer simulation by program EXTEND,whereby the simulation performs the function of accuracy of solution verification

    Social impact assessment on a hydrocarbon proyect using triangular whitenization weight functions

    Full text link
    [EN] Social impact assessment (SIA) has become an important factor for social conflicts prevention. In this study, we conducted SIA using the center-point triangular whitenization weight functions (CTWF) method, which is based on grey systems theory. A case study was conducted on a hydrocarbon exploration project located in the Gulf of Valencia, Spain. Two stakeholder groups and four evaluation criteria were identified. The results revealed that for the group of the directly linked population, the project would have very negative social impact; and for the group of indirectly linked citizens, the project would have negative social impact. The results could help central and community governments to make the best decision on the project. The method showed interesting results and could be apply to SIA of other projects or programs.Delgado-Villanueva, KA.; Romero Gil, I. (2016). Social impact assessment on a hydrocarbon proyect using triangular whitenization weight functions. IEEE. 118-123. https://doi.org/10.1109/CACIDI.2016.7785998S11812

    The Worker Perception Toward Foreman Leadership In Construction Project

    Get PDF
    Perception is the subject's view of assessing objects. Perception can be used as material for a person's assessment of objects and results in an assessment of exposure or data. Perception has good and bad results depending on the subject how to judge about the object. The method used to assess and analyze perceptions in this research is the triangular method and the field survey method through interview techniques. Based on these methods can produce results of the analysis to find out how the subject's perception of the object. Foreman leadership style has a big role in the continuity and success in a job, the perception of labor becomes a benchmark of success for a foreman to lead all labo

    Evaluating decision-making units under uncertainty using fuzzy multi-objective nonlinear programming

    Get PDF
    This paper proposes a new method to evaluate Decision Making Units (DMUs) under uncertainty using fuzzy Data Envelopment Analysis (DEA). In the proposed multi-objective nonlinear programming methodology both the objective functions and the constraints are considered fuzzy. The coefficients of the decision variables in the objective functions and in the constraints, as well as the DMUs under assessment are assumed to be fuzzy numbers with triangular membership functions. A comparison between the current fuzzy DEA models and the proposed method is illustrated by a numerical example

    Laplace deconvolution on the basis of time domain data and its application to Dynamic Contrast Enhanced imaging

    Full text link
    In the present paper we consider the problem of Laplace deconvolution with noisy discrete non-equally spaced observations on a finite time interval. We propose a new method for Laplace deconvolution which is based on expansions of the convolution kernel, the unknown function and the observed signal over Laguerre functions basis (which acts as a surrogate eigenfunction basis of the Laplace convolution operator) using regression setting. The expansion results in a small system of linear equations with the matrix of the system being triangular and Toeplitz. Due to this triangular structure, there is a common number mm of terms in the function expansions to control, which is realized via complexity penalty. The advantage of this methodology is that it leads to very fast computations, produces no boundary effects due to extension at zero and cut-off at TT and provides an estimator with the risk within a logarithmic factor of the oracle risk. We emphasize that, in the present paper, we consider the true observational model with possibly nonequispaced observations which are available on a finite interval of length TT which appears in many different contexts, and account for the bias associated with this model (which is not present when TT\rightarrow\infty). The study is motivated by perfusion imaging using a short injection of contrast agent, a procedure which is applied for medical assessment of micro-circulation within tissues such as cancerous tumors. Presence of a tuning parameter aa allows to choose the most advantageous time units, so that both the kernel and the unknown right hand side of the equation are well represented for the deconvolution. The methodology is illustrated by an extensive simulation study and a real data example which confirms that the proposed technique is fast, efficient, accurate, usable from a practical point of view and very competitive.Comment: 36 pages, 9 figures. arXiv admin note: substantial text overlap with arXiv:1207.223

    Evaluation of analytical methodologies to derive vulnerability functions

    Get PDF
    The recognition of fragility functions as a fundamental tool in seismic risk assessment has led to the development of more and more complex and elaborate procedures for their computation. Although vulnerability functions have been traditionally produced using observed damage and loss data, more recent studies propose the employment of analytical methodologies as a way to overcome the frequent lack of post-earthquake data. The variation of the structural modelling approaches on the estimation of building capacity has been the target of many studies in the past, however, its influence in the resulting vulnerability model, impact in loss estimations or propagation of the uncertainty to the seismic risk calculations has so far been the object of restricted scrutiny. Hence, in this paper, an extensive study of static and dynamic procedures for estimating the nonlinear response of buildings has been carried out in order to evaluate the impact of the chosen methodology on the resulting vulnerability and risk outputs. Moreover, the computational effort and numerical stability provided by each approach were evaluated and conclusions were obtained regarding which one offers the optimal balance between accuracy and complexity
    corecore