18 research outputs found

    A MODIFIED MCDM ALGORITHM WITH CUMULATIVE ENTROPY WEIGHTS FOR SELECTING THE WINNER OF THE TENDER

    Get PDF
    The aim of this research is to evaluate the proposed bids using impartial and entropy weights in a multi-criteria decision-making model. We use matrix data for hypothetical bidding involving nine criteria, with the presence of four domestic and two foreign contractors. Then, using cumulative entropy function, we estimate the entropy weights and use it in a multi-criteria decision-making model. The criteria of experience and knowledge in the field, good history and satisfaction in previous projects, financial and support capabilities, localization of the contractor, having the experience at the site of the project, availability and readiness of equipment and machines, the adequacy of technical staff, the work quality system, the efficient management and appropriate management system, creativity and innovation in similar tasks are the input variables of the decision model. After analyzing them, the proposals are prioritized through a multi-criteria decision-making model. The research findings include Shannon entropy and cumulative entropy-based weights for evaluation criteria and after applying the specific weight for the proposed quotation, the utility rate of each contractor is calculated. The results showed that the use of modified multi-dimensional decision-making method is more advantageous than traditional methods of evaluating bidding proposals in selecting the winner of a tender, and also using cumulative entropy weights in comparison with Shannon's leads to a more realistic choice of contractors

    Asian Port Performance Dimensions and Analyses: a Systematic Literature Review

    Get PDF
    Purpose: The  aim  of  the  study  was  to  analyze  the  performance dimensions and performance analysis techniques applied in studies of Asian port performance   Theoretical framework: The lack of literature in the context of port performance dimensions in Asian was  the  theoretical  basis  of  the study. A related study on Asian port performance was analysed and listed in the study based on thematic approach.   Design/methodology/approach: The PRISMA Statement (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) was used to guide this research, based on the databases of the two most prominent sources of articles which is Web of Science and Scopus.   Findings: This review identified four main categories surrounding the dimensions of port performance, those being operational dimensions, dimensions surrounding customers’ perspectives, logistical dimensions, and organizational dimensions. Furthermore, the following three performance analysis approaches were identified as being preferred by researchers of the Asian region: efficiency, productivity, and competitiveness.   Research, Practical & Social implications: The current findings of this research have shown that most studies on Asian ports focus on their efficiency and competitiveness rather than their productivity. A productivity survey might give a better overview of port performance as it concerns the actual output of the ports.   Originality/value: The systematic literature review (SLR) approach were hardly found in study of port performance. At the same time, the PRISMA method were applied to synthesized previous studies on port performance in Asian is another niche of the study

    中国における都市化総合評価及び環境への影響に関する研究

    Get PDF
    In Chapter one, research background and significance is investigated. In addition, previous studies and current situation in the research fields was reviewed and discussed. In Chapter two, an in-depth review of prior studies associated with the research topic was conducted. The literature review was carried out from three aspects: urbanization and eco-environment evalution and coordination, urban sprawl assessment and urban heat island investigation. In Chapter three, maximum entropy method was applied to help generate the evaluation system of eco-environment level and urbanization level at provincial scale. Comparison analysis and coordinate analysis was carried through to assess the development of urbanization and eco-environment as well as the balance and health degree of the city develops. In Chapter four, DMSP/OLS stable nighttime light dataset was used to measure and assess the urban dynamics from the extraction of built up area. Urban sprawl was evaluated by analyzing the landscape metrics which provided general understanding of the urban sprawl and distribution pattern characteristics could be got from the evaluation. In Chapter five, the investigation of surface urban heat island effects in Beijing city which derive from land surface temperature retrieval from remote sensing data of Landsat TM was carried out. In addition, spatial correlation and relationship between the urbanization level, vegetation coverage and surface urban heat island was carried out in this chapter. In Chapter six, all the works have been summarized and a conclusion of whole thesis is deduced.北九州市立大

    Efficiency analysis of alternative production systems in Kosovo - an ecosystem services approach

    Get PDF
    The efficiency estimation and the interpretation of its behavior are of extreme interest for primary producer in agriculture as well as for policy makers. In Kosovo one of the main objectives of Agriculture and Rural Development Plan 2007-2013 and 2014-2020 is to improve competitiveness and the efficiency of primary agricultural producers and to attain sustainable land use. Regardless of this, there was a lack of studies on farm efficiency estimation and the productivity changes of the agriculture sector in Kosovo. Therefore, the conducted study of this thesis focuses on estimation and the analysis of efficiency at farm level. More specifically, the study aimed estimation of technical, economic, and environmental efficiency of the farms oriented on tomato, grape and apple production. In addition, identification of the factors that extensively explain the variation of the efficiency scores among farms was sought. The study was based entirely on primary data, collected in three different stages. In the first stage, a survey using structured questionnaire was conducted with 120 farms which were distributed equally for each selected production system in the study. Farm efficiency scores were obtained using a Data Envelopment Analysis, which is a linear programming optimization technique that measures relative efficiency of a set of comparable units. In general, the efficiency scores for three different production systems were high, showing that there was little space for efficiency improvement. On average, tomato farms tend to be more technical efficient, followed by scale, revenue, and cost allocative efficiency. Farmers oriented in grape production were very scale efficient, followed by technical, revenue and cost allocative efficiency. Apple farms on average were performing relatively well in terms of technical efficiency which was the highest on average, followed by revenue efficiency and scale efficiency. Factors which were proved to be statistically important in explaining the variation of the efficiency scores among the farms were household size, farm size and number of cultivated crops, number of land plots, farmer´s education and experience in farming. In terms of the position in ranking between technical and environmental efficiency estimation, three different groups of farms were found. The group of farms which showed increase in ranking at environmental efficiency when compared to the technical one. Farms with no difference in ranking, and a group of farms showing a decrease in ranking at environmental efficiency compared to the technical efficiency. Farms which displayed an increase in ranking were mostly farms that improved or maintained good quality of soil at farm land and good level of agro-biodiversity provision. The second group of farms showed no difference in ranking, as they were fully efficient in technical and environmental efficiency estimation. The third group of farms which showed a decrease in ranking were those farms performing weakly in both technical and environmental efficiency. This group of farms were also having lower soil quality at farm land and lower agro-biodiversity when compared to the averages of total sample

    Analysis of Customer's Expectations and Satisfaction in the Zanjan Municipality Using Fuzzy Multi-Criteria Decision Making (FMCDM) Approach

    Get PDF
    Customer satisfaction is the most important step in the process of identifying customer expectations. Identify customer expectations ‎without referring to him and get his view, is impossible. In order to identify customer expectations, service suppliers are also using ‎statistical techniques, surveyed their customers. According to the studies, there is no appropriate framework for expectation model to ‎prioritize regions of organizations and make the favorable selection according to the organization’s policies and strategies. In this ‎research, a combination of fuzzy multiple criteria decision making is used for the optimal selection. The research method used in this ‎study is of descriptive and applied type, and field method is used to collect data. For Identifying customer expectations, data has ‎been collected from study population (customers of Zanjan municipality) is 303 people through random sampling method. To ranking ‎the dimensions of customer's expectations and make optimal selections of municipality zones, data has been collected from study ‎population (engineer contractor of the municipality) is 30 people. The data collection instrument was questionnaire and interview, which ‎had been valid. The Expert Choice, Web-based TOPSIS, SPSS and Excel software were used for calculations. It is interesting to ‎observe that the choices of the best municipality zone solely depend on the criterion having the maximum priority value. Based on ‎calculations on the stages of the proposed model, "municipality Zone 2" was selected as the optimal the region and had the highest rating ‎in response to customer expectations. Results show that the proposed model has a systematic fit with the defined procedures and known ‎inputs.â€

    An Investigation into Factors Affecting the Chilled Food Industry

    Get PDF
    With the advent of Industry 4.0, many new approaches towards process monitoring, benchmarking and traceability are becoming available, and these techniques have the potential to radically transform the agri-food sector. In particular, the chilled food supply chain (CFSC) contains a number of unique challenges by virtue of it being thought of as a temperature controlled supply chain. Therefore, once the key issues affecting the CFSC have been identified, algorithms can be proposed, which would allow realistic thresholds to be established for managing these problems on the micro, meso and macro scales. Hence, a study is required into factors affecting the CFSC within the scope of Industry 4.0. The study itself has been broken down into four main topics: identifying the key issues within the CFSC; implementing a philosophy of continuous improvement within the CFSC; identifying uncertainty within the CFSC; improving and measuring the performance of the supply chain. However, as a consequence of this study two further topics were added: a discussion of some of the issues surrounding information sharing between retailers and suppliers; some of the wider issues affecting food losses and wastage (FLW) on the micro, meso and macro scales. A hybrid algorithm is developed, which incorporates the analytic hierarchical process (AHP) for qualitative issues and data envelopment analysis (DEA) for quantitative issues. The hybrid algorithm itself is a development of the internal auditing algorithm proposed by Sueyoshi et al (2009), which in turn was developed following corporate scandals such as Tyco, Enron, and WorldCom, which have led to a decline in public trust. However, the advantage of the proposed solution is that all of the key issues within the CFSC identified can be managed from a single computer terminal, whilst the risk of food contamination such as the 2013 horsemeat scandal can be avoided via improved traceability

    Essays on efficiency and international tourism

    Get PDF
    RESUMEN: Esta tesis doctoral consta de tres ensayos sobre la economía del turismo. En el primer y tercer ensayo se realiza un análisis de eficiencia turística de las regiones españolas y de los factores que pueden estar determinando su evolución. En el segundo ensayo se hace una profunda revisión bibliográfica sobre la literatura que analiza la eficiencia turística a nivel mundial. El período de análisis abarca desde la última mitad del siglo XX (entre 1978 en el caso del estudio de la bibliografía turística, año en el que Charnes, Cooper y Rhodes introducen el Análisis Envolvente de Datos) hasta el periodo más reciente de nuestra economía (2018 en el tercer ensayo). En los dos ensayos que analizan la eficiencia turística en España el periodo comprende desde inicios del siglo XXI (2008 en el caso de la eficiencia de los destinos turísticos españoles) hasta los datos más actuales disponibles en este momento.ABSTRACT: This doctoral thesis consists of three essays on the economics of tourism. In the first and third essay an analysis of tourist efficiency of the Spanish regions and of the factors that may be determining their evolution is carried out. In the second essay, an in-depth bibliographical review is done on the literature that analyzes tourism efficiency worldwide. The analysis period covers from the last half of the twentieth century (between 1978 in the case of the study of the tourist bibliography, year in which Charnes, Cooper and Rhodes introduce the Data Envelope Analysis) until the most recent period of our economy ( 2018 in the third essay). In the two essays that analyze tourism efficiency in Spain, the period ranges from the beginning of the 21st century (2008 in the case of the efficiency of Spanish tourist destinations) to the most current data available at this time

    Software defect prediction using maximal information coefficient and fast correlation-based filter feature selection

    Get PDF
    Software quality ensures that applications that are developed are failure free. Some modern systems are intricate, due to the complexity of their information processes. Software fault prediction is an important quality assurance activity, since it is a mechanism that correctly predicts the defect proneness of modules and classifies modules that saves resources, time and developers’ efforts. In this study, a model that selects relevant features that can be used in defect prediction was proposed. The literature was reviewed and it revealed that process metrics are better predictors of defects in version systems and are based on historic source code over time. These metrics are extracted from the source-code module and include, for example, the number of additions and deletions from the source code, the number of distinct committers and the number of modified lines. In this research, defect prediction was conducted using open source software (OSS) of software product line(s) (SPL), hence process metrics were chosen. Data sets that are used in defect prediction may contain non-significant and redundant attributes that may affect the accuracy of machine-learning algorithms. In order to improve the prediction accuracy of classification models, features that are significant in the defect prediction process are utilised. In machine learning, feature selection techniques are applied in the identification of the relevant data. Feature selection is a pre-processing step that helps to reduce the dimensionality of data in machine learning. Feature selection techniques include information theoretic methods that are based on the entropy concept. This study experimented the efficiency of the feature selection techniques. It was realised that software defect prediction using significant attributes improves the prediction accuracy. A novel MICFastCR model, which is based on the Maximal Information Coefficient (MIC) was developed to select significant attributes and Fast Correlation Based Filter (FCBF) to eliminate redundant attributes. Machine learning algorithms were then run to predict software defects. The MICFastCR achieved the highest prediction accuracy as reported by various performance measures.School of ComputingPh. D. (Computer Science
    corecore