3,589 research outputs found

    Recruitment and selection processes through an effective GDSS

    Get PDF
    [[abstract]]This study proposes a group decision support system (GDSS), with multiple criteria to assist in recruitment and selection (R&S) processes of human resources. A two-phase decision-making procedure is first suggested; various techniques involving multiple criteria and group participation are then defined corresponding to each step in the procedure. A wide scope of personnel characteristics is evaluated, and the concept of consensus is enhanced. The procedure recommended herein is expected to be more effective than traditional approaches. In addition, the procedure is implemented on a network-based PC system with web interfaces to support the R&S activities. In the final stage, key personnel at a human resources department of a chemical company in southern Taiwan authenticated the feasibility of the illustrated example.[[notice]]補正完畢[[journaltype]]國內[[incitationindex]]SCI[[incitationindex]]E

    A hybrid and integrated approach to evaluate and prevent disasters

    Get PDF

    An Investigation into Factors Affecting the Chilled Food Industry

    Get PDF
    With the advent of Industry 4.0, many new approaches towards process monitoring, benchmarking and traceability are becoming available, and these techniques have the potential to radically transform the agri-food sector. In particular, the chilled food supply chain (CFSC) contains a number of unique challenges by virtue of it being thought of as a temperature controlled supply chain. Therefore, once the key issues affecting the CFSC have been identified, algorithms can be proposed, which would allow realistic thresholds to be established for managing these problems on the micro, meso and macro scales. Hence, a study is required into factors affecting the CFSC within the scope of Industry 4.0. The study itself has been broken down into four main topics: identifying the key issues within the CFSC; implementing a philosophy of continuous improvement within the CFSC; identifying uncertainty within the CFSC; improving and measuring the performance of the supply chain. However, as a consequence of this study two further topics were added: a discussion of some of the issues surrounding information sharing between retailers and suppliers; some of the wider issues affecting food losses and wastage (FLW) on the micro, meso and macro scales. A hybrid algorithm is developed, which incorporates the analytic hierarchical process (AHP) for qualitative issues and data envelopment analysis (DEA) for quantitative issues. The hybrid algorithm itself is a development of the internal auditing algorithm proposed by Sueyoshi et al (2009), which in turn was developed following corporate scandals such as Tyco, Enron, and WorldCom, which have led to a decline in public trust. However, the advantage of the proposed solution is that all of the key issues within the CFSC identified can be managed from a single computer terminal, whilst the risk of food contamination such as the 2013 horsemeat scandal can be avoided via improved traceability

    Risk Quadruplet: Integrating Assessments of Threat, Vulnerability, Consequence, and Perception for Homeland Security and Homeland Defense

    Get PDF
    Risk for homeland security and homeland defense is often considered to be a function of threat, vulnerability, and consequence. But what is that function? And are we defining and measuring these terms consistently? Threat, vulnerability, and consequence assessments are conducted, often separately, and data from one assessment could be drastically different from that of another due to inconsistent definitions of terms and measurements, differing data collection methods, or varying data sources. It has also long been a challenge to integrate these three disparate assessments to establish an overall picture of risk to a given asset. Further, many agencies conduct these assessments and there is little to no sharing of data, methodologies, or results vertically (between federal, state, and local decision-makers) or horizontally (across the many different sectors), which results in duplication of efforts and conflicting risk assessment results. Obviously, risk is a function of our perceptions and those perceptions can influence our understanding of threat, vulnerability, and consequence. Some assessments rely on perceptions (elicited from subject matter experts) in order to qualify or quantify threat, vulnerability, and consequence. Others exclude perception altogether, relying on objective data, if available. Rather than fault the subjectivity of our perceptions, or muddle objective assessments with personal opinions, it makes sense to embrace our perceptions, but segregate them as a unique component of risk. A risk quadruplet is proposed to systematically collect and integrate assessments of threat, vulnerability, consequence, and perception, such that each dimension can be explored uniquely, and such that all four components can be aggregated into an overall risk assessment in a consistent, transparent, traceable, and reproducible manner. The risk quadruplet draws from the fields of homeland security, homeland defense, systems engineering, and even psychology to develop a model of risk that integrates all four assessments using multicriteria decision analysis. The model has undergone preliminary validation and has proven to be a viable solution for ranking assets based on the four proposed components of risk

    Flood risk assessment using multi-sensor remote sensing, geographic information system, 2D hydraulic and machine learning based models

    Full text link
    University of Technology Sydney. Faculty of Engineering and Information Technology.Flooding events threaten the population, economy and environment worldwide. In recent years, several spatial methods have been developed to map flood susceptibility, hazard and risk for predicting and modelling flooding events. However, this research proposes multiple state-of-the-art approaches to assess, simulate and forecast flooding from recent satellite imagery. Firstly, a model was proposed to monitor changes in surface runoff and forecast future surface runoff on the basis of land use/land cover (LULC) and precipitation factors because the effects of precipitation and LULC dynamics have directly affected surface runoff and flooding events. Land transformation model (LTM) was used to detect the LULC changes. Moreover, an autoregressive integrated moving average (ARIMA) model was applied to analyse and forecast rainfall trends. The parameters of the ARIMA time series model were calibrated and fitted statistically to minimise prediction uncertainty through modern Taguchi method. Then, a GIS -based soil conservation service-curve number (SCS-CN) model was developed to simulate the maximum probable surface runoff. Results showed that deforestation and urbanisation have occurred upon a given time and have been predicted to increase. Furthermore, given negative changes in LULC, surface runoff increased and was forecasted to exceed gradually by 2020. In accordance with the implemented model calibration and accuracy assessment, the GIS-based SCS-CN combined with the LTM and ARIMA model is an efficient and accurate approach to detecting, monitoring and forecasting surface runoff. Secondly, a physical vulnerability assessment of flood was conducted by extracting detailed urban features from Worldview-3. Panchromatic sharpening in conjunction with atmospheric and topographic corrections was initially implemented to increase spatial resolution and reduce atmospheric distortion from satellite images. Dempster–Shafer (DS) fusion classifier was proposed in this part as a feature-based image analysis (FBIA) to extract urban complex objects. The DS-FBIA was investigated among two sites to examine the transferability of the proposed method. In addition, the DS-FBIA was compared with other common image analysis approaches (pixel- and object-based image analyses) to discover its accuracy and computational operating time. k-nearest neighbour, Bayes and support vector machine (SVM) classifiers were tested as pixel-based image analysis approaches, while decision tree classifier was examined as an object-based image analysis approach. The results showed improvements in detailed urban extraction obtained using the proposed FBIA with 92.2% overall accuracy and with high transferability from one site to another. Thirdly, an integrated model was developed for probability analysis of different types of flood using fully distributed GIS-based algorithms. These methods were applicable, particularly where annual monsoon rains trigger fluvial floods (FF) with pluvial flash flood (PFF) events occur simultaneously. A hydraulic 2D high-resolution sub-grid model of hydrologic engineering centre river analysis system was performed to simulate FF probability and hazard. Moreover, machine learning random forest (RF) method was used to model PFF probability and hazard. The RF was optimised by particle swarm optimisation (PSO) algorithm. Both models were verified and calibrated by cross validation and sensitivity analysis to create a coupled PFF– FF probability mapping. The results showed high accuracy in generating a coupled PFF–FF probability model that can discover the impact and contribution of each type to urban flood hazard. Furthermore, the results provided detailed flood information for urban managers to equip infrastructures, such as highways, roads and sewage network, actively. Fourthly, the risk of a flood can be assessed through different stages of flood probability, hazard and vulnerability. A total of 13 flood conditioning parameters were created to construct a geospatial database for flood probability estimation in two study areas. To estimate flood probability, five approaches, namely, logistic regression, frequency ratio (FR), SVM, analytical hierarchy process and combined FR–SVM, were adopted. Then, a flood risk map was generated by integrating flood hazard and vulnerability. The accuracy of flood probability indices indicated that the combined FR–SVM method achieved the highest accuracy among the other approaches. The reliability of the results obtained from this research was also verified in the field. The most effective parameters that would trigger flood occurrence were rainfall and flood inundation depth. In this research, transferable residency from one study area to another was verified through all the implemented methods. Therefore, the proposed approaches would be effectively and easily replicated in other regions with a similar climate condition, that condition that is, having a sufficient amount of flooding inventory events. Moreover, the results of the proposed approaches provided solid-detailed information that would be used for making favourable decisions to reduce and control future flood risks
    corecore