2 research outputs found

    Using stochastic model for lower financial risk management in refinery operation planning

    Get PDF
    Most Refineries historically models are deterministic, that is, they use nominal parameter values without taking into consideration the uncertainty in process, demands, refinery parameters, etc. And as a consequence, they are unable to perform risk management. In this paper a variety of methodologies for financial risk management in engineering decision have been already developed. We follow the approach presented by Barbaro and Bagajewicz (2004), who used two-stage stochastic programming model and you, can find all other approaches analyzed and discussed. The problem addressed here is that of determining the crude oil to purchase and decide on the production level of different products given predicts of demands. The profit is maximized taking into account revenues, crude oil costs, inventory costs, and lost demand costs. The model was tested using data from the Refinery owned by the State Oil Marketing Organization (SOMO) Company, Iraq. The results show that the stochastic model can forecast higher expected profit and lower risk compared to the deterministic model

    A hybrid database encryption algorithm based on reverse string and dynamic key

    Get PDF
    Personal computers has become one of the important part of human life, especially when they using the Internet or any public networks.So, the information or data may still be deliberately or inadvertently leaked out by the insiders or customers.In this paper, we proposing new algorithm to protect any type of data, depending on the substitution techniques, using dynamic key (DK) and reveres string for each character in the text (or field in the file of the database) and each string .A special equation was developed to generate the DK, it’s depend on the length of string, order the character in the string,ASCII code of the character, and string sequence.The proposed algorithm using the reverse string before encryption, also, there are two actions after that, for the purpose of increasing the complexity of the algorithm.The results showed that The developed algorithm was successes to remove any duplicate encryption, for each character and each record even when the text duplicated
    corecore