499 research outputs found

    Procesy osadzonych w Obozie Pracy w Potulicach przed Specjalnym Sądem Karnym w Toruniu (1944-1946)

    Get PDF
    This article study on the processes that took place in 1944-1945 before the Special Criminal Court in Torun. Briefly discusses the creation of both the Special Criminal Courts and Labor Camp in Potulice. The author presents how did this adjudicate looked like and what was the verdicts related to people who at that time were housed in a labor camp in Potulice. The work is the analysis of 1200 records located at the Institute of National Remembrance in Bydgoszcz.Artykuł dotyczy procesów jakie odbywały się w latach 1944-1945 przed Specjalnym Sądem Karnym w Toruniu. Omawia pokrótce powstanie zarówno Specjalnych Sądów Karnych jak i Obozu Pracy w Potulicach. W treści opracowania autorka przedstawia jak wyglądały i czego dotyczyły procesy sądowe osób, które w tym czasie przebywały w Obozie Pracy w Potulicach. Praca powstała po przeanalizowaniu ok. 1200 akt znajdujących się w Instytucie Pamięci Narodowej w Bydgoszczy

    Assessment of the company’s financial condition using a synthetic measure based on the example of a confectionery company

    Get PDF
    Motivation: Many methodological problems are associated with measuring and forecasting the company’s financial standing. Various methods have been developed to assess the current and future financial standing of enterprises. Their usefulness requires development and verification based on the example of a specific enterprise. The evaluation of one of these methods is the subject of this article.Aim: The research objective of the article is to propose a method for assessing the financial situation of a company and to carry out a forecast of the financial condition of an economic entity based on this method. To analyse the condition, indicator analysis and one of the methods of multivariate comparative analysis (MCA) were used — a synthetic measure was constructed, and then the time series analysis method was used (the model of the internal structure of the examined process was estimated). The study also attempts to determine the company’s financial condition forecast. The results of the study were presented based on the example of a company representing the food industry — the Wawel SA company.Results: In this study, the financial condition of the company was presented using a synthetic measure, which is a tool for multivariate comparative analysis (MCA). A model of the internal structure of the examined process was also estimated, based on which a forecast of the financial condition of a selected company representing the food industry was determined. The analysis showed that the presented methods are very useful for assessing and forecasting the company’s financial condition

    USE OF HPLC, Py-GCMS, FTIR METHODS IN THE STUDIES OF THE COMPOSITION OF SOIL DISSOLVED ORGANIC MATTER

    Get PDF
    The study has determined the composition of dissolved organic matter in Luvisols, Fluvisols and Histosols using spectroscopic (FTIR) and chromatographic (HPLC and Py-GCMS) methods. It has been found that aliphatic hydrocarbons (linear) containing from 4 to 12 atoms of carbon constitute the dominant group of compounds included in the dissolved organic matter (DOC). The preparations isolated from Histosols and Luvisols demonstrated a higher proportion of hydrophobic fraction with a longer retention time probably containing more compounds with long-chain aliphatic and simple aromatic structure than the DOC of Fluvisols. The differences in infrared spectra are evident particularly in the wave number between 1650–1030 cm-1. The DOC of Histosols is richer in aromatic compounds (range 1620 cm-1) but the DOC of Luvisols and Fluvisols is richer in alkene chains and hydroxyl (OH) and methoxy (OCH3) groups. The results showed differences in the composition of the DOM across the soils, caused their genesis. W pracy badano skład rozpuszczalnej materii organicznej (RMO) gleb (Luvisols, Fluvisols and Histosols) przy zastosowaniu metod spektroskopowych (FTIR) oraz chromatograficznych (HPLC i Py-GCMS). Stwierdzono, że dominującą grupą związków wchodzących w skład RMO są węglowodory alifatyczne (łańcuchowe) zawierające od 4 do 12 atomów węgla. Preparaty RMO wyizolowane z torfu i gleby płowej charakteryzujące się wyższym udziałem frakcji hydrofobowych o najdłuższym czasie retencji, zawierały najprawdopodobniej więcej związków o długich łańcuchach alifatycznych oraz proste struktury aromatyczne w porównaniu z RMO mady. Przebieg widm w podczerwieni wyraźnie wskazał różnicę w składzie badanych preparatów RMO, szczególnie w zakresie liczb falowych między 1650-1030 cm-1. Preparaty RMO wyizolowane z torfu były bogatsze w związki aromatyczne (pasmo 1620 cm-1) a frakcja RMO wyizolowana z gleby płowej i mady ciężkiej była bogatsza w łańcuchy alkenowe i grupy hydroksylowe (OH) i metoksylowe (OCH3). Otrzymane wyniki badań wykazały różnice w składzie RMO pomiędzy glebami, wynikające z ich genezy

    Online Self-Healing Control Loop to Prevent and Mitigate Faults in Scientific Workflows

    Get PDF
    Scientific workflows have become mainstream for conducting large-scale scientific research. As a result, many workflow applications and Workflow Management Systems (WMSs) have been developed as part of the cyberinfrastructure to allow scientists to execute their applications seamlessly on a range of distributed platforms. In spite of many success stories, a key challenge for running workflow in distributed systems is failure prediction, detection, and recovery. In this paper, we present a novel online self-healing framework, where failures are predicted before they happen, and are mitigated when possible. The proposed approach is to use control theory developed as part of autonomic computing, and in particular apply the proportional-integral-derivative controller (PID controller) control loop mechanism, which is widely used in industrial control systems, to mitigate faults by adjusting the inputs of the mechanism. The PID controller aims at detecting the possibility of a fault far enough in advance so that an action can be performed to prevent it from happening. To demonstrate the feasibility of the approach, we tackle two common execution faults of the Big Data era—data footprint and memory usage. We define, implement, and evaluate PID controllers to autonomously manage data and memory usage of a bioinformatics workflow that consumes/produces over 4.4TB of data, and requires over 24TB of memory to run all tasks concurrently. Experimental results indicate that workflow executions may significantly benefit from PID controllers, in particular under online and unknown conditions. Simulation results show that nearly-optimal executions (slowdown of 1.01) can be attained when using our proposed control loop, and faults are detected and mitigated far in advance

    Relacje zadłużenia w kredytach i pożyczkach oraz opodatkowania do przychodów ze sprzedaży w podmiotach sektora małych i średnich przedsiębiorstw w Polsce

    Get PDF
    This article contains a review of differences and similarities between micro and small firms compared to medium-size enterprises in relations of credit and loan debt and taxation to revenues from sales. Diversified course of these relations allowed to state a general conclusion that micro and small firms may react in different way and different intensity than medium enterprises to changes in availability and price of credits and loans, and also to changes in taxation regulations

    Solar Photocatalytic Degradation of Sulfamethoxazole by TiO2_{2} Modified with Noble Metals

    Get PDF
    Application of solar photocatalysis for water treatment is intensively studied. In this work, we investigated TiO2 modified with platinum (Pt/TiO2) and palladium (Pd/TiO2) using sulfamethoxazole (SMX) as the model contaminant. We considered the following parameters: (i) level of TiO2 modification with Pt/Pd, (ii) initial concentration of photocatalysts, (iii) geographic location where processes were conducted, and (iv) natural water matrix. The catalysts characterized by SEM, EDX, DRS, and XRD techniques showed successful deposition of Pd and Pt atoms on TiO2 surface that enabled light absorption in the visible (Vis) range, and therefore caused efficient SMX removal in all tested conditions. A comparison of the rate constants of SMX degradation in various conditions revealed that modification with Pd gave better results than modification with Pt, which was explained by the better optical properties of Pd/TiO2. The removal of SMX was higher with Pd/TiO2 than with Pt/TiO2, independent of the modification level. In the experiments with the same modification level, similar rate constants were achieved when four times the lower concentration of Pd/TiO2 was used as compared with Pt/TiO2. Formation of four SMX transformation products was confirmed, in which both amine groups are involved in photocatalytic oxidation. No toxic effect of post-reaction solutions towards Lepidium sativum was observed

    Using simple PID-inspired controllers for online resilient resource management of distributed scientific workflows

    Get PDF
    Scientific workflows have become mainstream for conducting large-scale scientific research. As a result, many workflow applications and Workflow Management Systems (WMSs) have been developed as part of the cyberinfrastructure to allow scientists to execute their applications seamlessly on a range of distributed platforms. Although the scientific community has addressed this challenge from both theoretical and practical approaches, failure prediction, detection, and recovery still raise many research questions. In this paper, we propose an approach inspired by the control theory developed as part of autonomic computing to predict failures before they happen, and mitigated them when possible. The proposed approach is inspired on the proportional–integral–derivative controller (PID controller) control loop mechanism, which is widely used in industrial control systems, where the controller will react to adjust its output to mitigate faults. PID controllers aim to detect the possibility of a non-steady state far enough in advance so that an action can be performed to prevent it from happening. To demonstrate the feasibility of the approach, we tackle two common execution faults of large scale data-intensive workflows—data storage overload and memory overflow. We developed a simulator, which implements and evaluates simple standalone PID-inspired controllers to autonomously manage data and memory usage of a data-intensive bioinformatics workflow that consumes/produces over 4.4 TB of data, and requires over 24 TB of memory to run all tasks concurrently. Experimental results obtained via simulation indicate that workflow executions may significantly benefit from the controller-inspired approach, in particular under online and unknown conditions. Simulation results show that nearly-optimal executions (slowdown of 1.01) can be attained when using our proposed method, and faults are detected and mitigated far in advance of their occurrence

    Nitzschia anatoliensis sp. nov., a cryptic diatom species from the highly alkaline Van Lake (Turkey)

    Get PDF
    In this article we describe Nitzschia anatoliensis Górecka, Gastineau & Solak sp. nov., an example of a diatom species inhabiting extreme habitats. The new species has been isolated and successfully grown from the highly alkaline Van Lake in East Turkey. The description is based on morphology (light and scanning electron microscopy), the sequencing of its organellar genomes and several molecular phylogenies. This species could easily be overlooked because of its extreme similarity to Nitzschia aurariae but molecular phylogenies indicate that they are only distantly related. Furthermore, molecular data suggest that N. anatoliensis may occur in several alkaline lakes of Asia Minor and Siberia, but was previously misidentified as Nitzschia communis. It also revealed the very close genetic proximity between N. anatoliensis and the endosymbiont of the dinotom Kryptoperidinium foliaceum, providing additional clues on what might have been the original species of diatoms to enter symbiosis.info:eu-repo/semantics/publishedVersio

    Asterism: an integrated, complete, and open-source approach for running seismologist continuous data-intensive analysis on heterogeneous systems

    Get PDF
    We present Asterism, an open source data-intensive framework, which combines the Pegasus and dispel4py workflow systems. Asterism aims to simplify the effort required to develop data-intensive applications that run across multiple heterogeneous resources, without users having to: re-formulate their methods according to different enactment systems; manage the data distribution across systems; parallelize their methods; co-place and schedule their methods with computing resources; and store and transfer large/small volumes of data. Asterism's key element is to leverage the strengths of each workflow system: dispel4py allows developing scientific applications locally and then automatically parallelize and scale them on a wide range of HPC infrastructures with no changes to the application's code; Pegasus orchestrates the distributed execution of applications while providing portability, automated data management, recovery, debugging, and monitoring, without users needing to worry about the particulars of the target execution systems. Asterism leverages the level of abstractions provided by each workflow system to describe hybrid workflows where no information about the underlying infrastructure is required beforehand. The feasibility of Asterism has been evaluated using the seismic ambient noise cross-correlation application, a common data-intensive analysis pattern used by many seismologists. The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The Asterism workflow is implemented as a Pegasus workflow composed of two tasks (Phase1 and Phase2), where each phase represents a dispel4py workflow. Pegasus tasks describe the in/output data at a logical level, the data dependency between tasks, and the e-Infrastructures and the execution engine to run each dispel4py workflow. We have instantiated the workflow using data from 1000 stations from the IRIS services, and run it across two heterogeneous resources described as Docker containers: MPI (Container2) and Storm (Container3) clusters (Figure 1). Each dispel4py workflow is mapped to a particular execution engine, and data transfers between resources are automatically handled by Pegasus

    DIaaS: Data-Intensive workflows as a service - Enabling easy composition and deployment of data-intensive workflows on Virtual Research Environments

    Get PDF
    We present the Data-Intensive workflows as a Service (DIaaS) model for enabling easy data-intensive workflow composition and deployment on clouds using containers. DIaaS model backbone is Asterism, an integrated solution for running data-intensive stream-based applications on heterogeneous systems, which combines the benefits of dispel4py with Pegasus workflow systems. The stream-based executions of an Asterism workflow are managed by dispel4py, while the data movement between different e-Infrastructures, and the coordination of the application execution are automatically managed by Pegasus.DIaaS combines Asterism framework with Docker containers to provide an integrated, complete, easy-to-use, portable approach to run data-intensive workflows on distributed platforms. Three containers integrate the DIaaS model: a Pegasus node, and an MPI and an Apache Storm clusters. Container images are described as Dockerfiles (available online at http://github.com/dispel4py/pegasus_dispel4py), linked to Docker Hub for providing continuous integration (automated image builds), and image storing and sharing. In this model, all required software (workflow systems and execution engines) for running scientific applications are packed into the containers, which significantly reduces the effort (and possible human errors) required by scientists or VRE administrators to build such systems. The most common use of DIaaS will be to act as a backend of VREs or Scientific Gateways to run data-intensive applications, deploying cloud resources upon request. We have demonstrated the feasibility of DIaaS using the data-intensive seismic ambient noise cross-correlation application (Figure 1). The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The application is submitted via Pegasus (Container1), and Phase1 and Phase2 are executed in the MPI (Container2) and Storm (Container3) clusters respectively. Although both phases could be executed within the same environment, this setup demonstrates the flexibility of DIaaS to run applications across e-Infrastructures. In summary, DIaaS delivers specialized software to execute data-intensive applications in a scalable, efficient, and robust manner reducing the engineering time and computational cost
    corecore