6,684 research outputs found
Leaching characteristics of the fine fraction from an excavated landfill: physico-chemical characterization
Leaching of fine fraction (<10 mm) obtained from landfill mining activities in an Estonian landfill was done. On-site excavation was carried out in four test pits (TP1, TP2, TP3, TP4) that were further divided in four layers (L1, L2, L3, L4). Total chemical oxygen demand (CODt), dissolved chemical oxygen demand (CODd), total organic carbon (TOC), dissolved organic carbon (DOC) and metals (Zn, Cu, Pb and Cd) were analyzed. The results showed that approximately 70 % of CODt were in particulate/colloidal state. The TOC released ranged between 2326 and 3530-mg/kg dry matter for test pits suggesting spatial differences in the studied landfill. DOC ranged between 365â874 and 317â940 mg/kg for different test pits and sampling layers, respectively. Low average leaching rates of metals were observed (0.2â1.5 %). Pb had a significantly higher average leaching rate (1.0 %) compared to Zn (0.70 %) and Cu (0.35 %). The potential use of CODt as a surrogate indicator of TOC, DOC and Zn on the basis of high correlation coefficients was observed. To conclude, the implementation of adequate strategies to manage fine-grained fractions obtained from excavated waste relies on physico-chemical characterization of both the fine fractions itself and the leachate generated during storage and use. © 2015 Springer Japa
On Optimal Binary One-Error-Correcting Codes of Lengths and
Best and Brouwer [Discrete Math. 17 (1977), 235-245] proved that
triply-shortened and doubly-shortened binary Hamming codes (which have length
and , respectively) are optimal. Properties of such codes are
here studied, determining among other things parameters of certain subcodes. A
utilization of these properties makes a computer-aided classification of the
optimal binary one-error-correcting codes of lengths 12 and 13 possible; there
are 237610 and 117823 such codes, respectively (with 27375 and 17513
inequivalent extensions). This completes the classification of optimal binary
one-error-correcting codes for all lengths up to 15. Some properties of the
classified codes are further investigated. Finally, it is proved that for any
, there are optimal binary one-error-correcting codes of length
and that cannot be lengthened to perfect codes of length
.Comment: Accepted for publication in IEEE Transactions on Information Theory.
Data available at http://www.iki.fi/opottone/code
User Defined Types and Nested Tables in Object Relational Databases
Bernadette Byrne, Mary Garvey, âUser Defined Types and Nested Tables in Object Relational Databasesâ, paper presented at the United Kingdom Academy for Information Systems 2006: Putting Theory into Practice, Cheltenham, UK, 5-7 June, 2006.There has been much research and work into incorporating objects into databases with a number of object databases being developed in the 1980s and 1990s. During the 1990s the concept of object relational databases became popular, with object extensions to the relational model. As a result, several relational databases have added such extensions. There has been little in the way of formal evaluation of object relational extensions to commercial database systems. In this work an airline flight logging system, a real-world database application, was taken and a database developed using a regular relational database and again using object relational extensions, allowing the evaluation of the relational extensions.Peer reviewe
Computer Administering of the Psychological Investigations: Set-relational Representation
Computer administering of a psychological investigation is the computer
representation of the entire procedure of psychological assessments - test
construction, test implementation, results evaluation, storage and maintenance
of the developed database, its statistical processing, analysis and
interpretation. A mathematical description of psychological assessment with the
aid of personality tests is discussed in this article. The set theory and the
relational algebra are used in this description. A relational model of data,
needed to design a computer system for automation of certain psychological
assessments is given. Some finite sets and relation on them, which are
necessary for creating a personality psychological test, are described. The
described model could be used to develop real software for computer
administering of any psychological test and there is full automation of the
whole process: test construction, test implementation, result evaluation,
storage of the developed database, statistical implementation, analysis and
interpretation. A software project for computer administering personality
psychological tests is suggested
Just below the surface: developing knowledge management systems using the paradigm of the noetic prism
In this paper we examine how the principles embodied in the paradigm of the noetic prism can illuminate the construction of knowledge management systems. We draw on the formalism of the prism to examine three successful tools: frames, spreadsheets and databases, and show how their power and also their shortcomings arise from their domain representation, and how any organisational system based on integration of these tools and conversion between them is inevitably lossy. We suggest how a late-binding, hybrid knowledge based management system (KBMS) could be designed that draws on the lessons learnt from these tools, by maintaining noetica at an atomic level and storing the combinatory processes necessary to create higher level structure as the need arises. We outline the âjust-below-the-surfaceâ systems design, and describe its implementation in an enterprise-wide knowledge-based system that has all of the conventional office automation features
- âŠ