1,592 research outputs found
Efficient coding in dolphin surface behavioral patterns
We show that the law of brevity, i.e. the tendency of words to shorten as their frequency increases, is also found in dolphin surface behavioral patterns. As far as we know, this is the first evidence of the law in another species, suggesting that coding efficiency is not unique to humans.Peer ReviewedPostprint (author's final draft
Optimal prefix codes for pairs of geometrically-distributed random variables
Optimal prefix codes are studied for pairs of independent, integer-valued
symbols emitted by a source with a geometric probability distribution of
parameter , . By encoding pairs of symbols, it is possible to
reduce the redundancy penalty of symbol-by-symbol encoding, while preserving
the simplicity of the encoding and decoding procedures typical of Golomb codes
and their variants. It is shown that optimal codes for these so-called
two-dimensional geometric distributions are \emph{singular}, in the sense that
a prefix code that is optimal for one value of the parameter cannot be
optimal for any other value of . This is in sharp contrast to the
one-dimensional case, where codes are optimal for positive-length intervals of
the parameter . Thus, in the two-dimensional case, it is infeasible to give
a compact characterization of optimal codes for all values of the parameter
, as was done in the one-dimensional case. Instead, optimal codes are
characterized for a discrete sequence of values of that provide good
coverage of the unit interval. Specifically, optimal prefix codes are described
for (), covering the range , and
(), covering the range . The described codes produce the expected
reduction in redundancy with respect to the one-dimensional case, while
maintaining low complexity coding operations.Comment: To appear in IEEE Transactions on Information Theor
Maps of random walks on complex networks reveal community structure
To comprehend the multipartite organization of large-scale biological and
social systems, we introduce a new information theoretic approach that reveals
community structure in weighted and directed networks. The method decomposes a
network into modules by optimally compressing a description of information
flows on the network. The result is a map that both simplifies and highlights
the regularities in the structure and their relationships. We illustrate the
method by making a map of scientific communication as captured in the citation
patterns of more than 6000 journals. We discover a multicentric organization
with fields that vary dramatically in size and degree of integration into the
network of science. Along the backbone of the network -- including physics,
chemistry, molecular biology, and medicine -- information flows
bidirectionally, but the map reveals a directional pattern of citation from the
applied fields to the basic sciences.Comment: 7 pages and 4 figures plus supporting material. For associated source
code, see http://www.tp.umu.se/~rosvall
JPEG: the quadruple object
The thesis, together with its practice-research works, presents an object-oriented
perspective on the JPEG standard. Using the object-oriented
philosophy of Graham Harman as a theoretical and also practical starting
point, the thesis looks to provide an account of the JPEG digital object and
its enfolding within the governmental scopic regime. The thesis looks to
move beyond accounts of digital objects and protocols within software
studies that position the object in terms of issues of relationality,
processuality and potentiality. From an object-oriented point of view, the
digital object must be seen as exceeding its relations, as actual, present and
holding nothing in reserve. The thesis presents an account of JPEG starting
from that position as well as an object-oriented account of JPEG’s position
within the distributed, governmental scopic regime via an analysis of
Facebook’s Timeline, tagging and Haystack systems.
As part of a practice-research project, the author looked to use that
perspective within photographic and broader imaging practices as a spur to
new work and also as a “laboratory” to explore Harman’s framework. The
thesis presents the findings of those “experiments” in the form of a report
alongside practice-research eBooks. These works were not designed to be
illustrations of the theory, nor works to be “analysed”. Rather, following the
lead of Ian Bogost and Mark Amerika, they were designed to be
“philosophical works” in the sense of works that “did” philosophy
Professional Learning and Distributed Leadership: A Symbiotic Relationship
Pedagogical improvement in early childhood education (ECE) is critically impacted by leadership and professional learning. Despite this importance, government funding for ECE professional learning has been significantly reduced over the past decade. Meanwhile, a growing body of research is suggesting that teacher professional learning is most effective when contextualised and sustained over time. In ECE, positional leaders have responsibility for ensuring ongoing teacher professional learning and the development of the programme while developing a culture of distributed leadership. This interpretive mixed-methods study examined the practices and perceptions of ECE teachers and leaders about leadership and professional learning. Surveys and interviews were designed to reveal the relationship between distributed leadership and professional learning in ECE settings and sought to discover practices of effective positional leaders in facilitating both. From the results of this study, it emerged that distributed leadership and professional learning are symbiotic and that ECE positional leaders need to develop certain leadership practices within their services in order to successfully foster both
Recommended from our members
Evaluating LEADER: canonical, endogenous and systemic learning
In this paper, we touch on a key theme in rural overnance 'the reconciliation of centralised procedures and the embedded institutions of rural society – through the lens of the evaluation procedures embedded in the European LEADER programme. LEADER is in many ways a highly
devolved European initiative, true to its origins as a progressive rural laboratory in terms of innovation, stakeholder engagement, social learning and systemic methodology for addressing rural needs. The design and operationalisation of national and local LEADER programmes and projects is tailored to local circumstances, and aims to direct LEADER funding to local needs while building the institutional and social capital that underpins successful rural development. Yet while the delivery of LEADER embraces heterogeneity, programmatic evaluation is centralised and learning at the national and local level is subservient to the need to defend the LEADER approach in
Brussels. This requires evaluation to be held at arms length from delivery organisations, even though there is evidence that where local evaluative capacity is robust, centralised evaluation is enhanced.
This paper reviews progress to date on improving the canonical forms of evaluation employed in LEADER, based on cumulative feedback from previous iterations of the programme. We then consider alternative evaluation traditions that engage with endogenous capacity for sense making,and the extent to which they might be taken up within LEADER. We conclude by proposing that more attention needs to be paid to institutionalisation of systemic evaluation within LEADER,which could engage with a much wider range of perspectives in rural development, across different scales of governance and national and regional contexts. This would require the reconciliation
canonical and endogenous forms of evaluation, but would align LEADER evaluation with the values and methods embodied in the rest of the programme
Population-based JPEG Image Compression: Problem Re-Formulation
The JPEG standard is widely used in different image processing applications.
One of the main components of the JPEG standard is the quantisation table (QT)
since it plays a vital role in the image properties such as image quality and
file size. In recent years, several efforts based on population-based
metaheuristic (PBMH) algorithms have been performed to find the proper QT(s)
for a specific image, although they do not take into consideration the user's
opinion. Take an android developer as an example, who prefers a small-size
image, while the optimisation process results in a high-quality image, leading
to a huge file size. Another pitfall of the current works is a lack of
comprehensive coverage, meaning that the QT(s) can not provide all possible
combinations of file size and quality. Therefore, this paper aims to propose
three distinct contributions. First, to include the user's opinion in the
compression process, the file size of the output image can be controlled by a
user in advance. Second, to tackle the lack of comprehensive coverage, we
suggest a novel representation. Our proposed representation can not only
provide more comprehensive coverage but also find the proper value for the
quality factor for a specific image without any background knowledge. Both
changes in representation and objective function are independent of the search
strategies and can be used with any type of population-based metaheuristic
(PBMH) algorithm. Therefore, as the third contribution, we also provide a
comprehensive benchmark on 22 state-of-the-art and recently-introduced PBMH
algorithms on our new formulation of JPEG image compression. Our extensive
experiments on different benchmark images and in terms of different criteria
show that our novel formulation for JPEG image compression can work
effectively.Comment: 39 pages, this paper is submitted to the related journa
On the Algorithmic Nature of the World
We propose a test based on the theory of algorithmic complexity and an
experimental evaluation of Levin's universal distribution to identify evidence
in support of or in contravention of the claim that the world is algorithmic in
nature. To this end we have undertaken a statistical comparison of the
frequency distributions of data from physical sources on the one
hand--repositories of information such as images, data stored in a hard drive,
computer programs and DNA sequences--and the frequency distributions generated
by purely algorithmic means on the other--by running abstract computing devices
such as Turing machines, cellular automata and Post Tag systems. Statistical
correlations were found and their significance measured.Comment: Book chapter in Gordana Dodig-Crnkovic and Mark Burgin (eds.)
Information and Computation by World Scientific, 2010.
(http://www.idt.mdh.se/ECAP-2005/INFOCOMPBOOK/). Paper website:
http://www.mathrix.org/experimentalAIT
- …