245 research outputs found
Search for time modulations in the decay rate of 40K and 232Th
Time modulations at per mil level have been reported to take place in the
decay constant of about 15 nuclei with period of one year (most cases) but also
of about one month or one day. In this paper we give the results of the
activity measurement of a 40K source and a 232Th one. The two experiments have
been done at the Gran Sasso Laboratory during a period of about 500 days, above
ground (40K) and underground (232Th) with a target sensitivity of a few parts
over 10^5. We also give the results of the activity measurement at the time of
the X-class solar flares which took place in May 2013. Briefly, our
measurements do not show any evidence of unexpected time dependence in the
decay rate of 40K and 232Th.Comment: version accepted for publication (Astroparticle Physics
VIRTUAL MUSEUMS FOR LANDSCAPE VALORIZATION AND COMMUNICATION
Research in the domain of landscape virtual reconstructions has been mainly focused on digitization and recording inside GIS
systems, or real time visualization, paying a minor attention to the development of a methodological approach for the landscape
narration, combing different registers, conceptual, emotional incitements and, thus, able to arouse in the public a feeling of emotional
“sensing” and self- identification. The landscape reflects also the human activities in the territory and the communities’ cultural
patterns, their sense of “belonging”. In a virtual museum of landscapes, the multidisciplinary approach, the multiplication of
perspectives and voices, storytelling, acquire primary importance. A Virtual Museum of landscapes should integrate both holistic and
delimited visions. The holistic vision requires a diachronic approach, including both present and past phases of life. On the other
side, delimited, or “monographic”, representations are useful to go deeper into specific and exemplar stories, regarding specific
groups of people.
Beside, the emergence of new social media enhancing cultural interactions among people induce the creation of specific social
platforms for Cultural Heritage for the active participation of a large number of stakeholders. Co-creation scenarios and tools can be
particularly promising. Aton is an example of front-end VR social platform in the web end, for the efficient streaming of
medium/large landscape, their exploration and characterization.
The Tiber Valley Virtual Museum is an example of sensorial cultural landscape. Starting from the acquisition of topographical data
through integrated technologies, several multi-sensory scenarios have been created, inside which visitors can feel embodied and
involved
Einstein and Jordan frames reconciled: a frame-invariant approach to scalar-tensor cosmology
Scalar-Tensor theories of gravity can be formulated in different frames, most
notably, the Einstein and the Jordan one. While some debate still persists in
the literature on the physical status of the different frames, a frame
transformation in Scalar-Tensor theories amounts to a local redefinition of the
metric, and then should not affect physical results. We analyze the issue in a
cosmological context. In particular, we define all the relevant observables
(redshift, distances, cross-sections, ...) in terms of frame-independent
quantities. Then, we give a frame-independent formulation of the Boltzmann
equation, and outline its use in relevant examples such as particle freeze-out
and the evolution of the CMB photon distribution function. Finally, we derive
the gravitational equations for the frame-independent quantities at first order
in perturbation theory. From a practical point of view, the present approach
allows the simultaneous implementation of the good aspects of the two frames in
a clear and straightforward way.Comment: 15 pages, matches version to be published on Phys. Rev.
HexBox: Interactive Box Modeling of Hexahedral Meshes
We introduce HexBox, an intuitive modeling method and interactive tool for creating and editing hexahedral meshes. Hexbox brings the major and widely validated surface modeling paradigm of surface box modeling into the world of hex meshing. The main idea is to allow the user to box-model a volumetric mesh by primarily modifying its surface through a set of topological and geometric operations. We support, in particular, local and global subdivision, various instantiations of extrusion, removal, and cloning of elements, the creation of non-conformal or conformal grids, as well as shape modifications through vertex positioning, including manual editing, automatic smoothing, or, eventually, projection on an externally-provided target surface. At the core of the efficient implementation of the method is the coherent maintenance, at all steps, of two parallel data structures: a hexahedral mesh representing the topology and geometry of the currently modeled shape, and a directed acyclic graph that connects operation nodes to the affected mesh hexahedra. Operations are realized by exploiting recent advancements in grid- based meshing, such as mixing of 3-refinement, 2-refinement, and face-refinement, and using templated topological bridges to enforce on-the-fly mesh conformity across pairs of adjacent elements. A direct manipulation user interface lets users control all operations. The effectiveness of our tool, released as open source to the community, is demonstrated by modeling several complex shapes hard to realize with competing tools and techniques
Practical quad mesh simplification
In this paper we present an innovative approach to incremental quad mesh simplification, i.e. the task of producing a low complexity quad mesh starting from a high complexity one. The process is based on a novel set of strictly local operations which preserve quad structure. We show how good tessellation quality (e.g. in terms of vertex valencies) can be achieved by pursuing uniform length and canonical proportions of edges and diagonals. The decimation process is interleaved with smoothing in tangent space. The latter strongly contributes to identify a suitable sequence of local modification operations. The method is naturally extended to manage preservation of feature lines (e.g. creases) and varying (e.g. adaptive) tessellation densities. We also present an original Triangle-to-Quad conversion algorithm that behaves well in terms of geometrical complexity and tessellation quality, which we use to obtain the initial quad mesh from a given triangle mesh
Practical quad mesh simplification
In this paper we present an innovative approach to incremental quad mesh simplification, i.e. the task of producing a low complexity quad mesh starting from a high complexity one. The process is based on a novel set of strictly local operations which preserve quad structure. We show how good tessellation quality (e.g. in terms of vertex valencies) can be achieved by pursuing uniform length and canonical proportions of edges and diagonals. The decimation process is interleaved with smoothing in tangent space. The latter strongly contributes to identify a suitable sequence of local modification operations. The method is naturally extended to manage preservation of feature lines (e.g. creases) and varying (e.g. adaptive) tessellation densities. We also present an original Triangle-to-Quad conversion algorithm that behaves well in terms of geometrical complexity and tessellation quality, which we use to obtain the initial quad mesh from a given triangle mesh
Loopy Cuts: Surface-Field Aware Block Decomposition for Hex-Meshing.
We present a new fully automatic block-decomposition hexahedral meshing
algorithm capable of producing high quality meshes that strictly preserve
feature curve networks on the input surface and align with an input surface
cross-field. We produce all-hex meshes on the vast majority of inputs, and
introduce localized non-hex elements only when the surface feature network
necessitates those. The input to our framework is a closed surface with a
collection of geometric or user-demarcated feature curves and a feature-aligned
surface cross-field. Its output is a compact set of blocks whose edges
interpolate these features and are loosely aligned with this cross-field. We
obtain this block decomposition by cutting the input model using a collection
of simple cutting surfaces bounded by closed surface loops. The set of cutting
loops spans the input feature curves, ensuring feature preservation, and is
obtained using a field-space sampling process. The computed loops are uniformly
distributed across the surface, cross orthogonally, and are loosely aligned
with the cross-field directions, inducing the desired block decomposition. We
validate our method by applying it to a large range of complex inputs and
comparing our results to those produced by state-of-the-art alternatives.
Contrary to prior approaches, our framework consistently produces high-quality
field aligned meshes while strictly preserving geometric or user-specified
surface features
Boltzmann Suppression of Interacting Heavy Particles
Matsumoto and Yoshimura have recently argued that the number density of heavy
particles in a thermal bath is not necessarily Boltzmann-suppressed for T << M,
as power law corrections may emerge at higher orders in perturbation theory.
This fact might have important implications on the determination of WIMP relic
densities. On the other hand, the definition of number densities in a
interacting theory is not a straightforward procedure. It usually requires
renormalization of composite operators and operator mixing, which obscure the
physical interpretation of the computed thermal average. We propose a new
definition for the thermal average of a composite operator, which does not
require any new renormalization counterterm and is thus free from such
ambiguities. Applying this definition to the model of Matsumoto and Yoshimura
we find that it gives number densities which are Boltzmann-suppressed at any
order in perturbation theory. We discuss also heavy particles which are
unstable already at T=0, showing that power law corrections do in general
emerge in this case.Comment: 7 pages, 5 figures. New section added, with the discussion of the
case of an unstable heavy particle. Version to appear on Phys. Rev.
Quad Meshing
Triangle meshes have been nearly ubiquitous in computer graphics, and a large body of data structures and geometry processing algorithms based on them has been developed in the literature. At the same time, quadrilateral meshes, especially semi-regular ones, have advantages for many applications, and significant progress was made in quadrilateral mesh generation and processing during the last several years. In this State of the Art Report, we discuss the advantages and problems of techniques operating on quadrilateral meshes, including surface analysis and mesh quality, simplification, adaptive refinement, alignment with features, parametrization, and remeshing
- …