1830 research outputs found
Sort by
Architecture and Neurophenomenology: Rethinking the Pre-reflective Dimension of Architectural Experience
The essential benefit of neurophenomenological investigations in architecture is
to be found in the capacity to provide us with the closest currently available
approximation of the human being in its biological and cultural complexity, which
can be used in architectural design and thinking. The prevailing praxis of creating
abstract, conceptual designs, which favor reflexive and intellectualized over
existential, perceptually based experience of architecture, is being increasingly
recognized in architectural circles as contradictory and inadequate interpretation
of our involvement with architectural spaces. This renewed interest in rethinking
the experiential dimension of architecture coincides with the
neurophenomenological understanding of recent interdisciplinary findings,
which unequivocally confirm that the experiencing – architectural – subject is a
profoundly embodied, enactive and situated human being.
Neurophenomenological analysis of architectural experience is aimed at
understanding the conditions of our embodiment, how we relate with
architectural environment and essentially, what it is about architecture that has
the capacity to sustain and nourish a meaningful human existence. By
emphasizing the pre-reflective dimension of experience, intention is to raise
architects’ awareness that our engagement and understanding of architectural
spaces is to a large extent determined by profoundly embodied and preconscious
processes. Importantly, neurophenomenology has the potential to
articulate the implicit architects’ knowledge: there is neurophenomenologically
valid evidence that the workings of dynamically intertwined brain and bodily
mechanisms have been intuitively used by architects throughout architectural
history as a pre-reflective architecture-body communication, in order to shape
the overall embodied experience and atmosphere of an architectural setting.
Architectural theories like late-nineteenth century idea of empathy (Einfühlung),
Le Corbusier’s promenade architecturale, Steven Holl’s “enmeshed experience”,
Juhani Pallasmaa’s “architecture as a verb”, Jan Gehl’s “life between buildings”,
have neurophenomenological correlates in sensorimotor theory of perception,
mirror neurons, hard-wired emotional responses, brain plasticity and the
concept of enriched environments, to name but a few.
In this sense, a crucial advantage of a dialogue between architecture and
neurophenomenology lies in the compatibility of ideas already present in
architectural discourse and the theoretical background of
neurophenomenological approach. Establishing a common ground facilitates
more accurate definition and overlapping of investigative goals, while the
phenomenologically enriched scientific hypotheses allow for the exploration and
protection of the intrinsic poetic nature of architecture
Health, survival and proximity to death: exploring health and mortality risks among Italian older adults
La funzione di vigilanza bancaria nella prospettiva dell'Unione bancaria europea
La ricerca ha analizzato l'evoluzione, in una prospettiva storico-normativa, della disciplina europea e nazionale relativa all'esercizio della funzione di vigilanza bancaria da parte della Banca d'Italia, nell'ordinamento creditizio italiano e le conseguenze prodotte su di esso dalla recente approvazione del progetto di Unione bancaria europea
La collaborazione tra Industria e Accademia nel quadro della Scienza della Sostenibilità
Questo lavoro parte dall’analisi dell’attuale crisi della sostenibilità, che ha portato a sistemi insostenibili a livello globale, sociale e umano, per riaffermare la crescente importanza del settore industriale, non in termini di ruolo centrale nel perseguimento di un percorso sempre più insostenibile, ma soprattutto nel ruolo che ancora deve essere svolto dal settore industriale nella transizione verso la sostenibilità. Principalmente, questo lavoro si concentra sul concetto di sostenibilità e reclama la necessità di una collaborazione tra industria e mondo accademico nell'ambito della Scienza della Sostenibilità. Tale collaborazione è volta a rivoluzionare il concetto di produzione scientifica, con la visione di un nuovo schema in cui i problemi complessi legati alla sostenibilità affrontati da tutta la business community e i problemi di rilevanza per i sistemi umani, globali e sociali devono trovare soluzioni condivise e cooperative. Pertanto, gli elementi centrali di una tale collaborazione implicano una ricerca intra-disciplinare e trans-disciplinare, una co-produzione della conoscenza, una co-evoluzione dei sistemi complessi e del loro ambiente, un approccio learning by doing e doing by learning, e, infine, l'innovazione di sistema, invece di ottimizzazione del sistema. Più semplicemente, questo nuovo approccio può essere rappresentato come co-evoluzione, co-produzione e co-apprendimento.Partendo dalla co-produzione di conoscenza e dall’approccio learning-by-doing e doing-by-learning, il processo di collaborazione tra Industria e Accademia (IAC) nell'ambito di questo innovativo paradigma scientifico è stato avviato e condiviso con i rappresentanti del settore attraverso incontri ad hoc e conferenze. In conclusione con il mio lavoro di ricerca si è riusciti a stabile un metodo di collaborazione tra Industria e Accademia nel contesto più ampio della Scienza della Sostenibilità. Il metodo è quello del “Closed Cycle Collaboration Process” basato appunto su un approccio partecipativo e collaborativo tra i partner industriali e accademici, per arrivare ad una co-produzione della conoscenza
PhD thesis
Study of the underdoped region of the cuprates superconductors. Attempet to find a broken symmetry state close to superconductivity in anology with the other unconventional superconductors. We propose electronic phases in analogy with the soft matter phases. They explain very well the neutron scattering experiments
LUCE DINAMICA E MEDIA SUPERFICI. L'ARCHITETTURA SOTTO UNA NUOVA LUCE
Questa ricerca intende analizzare le problematiche che concorrono nel
complesso tema della Media architettura e di come questa abbia radicalmente
modificato il concetto di spazio architettonico, ridefinendo la scala gerarchica
degli elementi che lo costituiscono e ridisegnandone il processo progettuale, in
un’articolata interrelazione con i nuovi media digitali. La funzione comunicativa
dell’architettura subisce una vertiginosa impennata nel momento in cui le pareti
diventano fogli bianchi a servizio dei media, trasformando l’involucro
architettonico prima e le superfici interne poi, in una interfaccia comunicativa e
propagandistica
LOW FREQUENCY NOISE SUPPRESSION FOR THE DEVELOPMENT OF GRAVITATIONAL ASTRONOMY
The existence of gravitational radiation, predicted by the General Relativity theory, was indirectly demonstrated by the observation of the orbital decay in the binary pulsar 1913+16, for which R.A. Hulse and J.H. Taylor were awarded with the Nobel Prize in 1993. From then on, the direct detection of gravitational waves became a main issue in the experimental physics, not only for the verification of the theory itself but, most important, because it can open a new "observation window" of the universe. In fact, many astronomical objects, such as neutron stars and black holes, can be directly studied only through their gravitational emission. Moreover, since its interaction with matter is intrinsically weak, the degradation of informations carried by gravitational waves is negligible, and their revelation will allow us to understand the internal structure of massive objects which emit them, and will also provide a complementary point of view to the traditional astronomy and cosmology.
The direct detection must face the extreme weakness of gravitational radiation, hence very high sensitive detectors are required in order to reveal the quadrupolar effect produced by the passage of gravitational waves. The first attempts in this field were based on massive resonant bars, relying on the pioneering technique developed by J. Weber. In recent decades a more promising strategy based on interferometry was developed, providing the advantage of a wide-frequency detection-band (from few Hz to some kHz) jointly to an extreme sensitivity (the detectable strain is smaller than the size of a proton). The global network of first generation interferometric detectors, composed of Virgo, LIGO, GEO600 and TAMA300, demonstrated the feasibility of such a technique; in particular the kilometric-scale detectors Virgo and LIGO achieved a sensitivity high enough to determine the first upper limits for the gravitational emission of some known neutron stars, such as the Crab and Vela pulsars. In the next few years the upgraded version of these detectors, namely the second generation of detectors (such as Advanced Virgo and Advanced LIGO) will become operational and are expected to achieve the first direct detections of gravitational waves.
However, the signal-to-noise ratio (SNR) of these first detections will be too low for precise astronomical studies of the gravitational wave sources and for complementing optical, radio and X-ray observations in the study of fundamental systems and processes in the Universe.
For this reason the investigation on the design of a new, namely third, generation of detectors is already started, leading to the proposal of the European Einstein Telescope (ET). With a considerably improved sensitivity these new machines will open the era of routine gravitational wave astronomy, leading to the birth of a complete multimessenger astronomy. In particular, to enlarge the detector bandwidth in the range of 1 Hz, where interesting gravitational signals, such as those emitted by rotating neutron stars, can be detected, a further reduction of the so-called low-frequency noise, with respect to the second generation detectors, is required.
In this low-frequency band the main limitation to the sensitivity of an interferometric detector arises from the thermal noise, and at lower frequencies, from the seismic and Newtonian noises. The suppression of the thermal noise will require the implementation of a cryogenic apparatus, in order to cool the test masses down to about 10 K, so that the development of position-control devices capable of cryogenic operations will be also necessary for the suspension and payload control. The seismic attenuation was already obtained in first generation detectors by means of long suspension chains of vertical and horizontal oscillators (e.g. the superattenuator of Virgo), so that a further reduction requires a smaller seismic noise at the input of the suspension system; moreover, mass density fluctuations produced by the seismic motion induce also a stochastic gravitational field (the so-called Newtonian or gravity-gradient noise) which shunts the suspension and couples directly to the mirrors of the interferometer. In order to suppress these two seismically-generated noises, third generation interferometers will be constructed in underground sites, where Rayleigh surface waves are attenuated, and the surrounding rock layers are more homogeneous and stable, reducing the density fluctuations. The feasibility of a cryogenic and underground interferometer was already tested by the Japanese prototype-detector CLIO, in the same site where is currently under construction KAGRA (formerly known as LGCT), the first full-scale interferometric detector based on these approaches. For these aspects, this second generation detector will be the forerunner of third generation interferometers such as ET, therefore a collaboration between the two scientific collaborations has been established.
My experimental work is focused on the suppression of these low noise sources, so that this thesis is structured in two parallel fields of research: the seismic characterization of a potential site for the construction of the Einstein Telescope, and the development, calibration and test of a cryogenic vertical accelerometer, which can be used as a position control device, analogously to those used in the actual room-temperature superattenuator of Virgo, but also to check the vibrations introduced by the cryogenic apparatus, as I did with the measurements I performed on the cryostats of KAGRA, presented at the end of this thesis.
The scheme of this thesis is subdivided in three main parts: in the first part I introduce the foundations of the gravitational astronomy, from the theory and the astrophysical sources to the experiments which will lead to the gravitational observations; in the second part I discuss the theory of low frequency noise sources and their suppression; in the third part I present the experimental work I performed in this context.
Every part is composed of two chapters, structured as follows.
In the first chapter I describe the derivation of gravitational waves from the Einstein's field equations, discussing their properties and the astrophysical and cosmological sources, especially those whose emission is expected at low frequencies.
In the second chapter I describe the direct interferometric detection of gravitational waves and the main noise sources which limit the sensitivity, concluding with an overview of present and future detectors.
In the third chapter I discuss the main features of the seismic and Newtonian noises, and the strategies necessary to suppress them, especially in third generation detectors.
In the fourth chapter I discuss the theory of thermal noise, from the ideal case of the damped harmonic oscillator to the real dissipative mechanical systems and optical components of the interferometer.
In the fifth chapter I present my experimental work on the long-period characterization of the Sos Enattos site in Sardinia (proposed for hosting the Einstein Telescope), from the construction and instrumentation of an underground array of sensors to the analysis of seismic and meteorological data collected in one year of observations.
Finally, in the sixth chapter I present my experimental work on the development of a cryogenic vertical accelerometer, from the designing to the cryogenic calibration and tests at T=20 K. In this chapter I also present the results of the implementation of this device into the cryostats dedicated to the test masses of KAGRA, where I verified the operations of the accelerometer at T=8 K and I measured the vibrations of the inner radiation shield of the cryostats. These measurements led to a first experimental estimate of the additional vibrational noise which will be injected by the cryogenic refrigerators to the detector test masses
High Precision, High Performance Simulations of Astrophysical Stellar Systems
The main target of this work is the discussion of the modern techniques (software and hardware) apt to solve numerically the -body problem in order to develop a numerical code with highest as possible speed and accuracy performance. In particular, we will introduce a new high precision, high performance, code (called \code) which solves the -body problem exploiting both a high order time integration algorithm (the Hermite's 6th order integrator) and the modern hardware represented by Graphics Processing Units (GPUs), which work as powerful computing accelerators. I will describe in details \code showing how GPUs can be efficiently exploited for gravitational -body simulations up to a large number of particles () with a degree of precision and speed impossible to reach until 5 years ago. Being quite new technologies, the GPUs have not been fully exploited so far; this is why, in this Thesis, I will discuss modern numerical techniques associated with the -body problem, starting from the set up of initial conditions up to the computation of the dynamical evolution of dense and populous stellar systems using GPUs and the two main languages (OpenCL and CUDA) apt to program them.
I will present also results of the application of \code to study the emerging state, and rapid mass segregation, of intermediate-, young, stellar systems after their violent relaxation process. These objects have been investigated simulating systems composed by stars of different masses, including a central star-mass black hole as well as a model of gas residual of the mother cloud, starting from \lq cold\rq to \lq warm\rq initial conditions. Moreover, thanks to the high adaptability of the developed software, our group is investigating the formation and the evolution of the innermost region of galaxies (Nuclear Star Clusters). This is, surely, a modern topic, which has not yet received an adequate self-consistent explanation neither from theoretical nor a numerical point of view