643 research outputs found

    Husserl and the Problem of Animal

    Get PDF
    The main aim of this paper is to give an overview of Husserl’s attempts to unfold the phenomenon of animal consciousness, and particularly the lowest level of subjectivity. I wanted to show that in this context Husserl’s notion of life had a peculiar importance. The phenomenon of life for Husserl was essentially the inner, mental activity of a subject or consciousness. Husserl understood life as a perpetual process of self-normalization; according to him, life normalizes itself on different levels of complexity. The reconstruction of the lowest level of subjectivity is at the same time the reconstruction of the lowest level and most rudimentary form of self-normalization of life. Husserl had fundamentally three ways to approach the problem of animal mind: empathy, eidetic variation and dismantling-deconstructive reflection (“Abbau”) on the own subjectivity of the phenomenologist. The first refers to the problem of empathizing with anomalous subjects (such as an animal), and to the question, how wide is the range of empathy, and towards which living beings could we be empathic, in a phenomenologically legitimate way? The second is to grasp the eidetic (essential) structures of consciousness in general, and remove eidetic moments and structures from it, and see, which are the most fundamental structures of subjectivity, without which no consciousness could be conceived at all. The third way is the dismantling-deconstructive approach of one’s own consciousness. The subjectivity in general appears as having several main layers, and the phenomenologist abstracts from the higher layer, in order to reach the deepest one. Once the lowest level is disclosed in this way, a subject could be reconstructed who possesses only the most fundamental, simplest structures and elements of subjectivity

    Engaged Eco-phenomenology. An Eco-socialist stance based upon a phenomenological account of narrative identity

    Get PDF
    In my presentation, I will attempt to show how a phenomenologically consequent interpretation of narrative identity would lead to eco-ethical and eco-political consequences. In particular, I will try to show the outlines of an eco-socialist theory, which implies an egalitarian approach of all living beings, and which is motivated by a phenomenological understanding of narrative identity. My presentation consists of two main parts. In the first part, I would like to treat the relationship between freedom, responsibility and – narratively conceived – personal identity, from a phenomenological point of view. The main authors of this part will be Husserl, Heidegger, Ricoeur, Lévinas and László Tengelyi. For Husserl, the narrative aspect of personal identity was already an important topic. For Heidegger, our own decisions constitute our identity. But in my opinion, there is a decisive factor, which was marginal for Heidegger, in regard of our identity and freedom: the Other. The Other’s problem became central for Lévinas, and also for Ricoeur. László Tengelyi modified Ricoeur’s account of narrative identity on a decisive point: he draws the attention to the role of “events of fate”; events that change the course of our lives fundamentally. In the second part I would like to show the ethical and political implications of the first part. The way in which we treat in fact the Other shows the best, who we are in real. But the Other must not just be a human being; she or he can be a living being whatsoever. Here I would like to emphasize the eco-phenomenological motifs in Husserl (to this see also: Erazim Kohák); and I will try to show how such motifs lead to an egalitarian, eco-socialist view of everything which lives

    Pontfelhőszűrési és -szegmentálási módszerek

    Get PDF
    A napjainkban széles körben terjedő pontfelhőt alkotó eszközöket, mint például a földi és légi lézerszkennereket, különböző mobil térképező rendszereket, illetve pilóta nélküli légi járműveket rutinszerűen használjuk a földmérésben és számos kapcsolódó mérnöki területen. A felmérések eredményeként előállított, több száz millió pontból álló pontfelhők feldolgozása viszont nem mindig egyszerű feladat. A sokszor idő- és hardverigényesnek tűnő manuális feldolgozás mellett ma már lehetőségünk van különböző korszerű matematikai módszereken (pl. iteratív robusztus becslésen), gépi tanuláson (pl.: sűrűség alapú klaszterezésen, neurális hálózatokon) alapuló pontfelhő-szegmentáló, -osztályozó eljárások alkalmazására is. Ezek a megoldások a pontfelhőből közvetett módon meghatározható információkat használnak fel, mint például a pontsűrűség, a normálvektorok iránya, vagy a különböző saját értékeken alapuló jellemzők. A cikkben a különböző – attribútumalapú, élalapú, modellalapú, régió növelésen, és gépi tanuláson alapuló – szegmentációs módszereket mutatom be, emellett kitérek ezek gyakorlati alkalmazására is. Több példán keresztül mutatom be, hogy a nyers pontfelhőből meghatározható jellemzők hogyan hasznosíthatóak különböző feladatok elvégzésénél, mint például a tető- és falpontok elhatárolása, talajpontok szűrése vagy ponthalmazok automatizált elkülönítése. A bemutatott módszerek az adott feladattól függően számos esetben kínálhatnak megoldást a pontfelhők hatékony feldolgozására

    The effects of ballot access requirements and campaign subsidies on candidate entry

    Get PDF
    A piaci versenyhez hasonlóan a politikai verseny is erősen szabályozott környezetben zajlik. Az európai demokráciák többségében a jelöltek csak bizonyos követelmények teljesítése mellett kerülhetnek fel a szavazólapra. Nem csak a passzív választójogra vonatkozó szabályok korlátozzák az indulást, de olyan további feltétek is, mint meghatározott számú választópolgári ajánlás összegyűjtése vagy adott összegű kaució befizetése. A korlátozásokkal éppen ellentétes hatást fejtenek ki a kampányok költségvetési támogatásaira vonatkozó szabályok. Ezeknek a célja, hogy anyagilag is elősegítse a pártok és jelöltek indulását a választásokon, biztosítsa a rendszer inkluzivitását és serkentse a versenyt. Figyelemreméltó, hogy a legtöbb ország ennek a két ellentétes megoldásnak valamelyik kombinációját alkalmazza. Kérdéses továbbá az, hogy ezek a megoldások mennyiben érik el a kívánt hatást, képesek-e érdemben korlátozni az irreleváns jelöltek elindulását, vagy támogatni az új politikai szereplők belépését. A tanulmány 27 Európai Uniós tagállam szabályozásának áttekintése és elemzése után arra jut, hogy mind a jelöltállítási kritériumok, mind a kampánytámogatás csak minimális hatást gyakorol a választási versenyre. Az eredmények alapján megfontolandó, hogy leszámítva azokat az eseteket, amikor az indulás jelentős anyagi előnyökkel jár, a jelöltállítási szabályok különösebb következmények nélkül lazíthatók – akár elhagyhatók

    Parameter estimation for inspiraling eccentric compact binaries including pericenter precession

    Full text link
    Inspiraling supermassive black hole binary systems with high orbital eccentricity are important sources for space-based gravitational wave (GW) observatories like the Laser Interferometer Space Antenna (LISA). Eccentricity adds orbital harmonics to the Fourier transform of the GW signal and relativistic pericenter precession leads to a three-way splitting of each harmonic peak. We study the parameter estimation accuracy for such waveforms with different initial eccentricity using the Fisher matrix method and a Monte Carlo sampling of the initial binary orientation. The eccentricity improves the parameter estimation by breaking degeneracies between different parameters. In particular, we find that the source localization precision improves significantly for higher-mass binaries due to eccentricity. The typical sky position errors are 1\sim1 deg for a nonspinning, 107M10^7\,M_{\odot} equal-mass binary at redshift z=1z=1, if the initial eccentricity 1 yr before merger is e00.6e_0\sim 0.6. Pericenter precession does not affect the source localization accuracy significantly, but it does further improve the mass and eccentricity estimation accuracy systematically by a factor of 3--10 for masses between 10610^6 and 107M10^7\,M_{\odot} for e00.3e_0 \sim 0.3.Comment: 14 two-column pages, 12 figures, expanded version; contains the proof correction

    The hard life of air bubbles crossing a fluid/fluid interface

    Get PDF
    We investigate the dynamics of isolated air bubbles crossing the horizontal interface separating two Newtonian immiscible liquids initially at rest by means of experiments and DNS. High-speed video imaging is used to obtain a detailed evolution of the various interfaces involved in the system. The size of the bubbles and the viscosity contrast between the two liquids are varied by more than one and four orders of magnitude, respectively, making it possible to obtain bubble shapes ranging from spherical to toroidal. A variety of flow regimes is observed, including that of small bubbles remaining trapped at the fluid–fluid interface in a film-drainage configuration. In most cases, the bubble succeeds in crossing the interface without being stopped near its undisturbed position and, during a certain period of time, tows a significant column of lower fluid which sometimes exhibits a complex dynamics as it lengthens in the upper fluid. Direct numerical simulations of several selected experimental situations are performed with a code employing a volume-of-fluid type formulation of the incompressible Navier–Stokes equations. Comparisons between experimental and numerical results confirm the reliability of the computational approach in most situations but also points out the need for improvements to capture some subtle but important physical processes, most notably those related to film drainage. Influence of the physical parameters highlighted by experiments and computations, especially that of the density and viscosity contrasts between the two fluids and of the various interfacial tensions, is discussed and analysed in the light of simple models

    VB-MK-LMF: Fusion of drugs, targets and interactions using Variational Bayesian Multiple Kernel Logistic Matrix Factorization

    Get PDF
    Background Computational fusion approaches to drug-target interaction (DTI) prediction, capable of utilizing multiple sources of background knowledge, were reported to achieve superior predictive performance in multiple studies. Other studies showed that specificities of the DTI task, such as weighting the observations and focusing the side information are also vital for reaching top performance. Method We present Variational Bayesian Multiple Kernel Logistic Matrix Factorization (VB-MK-LMF), which unifies the advantages of (1) multiple kernel learning, (2) weighted observations, (3) graph Laplacian regularization, and (4) explicit modeling of probabilities of binary drug-target interactions. Results VB-MK-LMF achieves significantly better predictive performance in standard benchmarks compared to state-of-the-art methods, which can be traced back to multiple factors. The systematic evaluation of the effect of multiple kernels confirm their benefits, but also highlights the limitations of linear kernel combinations, already recognized in other fields. The analysis of the effect of prior kernels using varying sample sizes sheds light on the balance of data and knowledge in DTI tasks and on the rate at which the effect of priors vanishes. This also shows the existence of ``small sample size'' regions where using side information offers significant gains. Alongside favorable predictive performance, a notable property of MF methods is that they provide a unified space for drugs and targets using latent representations. Compared to earlier studies, the dimensionality of this space proved to be surprisingly low, which makes the latent representations constructed by VB-ML-LMF especially well-suited for visual analytics. The probabilistic nature of the predictions allows the calculation of the expected values of hits in functionally relevant sets, which we demonstrate by predicting drug promiscuity. The variational Bayesian approximation is also implemented for general purpose graphics processing units yielding significantly improved computational time. Conclusion In standard benchmarks, VB-MK-LMF shows significantly improved predictive performance in a wide range of settings. Beyond these benchmarks, another contribution of our work is highlighting and providing estimates for further pharmaceutically relevant quantities, such as promiscuity, druggability and total number of interactions. Availability Data and code are available at http://bioinformatics.mit.bme.hu

    Pipeline mode in C-based direct hardware implementation

    Get PDF
    In this paper a methodology is presented that enables the pipeline function of hardware blocks created by C-based direct hardware design. The method is embedded into the C-based design methodology worked out by the authors earlier. This pipeline enabling method is rather flexible, needs no special efforts. With the help of a simple state-machine-based entity, blocks of different execution times can build up the pipeline, even with data-dependent duration. A data-spreading technique solves data consistency. Pipeline sectioning - chosing the right and balanced granularity versus pipelining overhead - is an optimisation matter. Simulation results prove the correctness of the method
    corecore