2,042 research outputs found

    Zum Reformentwurf zur Erneuerung des Steuerrechts von Paul Kirchhof

    Get PDF
    Wie ist der von Paul Kirchhof in diesem Jahr vorgelegte Entwurf zur Reform des deutschen Steuerrechts zu beurteilen? Dieter Dziadkowski, Mitglied der Ursprungslandkommission und der Einkommensteuer-(Bareis)-Kommission, betont, dass der Reformentwurf »ein ganzheitliches Werk« sei mit dem Ziel, das Steuerrecht zu vereinfachen und eine gleichmäßige Steuerbelastung für den Bürger zu schaffen. Insbesondere sollten löchrige Bemessungsgrundlagen, die durch zahlreiche Ausnahmetatbestände entstanden seien und zu überhöhten Steuersätzen geführt hätten, sachgerechter gestaltet sowie die Steuerartenvielfalt abgeschafft und auf vier Steuerarten reduziert werden. Seiner Ansicht nach könnte eine Umsetzung des Vorschlags von Paul Kirchhof zu einer wahrhaft »Großen Steuerreform« führen, »sofern sich die derzeitigen Entscheidungsträger trotz Eurokrise zu einer verbesserten Steuerpolitik durchringen könnten«. Franz W. Wagner, Universität Tübingen, ist dagegen skeptischer. Seiner Ansicht nach würde das Kurz-Gesetz das auf »code law« basierende Konstruktionsprinzip des bisherigen Steuerrechts durch ein System des »case law« ersetzen, dessen Beitrag zur Steuervereinfachung zweifelhaft sei. Wegen weiterhin unterschiedlicher Methoden der Einkommensermittlung in Bezug auf die Breite und Periodisierung von Steuerbemessungsgrundlagen würden bislang nicht auftretende, unerwünschte allokative Wirkungen entstehen. Und schließlich würde der geplante Niedrigtarif der Einkommensteuer von 25% vor allem für hohe Einkommen zu günstigen Ergebnissen führen. Rolf Peffekoven, Universität Mainz, sieht bei einer möglichen Umsetzung des Kirchhof-Modells, z.B. bei der Umsatzsteuer, erhebliche Probleme, die sich aufgrund EU-rechtlicher Restriktionen ergeben, und weist ebenfalls darauf hin, dass die angestrebte Reform bei der Einkommensteuer Entlastungen vor allem für die Bezieher höherer Einkommen bringen würde. Dem ständen deutliche Mehrbelastungen bei der Umsatzsteuer gegenüber, die in erster Linie die BezieSteuerreform, Steuerrecht, Steuerpolitik, Steuerbemessung, Einkommensteuer, Deutschland

    Wie einfach kann ein Steuersystem sein?

    Get PDF
    Wie sollte ein zukünftiges Steuersystem aufgebaut sein? Wie einfach kann es sein? Für Prof. Dr. Hans-Wolfgang Arndt, Universität Mannheim, ist der Karlsruher Entwurf für das Einkommensteuerrecht mustergültig. Prof. Dr. Dr. h.c. Franz W. Wagner, Universität Tübingen, sieht das größte Defizit der Diskussion um eine Steuervereinfachung darin, »dass den Bürgern verschwiegen wird, dass stark vereinfachte Steuersysteme die bisherigen Umverteilungs- und Anreizfunktionen des Steuersystems nicht übernehmen können... Würden die Bürger hingegen darüber informiert, dass tiefgreifende Eingriffe in Anreiz- und Verteilungsmechanismen das eigentliche Problem jeder Steuerreform sind, würde das Schlagwort der Steuervereinfachung seine konsensmobilisierende Funktion vermutlich schnell einbüßen.« Friedrich Merz, CDU/CSU-Fraktion, sieht das Ziel einer Vereinfachung des Steuerrechts in der »Wiederherstellung der Verständlichkeit. Ein verständliches und für Steuerpflichtige und Steuerverwaltung nachvollziehbares Steuerrecht schafft Akzeptanz und Rechtssicherheit.«Steuerreform, Steuersystem, Steuervereinfachung, Deutschland

    The future of Earth observation in hydrology

    Get PDF
    In just the past 5 years, the field of Earth observation has progressed beyond the offerings of conventional space-agency-based platforms to include a plethora of sensing opportunities afforded by CubeSats, unmanned aerial vehicles (UAVs), and smartphone technologies that are being embraced by both for-profit companies and individual researchers. Over the previous decades, space agency efforts have brought forth well-known and immensely useful satellites such as the Landsat series and the Gravity Research and Climate Experiment (GRACE) system, with costs typically of the order of 1 billion dollars per satellite and with concept-to-launch timelines of the order of 2 decades (for new missions). More recently, the proliferation of smart-phones has helped to miniaturize sensors and energy requirements, facilitating advances in the use of CubeSats that can be launched by the dozens, while providing ultra-high (3-5 m) resolution sensing of the Earth on a daily basis. Start-up companies that did not exist a decade ago now operate more satellites in orbit than any space agency, and at costs that are a mere fraction of traditional satellite missions. With these advances come new space-borne measurements, such as real-time high-definition video for tracking air pollution, storm-cell development, flood propagation, precipitation monitoring, or even for constructing digital surfaces using structure-from-motion techniques. Closer to the surface, measurements from small unmanned drones and tethered balloons have mapped snow depths, floods, and estimated evaporation at sub-metre resolutions, pushing back on spatio-temporal constraints and delivering new process insights. At ground level, precipitation has been measured using signal attenuation between antennae mounted on cell phone towers, while the proliferation of mobile devices has enabled citizen scientists to catalogue photos of environmental conditions, estimate daily average temperatures from battery state, and sense other hydrologically important variables such as channel depths using commercially available wireless devices. Global internet access is being pursued via high-altitude balloons, solar planes, and hundreds of planned satellite launches, providing a means to exploit the "internet of things" as an entirely new measurement domain. Such global access will enable real-time collection of data from billions of smartphones or from remote research platforms. This future will produce petabytes of data that can only be accessed via cloud storage and will require new analytical approaches to interpret. The extent to which today's hydrologic models can usefully ingest such massive data volumes is unclear. Nor is it clear whether this deluge of data will be usefully exploited, either because the measurements are superfluous, inconsistent, not accurate enough, or simply because we lack the capacity to process and analyse them. What is apparent is that the tools and techniques afforded by this array of novel and game-changing sensing platforms present our community with a unique opportunity to develop new insights that advance fundamental aspects of the hydrological sciences. To accomplish this will require more than just an application of the technology: in some cases, it will demand a radical rethink on how we utilize and exploit these new observing systems

    Changes in union membership over time : a panel analysis for West Germany

    Get PDF
    Despite the apparent stability of the wage bargaining institutions in West Germany, aggregate union membership has been declining dramatically since the early 90's. However, aggregate gross membership numbers do not distinguish by employment status and it is impossible to disaggregate these sufficiently. This paper uses four waves of the German Socioeconomic Panel in 1985, 1989, 1993, and 1998 to perform a panel analysis of net union membership among employees. We estimate a correlated random effects probit model suggested in Chamberlain (1984) to take proper account of individual specfic effects. Our results suggest that at the individual level the propensity to be a union member has not changed considerably over time. Thus, the aggregate decline in membership is due to composition effects. We also use the estimates to predict net union density at the industry level based on the IAB employment subsample for the time period 1985 to 1997. JEL - Klassifikation: J

    Suppression of soft nuclear bremsstrahlung in proton-nucleus collisions

    Full text link
    Photon energy spectra up to the kinematic limit have been measured in 190 MeV proton reactions with light and heavy nuclei to investigate the influence of the multiple-scattering process on the photon production. Relative to the predictions of models based on a quasi-free production mechanism a strong suppression of bremsstrahlung is observed in the low-energy region of the photon spectrum. We attribute this effect to the interference of photon amplitudes due to multiple scattering of nucleons in the nuclear medium.Comment: 12 pages, 3 figures, submitted to Phys. Rev. Let

    Changes in Union Membership Over Time : A Panel Analysis for West Germany

    Get PDF
    Despite the apparent stability of the wage bargaining institutions in West Germany, aggregate union membership has been declining dramatically since the early 90's. However, aggregate gross membership numbers do not distinguish by employment status and it is impossible to disaggregate these su±ciently. This paper uses four waves of the German Socioeconomic Panel in 1985, 1989, 1993, and 1998 to perform a panel analysis of net union membership among employees. We estimate a correlated random effects probit model suggested in Chamberlain (1984) to take proper account of individual specific effects. Our results suggest that at the individual level the propensity to be a union member has not changed considerably over time. Thus, the aggregate decline in membership is due to composition effects. We also use the estimates to predict net union density at the industry level based on the IAB employment subsample for the time period 1985 to 1997

    First steps towards a fast-neutron therapy planning program

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Monte Carlo code GEANT4 was used to implement first steps towards a treatment planning program for fast-neutron therapy at the FRM II research reactor in Garching, Germany. Depth dose curves were calculated inside a water phantom using measured primary neutron and simulated primary photon spectra and compared with depth dose curves measured earlier. The calculations were performed with GEANT4 in two different ways, simulating a simple box geometry and splitting this box into millions of small voxels (this was done to validate the voxelisation procedure that was also used to voxelise the human body).</p> <p>Results</p> <p>In both cases, the dose distributions were very similar to those measured in the water phantom, up to a depth of 30 cm. In order to model the situation of patients treated at the FRM II MEDAPP therapy beamline for salivary gland tumors, a human voxel phantom was implemented in GEANT4 and irradiated with the implemented MEDAPP neutron and photon spectra. The 3D dose distribution calculated inside the head of the phantom was similar to the depth dose curves in the water phantom, with some differences that are explained by differences in elementary composition. The lateral dose distribution was studied at various depths. The calculated cumulative dose volume histograms for the voxel phantom show the exposure of organs at risk surrounding the tumor.</p> <p>Conclusions</p> <p>In order to minimize the dose to healthy tissue, a conformal treatment is necessary. This can only be accomplished with the help of an advanced treatment planning system like the one developed here. Although all calculations were done for absorbed dose only, any biological dose weighting can be implemented easily, to take into account the increased radiobiological effectiveness of neutrons compared to photons.</p

    Molecular matched targeted therapies for primary brain tumors—a single center retrospective analysis

    Get PDF
    PURPOSE: Molecular diagnostics including next generation gene sequencing are increasingly used to determine options for individualized therapies in brain tumor patients. We aimed to evaluate the decision-making process of molecular targeted therapies and analyze data on tolerability as well as signals for efficacy. METHODS: Via retrospective analysis, we identified primary brain tumor patients who were treated off-label with a targeted therapy at the University Hospital Frankfurt, Goethe University. We analyzed which types of molecular alterations were utilized to guide molecular off-label therapies and the diagnostic procedures for their assessment during the period from 2008 to 2021. Data on tolerability and outcomes were collected. RESULTS: 413 off-label therapies were identified with an increasing annual number for the interval after 2016. 37 interventions (9%) were targeted therapies based on molecular markers. Glioma and meningioma were the most frequent entities treated with molecular matched targeted therapies. Rare entities comprised e.g. medulloblastoma and papillary craniopharyngeoma. Molecular targeted approaches included checkpoint inhibitors, inhibitors of mTOR, FGFR, ALK, MET, ROS1, PIK3CA, CDK4/6, BRAF/MEK and PARP. Responses in the first follow-up MRI were partial response (13.5%), stable disease (29.7%) and progressive disease (46.0%). There were no new safety signals. Adverse events with fatal outcome (CTCAE grade 5) were not observed. Only, two patients discontinued treatment due to side effects. Median progression-free and overall survival were 9.1/18 months in patients with at least stable disease, and 1.8/3.6 months in those with progressive disease at the first follow-up MRI. CONCLUSION: A broad range of actionable alterations was targeted with available molecular therapeutics. However, efficacy was largely observed in entities with paradigmatic oncogenic drivers, in particular with BRAF mutations. Further research on biomarker-informed molecular matched therapies is urgently necessary. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s11060-022-04049-w
    corecore