23 research outputs found
Kje bi bilo treba izboljšati območja za pešce? Razvrščanje in kartiranje posegov za izboljšanje ulične hodljivosti v središču Cape Towna
Prostorsko urejanje območij za pešce, ki zagotavlja bolj
zdravo in vključujočo ulično krajino, je lahko močan mehanizem za izboljšanje varnosti in udobnosti pešačenja
v afriških mestih. Avtorja v članku predlagata pristop k
analizi hodljivosti na več prostorskih ravneh, s katerim se
lahko določijo ulice, primerne za pešce, in problematična
območja, ki zahtevajo manjše izboljšave (npr. popravilo
pločnikov, boljše vzdrževanje stavb ter ureditev ulične
razsvetljave in javnih klopi). Raziskovalni okvir, ki temelji
na uporabi GIS, sta uporabila za središče Cape Towna v
Južni Afriki, ki se spopada s kompleksnimi družbenimi
in okoljskimi izzivi. Za vsak segment ulice s prehodom
za pešce sta z orodjem za virtualno presojo območij za
pešce zbrala podatke o okoljskih kazalnikih na mikro- in
mezoravni ter proučila kakovost javnega prostora. Rezultate, dobljene z navedenim sestavljenim orodjem za presojo ulic, sta pomnožila z normalizirano vrednostjo mere
prostorske sintakse (tj. integracije), na podlagi česar sta
določila poti v mreži, ki so med seboj najbolj povezane in
najbolj potrebne prenove. Z Jenksovo metodo naravnih
mej sta razvrstila rezultate za vsak segment, na podlagi
česar sta ugotovila, da so ulice, ki so najbolj potrebne
prenove, zgoščene v Bo-Kaapu, razmeroma slabo razvitem, večkulturnem in hribovitem predelu v zahodnem
delu Cape Towna. Na koncu sta predstavila priporočila za izboljšanje kakovosti mestnega okolja in splošne
privlačnosti mesta za pešce. Predlagana metodologija
omogoča učinkovitejše upravljanje krajev in razvrščanje
potreb mesta po izboljšavah, s čimer se zmanjšajo stroški
in poraba časa
Where to improve pedestrian streetscapes: Prioritizing and mapping street-level walkability interventions in Cape Town’s city centre
Pedestrian interventions for healthier and more inclusive
streetscapes can be powerful mechanisms to increase the
safety and comfort of walking in African cities. This article proposes a multiscale walkability analysis approach
to identify both suitable streets for pedestrian travel and
problematic areas requiring small-scale improvements
(e.g., pavement repairs, building maintenance, streetlights, and public seating). We applied a GIS-based
framework to the central urban area of Cape Town, South
Africa, which presents complex social and environmental
challenges. For each street-and-crossing segment, a virtual
pedestrian streetscape audit tool was used to collect micro- and mesoscale environmental indicators and assess
the quality of public space. This composite street-level
assessment tool was weighted with a space syntax analysis
indicator (i.e., spatial integration) to detect the network’s
most interconnected and high-priority pathways. The
Jenks natural breaks classification algorithm was used to
classify scores for each segment, which ultimately found
that the highest-priority streets for redevelopment are
clustered in Bo-Kaap, a relatively disadvantaged, multicultural, and hilly district on Cape Town’s west side.
Policy recommendations are evaluated to increase the
quality of the urban environment and the city’s overall
attractiveness to pedestrians. The proposed methodology
facilitates more effective place management and classifies
the city’s needs in improvements, minimizing both time
and budget costs
Machine learning and case-based reasoning for real-time onboard prediction of the survivability of ships
The subject of damaged stability has greatly profited from the development of new tools and techniques in recent history. Specifically, the increased computational power and the probabilistic approach have transformed the subject, increasing accuracy and fidelity, hence allowing for a universal application and the inclusion of the most probable scenarios. Currently, all ships are evaluated for their stability and are expected to survive the dangers they will most likely face. However, further advancements in simulations have made it possible to further increase the fidelity and accuracy of simulated casualties. Multiple time domain and, to a lesser extent, Computational Fluid dynamins (CFD) solutions have been suggested as the next “evolutionary” step for damage stability. However, while those techniques are demonstrably more accurate, the computational power to utilize them for the task of probabilistic evaluation is not there yet. In this paper, the authors present a novel approach that aims to serve as a stopgap measure for introducing the time domain simulations in the existing framework. Specifically, the methodology presented serves the purpose of a fast decision support tool which is able to provide information regarding the ongoing casualty utilizing prior knowledge gained from simulations. This work was needed and developed for the purposes of the EU-funded project SafePASS
SafePASS Project : A Risk Modelling Tool for Passenger Ship Evacuation and Emergency Response Decision Support
One of the biggest challenges in the field of maritime safety is the integration of all the systems related to the evacuation and emergency response under one Decision Support Tool that could broadly cover all the emergency cases and assist in the co-ordination of the evacuation process. Besides, for a decision support tool to be useful we need to be able to calculate the Available time to Evacuate based on real-time data, such as the passenger distribution on board and of course based on the various sensor data that will monitor the damage and its propagation. For all the above, the risk modelling tool developed in SafePASS H2020 project is able to estimate the potential fatalities both in the design phase and in real-time, assessing the evacuation and abandonment risk dynamically, based on real-time data related to the passenger distribution, route, semantics, LSA availability, procedural changes, and damage case (fire or flooding) propagation
Predictors of paroxysmal atrial fibrillation: Analysis of 24-hour ECG Holter monitoring
main predictors of the development of this arrhythmia.
Material and methods.A single-center, cross-control study was conducted. Of all 6630 protocols analyzed, according to 24-hour ECG monitoring, AF paroxysm was detected in 97 people as an accidental finding. These patients were included in the main study group. The control group consisted of 99 patients from the same cohort without paroxysmal AF, having the anthropometric and comorbidity parameters similar to the patients of the main group.
Results.In the absolute majority (97.9%) of patients in the main group in whom paroxysmal AF was detected, a special variant of extrasystole was revealed – early atrial “P on T” type (versus 4.0% in patients in the control group) [OR 8461.648 (382.1983;187336)]. The number of supraventricular single, paired and group extrasystoles was significantly higher in the main group, but the number of ventricular extrasystoles did not differ significantly.
Conclusion.One of the main ECG predictors for the development of paroxysmal AF in asymptomatic patients is the appearance of supraventricular extrasystole of the “P on T” type. In the mechanism of formation of AF paroxysm during supraventricular extrasystole of the “P on T” type, not only electrophysiological mechanisms play a role, but also the heart biomechanics
Appropriateness criteria for cardiovascular imaging use in clinical practice: a position statement of the ESC/EACVI taskforce
There is a growing interest from the scientific community in the appropriate use of cardiovascular imaging techniques for diagnosis and decision making in Europe. To develop appropriateness criteria for cardiovascular imaging use in clinical practice in Europe, a dedicated taskforce has been appointed by the European Society of Cardiology (ESC) and the European Association of Cardiovascular Imaging (EACVI). The present paper describes the appropriateness criteria development process
On the electrification of CO2-based methanol synthesis via a reverse water–gas shift : a comparative techno-economic assessment of thermo-catalytic and plasma-assisted routes
A thorough cost analysis based on the conceptual process design of a two-step CO2-to-methanol synthesis route is performed, comprising CO2 hydrogenation in an electrified reverse water-gas shift (RWGS) reactor, followed by a conventional methanol synthesis reactor. In the former step, both thermal and nonthermal plasma reactors are considered, i.e., direct current (DC) arc and microwave (MW) plasma, respectively, and benchmarked against the conventional thermo-catalytic counterpart. It is found that employment of any type of plasma promotes higher CO2 conversions in the RWGS step than the conventional thermo-catalytic reactors (82-90 vs 61%), thereby higher single-pass methanol yields (24-27 vs 17%). This comes at the expense of higher electricity demand, which minorly affects the process economics since green H-2 utilized in RWGS and methanol synthesis is the cost driver. The economic analysis shows that the current green H-2 prices (2022 scenario) render the two-step CO2-to-methanol process economically unviable, regardless of the reactor technology used, attaining approximately a 4-fold higher levelized cost of methanol (LCOM), 1875-1900 ton(-1), compared to the state-of-the-art route, i.e., syngas production through steam methane reforming (SMR) and coal gasification, followed by WGS and methanol synthesis reactors. However, the two-step CO2-to-methanol route could be viable for a long term (2050 scenario), driven by lower costs of electricity (10 MW h(-1)) and green H-2 (1.0 kg(-1)) along with the avoided emission credits. This originates from the lower greenhouse gas (GHG) emissions that the two-step CO2-to-methanol route attains compared with the state-of-the-art. In the 2050 frame, plasma technologies are anticipated to be at least 45% more profitable than thermo-catalytic reactors, while the profitability of nonthermal plasmas will significantly improve if vacuum operation is avoided, mitigating the excessive compression energy demand and subsequently decreasing the operating cost
An assessment of electrified methanol production from an environmental perspective
How green is an electrified methanol production process? Up to 43% greenhouse gas emission curbing is possible when renewable electricity is utilized to drive a novel plasma-assisted dry methane reforming-based process.The sustainability of novel electrified processes for methanol production, including plasma-assisted and electrically heated thermocatalytic dry methane reforming based processes, is assessed. Conceptual process design is applied to obtain the life cycle inventory data to perform the ex-ante life cycle assessment, with a focus on the climate change impacts expressed in kg CO2-eq. per kg(MeOH). The plasma-assisted technology results in lower greenhouse gas emissions than the conventional thermocatalytic counterpart, when the plasma reactor itself is powered by renewable (solar or wind) electricity. This also holds for most of the environmental indicators; only a few trade-offs on (eco)toxicity, particulate matter and mineral resource indicators were found, due to the impact from wind turbine construction. For a fully electrified modus operandi, i.e. when all unit operations are electrified by renewable sources, both the plasma-assisted and thermocatalytic technologies result in low climate change impacts, in the range of 0.6-0.7 kg CO2-eq. per kg(MeOH). This is comparable to the climate change impact of CO2-based methanol production utilizing electrolytic H-2. Finally, it is estimated that up to 43% CO2 abatement may be possible by replacing the state-of-the-art (natural gas steam reforming-based) methanol production process with electrified alternatives running on renewable electricity