5 research outputs found
Poster: A Real-World Distributed Infrastructure for Processing Financial Data at Scale
Financial markets are event- and data-driven to an extremely high degree. For
making decisions and triggering actions stakeholders require notifications
about significant events and reliable background information that meet their
individual requirements in terms of timeliness, accuracy, and completeness. As
one of Europe's leading providers of financial data and regulatory solutions
vwd processes an average of 18 billion event notifications from 500+ data
sources for 30 million symbols per day. Our large-scale distributed event-based
systems handle daily peak rates of 1+ million event notifications per second
and additional load generated by singular pivotal events with global impact. In
this poster we give practical insights into our IT systems. We outline the
infrastructure we operate and the event-driven architecture we apply at vwd. In
particular we showcase the (geo)distributed publish/subscribe broker network we
operate across locations and countries to provide market data to our customers
with varying quality of information (QoI) properties.Comment: Authors' version of the accepted submission; final version published
by ACM as part of the proceedings of DEBS '19: The 13th ACM International
Conference on Distributed and Event-based Systems (DEBS '19); 2 pages, 1
figure; vwd Vereinigte Wirtschaftsdienste GmbH is by now known as Infront
Financial Technology GmbH (part of the Infront group
Poster: Benchmarking Financial Data Feed Systems
Data-driven solutions for the investment industry require event-based backend
systems to process high-volume financial data feeds with low latency, high
throughput, and guaranteed delivery modes.
At vwd we process an average of 18 billion incoming event notifications from
500+ data sources for 30 million symbols per day and peak rates of 1+ million
notifications per second using custom-built platforms that keep audit logs of
every event.
We currently assess modern open source event-processing platforms such as
Kafka, NATS, Redis, Flink or Storm for the use in our ticker plant to reduce
the maintenance effort for cross-cutting concerns and leverage hybrid
deployment models. For comparability and repeatability we benchmark candidates
with a standardized workload we derived from our real data feeds.
We have enhanced an existing light-weight open source benchmarking tool in
its processing, logging, and reporting capabilities to cope with our workloads.
The resulting tool wrench can simulate workloads or replay snapshots in volume
and dynamics like those we process in our ticker plant. We provide the tool as
open source.
As part of ongoing work we contribute details on (a) our workload and
requirements for benchmarking candidate platforms for financial feed
processing; (b) the current state of the tool wrench.Comment: Authors' version of the accepted submission; final version published
by ACM as part of the proceedings of DEBS '19: The 13th ACM International
Conference on Distributed and Event-based Systems (DEBS '19); 2 pages, 2
figure
Managing the Complexity of Processing Financial Data at Scale -- an Experience Report
Financial markets are extremely data-driven and regulated. Participants rely
on notifications about significant events and background information that meet
their requirements regarding timeliness, accuracy, and completeness. As one of
Europe's leading providers of financial data and regulatory solutions vwd
processes a daily average of 18 billion notifications from 500+ data sources
for 30 million symbols. Our large-scale geo-distributed systems handle daily
peak rates of 1+ million notifications/sec. In this paper we give practical
insights about the different types of complexity we face regarding the data we
process, the systems we operate, and the regulatory constraints we must comply
with. We describe the volume, variety, velocity, and veracity of the data we
process, the infrastructure we operate, and the architecture we apply. We
illustrate the load patterns created by trading and how the markets' attention
to the Brexit vote and similar events stressed our systems.Comment: 12 pages, 2 figures, to be published in the proceedings of the 10th
Complex Systems Design & Management conference (CSD&M'19) by Springe
Deriving a realistic workload model to simulate high-volume financial data feeds for performance benchmarking
Processing financial market data at scale and in real-time poses a set of unique challenges to event-driven architectures due to the volume, variety, velocity, and veracity of the enclosed information on top of other constraints. Reproducible stress tests at scale using configurable benchmarks are key to building and tuning suitable processing systems. Available benchmarks, however, lack realistic and configurable workload models for market data scenarios. In previous work we already addressed this gap by describing the specific challenges of processing financial data at scale and by introducing a modular open-source benchmarking framework. This paper makes two contributions to the ongoing challenge of building realistic benchmarks for the financial data processing domain by outlining: (a) a detailed statistical analysis of real-world financial market data feeds processed on a global scale by Infront Financial Technology GmbH; and (b) a simple workload model built on this analysis to simulate high-volume market data feeds with their distinctive characteristics to be used in benchmarks. We evaluate our model using the DEBS 2022 Grand Challenge data set Trading Data
Physics book: CRYRING@ESR
The exploration of the unique properties of stored and cooled beams of highly-charged ions as provided by heavy-ion storage rings has opened novel and fascinating research opportunities in the realm of atomic and nuclear physics research. Since the late 1980s, pioneering work has been performed at the CRYRING at Stockholm (Abrahamsson et al. 1993) and at the Test Storage Ring (TSR) at Heidelberg (Baumann et al. 1988). For the heaviest ions in the highest charge-states, a real quantum jump was achieved in the early 1990s by the commissioning of the Experimental Storage Ring (ESR) at GSI Helmholtzzentrum für Schwerionenforschung (GSI) in Darmstadt (Franzke 1987) where challenging experiments on the electron dynamics in the strong field regime as well as nuclear physics studies on exotic nuclei and at the borderline to atomic physics were performed. Meanwhile also at Lanzhou a heavy-ion storage ring has been taken in operation, exploiting the unique research opportunities in particular for medium-heavy ions and exotic nuclei (Xia et al. 2002)