The scheme of the data acquisition (DAQ) architecture in High Energy Physics
(HEP) experiments consist of data transport from the front-end electronics
(FEE) of the online detectors to the readout units (RU), which perform online
processing of the data, and then to the data storage for offline analysis. With
major upgrades of the Large Hadron Collider (LHC) experiments at CERN, the data
transmission rates in the DAQ systems are expected to reach a few TB/sec within
the next few years. These high rates are normally associated with the increase
in the high-frequency losses, which lead to distortion in the detected signal
and degradation of signal integrity. To address this, we have developed an
optimization technique of the multi-gigabit transceiver (MGT) and implemented
it on the state-of-the-art 20nm Arria-10 FPGA manufactured by Intel Inc. The
setup has been validated for three available high-speed data transmission
protocols, namely, GBT, TTC-PON and 10 Gbps Ethernet. The improvement in the
signal integrity is gauged by two metrics, the Bit Error Rate (BER) and the Eye
Diagram. It is observed that the technique improves the signal integrity and
reduces BER. The test results and the improvements in the metrics of signal
integrity for different link speeds are presented and discussed