Implementing frequency response using grid-connected inverters is one of the
popular proposed alternatives to mitigate the dynamic degradation experienced
in low inertia power systems. However, such solution faces several challenges
as inverters do not intrinsically possess the natural response to power
fluctuations that synchronous generators have. Thus, to synthetically generate
this response, inverters need to take frequency measurements, which are usually
noisy, and subsequently make changes in the output power, which are therefore
delayed. This paper explores the system-wide performance tradeoffs that arise
when measurement noise, power disturbances, and delayed actions are considered
in the design of dynamic controllers for grid-connected inverters. Using a
recently proposed dynamic droop (iDroop) control for grid-connected inverters,
which is inspired by classical first order lead-lag compensation, we show that
the sets of parameters that result in highest noise attenuation, power
disturbance mitigation, and delay robustness do not necessarily have a common
intersection. In particular, lead compensation is desired in systems where
power disturbances are the predominant source of degradation, while lag
compensation is a better alternative when the system is dominated by delays or
frequency noise. Our analysis further shows that iDroop can outperform the
standard droop alternative in both joint noise and disturbance mitigation, and
delay robustness