109 research outputs found
Efficient estimation of heterogeneous coefficients in panel data models with common shock
This paper investigates efficient estimation of heterogeneous coefficients in panel data models with common shocks,
which have been a particular focus of recent theoretical and empirical literature. We propose a new two-step method
to estimate the heterogeneous coefficients. In the first step, the maximum likelihood (ML) method is first conducted
to estimate the loadings and idiosyncratic variances. The second step estimates the heterogeneous coefficients
by using the structural relations implied by the model and replacing the unknown parameters with their ML estimates.
We establish the asymptotic theory of our estimator, including consistency, asymptotic representation, and limiting
distribution. The two-step estimator is asymptotically efficient in the sense that it has the same limiting distribution
as the infeasible generalized least squares (GLS) estimator. Intensive Monte Carlo simulations show that the proposed
estimator performs robustly in a variety of data setups
Efficient estimation of heterogeneous coefficients in panel data models with common shock
This paper investigates efficient estimation of heterogeneous coefficients in panel data models with common shocks,
which have been a particular focus of recent theoretical and empirical literature. We propose a new two-step method
to estimate the heterogeneous coefficients. In the first step, the maximum likelihood (ML) method is first conducted
to estimate the loadings and idiosyncratic variances. The second step estimates the heterogeneous coefficients
by using the structural relations implied by the model and replacing the unknown parameters with their ML estimates.
We establish the asymptotic theory of our estimator, including consistency, asymptotic representation, and limiting
distribution. The two-step estimator is asymptotically efficient in the sense that it has the same limiting distribution
as the infeasible generalized least squares (GLS) estimator. Intensive Monte Carlo simulations show that the proposed
estimator performs robustly in a variety of data setups
Quasi Maximum Likelihood Analysis of High Dimensional Constrained Factor Models
Factor models have been widely used in practice. However, an undesirable feature of a high dimensional factor model is that the model has too many parameters. An effective way to address this issue, proposed in a seminar work by Tsai and Tsay (2010),
is to decompose the loadings matrix by a high-dimensional known matrix multiplying with a low-dimensional unknown matrix, which Tsai and Tsay (2010) name constrained factor models. This paper investigates the estimation and inferential theory of constrained factor models under large-N and large-T setup, where N denotes the number of cross sectional units and T the time periods. We propose using the quasi maximum likelihood method to estimate the model and investigate the asymptotic
properties of the quasi maximum likelihood estimators, including consistency, rates of convergence and limiting distributions. A new statistic is proposed for testing the null
hypothesis of constrained factor models against the alternative of standard factor models. Partially constrained factor models are also investigated. Monte carlo simulations confirm our theoretical results and show that the quasi maximum likelihood estimators and the proposed new statistic perform well in finite samples. We also consider the extension to an approximate constrained factor model where the idiosyncratic errors are allowed to be weakly dependent processes
Quasi Maximum Likelihood Analysis of High Dimensional Constrained Factor Models
Factor models have been widely used in practice. However, an undesirable feature of a high dimensional factor model is that the model has too many parameters. An effective way to address this issue, proposed in a seminar work by Tsai and Tsay (2010),
is to decompose the loadings matrix by a high-dimensional known matrix multiplying with a low-dimensional unknown matrix, which Tsai and Tsay (2010) name constrained factor models. This paper investigates the estimation and inferential theory of constrained factor models under large-N and large-T setup, where N denotes the number of cross sectional units and T the time periods. We propose using the quasi maximum likelihood method to estimate the model and investigate the asymptotic
properties of the quasi maximum likelihood estimators, including consistency, rates of convergence and limiting distributions. A new statistic is proposed for testing the null
hypothesis of constrained factor models against the alternative of standard factor models. Partially constrained factor models are also investigated. Monte carlo simulations confirm our theoretical results and show that the quasi maximum likelihood estimators and the proposed new statistic perform well in finite samples. We also consider the extension to an approximate constrained factor model where the idiosyncratic errors are allowed to be weakly dependent processes
Estimation and inference of FAVAR models
The factor-augmented vector autoregressive (FAVAR) model, first proposed by Bernanke, Bovin, and Eliasz (2005, QJE), is now widely used in macroeconomics and finance. In this model, observable and unobservable factors jointly follow a vector autoregressive process, which further drives the comovement of a large number of observable variables. We study the identification restrictions in the presence of observable factors. We propose a likelihood-based two-step method to estimate the FAVAR model that explicitly accounts for factors being partially observed. We then provide an inferential theory for the estimated factors, factor loadings and the dynamic parameters in the VAR process. We show how and why the limiting distributions are different from the existing results
Estimation and inference of FAVAR models
The factor-augmented vector autoregressive (FAVAR) model, first proposed by Bernanke, Bovin, and Eliasz (2005, QJE), is now widely used in macroeconomics and finance. In this model, observable and unobservable factors jointly follow a vector autoregressive process, which further drives the comovement of a large number of observable variables. We study the identification restrictions in the presence of observable factors. We propose a likelihood-based two-step method to estimate the FAVAR model that explicitly accounts for factors being partially observed. We then provide an inferential theory for the estimated factors, factor loadings and the dynamic parameters in the VAR process. We show how and why the limiting distributions are different from the existing results
Planimetría de alta resolución del dolmen de Menga (Antequera, Málaga) mediante escaneado láser terrestre, levantamiento 3D y fotogrametría
Dielectric metasurfaces can achieve flexible beam manipulations. Herein, we study dielectric metasurfaces with different refractive indices, periods, incident angles, and cross-sectional shapes to determine the metasurface working mechanisms. Perfect transmission mainly depends on multipolar interference that can be used to control the transmission modes through the hybrid periods, hybrid cross sections, and multilayers. Perfect reflection is strongly influenced by the period of the metasurface and occurs only when the period is shorter than incident wavelength, which can be attributed to the lattice coupling. Furthermore, lattice coupling can be classified into two types with distinct properties: vertical mode and horizontal mode coupling. The vertical mode appears when the effective wavelength matches the feature size, whereas the horizontal mode only appears when the incident wavelength is close to the period. The horizontal mode is sensitive to the incident angle. The revealed functioning mechanisms enable further practical applications of metasurfaces
Beam Manipulation Mechanisms of Dielectric Metasurfaces
Dielectric metasurfaces can achieve flexible beam manipulations. Herein, we study dielectric metasurfaces with different refractive indices, periods, incident angles, and cross-sectional shapes to determine the metasurface working mechanisms. Perfect transmission mainly depends on multipolar interference that can be used to control the transmission modes through the hybrid periods, hybrid cross sections, and multilayers. Perfect reflection is strongly influenced by the period of the metasurface and occurs only when the period is shorter than incident wavelength, which can be attributed to the lattice coupling. Furthermore, lattice coupling can be classified into two types with distinct properties: vertical mode and horizontal mode coupling. The vertical mode appears when the effective wavelength matches the feature size, whereas the horizontal mode only appears when the incident wavelength is close to the period. The horizontal mode is sensitive to the incident angle. The revealed functioning mechanisms enable further practical applications of metasurfaces
Hybrid Dual and Meet-LWE Attack
The Learning with Errors (LWE) problem is one of the most prominent problems in lattice-based cryptography. Many practical LWE-based schemes, including Fully Homomorphic encryption (FHE), use sparse ternary secret for the sake of efficiency. Several (hybrid) attacks have been proposed that benefit from such sparseness, thus researchers believe the security of the schemes with sparse ternary secrets is not well-understood yet. Recently, May [Crypto 2021] proposed an efficient meet-in-the-middle attack named Meet-LWE for LWE with ternary se- cret, which significantly improves Odlyzko’s algorithm. In this work, we generalize May’s Meet-LWE and then introduce a new hybrid attack which combines Meet-LWE with lattice dual attack. We implement our algorithm to FHE-type parameters of LWE problem and compare it with the previous hybrid dual attacks. The result shows that our attack outperforms other attacks in a large range of parameters. We note that our attack has no impact on the LWE-based schemes in the PQC Standardization held by NIST as their secrets are not sparse and/or ternary
- …