568 research outputs found

    Numerical simulation of microwave heating of a target with temperature dependent electrical properties in a single-mode cavity

    Get PDF
    This dissertation extends the work done by Hue and Kriegsmann in 1998 on microwave heating of a ceramic sample in a single-mode waveguide cavity. In that work, they devised a method combining asymptotic and numerical techniques to speed up the computation of electromagnetic fields inside a high-Q cavity in the presence of low-loss target. In our problem, the dependence of the electrical conductivity on temperature increases the complexity of the problem. Because the electrical conductivity depends on temperature, the electromagnetic fields must be recomputed as the temperature varies. We then solve the coupled heat equation and Maxwell\u27s equations to determine the history and distribution of the temperature in the ceramic sample. This complication increases the overall computational effort required by several orders of magnitude. In their work, Hile and Kriegsmann used the established technique of solving the time-dependent Maxwell\u27s equations with the finite-difference time domain method (FDTD) until a time-harmonic steady state is obtained. Here we replace this technique with a more direct solution of a finite-difference approximation of the Helmholtz equation. The system of equations produced by this finite-difference approximation has a matrix that is large and non-Hermitian. However, we find that it may be splitted into the sum of a real symmetric matrix and a relatively low-rank matrix. The symmetric system represents the discretization of Helmholtz equation inside an empty and truncated waveguide; this system can be solved efficiently with the conjugate gradient method or fast Fourier transform. The low-rank matrix carries the information at the truncated boundaries of the waveguide and the properties of the sample. The rank of this matrix is approximately the sum of twice the number of grid spacings across waveguide and the number of grid points in the target. As a result of the splitting, we can handle this part of the problem by solving a system having as many unknowns as the rank of this matrix. With the above algorithmic innovations, substantial computational efficiencies have been obtained. We demonstrate the heating of a target having a temperature dependent electrical conductivity. Comparison with computations for constant electrical conductivity demonstrate significant difference in the heating histories. The computational complexity of our approach in comparison with that of using the FDTD solver favors the FDTD method when ultra-fine grids are used. However, in cases where grids are refined simply to reduce asymptotic truncation error, our method can retain its advantages by reducing truncation error through higher-order discretization of the Helmholtz operator

    Mixtures, Moments And Information: Three Essays In Econometrics

    Get PDF
    This thesis is a collection of three independent essays in econometrics.;The first essay uses the empirical characteristic function (ECF) procedure to estimate the parameters of mixtures of normal distributions and switching regression models. The ECF procedure was formally proposed by Feuerverger and Mureika (1977), Heathcote (1977). Since the characteristic function is uniformly bounded, the procedure gives estimates that are numerically stable. Furthermore, it is also shown that the finite sample properties of the ECF estimator are very good, even in the case where the popular maximum likelihood fails to exist.;The second essay applies White\u27s (1982) information matrix (IM) test to a stationary and invertible autoregressive moving average (ARMA) process. Our result indicates that, for ARMA specification, the derived covariance matrix of the indicator vector is not block diagonal implying the algebraic structure of the IM test is more complicated than other cases previously analyzed in the literature (see for example Hall (1987), Bera and Lee (1993)). Our derived IM test turns out to be a joint specification test of parameter heterogeneity (i.e. test for random coefficient or conditional heteroskedasticity) of the specified model and normality.;The final essay compares, using Monte Carlo simulation, the generalized method of moments (GMM) and quasi-maximum likelihood (QML) estimators of the parameter of a simple linear regression model with autoregressive conditional heteroskedastic (ARCH) disturbances. The results reveal that GMM estimates are often biased (apparently due to poor instruments), statistically insignificant, and dynamically unstable (especially the parameters of the ARCH process). On the other hand, QML estimates are generally unbiased, statistically significant and dynamically stable. Asymptotic standard errors for QML are 2 to 6 times smaller than for GMM, depending on the choice of the instruments

    Quantifying and explaining parameter heterogeneity in the capital regulation-bank risk nexus

    Get PDF
    By examining the impact of capital regulation on bank risk-taking using a local estimation technique, we are able to quantify the heterogeneous response of banks towards this type of regulation in banking sectors of western-type economies. Subsequently, using this information on the bank-level responses to capital regulation, we examine the sources of heterogeneity. The findings suggest that the impact of capital regulation on bank risk is very heterogeneous across banks and the sources of this heterogeneity can be traced into both bank and industry characteristics, as well as into the macroeconomic conditions. Therefore, the present analysis has important implications on the way bank regulation is conducted, as it suggests that common capital regulatory umbrellas may not be sufficient to promote financial stability. On the basis of our findings, we contend that Basel guidelines may have to be reoriented towards more flexible, country-specific policy proposals that focus on the restraint of excess risk-taking by banks.Capital regulation; risk-taking of banks; local generalized method of moments

    Estimation and Inference for Threshold Effects in Panel Data Stochastic Frontier Models

    Get PDF
    One of the most enduring problems in cross-section or panel data models is heterogeneity among individual observations. Different approaches have been proposed to deal with this issue, but threshold regression models offer intuitively appealing econometric methods to account for heterogeneity. We propose three different estimators that can accommodate multiple thresholds. The first two, allowing respectively for fixed and random effects, assume that the firms specific inefficiency scores are time-invariant while the third one allows for time-varying inefficiency scores. We rely on a likelihood ratio test with m − 1 regimes under the null against m regimes. Testing for threshold effects is problematic because of the presence of a nuisance parameter which is not identified under the null hypothesis. This is known as Davies problem. We apply procedures pioneered by Hansen (1999) to test for the presence of threshold effects and to obtain a confidence set for the threshold parameter. These procedures specifically account for Davies problem and are based on non-standard asymptotic theory. Finally, we perform an empirical application of the fixed effects model on a panel of Quebec dairy farms. The specifications involving a trend and the Cobb- Douglas and Translog functional forms support three thresholds or four regimes based on farm size. The efficiency scores vary between 0.95 and 1 in models with and without thresholds. Therefore, productivity differences across farm sizes are most likely due to technological heterogeneity.Stochastic frontier models, threshold regression, technical efficiency, bootstrap, dairy production, C12, C13, C23, C52, Research Methods/ Statistical Methods,

    Pecuniary, Non-Pecuniary, and Downstream Research Spillovers: The Case of Canola

    Get PDF
    This paper develops an empirical framework for estimating a number of inter-firm and downstream research spillovers in the canola crop research industry. The spillovers include basic research, human capital/ knowledge (as measured through other-firm expenditures), and genetics (as measured through yields of other-firms). The model used to examine spillover effects on research productivity provides evidence that there are many positive inter-firm non-pecuniary research spillovers, which is consistent with a research clustering effect. The second model, which examines spillovers at the level of firm revenue , shows that, while private firms tend to crowd one another, public firm expenditure on basic and applied research creates a crowding-in effect for private firms. This model also shows that enhanced intellectual property rights have increased the revenues of private firms. The third model, which examines social value of each firm's output, provides evidence that downstream research spillovers remain important in this modern crop research industry.basic research, applied research, public research expenditures, private research expenditures, biotechnology, Research and Development/Tech Change/Emerging Technologies, O3,
    corecore