Kernel deconvolution estimation¶
while method of moments and quasi-likelihood approaches require additional distributional assumptions for the inefficiency and noise terms, a fully nonparametric estimation of the expected inefficiency \(\mu\) is also available by applying nonparametric kernel deconvolution, proposed by Hall and Simar (2002). Note that the residuals \(\hat{\varepsilon}_i^{CNLS}\) are consistent estimators of \(e^o = \varepsilon_i + \mu\) (for production model). Following Kuosmanen and Johnson (2017), the density function of \({e^o}\)
where \(K(\cdot)\) is a compactly supported kernel and \(h\) is a bandwidth. citet{Hall2002} show that the first derivative of the density function of the composite error term (\(f_\varepsilon^{'}\))is proportional to that of the inefficiency term (\(f_u^{'}\)) in the neighborhood of \(\mu\). Therefore, a robust nonparametric estimator of expected inefficiency \(\mu\) is obtained as
where \(C\) is a closed interval in he right tail of \(f_{e^o}\).
Example: StoNED with CNLS [.ipynb]¶
In the following code, we use the kernel density approach to decompose the CNLS residuals and display the unconditional expected inefficiency.
# import packages
from pystoned import CNLS, StoNED
from pystoned.dataset import load_Finnish_electricity_firm
from pystoned.constant import CET_MULT, FUN_COST, RTS_VRS, RED_KDE
# import Finnish electricity distribution firms data
data = load_Finnish_electricity_firm(x_select=['Energy', 'Length', 'Customers'],
y_select=['TOTEX'])
# build and optimize the CNLS model
model = CNLS.CNLS(data.y, data.x, z=None, cet=CET_MULT, fun=FUN_COST, rts=RTS_VRS)
model.optimize('email@address')
# calculate and print unconditional expected inefficiency (mu)
rd = StoNED.StoNED(model)
print(rd.get_unconditional_expected_inefficiency(RED_KDE))