GMMInference.jl¶
GMMModel¶
The abstract type GMMModel
is used to define methods for generic method of moments problems. To use this interface, define a concrete subtype of GMMModel
and at least a specialized get_gi
method. See src/models
for a few examples.
#
GMMInference.GMMModel
— Type.
GMMModel
Abstract type for GMM models.
#
GMMInference.cue_objective
— Method.
cue_objective(gi::Function)
Returns the CUE objective function for moment functions gi.
$Q(θ) = n (1/n \sum_i g_i(θ)) \widehat{cov}(g_i(θ))^{-1} (1/n \sum_i g_i(θ))’$
Calculates cov(gi(θ)) assuming observations are independent.
#
GMMInference.cue_objective
— Method.
cue_objective(model::GMMModel)
Returns the CUE objective function for model
.
Calculated weighting matrix assuming observations are independent.
#
GMMInference.gel_jump_problem
— Function.
gel_jump_problem(model::GMMModel, h::Function=log)
Returns JuMP problem for generalized empirical likelihood estimation of model
. h
is a generalized likelihood function.
#
GMMInference.gel_nlp_problem
— Function.
gel_nlp_problem(model::GMMModel, h::Function=log)
Returns NLPModel problem for generalized empirical likelihood estimation of model
. h
is a generalized likelihood function.
#
GMMInference.gel_optim_args
— Function.
gel_optim_args(model::GMMModel, h::Function=log)
Return tuple, out
for calling optimize(out..., IPNewton())
afterward.
It seems that IPNewton() works better if the constraint on p is 0 ≤ sum(p) ≤ 1 instead of sum(p)=1, and you begin the optimizer from a point with sum(p) < 1.
#
GMMInference.get_gi
— Method.
get_gi(model::GMMModel)
Returns a function gi(θ)
where the moment condition for a GMM model is
$E[g_i(\theta)] = 0$
gi(θ)
should be a number of observations by number of moment conditions matrix.
#
GMMInference.gmm_constraints
— Method.
gmm_constraints(model::GMMModel)
Returns constraint as function or parameters, θ
, where c(θ) = 0
.
#
GMMInference.gmm_jump_problem
— Function.
gmm_jump_problem(model::GMMModel, obj=gmm_objective)
Constructs JuMP problem for GMMModel.
#
GMMInference.gmm_nlp_problem
— Function.
gmm_nlp_problem(model::GMMModel, obj=gmm_objective)
Constructs NLPModel for GMMModel.
#
GMMInference.gmm_objective
— Function.
gmm_objective(gi::Function, W=I)
Returns $Q(θ) = n (1/n \sum_i g_i(θ)) W (1/n \sum_i g_i(θ))’$
#
GMMInference.gmm_objective
— Function.
gmm_objective(model::GMMModel, W=I)
Returns the GMM objective function with weighting matrix W
for model
.
#
GMMInference.number_moments
— Method.
number_moments(model::GMModel)
Number of moments (columns of gi(θ)) for a GMMModel
#
GMMInference.number_observations
— Method.
number_observations(model::GMMModel)
Number of observations (rows of gi(θ)) for a GMMModel
#
GMMInference.number_parameters
— Method.
number_parameters(model::GMMModel)
Number of parameters (dimension of θ) for a GMMModel.
IV Logit¶
#
GMMInference.IVLogitShare
— Type.
IVLogitShare <: GMMModel
An IVLogitShare
model consists of outcomes, y
∈ (0,1), regressors x
and instruments z
. The moment condition is
$E[ (\log(y/(1-y)) - xβ)z ] = 0$
The dimensions of x
, y
, and z
must be such that length(y) == size(x,1) == size(z,1)
and size(x,2) ≤ size(z,2)
.
#
GMMInference.IVLogitShare
— Method.
IVLogitShare(n::Integer, β::AbstractVector,
π::AbstractMatrix, ρ)
Simulate an IVLogitShare model.
Arguments
n
number of observationsβ
coefficients onx
π
first stage coefficientsx = z*π + v
ρ
correlation between x[:,1] and structural error.
Returns an IVLogitShare GMMModel.
Panel Mixture¶
#
GMMInference.MixturePanel
— Type.
MixturePanel(n::Integer, t::Integer,
k::Integer, type_prob::AbstractVector,
β::AbstractMatrix, σ = 1.0)
Simulate a MixturePanel model.
Arguments
n
individualst
time periodsk
regressorstype_prob
probability of each typeβ
matrix(k × lenght(type_prob))
coefficientsσ
standard deviation ofϵ
Returns a MixturePanel GMMModel.
#
GMMInference.MixturePanel
— Type.
MixturePanel <: GMMModel
A MixturePanel
model consists of outcomes, y
, regressors, x
, and a number of types, ntypes
. Each observation i
is one of ntypes
types with probability p[i]
. Conditional on type,y
is given by
y[i,t] = x[i,t,:]*β[:,type[i]] + ϵ[i,t]
It is assumed that ϵ
is uncorrelated accross i
and t
and E[ϵ²]= σ².
The moment conditions used to estimate p
, β
and σ
are
$E[ \sum_{j} x(y - xβ[:,j])p[j]] = 0$
and
$E[ y[i,t]*y[i,s] - \sum_j pj -1(t=s)σ²] = 0$
Random Coefficients Logit¶
#
GMMInference.RCLogit
— Type.
RCLogit(n::Integer, β::AbstractVector,
π::AbstractMatrix, Σ::AbstractMatrix,
ρ, nsim=100)
Simulates a RCLogit model.
Arguments
n
number of observationsβ
mean coefficients onx
π
first stage coefficientsx = z*π + e
Σ
variance of random coefficientsρ
correlation between x[:,1] and structural error.nsim
number of draws ofν
for monte-carlo integration
#
GMMInference.RCLogit
— Type.
RCLogit <: GMMModel
A random coefficients logit model with endogeneity. An RCLogit
model consists of outcomes, y
∈ (0,1), regressors x
, instruments z
, and random draws ν ∼ N(0,I)
. The moment condition is
$E[ξz] = 0$
where
$y = ∫ exp(x(β + ν) + ξ)/(1 + exp(x(β + ν) + ξ)) dΦ(ν;Σ)$
where Φ(ν;Σ) is the normal distribution with variance Σ.
The dimensions of x
, y
, z
, and ν
must be such that length(y) == size(x,1) == size(z,1) == size(ν,2)
and size(ν,3) == size(x,2) ≤ size(z,2)
.
#
GMMInference.gmm_constraints
— Method.
gmm_constraints(model::RCLogit)
Returns
$c(θ) = ∫ exp(x(β + ν) + ξ)/(1 + exp(x(β + ν) + ξ)) dΦ(ν;Σ) - y$
where θ = [β, uvec, ξ]
with uvec = vector(cholesky(Σ).U)
.
The integral is computed by monte-carlo integration.
Inference¶
Inference methods for GMM problems. Currently only contains GEL based tests. See empirical likelihood for usage, background, and references.
#
GMMInference.gel_pλ
— Function.
gel_pλ(model::GMMModel, h::Function=log)
Return a function that given parameters θ solves
$max_p sum h(p) s.t. sum(p) = 1, sum(p*gi(θ)) = 0$
The returned function gives (p(θ),λ(θ))
The returned function is not thread-safe.
#
GMMInference.gel_tests
— Function.
gel_tests(θ, model::GMMModel, h::Function=log)
Computes GEL test statistics for H₀ : θ = θ₀. Returns a tuple containing statistics and p-values.
We plan to also add AR, KLM, and CLR methods for GMM. See identification robust inference.