GMMInference.jl

GMMModel

The abstract type GMMModel is used to define methods for generic method of moments problems. To use this interface, define a concrete subtype of GMMModel and at least a specialized get_gi method. See src/models for a few examples.

# GMMInference.GMMModelType.

GMMModel

Abstract type for GMM models.

source

# GMMInference.cue_objectiveMethod.

 cue_objective(gi::Function)

Returns the CUE objective function for moment functions gi.

$Q(θ) = n (1/n \sum_i g_i(θ)) \widehat{cov}(g_i(θ))^{-1} (1/n \sum_i g_i(θ))’$

Calculates cov(gi(θ)) assuming observations are independent.

source

# GMMInference.cue_objectiveMethod.

cue_objective(model::GMMModel)

Returns the CUE objective function for model.

Calculated weighting matrix assuming observations are independent.

source

# GMMInference.gel_jump_problemFunction.

gel_jump_problem(model::GMMModel, h::Function=log)

Returns JuMP problem for generalized empirical likelihood estimation of model. h is a generalized likelihood function.

source

# GMMInference.gel_nlp_problemFunction.

gel_nlp_problem(model::GMMModel, h::Function=log)

Returns NLPModel problem for generalized empirical likelihood estimation of model. h is a generalized likelihood function.

source

# GMMInference.gel_optim_argsFunction.

gel_optim_args(model::GMMModel, h::Function=log)

Return tuple, out for calling optimize(out..., IPNewton()) afterward.

It seems that IPNewton() works better if the constraint on p is 0 ≤ sum(p) ≤ 1 instead of sum(p)=1, and you begin the optimizer from a point with sum(p) < 1.

source

# GMMInference.get_giMethod.

get_gi(model::GMMModel)

Returns a function gi(θ) where the moment condition for a GMM model is

$E[g_i(\theta)] = 0$

gi(θ) should be a number of observations by number of moment conditions matrix.

source

# GMMInference.gmm_constraintsMethod.

gmm_constraints(model::GMMModel)

Returns constraint as function or parameters, θ, where c(θ) = 0.

source

# GMMInference.gmm_jump_problemFunction.

gmm_jump_problem(model::GMMModel, obj=gmm_objective)

Constructs JuMP problem for GMMModel.

source

# GMMInference.gmm_nlp_problemFunction.

gmm_nlp_problem(model::GMMModel, obj=gmm_objective)

Constructs NLPModel for GMMModel.

source

# GMMInference.gmm_objectiveFunction.

 gmm_objective(gi::Function, W=I)

Returns $Q(θ) = n (1/n \sum_i g_i(θ)) W (1/n \sum_i g_i(θ))’$

source

# GMMInference.gmm_objectiveFunction.

gmm_objective(model::GMMModel, W=I)

Returns the GMM objective function with weighting matrix W for model.

source

# GMMInference.number_momentsMethod.

number_moments(model::GMModel)

Number of moments (columns of gi(θ)) for a GMMModel

source

# GMMInference.number_observationsMethod.

 number_observations(model::GMMModel)

Number of observations (rows of gi(θ)) for a GMMModel

source

# GMMInference.number_parametersMethod.

 number_parameters(model::GMMModel)

Number of parameters (dimension of θ) for a GMMModel.

source

IV Logit

# GMMInference.IVLogitShareType.

IVLogitShare <: GMMModel

An IVLogitShare model consists of outcomes, y ∈ (0,1), regressors x and instruments z. The moment condition is

$E[ (\log(y/(1-y)) - xβ)z ] = 0$

The dimensions of x, y, and z must be such that length(y) == size(x,1) == size(z,1) and size(x,2) ≤ size(z,2).

source

# GMMInference.IVLogitShareMethod.

IVLogitShare(n::Integer, β::AbstractVector,
                  π::AbstractMatrix, ρ)

Simulate an IVLogitShare model.

Arguments

  • n number of observations
  • β coefficients on x
  • π first stage coefficients x = z*π + v
  • ρ correlation between x[:,1] and structural error.

Returns an IVLogitShare GMMModel.

source

Panel Mixture

# GMMInference.MixturePanelType.

MixturePanel(n::Integer, t::Integer,
                  k::Integer, type_prob::AbstractVector,
                  β::AbstractMatrix, σ = 1.0)

Simulate a MixturePanel model.

Arguments

  • n individuals
  • t time periods
  • k regressors
  • type_prob probability of each type
  • β matrix (k × lenght(type_prob)) coefficients
  • σ standard deviation of ϵ

Returns a MixturePanel GMMModel.

source

# GMMInference.MixturePanelType.

MixturePanel <: GMMModel

A MixturePanel model consists of outcomes, y, regressors, x, and a number of types, ntypes. Each observation i is one of ntypes types with probability p[i]. Conditional on type,y is given by

y[i,t] = x[i,t,:]*β[:,type[i]] + ϵ[i,t]

It is assumed that ϵ is uncorrelated accross i and t and E[ϵ²]= σ².

The moment conditions used to estimate p, β and σ are

$E[ \sum_{j} x(y - xβ[:,j])p[j]] = 0$

and

$E[ y[i,t]*y[i,s] - \sum_j pj -1(t=s)σ²] = 0$

source

Random Coefficients Logit

# GMMInference.RCLogitType.

RCLogit(n::Integer, β::AbstractVector,
        π::AbstractMatrix, Σ::AbstractMatrix,
        ρ, nsim=100)

Simulates a RCLogit model.

Arguments

  • n number of observations
  • β mean coefficients on x
  • π first stage coefficients x = z*π + e
  • Σ variance of random coefficients
  • ρ correlation between x[:,1] and structural error.
  • nsim number of draws of ν for monte-carlo integration

source

# GMMInference.RCLogitType.

RCLogit <: GMMModel

A random coefficients logit model with endogeneity. An RCLogit model consists of outcomes, y ∈ (0,1), regressors x, instruments z, and random draws ν ∼ N(0,I). The moment condition is

$E[ξz] = 0$

where

$y = ∫ exp(x(β + ν) + ξ)/(1 + exp(x(β + ν) + ξ)) dΦ(ν;Σ)$

where Φ(ν;Σ) is the normal distribution with variance Σ.

The dimensions of x, y, z, and ν must be such that length(y) == size(x,1) == size(z,1) == size(ν,2) and size(ν,3) == size(x,2) ≤ size(z,2).

source

# GMMInference.gmm_constraintsMethod.

gmm_constraints(model::RCLogit)

Returns

$c(θ) = ∫ exp(x(β + ν) + ξ)/(1 + exp(x(β + ν) + ξ)) dΦ(ν;Σ) - y$

where θ = [β, uvec, ξ] with uvec = vector(cholesky(Σ).U).

The integral is computed by monte-carlo integration.

source

Inference

Inference methods for GMM problems. Currently only contains GEL based tests. See empirical likelihood for usage, background, and references.

# GMMInference.gel_pλFunction.

gel_pλ(model::GMMModel, h::Function=log)

Return a function that given parameters θ solves

$max_p sum h(p) s.t. sum(p) = 1, sum(p*gi(θ)) = 0$

The returned function gives (p(θ),λ(θ))

The returned function is not thread-safe.

source

# GMMInference.gel_testsFunction.

gel_tests(θ, model::GMMModel, h::Function=log)

Computes GEL test statistics for H₀ : θ = θ₀. Returns a tuple containing statistics and p-values.

source

We plan to also add AR, KLM, and CLR methods for GMM. See identification robust inference.