menu_book Explore the article's raw data

PENALIZED JACKKNIFE EMPIRICAL LIKELIHOOD IN HIGH DIMENSIONS

Abstract

The jackknife empirical likelihood (JEL) is an attractive approach for statistical inferences with nonlinear statistics, such as U-statistics. However, most contemporary problems involve high-dimensional model selection and, thus, the feasibility of this approach in theory and practice remains largely unexplored in situations in which the number of parameters diverges to infinity. In this paper, we propose a penalized JEL method that preserves the main advantages of the JEL and leads to reliable variable selection based on estimating equations with a U statistic structure in high-dimensional settings. Under certain regularity conditions, we establish the asymptotic theory and oracle property for the JEL and its penalized version when the numbers of estimating equations and parameters increase with the sample size. Simulation studies and a real-data analysis are used to examine the performance of the proposed methods and illustrate its practical utility.

article Article
date_range 2023
language English
link Link of the paper
format_quote
Sorry! There is no raw data available for this article.
Loading references...
Loading citations...
Featured Keywords

Estimating equations
high-dimensional data analysis
jack-knife empirical likelihood
penalized likelihood
U-statistics
variable selection
Citations by Year

Share Your Research Data, Enhance Academic Impact