Related references
Note: Only part of the references are listed.On divergence tests for composite hypotheses under composite likelihood
N. Martin et al.
STATISTICAL PAPERS (2019)
A Robust Wald-Type Test for Testing the Equality of Two Means from Log-Normal Samples
Ayanendranath Basu et al.
METHODOLOGY AND COMPUTING IN APPLIED PROBABILITY (2019)
Robust Inference after Random Projections via Hellinger Distance for Location-Scale Family
Lei Li et al.
ENTROPY (2019)
Composite Tests under Corrupted Data
Michel Broniatowski et al.
ENTROPY (2019)
Composite Likelihood Methods Based on Minimum Density Power Divergence Estimator
Elena Castilla et al.
ENTROPY (2018)
New robust statistical procedures for the polytomous logistic regression models
Elena Castilla et al.
BIOMETRICS (2018)
φ-Divergence in Contingency Table Analysis
Maria Kateri
ENTROPY (2018)
Minimum Penalized φ-Divergence Estimation under Model Misspecification
M. Virtudes Alba-Fernandez et al.
ENTROPY (2018)
Robustness Property of Robust-BD Wald-Type Test for Varying-Dimensional General Linear Models
Xiao Guo et al.
ENTROPY (2018)
Robust Estimation for the Single Index Model Using Pseudodistances
Aida Toma et al.
ENTROPY (2018)
A Generalized Relative (α, β)-Entropy: Geometric Properties and Applications to Robust Statistical Inference
Abhik Ghosh et al.
ENTROPY (2018)
Non-Quadratic Distances in Model Assessment
Marianthi Markatou et al.
ENTROPY (2018)
Robust Wald-type tests for non-homogeneous observations based on the minimum density power divergence estimator
Ayanendranath Basu et al.
METRIKA (2018)
Robust Relative Error Estimation
Kei Hirose et al.
ENTROPY (2018)
Convex Optimization via Symmetrical Holder Divergence for a WLAN Indoor Positioning System
Osamah Abdullah
ENTROPY (2018)
Likelihood Ratio Testing under Measurement Errors
Michel Broniatowski et al.
ENTROPY (2018)
Asymptotic Properties for Methods Combining the Minimum Hellinger Distance Estimate and the Bayesian Nonparametric Density Estimate
Yuefeng Wu et al.
ENTROPY (2018)
Robust and Sparse Regression via γ-Divergence
Takayuki Kawashima et al.
ENTROPY (2017)
Robust-BD Estimation and Inference for General Partially Linear Models
Chunming Zhang et al.
ENTROPY (2017)
A Wald-type test statistic for testing linear hypothesis in logistic regression models based on minimum density power divergence estimator
Ayanendranath Basu et al.
ELECTRONIC JOURNAL OF STATISTICS (2017)
The logarithmic super divergence and asymptotic inference properties
Avijit Maji et al.
ASTA-ADVANCES IN STATISTICAL ANALYSIS (2016)
Robust estimation under heavy contamination using unnormalized models
Takafumi Kanamori et al.
BIOMETRIKA (2015)
Robust tests for the equality of two normal means based on the density power divergence
A. Basu et al.
METRIKA (2015)
ROBUST-BD ESTIMATION AND INFERENCE FOR VARYING-DIMENSIONAL GENERAL LINEAR MODELS
Chunming Zhang et al.
STATISTICA SINICA (2014)
NEARLY UNBIASED VARIABLE SELECTION UNDER MINIMAX CONCAVE PENALTY
Cun-Hui Zhang
ANNALS OF STATISTICS (2010)
Robust parameter estimation with a small bias against heavy contamination
Hironori Fujisawa et al.
JOURNAL OF MULTIVARIATE ANALYSIS (2008)
Profile likelihood inferences on semiparametric varying-coefficient partially linear models
JQ Fan et al.
BERNOULLI (2005)
Variable selection via nonconcave penalized likelihood and its oracle properties
JQ Fan et al.
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION (2001)
Robust inference with GMM estimators
E Ronchetti et al.
JOURNAL OF ECONOMETRICS (2001)