TY - JOUR A1 - Hartung, Niklas A1 - Wahl, Martin A1 - Rastogi, Abhishake A1 - Huisinga, Wilhelm T1 - Nonparametric goodness-of-fit testing for parametric covariate models in pharmacometric analyses JF - CPT: pharmacometrics & systems pharmacology N2 - The characterization of covariate effects on model parameters is a crucial step during pharmacokinetic/pharmacodynamic analyses. Although covariate selection criteria have been studied extensively, the choice of the functional relationship between covariates and parameters, however, has received much less attention. Often, a simple particular class of covariate-to-parameter relationships (linear, exponential, etc.) is chosen ad hoc or based on domain knowledge, and a statistical evaluation is limited to the comparison of a small number of such classes. Goodness-of-fit testing against a nonparametric alternative provides a more rigorous approach to covariate model evaluation, but no such test has been proposed so far. In this manuscript, we derive and evaluate nonparametric goodness-of-fit tests for parametric covariate models, the null hypothesis, against a kernelized Tikhonov regularized alternative, transferring concepts from statistical learning to the pharmacological setting. The approach is evaluated in a simulation study on the estimation of the age-dependent maturation effect on the clearance of a monoclonal antibody. Scenarios of varying data sparsity and residual error are considered. The goodness-of-fit test correctly identified misspecified parametric models with high power for relevant scenarios. The case study provides proof-of-concept of the feasibility of the proposed approach, which is envisioned to be beneficial for applications that lack well-founded covariate models. Y1 - 2021 U6 - https://doi.org/10.1002/psp4.12614 SN - 2163-8306 VL - 10 IS - 6 SP - 564 EP - 576 PB - Nature Publ. Group CY - London ER - TY - JOUR A1 - Rastogi, Abhishake T1 - Tikhonov regularization with oversmoothing penalty for nonlinear statistical inverse problems JF - Communications on Pure and Applied Analysis N2 - In this paper, we consider the nonlinear ill-posed inverse problem with noisy data in the statistical learning setting. The Tikhonov regularization scheme in Hilbert scales is considered to reconstruct the estimator from the random noisy data. In this statistical learning setting, we derive the rates of convergence for the regularized solution under certain assumptions on the nonlinear forward operator and the prior assumptions. We discuss estimates of the reconstruction error using the approach of reproducing kernel Hilbert spaces. KW - Statistical inverse problem KW - Tikhonov regularization KW - Hilbert Scales KW - reproducing kernel Hilbert space KW - minimax convergence rates Y1 - 2020 U6 - https://doi.org/10.3934/cpaa.2020183 SN - 1534-0392 SN - 1553-5258 VL - 19 IS - 8 SP - 4111 EP - 4126 PB - American Institute of Mathematical Sciences CY - Springfield ER - TY - GEN A1 - Rastogi, Abhishake T1 - Tikhonov regularization with oversmoothing penalty for linear statistical inverse learning problems T2 - AIP Conference Proceedings : third international Conference of mathematical sciences (ICMS 2019) N2 - In this paper, we consider the linear ill-posed inverse problem with noisy data in the statistical learning setting. The Tikhonov regularization scheme in Hilbert scales is considered in the reproducing kernel Hilbert space framework to reconstruct the estimator from the random noisy data. We discuss the rates of convergence for the regularized solution under the prior assumptions and link condition. For regression functions with smoothness given in terms of source conditions the error bound can explicitly be established. KW - Statistical inverse problem KW - Tikhonov regularization KW - Hilbert Scales KW - Reproducing kernel Hilbert space KW - Minimax convergence rates Y1 - 2019 SN - 978-0-7354-1930-8 U6 - https://doi.org/10.1063/1.5136221 SN - 0094-243X VL - 2183 PB - American Institute of Physics CY - Melville ER -