TY - JOUR A1 - Bussas, Matthias A1 - Sawade, Christoph A1 - Kuhn, Nicolas A1 - Scheffer, Tobias A1 - Landwehr, Niels T1 - Varying-coefficient models for geospatial transfer learning JF - Machine learning N2 - We study prediction problems in which the conditional distribution of the output given the input varies as a function of task variables which, in our applications, represent space and time. In varying-coefficient models, the coefficients of this conditional are allowed to change smoothly in space and time; the strength of the correlations between neighboring points is determined by the data. This is achieved by placing a Gaussian process (GP) prior on the coefficients. Bayesian inference in varying-coefficient models is generally intractable. We show that with an isotropic GP prior, inference in varying-coefficient models resolves to standard inference for a GP that can be solved efficiently. MAP inference in this model resolves to multitask learning using task and instance kernels. We clarify the relationship between varying-coefficient models and the hierarchical Bayesian multitask model and show that inference for hierarchical Bayesian multitask models can be carried out efficiently using graph-Laplacian kernels. We explore the model empirically for the problems of predicting rent and real-estate prices, and predicting the ground motion during seismic events. We find that varying-coefficient models with GP priors excel at predicting rents and real-estate prices. The ground-motion model predicts seismic hazards in the State of California more accurately than the previous state of the art. KW - Transfer learning KW - Varying-coefficient models KW - Housing-price prediction KW - Seismic-hazard models Y1 - 2017 U6 - https://doi.org/10.1007/s10994-017-5639-3 SN - 0885-6125 SN - 1573-0565 VL - 106 SP - 1419 EP - 1440 PB - Springer CY - Dordrecht ER -