本章目錄
9 Generalized linear models and the exponential family 281
9.1 Introduction 281
9.2 The exponential family 281
9.2.1 Definition 282
9.2.2 Examples 282
9.2.3 Log partition function 284
9.2.4 MLE for the exponential family 286
9.2.5 Bayes for the exponential family * 287
9.2.6 Maximum entropy derivation of the exponential family * 289
9.3 Generalized linear models (GLMs) 290
9.3.1 Basics 290
9.3.2 ML and MAP estimation 292
9.3.3 Bayesian inference 293
9.4 Probit regression 293
9.4.1 ML/MAP estimation using gradient-based optimization 294
9.4.2 Latent variable interpretation 294
9.4.3 Ordinal probit regression * 295
9.4.4 Multinomial probit models * 295
9.5 Multi-task learning 296
9.5.1 Hierarchical Bayes for multi-task learning 296
9.5.2 Application to personalized email spam filtering 296
9.5.3 Application to domain adaptation 297
9.5.4 Other kinds of prior 297
9.6 Generalized linear mixed models * 298
9.6.1 Example: semi-parametric GLMMs for medical data 298
9.6.2 Computational issues 300
9.7 Learning to rank * 300
9.7.1 The pointwise approach 301
9.7.2 The pairwise approach 301
9.7.3 The listwise approach 302
9.7.4 Loss functions for ranking 303
github下載鏈接:https://github.com/916718212/Machine-Learning-A-Probabilistic-Perspective-.git