NUK - logo
E-viri
Recenzirano Odprti dostop
  • Efficient parameter learnin...
    Zaidi, Nayyar A.; Webb, Geoffrey I.; Carman, Mark J.; Petitjean, François; Buntine, Wray; Hynes, Mike; De Sterck, Hans

    Machine learning, 10/2017, Letnik: 106, Številka: 9-10
    Journal Article

    Recent advances have demonstrated substantial benefits from learning with both generative and discriminative parameters. On the one hand, generative approaches address the estimation of the parameters of the joint distribution— P ( y , x ) , which for most network types is very computationally efficient (a notable exception to this are Markov networks) and on the other hand, discriminative approaches address the estimation of the parameters of the posterior distribution—and, are more effective for classification, since they fit P ( y | x ) directly. However, discriminative approaches are less computationally efficient as the normalization factor in the conditional log-likelihood precludes the derivation of closed-form estimation of parameters. This paper introduces a new discriminative parameter learning method for Bayesian network classifiers that combines in an elegant fashion parameters learned using both generative and discriminative methods. The proposed method is discriminative in nature, but uses estimates of generative probabilities to speed-up the optimization process. A second contribution is to propose a simple framework to characterize the parameter learning task for Bayesian network classifiers. We conduct an extensive set of experiments on 72 standard datasets and demonstrate that our proposed discriminative parameterization provides an efficient alternative to other state-of-the-art parameterizations.