Dynamic signature verification using discriminative training
Gregory F. Russell, Jianying Hu, et al.
ICDAR 2005
This paper addresses the problem of estimating the optimal Hidden Markov Model (HMM) topology. The optimal topology is defined as the one that gives the smallest error-rate with the minimal number of parameters. The paper introduces a Bayesian model selection criterion that is suitable for Continuous Hidden Markov Models topology optimization. The criterion is derived from the Laplacian approximation of the posterior of a model structure, and shares the algorithmic simplicity of conventional Bayesian selection criteria, such as Schwarz’s Bayesian Information Criterion (BIC). Unlike, BIC, which uses a multivariate Normal distribution assumption for the prior of all parameters of the model, the proposed HMM-oriented Bayesian Information Criterion (HBIC), models each parameter by a different distribution, one more appropriate for that parameter. The results on an handwriting recognition task shows that the HBIC realizes a much smaller and efficient system than a system generated through the BIC.
Gregory F. Russell, Jianying Hu, et al.
ICDAR 2005
Malcolm Slaney, Jayashree Subrahmonia, et al.
UM 2003
Alain Biem, Eric Bouillet, et al.
SIGMOD 2010
Michael A. Bauer, Alain Biem, et al.
AIP-CP 2011