Machine Learning


  • Carlos Fernandez-Lozano, Jose Seoane, Marcos Gestal, Tom Gaunt, Julian Dorado and Colin Campbell. Texture classification using feature selection and kernel based techniques Soft Computing 10.1007/s00500-014-1573-5 (2015).

  • Learning the Coordinate Gradients. Y. Ying, W. Qiang, W and C. Campbell, Advances in Computational Mathematics, vol 37. 2012, pp. 355-378.

    Download the paper.

  • A random forest proximity matrix as a new measure for gene annotation. Jose A. Seoane, Ian N.M. Day, Juan P. Casas, Colin Campbell and Tom R. Gaunt. Proceedings ESANN 2014.

  • Texture classification using Kernel-Based Techniques. Carlos Fernandez-Lozano, Jose A. Seoane, Marcos Gestal, Tom R. Gaunt and Colin Campbell. IWANN2013 06/2013.

  • Learning with Support Vector Machines Colin Campbell and Yiming Yiming, 2011, Morgan and Claypool.

    Purchase the book.

  • Generalized sparse metric learning with relative comparisons. Kaizhu Huang, Yiming Ying and Colin Campbell. Knowledge and Information Systems (KAIS), volume 28, issue 1, 2011, pages 25-45.

    Download the pdf

  • Rademacher chaos complexity for learning the kernel, Yiming Ying and Colin Campbell. Neural Computation, 22, 2010, p. 2858-2886.

    Download the pdf.

    An earlier paper on this approach (Bounds for Learning the Kernel: Rademacher Chaos Complexity) was released in 2008 and you can download the original pdf. This latest version is an extension of our COLT conference paper: Generalization bounds for learning the kernel. We now provide a self-contained proof for bounding the Rademacher chaos complexity by metric entropy integrals and also correct the claim about generalization bounds derived from the covering number approach (which appeared at the end of Section 3 of the COLT conference version).

  • Sparse Metric Learning via Smooth Optimization. Yiming Ying, Kaizhu Huang and Colin Campbell. Advances in Neural Information Processing Systems 22, 2009, p. 2214-2222.

    Download the pdf

  • Analysis of SVM with Indefinite Kernels. Yiming Ying, Colin Campbell and Mark Girolami. Advances in Neural Information Processing Systems 22, 2009, p. 2205-2213.

    Download the pdf

  • GSML: A Unified Framework for Sparse Metric Learning. Kaizhu Huang, Yiming Ying and Colin Campbell. Proceedings IEEE International Conference on Data Mining, ICDM 2009.

    Download the pdf

  • Generalization Bounds for Learning the Kernel. Yiming Ying and Colin Campbell. Proceedings: Computational Learning Theory (COLT), 2009.

    Download the pdf

  • Learning coordinate gradients with multi-task kernels. Yiming Ying and Colin Campbell. Proceedings: Computational Learning Theory (COLT), 2008, p. 217-228.

    Download the pdf

  • Kaizhu Huang, Zenglin Xu, Irwin King, Michael R. Lyu and Colin Campbell. Supervised Self-taught Learning: Actively Transferring Knowledge from Unlabeled Data, Proceedings of IJCNN09, 2009, p. 1272-1277.

  • Special Issue: Support Vector Machines and Kernel Methods. Editors Nello Cristianini, Colin Campbell and Chris Burges. Machine Learning, vol. 46, 2002.

  • Special Issue: Support Vector Machines. Editors C. Campbel,C.-J Lin, S.S.Keerthi, V.D. Sancez A. Neurocomputing vol 55/1-2, 2003.

  • Query Learning with Gaussian Processes. Simon Rogers and Colin Campbell. Proceedings of the 2003 UK Workshop on Computational Intelligence, 2003, pages 45-52.

  • A Linear Programming Approach to Novelty Detection. Colin Campbell and Kristin P. Bennett. Advances in Neural Information Processing Systems, vol. 13, MIT press, Cambridege, MA, 2001, p. 395-401.

    Download the pdf.

  • An Introduction to Kernel Methods. C. Campbell. Radial Basis Function Networks: Design and Applications. R.J. Howlett and L.C. Jain (eds). Physica Verlag, Berlin, 2000, Ch. 7 p. 155-192.

    Download the pdf.

  • Machine Learning Strategies for Complex Tasks. Colin Campbell, Theodoros Evgeniou, Bernd Heisele and Massimiliano Pontil. Proceedings of First IEEE-RAS International Conference on Humanoid Robots, MIT, 2000, Springer Verlag, 13 pages..

    Download the pdf

  • Query Learning with Large Margin Classifiers. C. Campbell, N. Cristianini and A. Smola. Proceedings of the 17th International Conference on Machine Learing, (ICML2000, Stanford, CA, 2000), Morgan Kaufmann, p. 111-118..

    Download the pdf.

  • Support Vector Machine Classification and Validation of Cancer Tissue Samples using Microarray Expression Data. T. Furey, N. Cristianini, N. Duffy, D. Bednarski, Michel Schummer and D. Haussler Bioinformatics, 2000, 16:906-914

  • Algorithmic Approaches to Training Support Vector Machnies: A Survey. C. Campbell. Proceedings of ESANN2000 (D-Facto Publications, Belgium, 2000) p. 27-36.

    Download the pdf.

  • Kernel Methods: A Survey of Current Techniques. C. Campbell. Neurocomputing 2002, 48: 63-84.

    Download the pdf

  • Bayes Point Machines. Ralf Herbrich, Thore Graepel, Colin Campbell. Journal of Machine Learning Research 1(2001) p. 245-279.

    Download the pdf.

  • Robust Bayes Point Machines. R. Herbrich, Th. Graepel and C. Campbell. Proceedings of ESANN2000 (D-Facto Publications, Belgium, 2000) p. 49-54.

    Download the pdf.

  • Large Margin DAGs for Multiclass Classification J. Platt, N. Cristianini and J. Shawe-Taylor. In Advances in Neural Information Processing Systems, Vol. 12 MIT Press, 2000, p. 547-553.

    Download the pdf.

  • Margin Distribution and Soft Margin. J. Shawe-Taylor and N. Cristianini. In: A.J. Smola, P. Bartlett, B. Schoelkopf, and C.Schuurmans (editors), Advances in Large Margin Classifiers, Chapter 2. MIT Press, 1999.

    Download the pdf.

  • Knowledge-based Analysis of Microarray Gene Expression Data using Support Vector Machines. M. Brown, W. Grundy, D. Lin, N. Cristianini C. Sugnet, T. Furey, M. Ares Jr., D. Haussler Proceedings of the National Academy of Sciences 97(1) p. 262-267.

    View details

  • Controlling the Sensitivity of Support Vector Machines. K. Veropoulos, C. Campbell and N. Cristianini. In Proceedings of the International Joint Conference on Artificial Intelligence, Stockholm, Sweden, 1999 (IJCAI99), Workshop ML3, p. 55-60.

    Download the pdf.

  • Bayesian Voting Schemes and Large Margin Classifiers. N. Cristianini and J. Shawe-Taylor. In: B. Schoelkopf, C. Burges and A. Smola (editors), Advances in Kernel Methods - Support Vector Learning, Chapter 5 (p. 55-68). MIT Press, 1999.

    Download the pdf.

  • Margin Distribution Bounds on Generalisation. J. Shawe-Taylor and N. Cristianini. In: Lecture Notes in Artificial Intelligence, 1572, Computational Learning Theory (Proceedings of Eurocolt 1999). p. 263-273.

    Download the pdf.

  • Robust Bounds on Generalisation from the Margin Distribution. J. Shawe-Taylor and N. Cristianini. Submitted to IEEE Transactions on Information Theory..

    Download the pdf.

  • Dynamically Adapting Kernels in Support Vector Machines. N. Cristianini, C. Campbell and J. Shawe-Taylor. In M. Kearns, S. Solla and D. Cohn (editors), Advances in Neural Information Processing Systems, Vol. 11. MIT Press, 1999, p. 204-210.

    Download the pdf.

  • Multiplicative Updatings for Support Vector Machines. N. Cristianini, C. Campbell and J. Shawe-Taylor. In: Proceeding of ESANN 99 (D-Facto Publications, Belgium, 1999) p. 189-194.

    Download the pdf.

  • Enlarging the Margin in Perceptron Decision Trees. K. Bennett, N. Critianini, J. Shawe-Taylor and D. Wu. Machine Learning 41 (3), 2000, pp. 295-313.

    Download the pdf.

  • Large Margin Decision Trees for Induction and Transduction. D. Wu, K. Bennett, N. Cristianini and J. Shawe-Taylor. In: Proceedings of the Sixteenth International Conference on Machine Learning (ICML99).

    Download the pdf.

  • Bayes Point Machines: Estimating the Bayes Point in Kernel Space. R. Herbrich, Th. Graepel and C. Campbell. In: Proceedings of the International Joint Conference on Artificial Intelligence, Stockholm, Sweden, 1999 (IJCAI99). Workshop ML3, p.23-27

    Download the pdf.

  • Bayesian Learning in Reproducing Kernel Hilbert Spaces: The Usefulness of the Bayes Point. R. Herbrich, Th. Graepel and C. Campbell. Technical Report (1999). Longer version of the above paper: comments on this paper are welcome.

    Download the pdf.

  • Further Results on the Margin Distribution. J. Shawe-Taylor and N. Cristianini. Proceedings of COLT'99, Morgan Kaufmann, p. 278-285.

    Download the pdf.

  • Data Dependent Structural Risk Minimization for Perceptron Decision Trees. J. Shawe-Taylor and N. Cristianini. In: M. Jordan, M. Kearns and S. Solla, Advances in Neural Information Processing Systems, Vol. 10 (p. 336-342). MIT Press, 1998.

    Download the pdf.

  • Bayesian Classifiers are Large Margin Hyperplanes in a Hilbert Space. N. Cristainini, J. Shawe-Taylor and P. Sybacek. In: J. Shavlik (editor), Proceedings of the Fifteenth International Conference on Machine Learning, (1998) p. 109-117.

    Download the pdf.

  • Bayesian Classifiers are Large Margin Hyperplanes in a Hilbert Space. N. Cristianini, J. Shawe-Taylor and P. Sykacek. Journal submission (1998).

    Download the pdf.

  • Simple Training Algorithms for Support Vector Machines. C. Campbell and N. Cristianini. Technical Report: describes KA algorithm, including how to implement a bias (same solution as SVM via QP).

    Download the pdf.

  • The Kernel-Adatron: a Fast and Simple Learning Procedure for Support Vector Machines. T. Friess, N. Cristianini and C. Campbell. (First paper on KA algorithm but contains a number of typographical errors). In: J. Shavlik (editor), Proceedings of the Fifteenth International Conference on Machine Learning (1998) p. 188-196.

  • Large Margin Classification Using the Kernel Adatron Algorithm. C. Campbell, Th. Friess and N. Cristianini. In: L. Xu, L.W. Chan and A. Fu (editors), IDEAL'98: Intelligent Data Engineering and Learning, Springer, 1998 p. 355-362.