Dr Sean Holden
Dr Sean Holden is pleased to consider applications from prospective PhD students.
http://www.cl.cam.ac.uk/~sbh11 (personal home page)
Our research covers assorted issues in both theoretical and applied machine learning. At present we are interested in:
- Computational learning theory. How can we better understand the properties of machine learning algorithms in terms of, for example, the relationship between the number of training examples used and their performance?
- Bayesian inference. This approach to machine learning continues to offer state-of-the-art performance in many applications. However the continuing problem of the analytical intractability of many of the fundamental calculations continues to provide opportunities for research into improved approximation techniques.
- Quantum computation for machine learning. It is known that, should a practical quantum computer become viable, quantum computation will provide definite benefits in certain areas. However little is known about the extent to which it might benefit machine learning.
- Machine learning techniques for automated theorem proving.
No direct clinical relevance
No collaborators listed
Richard Russell and Sean Holden (2010), “Handling Goal Utility Dependencies in a Satisfiability Framework” Proceedings of International Conference on Automated Planning and Scheduling (ICAPS)
Andrew Naish-Guzman and Sean Holden (2007), “The Generalized FITC Approximation” Proceedings of Neural Information Processing Systems (NIPS)
Burbidge R, Trotter M, Holden S, Buxton B (2001), “Drug Design by Machine Learning: Support Vector Machines for Pharmaceutical Data” Special issue of Computers and Chemistry 26(1):4-15
S. B. Holden (1996), “PAC-like Upper Bounds for the Sample Complexity of Leave-One-Out Cross-Validation” Proceedings of the Ninth Annual Conference on Computational Learning Theory (COLT)
Simon Fothergill, Rob Harle and Sean Holden (2008), “Modelling the Model Athlete : Automatic Coaching of Rowing Technique” 12th International Workshop on Structural and Syntactic Pattern Recognition
Andrew Naish-Guzman and Sean Holden (2007), “Robust Regression with Twinned Gaussian Processes” Proceedings of Neural Information Processing Systems (NIPS)
Paquet U, Holden S, Naish-Guzman A (2005), “Bayesian Hierarchical Ordinal Regression” Proceedings of the International Conference on Artificial Neural Networks (ICANN)
Paquet U, Holden S, Naish-Guzman A (2005), “On The Explicit Use Of Example Weights In The Construction Of Classifiers” Proceedings of the International Conference on Artificial Neural Networks (ICANN)
P. Hammond, T. J. Hutton, J. E. Allanson, L. E. Campbell, R. C. M. Hennekam, S. Holden, K. C. Murphy, M. A. Patton, A. Shaw, I. K. Temple, M. Trotter, R. M. Winter (2004), “3D Analysis of Facial Morphology” American Journal of Medical Genetics 126A(4):339-348
M. Trotter and S. Holden (2003), “Support Vector Machines for ADME Property Classification” QSAR & Combinatorial Science 22(5):533-548
J. Wickramaratna, S. B. Holden and B. Buxton (2001), “Performance Degradation in Boosting” Proceedings of the 2nd International Workshop on Multiple Classifier Systems
M. Trotter, B. Buxton and S. B. Holden (2001), “Support Vector Machines in Combinatorial Chemistry” Measurement and Control 34(8):235-239
R. Burbidge, M. Trotter, B. Buxton and S. B. Holden (2001), “STAR - Sparsity Through Automated Rejection” Connectionist Models of Neurons, Learning Processes, and Artificial Intelligence: 6th International Work-Conference On Artificial and Natural Neural Networks , IWANN
M. Anthony and S. B. Holden (1998), “Cross-Validation for Binary Classification by Real-Valued Functions: Theoretical Analysis” Proceedings of the Eleventh Annual Conference on Computational Learning Theory (COLT)
S. B. Holden (1998), “Generalization” Statistics and Computing 8(1):3-4
S. B. Holden and M. Niranja (1997), “Average-Case Learning Curves for Radial Basis Function Networks” Neural Computation 9(2):441-460
M. Price, S. B. Holden and M Sandler (1996), “Accurate Parallel Form Filter Synthesis” Electronics Letters 32(22):2066-2067
S. B. Holden and M. Niranjan (1995), “On the Practical Applicability of VC Dimension Bounds” Neural Computation 7(6):1265-1288
S. B. Holden and M. Niranjan (1995), “On the Statistical Physics of Radial Basis Function Networks” Neural Processing Letters 2(4):16-19
S. B. Holden and P. J. W. Rayner (1995), “Generalization and PAC Learning: Some New Results for the Class of Generalized Single Layer Networks” IEEE Transactions on Neural Networks 6(2):368-380
M. Anthony and S. B. Holden (1994), “Quantifying Generalization in Linearly Weighted Neural Networks” Complex Systems 8:91-114
S. B. Holden (1994), “Neural Networks and the VC Dimension” Proceedings of the Third IMA International Conference on Mathematics and Signal Processing
S. B. Holden (1994), “How Practical are VC Dimension Bounds?” Proceedings of the IEEE International Conference on Neural Networks
M. Anthony and S. B. Holden (1993), “On the Power of Linearly Weighted Neural Networks” Proceedings of the International Conference on Artificial Neural Networks (ICANN)
M. Anthony and S. B. Holden (1993), “On the Power of Polynomial Discriminators and Radial Basis Function Networks” Proceedings of the Sixth Annual ACM Conference on Computational Learning Theory (COLT)
S. B. Holden (1993), “Valid Generalization in Radial Basis Function Networks and Modified Kanerva Models” Proceedings of the IEE Third International Conference on Artificial Neural Networks
S. B. Holden and P. J. W. Rayner (1992), “Generalization and Learning in Volterra and Radial Basis Function Networks: A Theoretical Analysis” Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing
M. R. Lynch, P. J. W. Rayner and S. B. Holden (1991), “Removal of Degeneracy in Adaptive Volterra Networks by Dynamic Structuring” Proceedings of the IEEE International Conference on Acoustics Speech and Signal Processing
M. R. Lynch, S. B. Holden and P. J. W. Rayner (1991), “Complexity Reduction in Volterra Connectionist Networks using a Self-Structuring LMS Algorithm” Proceedings of the IEE Second International Conference on Artificial Neural Networks