Skip to menu Skip to content Skip to footer
Professor Fred Roosta
Professor

Fred Roosta

Email: 
Phone: 
+61 7 336 53259

Overview

Availability

Professor Fred Roosta is:
Available for supervision

Qualifications

  • Doctor of Philosophy, The University of British Columbia

Research interests

  • Artificial Intelligence

  • Machine Learning

  • Numerical Optimization

  • Numerical Analysis

  • Computational Statistics

  • Scientific Computing

Works

Search Professor Fred Roosta’s works on UQ eSpace

45 works between 2014 and 2024

1 - 20 of 45 works

2024

Journal Article

Non-uniform smoothness for gradient descent

Berahas, Albert S., Roberts, Lindon and Roosta, Fred (2024). Non-uniform smoothness for gradient descent. Transactions on Machine Learning Research.

Non-uniform smoothness for gradient descent

2024

Conference Publication

Inexact Newton-type methods for optimisation with nonnegativity constraints

Smee, Oscar and Roosta, Fred (2024). Inexact Newton-type methods for optimisation with nonnegativity constraints. International Conference on Machine Learning, Vienna, Austria, 21-27 July 2024. Proceedings of Machine Learning Research.

Inexact Newton-type methods for optimisation with nonnegativity constraints

2024

Conference Publication

Manifold integrated gradients: Riemannian geometry for feature attribution

Zaher, Eslam, Trzaskowski, Maciej, Nguyen, Quan and Roosta, Fred (2024). Manifold integrated gradients: Riemannian geometry for feature attribution. International Conference on Machine Learning, Vienna, Austria, 21-27 July 2024. Proceedings of Machine Learning Research.

Manifold integrated gradients: Riemannian geometry for feature attribution

2023

Conference Publication

Monotonicity and double descent in uncertainty estimation with gaussian processes

Hodgkinson, Liam, Van Der Heide, Chris, Roosta, Fred and Mahoney, Michael W. (2023). Monotonicity and double descent in uncertainty estimation with gaussian processes. International Conference on Machine Learning, Honolulu, HI United States, 23 - 29 July 2023. San Diego, CA United States: International Conference on Machine Learning.

Monotonicity and double descent in uncertainty estimation with gaussian processes

2023

Journal Article

Generalising uncertainty improves accuracy and safety of deep learning analytics applied to oncology

MacDonald, Samual, Foley, Helena, Yap, Melvyn, Johnston, Rebecca L., Steven, Kaiah, Koufariotis, Lambros T., Sharma, Sowmya, Wood, Scott, Addala, Venkateswar, Pearson, John V., Roosta, Fred, Waddell, Nicola, Kondrashova, Olga and Trzaskowski, Maciej (2023). Generalising uncertainty improves accuracy and safety of deep learning analytics applied to oncology. Scientific Reports, 13 (1) 7395, 1-14. doi: 10.1038/s41598-023-31126-5

Generalising uncertainty improves accuracy and safety of deep learning analytics applied to oncology

2023

Journal Article

Inexact Newton-CG algorithms with complexity guarantees

Yao, Zhewei, Xu, Peng, Roosta, Fred, Wright, Stephen J. and Mahoney, Michael W. (2023). Inexact Newton-CG algorithms with complexity guarantees. IMA Journal of Numerical Analysis, 43 (3), 1855-1897. doi: 10.1093/imanum/drac043

Inexact Newton-CG algorithms with complexity guarantees

2022

Journal Article

MINRES: From negative curvature detection to monotonicity properties

Liu, Yang and Roosta, Fred (2022). MINRES: From negative curvature detection to monotonicity properties. SIAM Journal on Optimization, 32 (4), 2636-2661. doi: 10.1137/21m143666x

MINRES: From negative curvature detection to monotonicity properties

2022

Journal Article

Confirming the Lassonde Curve through life cycle analysis and its effect on share price: A case study of three ASX listed gold companies

Rijsdijk, Timothy, Nehring, Micah, Kizil, Mehmet and Roosta, Fred (2022). Confirming the Lassonde Curve through life cycle analysis and its effect on share price: A case study of three ASX listed gold companies. Resources Policy, 77 102704, 1-12. doi: 10.1016/j.resourpol.2022.102704

Confirming the Lassonde Curve through life cycle analysis and its effect on share price: A case study of three ASX listed gold companies

2022

Journal Article

Newton-MR: inexact Newton Method with minimum residual sub-problem solver

Roosta, Fred, Liu, Yang, Xu, Peng and Mahoney, Michael W. (2022). Newton-MR: inexact Newton Method with minimum residual sub-problem solver. EURO Journal on Computational Optimization, 10 100035, 1-44. doi: 10.1016/j.ejco.2022.100035

Newton-MR: inexact Newton Method with minimum residual sub-problem solver

2022

Journal Article

LSAR: efficient leverage score sampling algorithm for the analysis of big time series data

Eshragh, Ali, Roosta, Fred, Nazari, Asef and Mahoney, Michael W. (2022). LSAR: efficient leverage score sampling algorithm for the analysis of big time series data. Journal of Machine Learning Research, 23, 1-36.

LSAR: efficient leverage score sampling algorithm for the analysis of big time series data

2022

Conference Publication

Crop type prediction utilising a long short-term memory with a self-attention for winter crops in Australia

Nguyen, Dung, Zhao, Yan, Zhang, Yifan, Huynh, Anh Ngoc-Lan, Roosta, Fred, Hammer, Graeme, Chapman, Scott and Potgieter, Andries (2022). Crop type prediction utilising a long short-term memory with a self-attention for winter crops in Australia. IGARSS 2022 - 2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17-22 July 2022. Piscataway, NJ, United States: Institute of Electrical and Electronics Engineers. doi: 10.1109/IGARSS46834.2022.9883737

Crop type prediction utilising a long short-term memory with a self-attention for winter crops in Australia

2021

Journal Article

Implicit Langevin algorithms for sampling from log-concave densities

Hodgkinson, Liam, Salomone, Robert and Roosta, Fred (2021). Implicit Langevin algorithms for sampling from log-concave densities. Journal of Machine Learning Research, 22 136, 1-30.

Implicit Langevin algorithms for sampling from log-concave densities

2021

Conference Publication

Shadow Manifold Hamiltonian Monte Carlo

van der Heide, Chris, Hodgkinson, Liam, Roosta, Fred and Kroese, Dirk (2021). Shadow Manifold Hamiltonian Monte Carlo. International Conference on Artificial Intelligence and Statistics, Online, 27-30- July 2021. Tempe, AZ, United States: ML Research Press.

Shadow Manifold Hamiltonian Monte Carlo

2021

Journal Article

Evolution and application of digital technologies to predict crop type and crop phenology in agriculture

Potgieter, A. B., Zhao, Yan, Zarco-Tejada, Pablo J, Chenu, Karine, Zhang, Yifan, Porker, Kenton, Biddulph, Ben, Dang, Yash P., Neale, Tim, Roosta, Fred and Chapman, Scott (2021). Evolution and application of digital technologies to predict crop type and crop phenology in agriculture. In Silico Plants, 3 (1) diab017, 1-23. doi: 10.1093/insilicoplants/diab017

Evolution and application of digital technologies to predict crop type and crop phenology in agriculture

2021

Journal Article

Inexact nonconvex Newton-type methods

Yao, Zhewei, Xu, Peng, Roosta, Fred and Mahoney, Michael W. (2021). Inexact nonconvex Newton-type methods. INFORMS Journal on Optimization, 3 (2), 154-182. doi: 10.1287/ijoo.2019.0043

Inexact nonconvex Newton-type methods

2021

Journal Article

Convergence of Newton-mr under inexact hessian information

Liu, Yang and Roosta, Fred (2021). Convergence of Newton-mr under inexact hessian information. SIAM Journal on Optimization, 31 (1), 59-90. doi: 10.1137/19M1302211

Convergence of Newton-mr under inexact hessian information

2021

Conference Publication

Avoiding kernel fixed points: Computing with ELU and GELU infinite networks

Tsuchida, Russell, Pearce, Tim, van der Heide, Chris, Roosta, Fred and Gallagher, Marcus (2021). Avoiding kernel fixed points: Computing with ELU and GELU infinite networks. 35th AAAI Conference on Artificial Intelligence, AAAI 2021, Online, 2 - 9 February 2021. Menlo Park, CA United States: Association for the Advancement of Artificial Intelligence.

Avoiding kernel fixed points: Computing with ELU and GELU infinite networks

2021

Conference Publication

Avoiding kernel fixed points: computing with ELU and GELU infinite networks

Tsuchida, Russell, Pearce, Tim, van der Heide, Chris, Roosta, Fred and Gallagher, Marcus (2021). Avoiding kernel fixed points: computing with ELU and GELU infinite networks. 35th AAAI Conference on Artificial Intelligence / 33rd Conference on Innovative Applications of Artificial Intelligence / 11th Symposium on Educational Advances in Artificial Intelligence, Electr Network, 2-9 February 2021. Washington, DC, United States: Association for the Advancement of Artificial Intelligence.

Avoiding kernel fixed points: computing with ELU and GELU infinite networks

2021

Journal Article

Limit theorems for out-of-sample extensions of the adjacency and Laplacian spectral embeddings

Levin, Keith D., Roosta, Fred, Tang, Minh, Mahoney, Michael W. and Priebe, Carey E. (2021). Limit theorems for out-of-sample extensions of the adjacency and Laplacian spectral embeddings. Journal of Machine Learning Research, 22 194, 1-59.

Limit theorems for out-of-sample extensions of the adjacency and Laplacian spectral embeddings

2021

Conference Publication

Stochastic continuous normalizing flows: training SDEs as ODEs

Hodgkinson, Liam, van der Heide, Chris, Roosta, Fred and Mahoney, Michael W. (2021). Stochastic continuous normalizing flows: training SDEs as ODEs. Conference on Uncertainty in Artificial Intelligence, Online, 27-29 July 2021. San Diego, CA, United States: Association For Uncertainty in Artificial Intelligence (AUAI).

Stochastic continuous normalizing flows: training SDEs as ODEs

Funding

Current funding

  • 2021 - 2026
    ARC Training Centre for Information Resilience
    ARC Industrial Transformation Training Centres
    Open grant
  • 2021 - 2025
    CropVision: A next-generation system for predicting crop production
    ARC Linkage Projects
    Open grant

Past funding

  • 2021
    Big time series data and randomised numerical linear algebra
    University of Melbourne
    Open grant
  • 2019
    Approximate solutions to large Markov decision processes
    University of Melbourne
    Open grant
  • 2018 - 2024
    Efficient Second-Order Optimisation Algorithms for Learning from Big Data
    ARC Discovery Early Career Researcher Award
    Open grant

Supervision

Availability

Professor Fred Roosta is:
Available for supervision

Before you email them, read our advice on how to contact a supervisor.

Available projects

  • Non-convex Optimization for Machine Learning

    Design, analysis, and implementation of novel optimization algorithms for optimization of modern non-convex machine learning problems.

  • Interpretable AI - Theory and Practice

    This project will extend and innovate, both theoretically and practically, interpretable methods in AI that are transparent and explainable to improve trust and usability. It will also explore novel approaches for uncertainty quantification and understanding causality.

  • Exploring Predictivity--Parsimony Trade-off In Scientific Machine Learning

    This project will investigate, both theoretically and empirically, novel statistical techniques to explore the trade-offs between high-generalization performance and low-model complexity for scientific machine learning.

  • Novel Machine Learning Models for Scientific Discovery

    To extend the application range of machine learning to scientific domains, this project will design, analyze and implement novel machine learning techniques that learn from data, while conform with known properties of the underlying scientific models.

  • Automated Discovery of Optimization and Linear Algebra Algorithms

    Using reinforcement learning to automate algorithmic discovery, this project aims to develop novel variants of first- and second-order optimization methods, randomized numerical linear algebra techniques, and mixed-integer programming approaches.

  • Second-order Optimization Algorithms for Machine Learning

    This project aims to develop the next generation of second-order optimization methods for training complex machine learning models, with particular focus on constrained problems arising in scientific machine learning applications.

  • Distributed Optimization Algorithms for Large-scale Machine Learning

    This project aims to design, analyze and implement efficient optimization algorithms suitable for distributed computing environments, with focus on large-scale machine learning.

Supervision history

Current supervision

  • Doctor Philosophy

    Stochastic Simulation and Optimization Methods for Machine Learning

    Principal Advisor

  • Doctor Philosophy

    AI/ML Framework for Mixed-integer Nonlinear Optimisation

    Principal Advisor

    Other advisors: Dr Nan Ye

  • Doctor Philosophy

    Interpretable AI-Theory and Practice

    Principal Advisor

    Other advisors: Dr Quan Nguyen

  • Doctor Philosophy

    Novel Machine Learning Models for Scientific Discovery

    Principal Advisor

  • Doctor Philosophy

    Characterizing Influence and Sensitivity in the Interpolating Regime

    Principal Advisor

    Other advisors: Associate Professor Marcus Gallagher

  • Doctor Philosophy

    Newton type methods for constrained optimization

    Principal Advisor

  • Doctor Philosophy

    Forecasting the Market Capitalisation of ASX Listed Junior Resource Companies through an Artificial Neural Network

    Associate Advisor

    Other advisors: Associate Professor Mehmet Kizil, Dr Micah Nehring

  • Doctor Philosophy

    Efficient graph representation learning with neural networks and self-supervised learning

    Associate Advisor

    Other advisors: Dr Nan Ye

Completed supervision

Media

Enquiries

For media enquiries about Professor Fred Roosta's areas of expertise, story ideas and help finding experts, contact our Media team:

communications@uq.edu.au