Skip to menu Skip to content Skip to footer
Professor Fred Roosta
Professor

Fred Roosta

Email: 
Phone: 
+61 7 336 53259

Overview

Availability

Professor Fred Roosta is:
Available for supervision

Qualifications

  • Doctor of Philosophy, The University of British Columbia

Research interests

  • Artificial Intelligence

  • Machine Learning

  • Numerical Optimization

  • Numerical Analysis

  • Computational Statistics

  • Scientific Computing

Works

Search Professor Fred Roosta’s works on UQ eSpace

51 works between 2014 and 2025

21 - 40 of 51 works

2021

Journal Article

Convergence of Newton-mr under inexact hessian information

Liu, Yang and Roosta, Fred (2021). Convergence of Newton-mr under inexact hessian information. SIAM Journal on Optimization, 31 (1), 59-90. doi: 10.1137/19M1302211

Convergence of Newton-mr under inexact hessian information

2021

Conference Publication

Avoiding kernel fixed points: Computing with ELU and GELU infinite networks

Tsuchida, Russell, Pearce, Tim, van der Heide, Chris, Roosta, Fred and Gallagher, Marcus (2021). Avoiding kernel fixed points: Computing with ELU and GELU infinite networks. 35th AAAI Conference on Artificial Intelligence, AAAI 2021, Online, 2 - 9 February 2021. Menlo Park, CA United States: Association for the Advancement of Artificial Intelligence. doi: 10.1609/aaai.v35i11.17197

Avoiding kernel fixed points: Computing with ELU and GELU infinite networks

2021

Journal Article

Limit theorems for out-of-sample extensions of the adjacency and Laplacian spectral embeddings

Levin, Keith D., Roosta, Fred, Tang, Minh, Mahoney, Michael W. and Priebe, Carey E. (2021). Limit theorems for out-of-sample extensions of the adjacency and Laplacian spectral embeddings. Journal of Machine Learning Research, 22 194, 1-59.

Limit theorems for out-of-sample extensions of the adjacency and Laplacian spectral embeddings

2021

Conference Publication

Stochastic continuous normalizing flows: training SDEs as ODEs

Hodgkinson, Liam, van der Heide, Chris, Roosta, Fred and Mahoney, Michael W. (2021). Stochastic continuous normalizing flows: training SDEs as ODEs. Conference on Uncertainty in Artificial Intelligence, Online, 27-29 July 2021. San Diego, CA, United States: Association For Uncertainty in Artificial Intelligence (AUAI).

Stochastic continuous normalizing flows: training SDEs as ODEs

2021

Conference Publication

Non-PSD matrix sketching with applications to regression and optimization

Feng, Zhili, Roosta, Fred and Woodruff, David P. (2021). Non-PSD matrix sketching with applications to regression and optimization. Conference on Uncertainty in Artificial Intelligence, Online, 27-29 July 2021. San Diego, CA United States: Association For Uncertainty in Artificial Intelligence (AUAI).

Non-PSD matrix sketching with applications to regression and optimization

2021

Conference Publication

Avoiding kernel fixed points: computing with ELU and GELU infinite networks

Tsuchida, Russell, Pearce, Tim, van der Heide, Chris, Roosta, Fred and Gallagher, Marcus (2021). Avoiding kernel fixed points: computing with ELU and GELU infinite networks. 35th AAAI Conference on Artificial Intelligence / 33rd Conference on Innovative Applications of Artificial Intelligence / 11th Symposium on Educational Advances in Artificial Intelligence, Electr Network, 2-9 February 2021. Washington, DC, United States: Association for the Advancement of Artificial Intelligence.

Avoiding kernel fixed points: computing with ELU and GELU infinite networks

2020

Journal Article

Newton-type methods for non-convex optimization under inexact Hessian information

Xu, Peng, Roosta, Fred and Mahoney, Michael W. (2020). Newton-type methods for non-convex optimization under inexact Hessian information. Mathematical Programming, 184 (1-2), 35-70. doi: 10.1007/s10107-019-01405-z

Newton-type methods for non-convex optimization under inexact Hessian information

2020

Conference Publication

Newton-admm: a distributed GPU-accelerated optimizer for multiclass classification problems

Fang, Chih-Hao, Kylasa, Sudhir B., Roosta, Fred, Mahoney, Michael W. and Grama, Ananth (2020). Newton-admm: a distributed GPU-accelerated optimizer for multiclass classification problems. International Conference on High Performance Computing, Networking, Storage and Analysis (SC), Atlanta, GA, United States, 9-19 November 2020. Piscataway, NJ, United States: IEEE Computer Society. doi: 10.1109/SC41405.2020.00061

Newton-admm: a distributed GPU-accelerated optimizer for multiclass classification problems

2020

Conference Publication

DINO: Distributed Newton-type optimization method

Crane, Rixon and Roosta, Fred (2020). DINO: Distributed Newton-type optimization method. 37th International Conference on Machine Learning, ICML 2020, Online, 12-18 July 2020. International Machine Learning Society.

DINO: Distributed Newton-type optimization method

2020

Book Chapter

Parallel optimization techniques for machine learning

Kylasa, Sudhir, Fang, Chih-Hao, Roosta, Fred and Grama, Ananth (2020). Parallel optimization techniques for machine learning. Parallel algorithms in computational science and engineering. (pp. 381-417) edited by Ananth Grama and Ahmed H. Sameh. Cham, Switzerland: Birkhauser. doi: 10.1007/978-3-030-43736-7_13

Parallel optimization techniques for machine learning

2020

Conference Publication

Second-order optimization for non-convex machine learning: an empirical study

Xu, Peng, Roosta, Fred and Mahoney, Michael W. (2020). Second-order optimization for non-convex machine learning: an empirical study. SIAM International Conference on Data Mining, Cincinnati, OH, United States, 7-9 May 2020. Philadelphia, PA, United States: Society for Industrial and Applied Mathematics. doi: 10.1137/1.9781611976236.23

Second-order optimization for non-convex machine learning: an empirical study

2020

Conference Publication

DINO: Distributed Newton-type optimization method

Crane, Rixon and Roosta, Fred (2020). DINO: Distributed Newton-type optimization method. International Conference on Machine Learning, Online, 12-18 July 2020. San Diego, CA United States: International Conference on Machine Learning.

DINO: Distributed Newton-type optimization method

2019

Book Chapter

Optimization methods for inverse problems

Ye, Nan, Roosta-Khorasani, Farbod and Cui, Tiangang (2019). Optimization methods for inverse problems. 2017 MATRIX annals. (pp. 121-140) edited by David R. Wood, Jan de Gier, Cheryl E. Praeger and Terence Tao. Cham, Switzerland: Springer. doi: 10.1007/978-3-030-04161-8_9

Optimization methods for inverse problems

2019

Conference Publication

DINGO: Distributed Newton-type method for gradient-norm optimization

Crane, Rixon and Roosta, Fred (2019). DINGO: Distributed Newton-type method for gradient-norm optimization. Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 8-14 December 2019. Maryland Heights, MO United States: Morgan Kaufmann Publishers.

DINGO: Distributed Newton-type method for gradient-norm optimization

2019

Conference Publication

Exchangeability and kernel invariance in trained MLPs

Tsuchida, Russell, Roosta, Fred and Gallagher, Marcus (2019). Exchangeability and kernel invariance in trained MLPs. Twenty-Eighth International Joint Conference on Artificial Intelligence (IJCAI-19, Macao, China, 10-16 August 2019. Marina del Rey, CA USA: International Joint Conferences on Artificial Intelligence. doi: 10.24963/ijcai.2019/498

Exchangeability and kernel invariance in trained MLPs

2019

Conference Publication

GPU accelerated sub-sampled Newton's method for convex classification problems

Kylasa, Sudhir, Roosta, Fred (Farbod), Mahoney, Michael W. and Grama, Ananth (2019). GPU accelerated sub-sampled Newton's method for convex classification problems. SIAM International Conference on Data Mining, Calgary, Canada, 2-4 May 2019. Philadelphia, PA, United States: Society for Industrial and Applied Mathematics. doi: 10.1137/1.9781611975673.79

GPU accelerated sub-sampled Newton's method for convex classification problems

2018

Journal Article

Sub-sampled Newton methods

Roosta-Khorasani, Farbod and Mahoney, Michael W. (2018). Sub-sampled Newton methods. Mathematical Programming, 174 (1-2), 293-326. doi: 10.1007/s10107-018-1346-5

Sub-sampled Newton methods

2018

Conference Publication

GIANT: Globally improved approximate Newton method for distributed optimization

Wang, Shusen, Roosta-Khorasani, Farbod, Xu, Peng and Mahoney, Michael W. (2018). GIANT: Globally improved approximate Newton method for distributed optimization. 32nd Conference on Neural Information Processing Systems, NeurIPS 2018, Montreal, QC, Canada, 2 - 8 December, 2018. Maryland Heights, MO, United States: Neural information processing systems foundation.

GIANT: Globally improved approximate Newton method for distributed optimization

2018

Conference Publication

FLAG n’ FLARE: fast linearly-coupled adaptive gradient methods

Cheng, Xiang, Roosta-Khorasani, Farbod, Palombo, Stefan, Bartlett, Peter L. and Mahoney, Michael W. (2018). FLAG n’ FLARE: fast linearly-coupled adaptive gradient methods. Twenty-First International Conference on Artificial Intelligence and Statistics, Lanzarote, Canary Islands, 9-11 April 2018. Cambridge, MA, United States: M I T Press.

FLAG n’ FLARE: fast linearly-coupled adaptive gradient methods

2018

Conference Publication

Out-of-sample extension of graph adjacency spectral embedding

Levin, Keith, Roosta-Khorasani, Farbod, Mahoney, Michael W. and Priebe, Carey E. (2018). Out-of-sample extension of graph adjacency spectral embedding. 35th International Conference on Machine Learning, Stockholm, Sweden, 10-15 July 2018. Cambridge, MA, United States: M I T Press.

Out-of-sample extension of graph adjacency spectral embedding

Funding

Current funding

  • 2025 - 2028
    Next Generation Newton-type Methods with Minimum Residual Solver
    ARC Discovery Projects
    Open grant
  • 2021 - 2026
    ARC Training Centre for Information Resilience
    ARC Industrial Transformation Training Centres
    Open grant
  • 2021 - 2025
    CropVision: A next-generation system for predicting crop production
    ARC Linkage Projects
    Open grant

Past funding

  • 2021
    Big time series data and randomised numerical linear algebra
    University of Melbourne
    Open grant
  • 2019
    Approximate solutions to large Markov decision processes
    University of Melbourne
    Open grant
  • 2018 - 2024
    Efficient Second-Order Optimisation Algorithms for Learning from Big Data
    ARC Discovery Early Career Researcher Award
    Open grant

Supervision

Availability

Professor Fred Roosta is:
Available for supervision

Looking for a supervisor? Read our advice on how to choose a supervisor.

Available projects

  • Non-convex Optimization for Machine Learning

    Design, analysis, and implementation of novel optimization algorithms for optimization of modern non-convex machine learning problems.

  • Interpretable AI - Theory and Practice

    This project will extend and innovate, both theoretically and practically, interpretable methods in AI that are transparent and explainable to improve trust and usability. It will also explore novel approaches for uncertainty quantification and understanding causality.

  • Exploring Predictivity--Parsimony Trade-off In Scientific Machine Learning

    This project will investigate, both theoretically and empirically, novel statistical techniques to explore the trade-offs between high-generalization performance and low-model complexity for scientific machine learning.

  • Novel Machine Learning Models for Scientific Discovery

    To extend the application range of machine learning to scientific domains, this project will design, analyze and implement novel machine learning techniques that learn from data, while conform with known properties of the underlying scientific models.

  • Automated Discovery of Optimization and Linear Algebra Algorithms

    Using reinforcement learning to automate algorithmic discovery, this project aims to develop novel variants of first- and second-order optimization methods, randomized numerical linear algebra techniques, and mixed-integer programming approaches.

  • Second-order Optimization Algorithms for Machine Learning

    This project aims to develop the next generation of second-order optimization methods for training complex machine learning models, with particular focus on constrained problems arising in scientific machine learning applications.

  • Distributed Optimization Algorithms for Large-scale Machine Learning

    This project aims to design, analyze and implement efficient optimization algorithms suitable for distributed computing environments, with focus on large-scale machine learning.

Supervision history

Current supervision

  • Doctor Philosophy

    Newton type methods for constrained optimization

    Principal Advisor

  • Doctor Philosophy

    Novel Machine Learning Models for Scientific Discovery

    Principal Advisor

  • Doctor Philosophy

    AI/ML Framework for Mixed-integer Nonlinear Optimisation

    Principal Advisor

  • Doctor Philosophy

    Interpretable AI-Theory and Practice

    Principal Advisor

    Other advisors: Dr Quan Nguyen, Dr Maciej Trzaskowski

  • Doctor Philosophy

    Faithful-Newton Framework: Bridging between Inner and Outer Solvers

    Principal Advisor

    Other advisors: Associate Professor Marcus Gallagher

  • Doctor Philosophy

    Stochastic Simulation and Optimization Methods for Machine Learning

    Principal Advisor

  • Doctor Philosophy

    Forecasting the Market Capitalisation of ASX Listed Junior Resource Companies through an Artificial Neural Network

    Associate Advisor

    Other advisors: Associate Professor Mehmet Kizil, Dr Micah Nehring

  • Doctor Philosophy

    Offline Reinforcement Learning Theory and Algorithms

    Associate Advisor

    Other advisors: Dr Nan Ye

Completed supervision

Media

Enquiries

For media enquiries about Professor Fred Roosta's areas of expertise, story ideas and help finding experts, contact our Media team:

communications@uq.edu.au