Posts

Welcome to Foundation Models and Federated Learning research group.

Image
This group is in the University of Technology Sydney, Australia. We are dedicated to exploring new methodologies and applications on foundation models and federated learning. We are passionate to share our research outcomes with the research community via publishing papers, giving talks, hosting workshops and seminars, and sharing thoughts via writing blog posts. We also enjoyed the discussion with industry partners to solve their practical challenges using our knowledge and methods, and then towards co-designing innovative applications. We have special interests in applying cutting-edge artificial intelligence technology to solve the challenges of science, cybersecurity, digital health, climate change, and sustainable environment. Image 1: UTS Central, Building 2 (the image is sourced from the Internet) Prompt engineering is a major way of using foundation models, e.g. ChatGPT. However, privacy-preserving on prompt learning would be a new risk of using foundation models. For examp...

Will Federated Learning to be the next key breakthrough of Time-Series Foundation Models?

Author: Guodong Long, Australian Artificial Intelligence Institute, University of Technology Sydney, Australia Date: Feb 12, 2026 Will Federated Learning Be the Next Breakthrough for Time-Series Foundation Models? Federated learning has long been viewed primarily as a privacy-preserving technique . However, its true essence lies in something deeper: the intrinsic capability for collaboration among distributed participants . When we shift our perspective from privacy to collaboration, a more ambitious vision of federated learning emerges — and with it, new possibilities for innovation. Let us momentarily set aside privacy preservation and explore a broader question: Could federated learning unlock the next generation of time-series foundation models? If this sparks ideas for papers, grants, or applications, I would be delighted to discuss further. A Brief History of Time-Series Foundation Models Foundation models are pre-trained machine learning models designed to capture general knowle...

Federated Recommendation: A Privacy-Preserving Future for Recommendation Systems

Author: Guodong Long, Australian Artificial Intelligence Institute, University of Technology Sydney, Australia Date: Feb 12, 2026 What Is Federated Recommendation? Federated recommendation is an emerging approach to building recommendation systems using a federated learning (FL) framework , with privacy preservation as a core principle. Traditional recommendation systems are typically operated by centralized servers controlled by service providers. These systems require users’ interaction data—such as viewing history, browsing behavior, and purchase records—to be stored and processed on the provider’s servers. In contrast, federated recommendation shifts data storage and model training to users’ local devices (e.g., smartphones or personal computers). Rather than uploading raw behavioral data to centralized servers, the model is trained locally, and only privacy-preserving updates are shared with the service provider. This reduces the exposure of sensitive personal information while m...

A Brief Investigation about the Current Status of Federated Learning for Health in Australia

Author: Guodong Long, Australian Artificial Intelligence Institute, University of Technology Sydney, Australia Date: Feb 12, 2026 Federated learning (FL) is a privacy-preserving machine learning paradigm designed to enable collaborative model training across distributed data sources without centralising sensitive data. It has been widely adopted in domains involving personal devices (e.g., smartphones) and data-sensitive institutions, particularly healthcare and finance. National Infrastructure and Major Investments In 2023, a A$13.7 million initiative ( link ), titled NINA – National Infrastructure for Federated Learning in Digital Health , was established and is hosted by the Faculty of Health at The University of Queensland (UQ). Led by Professor Clair Sullivan (UQ) , the project received A$6 million from the Medical Research Future Fund (MRFF) under the National Critical Research Infrastructure scheme. This investment reflects growing national commitment to federated learning as a...

Post-project thinking: Enterprise Intelligent Assistants for SME

The rise of LLM has demonstrated the promising future of developing commercial EIAs in SME environments. Below are several items for consideration. 1. Transformer architecture is the core.  Transformer has been an overwhelming neural architecture for implementing modern EIAs. The LLM is also implemented based on Transformer in a large-scale scenario. It is easy to transfer the current SMEs-based application to an LLM-based framework because they basically share the same architecture. There are also open-sourced LLM available that make it feasible to make this transition.  2. Trustworthy collaboration is the future. Most SMEs are unwilling to expose their data to LLM for data security and business protection. However, they need to plug into the LLM to utilise the powerful generative AI tool. The dilemma could be solved by leveraging the federated fine-tuning mechanism upon LLMs. Therefore, trustworthy collaboration will be a promising solution for future use scenarios of SMEs. ...

Meet you in person at IJCAI-FL workshop in Maco, 21 August 2023.

Image
I am serving as one of the program co-chairs to host the Federated Learning workshop in IJCAI 2023. Please find the website below. https://federated-learning.org/fl-ijcai-2023/ Workshop Date : August 21, 2023 Venue : Sheraton Grand Macao Hotel, Macau 

Scholarships available for 2023 and 2024.

We have several scholarships available for both domestic and international candidates.  Below are the requirements for recruitment. Please send your CV to the contact email - guodong.long (at) uts.edu.au.  The University of Technology Sydney is a public research university located in Sydney, New South Wales, Australia.  UTS Computer Science subject is ranked in a high position in worldwide, such as #17 in worldwide by ShanghaiRanking 2022 . #15 in worldwide by USNews 2022-2023 #73 in worldwide by QS ranking 2023 #69 in worldwide by TIMES 2023 1. Federated learning research The student is expected to publish papers at machine learning top conferences including ICML, ICLR, and NeurIPS. Other related conferences and journals are also acceptable. Requirements 1) The candidate has research experience or is ranked top 5% of her/his bachelor's study. 2) The candidate has a special interest and strong background in one of the below disciplines: Optimization, Statistics, Math...