log in  |  register  |  feedback?  |  help  |  web accessibility
PhD Proposal: Improving Scalability of Transformer Methods for Tabular and Time Series Problems
Mayuka Jayawardhana
IRB-5105 https://umd.zoom.us/j/2301256760?pwd=yEnbPucXOgRadvtKIGCybs1LVKj471.1 (Passcode: 529800)
Thursday, December 11, 2025, 10:00-11:30 am
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

Gradient-boosted decision tree (GBDT) algorithms such as XGBoost, CatBoost, and LightGBM have been the de facto standard for tabular and time series problems for the past decade. Meanwhile, tabular prior-fitted networks (PFNs)—a recent class of transformer-based foundational models trained for in-context learning—have demonstrated superior performance on small and medium-sized datasets. Large language models (LLMs) have also shown strong zero- and few-shot performance on tabular tasks by leveraging column headers and descriptive metadata. However, transformer models struggle to scale to large datasets due to inherent context-length limitations.

This dissertation presents a set of contributions aimed at improving the scalability of transformer-based models for tabular and time series prediction, enabling them to retain their pretraining benefits, natural-language understanding, and strong small-data reasoning capabilities while extending reliably to large-scale datasets and joint multivariate prediction settings.

Bio

Mayuka Jayawardhana is a fifth-year Ph.D. student in Computer Science at the University of Maryland, College Park, advised by Prof. Tom Goldstein. His research focuses on scalable foundational models for tabular and time series problems.

 

This talk is organized by Migo Gui