Graph Neural Networks for Time Series Forecasting: Beyond Traditional Approaches

2025-06-17
Graph Neural Networks for Time Series Forecasting: Beyond Traditional Approaches

This blog post presents a novel approach to time series forecasting using graph neural networks. Unlike traditional methods that focus solely on individual time series, this approach leverages the interconnectedness of data within a graph structure (e.g., from a relational database). By representing time series as nodes in a graph, and employing techniques like graph transformers, the model captures relationships between different series, leading to more accurate predictions. The post also compares regression-based and generative forecasting methods, demonstrating the generative approach's superior ability to capture high-frequency details and handle rare events.

Read more

KumoRFM: A Relational Foundation Model for Revolutionizing Relational Database Predictions

2025-05-23
KumoRFM: A Relational Foundation Model for Revolutionizing Relational Database Predictions

KumoRFM is a groundbreaking Relational Foundation Model (RFM) capable of making accurate predictions on relational databases across a wide range of predictive tasks without requiring any data or task-specific training. It achieves this by transforming databases into temporal, heterogeneous graphs, employing a table-invariant encoding scheme and a Relational Graph Transformer to reason across multimodal data between tables. On the RelBench benchmark, KumoRFM outperforms traditional feature engineering and end-to-end supervised deep learning approaches by 2% to 8% on average, with further improvements of 10% to 30% after fine-tuning. Most importantly, KumoRFM is orders of magnitude faster than conventional supervised training approaches, offering a zero-code solution for real-time predictions.

Read more

Relational Graph Transformers: Unleashing AI's Potential in Relational Databases

2025-04-28
Relational Graph Transformers: Unleashing AI's Potential in Relational Databases

Traditional machine learning struggles to fully capture the valuable insights hidden in the complex relationships between tables within enterprise data. Relational Graph Transformers (RGTs) represent a breakthrough, treating relational databases as interconnected graphs, eliminating the need for extensive feature engineering and complex data pipelines. RGTs significantly improve the efficiency and accuracy of AI in extracting intelligence from business data, showing immense potential in applications like customer analytics, recommendation systems, fraud detection, and demand forecasting. They offer a powerful new tool for both data scientists and business leaders.

Read more

Graph Transformers: The Next Generation of Graph Models

2025-04-22
Graph Transformers: The Next Generation of Graph Models

Graphs are ubiquitous, but leveraging their complex, long-range relationships has been a challenge for machine learning. Graph Neural Networks (GNNs) excel at capturing local patterns but struggle with global relationships. Enter Graph Transformers, which leverage powerful self-attention mechanisms, enabling each node to directly attend to information from anywhere in the graph, thus capturing richer relationships and subtle patterns. Compared to GNNs, Graph Transformers offer advantages in handling long-range dependencies, mitigating over-smoothing and over-squashing, and more effectively processing heterogeneous data. While Graph Transformers have higher computational complexity, techniques like sparse attention mechanisms and subgraph sampling enable efficient processing of large graph datasets.

Read more