Open mobile menu

Large-Scale Transformer-based

Causal Discovery Model

/ LCDM /

We are developing a totally novel architecture called Large Causal Discovery Model (LCDM), with the goal of turning causal learning into an inference task—eliminating the need for retraining or hand-crafted modeling for each new dataset, similar to current pre-trained large models.

 

Moreover, rather than training from scratch, we fine-tune existing LLMs. Since LLMs are trained via next-token prediction, they implicitly learn Markov blankets of variables. We leverage this property and apply post hoc analysis of attention matrices to reconstruct causal graphs.

 

This approach enables scalable, zero-shot causal reasoning, and transforms causal learning into a reusable capability embedded in the foundation model itself.

Built on

Transformers

Plug-and-Play

Causal Discovery

Delivering

Causal Structures

linkedin →

X →

© 2025 Abel.AI, Inc. All rights reserved.