Solving ARC-AGI Puzzles Without Pretraining: A Compression-Based Approach
Isaac Liao and Albert Gu introduce CompressARC, a novel method that tackles the ARC-AGI benchmark using lossless information compression. This approach, without pretraining or large datasets, achieves 34.75% accuracy on the training set and 20% on the evaluation set, relying solely on compression during inference. The core idea is that more efficient compression correlates with more accurate solutions. CompressARC uses a neural network decoder and gradient descent to find a compact representation of the puzzle, inferring the answer within a reasonable timeframe. This work challenges the conventional reliance on extensive pretraining and data, suggesting a future where tailored compressive objectives and efficient inference-time computation unlock deep intelligence from minimal input.
Read more