GGML Training Advancement: A MNIST VAE Training Example
2024-12-22
GitHub user bssrdf shared an example of training a MNIST VAE using the GGML library. This example aims to use only the GGML pipeline and its ADAM optimizer implementation, filling a gap in available GGML training examples. Modifications were made to the ADAM and LBFGS optimizers for GPU backend compatibility, and several missing operators and optimizer hooks were added for testing and sampling. The results after 10 epochs were satisfactory.
AI