
Nicolas Korjahn developed and integrated the Shampoo Optimizer for neural network training within the apache/systemds repository, focusing on both full-matrix and diagonal preconditioning techniques. His work included implementing momentum updates and a heuristic variant that delays preconditioner updates and reduces the frequency of root recomputation, aiming to improve convergence and training efficiency. Using DML and Python, Nicolas extended existing neural network training scripts and introduced comprehensive tests to validate correctness and convergence. The feature was committed as an initial prototype and accompanied by staging experiments, demonstrating a deep technical approach to optimization algorithms and machine learning workflows in SystemDS.
February 2026 monthly summary for apache/systemds: Delivered the Shampoo Optimizer for Neural Network Training, implementing both full-matrix and diagonal preconditioning, momentum updates, and a heuristic variant with delayed preconditioner updates and infrequent root recomputation. The work supports neural network training within SystemDS and lays groundwork for improved convergence and training efficiency. Key tests and experiments were added and wired into the existing NN training workflow.
February 2026 monthly summary for apache/systemds: Delivered the Shampoo Optimizer for Neural Network Training, implementing both full-matrix and diagonal preconditioning, momentum updates, and a heuristic variant with delayed preconditioner updates and infrequent root recomputation. The work supports neural network training within SystemDS and lays groundwork for improved convergence and training efficiency. Key tests and experiments were added and wired into the existing NN training workflow.

Overview of all repositories you've contributed to across your timeline