 we have developed a new family of multiplicative algorithms for non-negative matrix factorization and Mf. These algorithms are based on a new family of generalized divergences called the alpha-beta divergences, A-B divergences. These divergences are parameterized by two tuning parameters, alpha and beta, and they allow us to smoothly transition from the fundamental alpha-dash, beta-dash, and gamma divergences. This allows us to incorporate a wide variety of existing divergences into our algorithm, such as the LISUNG, ASRA, Image Space Reconstruction Algorithm, EMML, Expectation Maximization Maximum Likelihood, alpha-NMF, and beta-NMF. Additionally, we have shown that the proposed family of A-B multiplicative NMF algorithms improves robustness with respect to noise and outliers compared to existing algorithms. This article was authored by Andrei Chihaki, Sergio Cruces, and Shana Chiamari.