 We have developed a new family of multiplicative algorithms for non-negative matrix factorization, NMF. These algorithms are based on a new family of generalized divergences called the alpha-beta divergences, AB divergences. These divergences are parameterized by two tuning parameters, alpha and beta, and they allow us to smoothly transition from the fundamental alpha-dash, beta-dash, and gamma divergences. This allows us to incorporate a wide variety of existing divergences into our algorithm, such as the LISUNG, ISRA, Image Space Reconstruction Algorithm, EMML, Expectation Maximization Maximum Likelihood, Alpha NMF, and Beta NMF. Additionally, we have shown that the proposed family of AB multiplicative NMF algorithms improves robustness with respect to noise and outliers compared to existing algorithms. This article was authored by Andrei Chihaki, Sergio Cruzas, and Shani Chiamari. We are article.tv, links in the description below.