 We have developed a new family of multiplicative algorithms for non-negative matrix factorization, NMF. These algorithms are based on a new family of generalized divergences called the alpha-beta divergences, AB divergences. These divergences are parameterized by two tuning parameters, alpha and beta, and they allow us to smoothly transition from the fundamental alpha-dash, beta-dash and gamma divergences. This allows us to incorporate a wide variety of existing divergences into our algorithm, such as the LISUNG, ISRA, image space reconstruction algorithm, EMML, expectation maximization maximum likelihood, alpha-NMF, and beta-NMF. Additionally, we have shown that the proposed family of AB multiplicative NMF algorithms improves robustness with respect to noise and outliers compared to existing algorithms. This article was authored by Andrei Chihaki, Sergio Cruces, and Shanichi Amari.