 In this paper, we explore several families of asymmetric divergences, including alpha-dash, beta-dash, and gamma-divergences. These divergences are related through nonlinear transformations, allowing us to generate new divergences from existing ones. Additionally, we demonstrate how these divergences can be used to measure the similarity between two probability distributions. Finally, we show that these divergences can be interpreted as measures of information theory, providing insight into their applications. This article was authored by Andrei Chihaki and Shanichi Amari.