 In this paper, we explore three different types of divergences, alpha-dash, beta-dash, and gamma-divergences, and examine their similarities and differences. We demonstrate how each type of divergence can be derived from another, as well as how they are related to other measures of entropy, such as tsalis and rainy entropies. These divergences have applications in many fields, including machine learning, data mining, and information theory. This article was authored by Andrei Chihaki and Shanichi Amari.