 Artificial intelligence comes with the promises of helping coders code faster, drivers drive safer, and doing some of the daily activities in a less time consuming manner. Artificial intelligence, when adopted widely, could have large footprints in the future, which will also exceed the power demands of some of the countries. Since 2022, Generative AI, which helps you to generate texts, images, and other data has undergone rapid growth, including the open AI's chat GPT. AI requires feeding these models a large amount of data, which itself is very energy intensive. Huggingface, an AI-developing company based out of New York, reported that its multilingual AI text generating tool consumes somewhere around 433 megawatt hour of energy, which is sufficient to power 40 American homes for a year. AI's energy footprint doesn't end with training. When the tool is put to work, generating data based on prompts, every time the tool generates an image or a text, it consumes a significant amount of processing power and hence energy. For example, chat GPT can cost somewhere around 546 megawatt hour of energy on a daily basis. By 2027, the AI-related electricity consumption could have grown somewhere by 85 to 134 terawatt hour on an annual prediction based on the AI server production. Now, you won't believe this, this amount is comparable to the annual electricity consumption by some of the countries like Netherlands, Argentina, and Sweden. We need to be very mindful about what we use AI for. It's very energy intensive so we don't want to put it for things we don't actually need.