I'm not sure how alarmed to be about artificial intelligence. Personally I think it's really hard to predict when we'll create a machine that essentially has consciousness. That's because we don't know what consciousness is, how it works, what's required to create it etc. So It might be technologically around the corner or a hundred years away.
What I do think is more predictable is the development of autonomous weapons that use AI to be the most effective killing machines of all time. That is scary. As outline by people like Musk and Hawking, this threat is clear and present so we should address it. I would like to see us agree as a species not to develop these sorts of weapons because if any one state does develop them, they would be very hard to stop.