 The idea that death or mortality is what gives life meaning is not particularly new and it has been presented in all sorts of variations in movies, literature, philosophy, you name it. The basic idea is that no matter how much we might fear it, death is necessary to give life meaning and even make things enjoyable. It's such an old and well-established trope that it has become common wisdom. It sounds deep and wise. Fine, but is it true? Does death give life meaning?