 Our work proposes a novel Siamese-based spatial-temporal attention neural network for remote sensing image change detection. The proposed approach integrates a CD self-attention module in the procedure of feature extraction, which calculates the attention weights between any two pixels at different times and positions and uses them to generate more discriminative features. This enables us to capture spatial, temporal dependencies at various scales, thereby generating better representations to accommodate objects of various sizes. Furthermore, we introduce a large-scale remote sensing image CD dataset lever CD, consisting of 637 image pairs, 1024 by 1024, and over 31K independently labeled change instances. Experimental results demonstrate that our method outperforms several other state-of-the-art methods. This article was authored by Houchin and Jinway-Sher.