Temporal gates play a significant role in modern recurrent- based neural encoders, enabling fine-grained control over re- cursive compositional operations over time. In recurrent mod- els such as the long short-term memory (LSTM), temporal gates control the amount of information retained or discarded over time, not only playing an important role in influenc- ing the learned representations but also serving as a protec- tion against vanishing gradients. This paper explores the idea of learning temporal gates for sequence pairs (question and answer), jointly influencing the learned representations in a pairwise manner. In our approach, temporal gates are learned via 1D convolutional layers and then subsequently cross ap- plied across question and answer for joint learning. Empiri- cally, we show that this conceptually simple sharing of tem- poral gates can lead to competitive performance across mul- tiple benchmarks. Intuitively, what our network achieves can be interpreted as learning representations of question and an- swer pairs that are aware of what each other is remember- ing or forgetting, i.e., pairwise temporal gating. Via exten- sive experiments, we show that our proposed model achieves state-of-the-art performance on two community-based QA datasets and competitive performance on one factoid-based QA dataset.