We present Cyclic-Permutation Invariant Neural Networks, a novel class of neural networks (NNs) designed to be invariant to phase shifts of period-folded periodic sequences by means of ‘symmetry padding’. In the context of periodic variable star light curves, initial phases are exogenous to the physical origin of the variability and should thus be immaterial to the downstream inference application. Although previous work utilizing NNs commonly operated on period-folded light curves, no approach to date has taken advantage of such a symmetry. Across three different data sets of variable star light curves, we show that two implementations of Cyclic-Permutation Invariant Networks- iTCN and iResNet-consistently outperform state-of-the-art non- invariant baselines and reduce overall error rates by between 4 to 22 per cent. Over a 10-class OGLE-III sample, the iTCN/iResNet achieves an average per-class accuracy of 93.4 per cent/93.3 per cent, compared to recurrent NN/random forest accuracies of 70.5 per cent/89.5 per cent in a recent study using the same data. Finding improvement on a non-astronomy benchmark, we suggest that the methodology introduced here should also be applicable to a wide range of science domains where periodic data abounds.