Normalizing Flows are deep generative models that allow for feasible exact inference by means of an invertible mapping between a simple prior and a complex target distribution. Coupling Flows inject the expressive power of neural networks into this framework: They condition a transformation of a subset of features on the remaining features. As of yet, the choice of a conditional structure has been limited to simple heuristics, such as splitting features into indices of odd and even parity. This works well on image datasets that exhibit strong feature locality. However, when applied to data with non-local features, such parity structures lead to worse training results. We propose a novel Sequential Monte Carlo structure search to find conditional structures that improve convergence and lead to smaller Coupling Flow models on general datasets. We show that our method outperforms parity structures and purely random structures on non-image datasets and produces shallower models with fewer parameters.