This paper studies how to determine the number of factors k by cross-validation in high-dimensional predictive models. We consider a nonparametric setting in which the relationship between the target variable and high-dimensional predictors is unspecified, and treat the number of factors as a tuning parameter for prediction: factors are estimated fold-wise from predictors only, k is selected to minimize the validation loss in predicting the target, and factors are re-estimated on the full sample once is k chosen. We show that fold-wise cross-validation achieves near-oracle out-of-sample predictive performance under both strong and weak factors regimes. Extensions to weakly dependent data are derived using blocked cross-validation, providing valid performance guarantees for factor-augmented predictions with time series. Our empirical application shows that cross-validated factor selection yields smaller out-of-sample prediction errors than information-criterion-based choices, particularly when the factors are weak or their directions are misaligned with the target.