The random projections usually adopted in compressive sensing applications do not exploit a priori knowledge of the sensing task or expected signal structure (other than the fundamental assumption of sparsity). In this paper, we use a task-specific information-based approach to optimizing the compressive sensing kernels for the time delay estimation of radar targets. The measurements are modeled according to a Gaussian mixture model by approximately discretizing the a priori distribution of the time delay. The sensing kernel that maximizes the Shannon mutual information between the measurements and the time delay is then approximated via a gradient-based approach. In addition, we also derive the Bayesian Cramér-Rao bound (CRB) on the time delay estimate as a function of the compressive sensing measurement kernels. Simulation results demonstrate that the proposed optimal sensing kernel outperforms random projections and the performance is consistent with the Bayesian CRB versus signal-to-noise ratio. We conclude that compressive sensing has potential utility in providing measurements with improved resolution for radar target parameter estimation problems.