We consider the problem of recovering two unknown vectors, and , of length from their circular convolution. We make the structural assumption that the two vectors are members of known subspaces, one with dimension and the other with dimension . Although the observed convolution is nonlinear in both and , it is linear in the rank-1 matrix formed by their outer product . This observation allows us to recast the deconvolution problem as low-rank matrix recovery problem from linear measurements, whose natural convex relaxation is a nuclear norm minimization program. We prove the effectiveness of this relaxation by showing that, for “generic” signals, the program can deconvolve and exactly when the maximum of and is almost on the order of . That is, we show that if is drawn from a random subspace of dimension , and is a vector in a subspace of dimension whose basis vectors are spread out in the frequency domain, then nuclear norm minimization recovers without error. We discuss this result in the context of blind channel estimation in communications. If we have a message of length , which we code using a random coding matrix, and the encoded message travels through an unknown linear time-invariant channel of maximum length , then the receiver can recover both the channel response and the message when , to within constant and log factors.