Is Generator Conditioning Causally Related to GAN Performance?
abs:
Recent work (Pennington et al., 2017) suggeststhat controlling the entire distribution of Jacobiansingular values is an important design considera-tion in deep learning. Motivated by this, we studythe distribution of singular values of the Jacobianof the generator in Generative Adversarial Net-works (GANs). We find that this Jacobian gen-erally becomes ill-conditioned at the beginningof training. Moreover, we find that the average(with z ∼ p(z)) conditioning of the generatoris highly predictive of two other ad-hoc metricsfor measuring the “quality” of trained GANs: theInception Score and the Frechet Inception Dis-tance (FID). We test the hypothesis that this re-lationship is causal by proposing a “regulariza-tion” technique (called Jacobian Clamping) thatsoftly penalizes the condition number of the gen-erator Jacobian. Jacobian Clamping improvesthe mean Inception Score and the mean FID forGANs trained on several datasets. It also greatlyreduces inter-run variance of the aforementionedscores, addressing (at least partially) one of themain
https://www.arxiv-vanity.com/papers/1802.08768/