Skip to content
This repository has been archived by the owner on Jan 2, 2021. It is now read-only.

Commit

Permalink
Corrected value for adversarial loss. Don't refactor math the day aft…
Browse files Browse the repository at this point in the history
…er stopping coffee.
  • Loading branch information
alexjc committed Oct 31, 2016
1 parent 0c9937a commit 02d2fca
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ Pre-trained models are provided in the GitHub releases. Training your own is a
# Train the model using an adversarial setup based on [4] below.
python3.4 enhance.py --train "data/*.jpg" --model custom --scales=2 --epochs=250 \
--perceptual-layer=conv5_2 --smoothness-weight=2e4 --adversary-weight=2e5 \
--perceptual-layer=conv5_2 --smoothness-weight=2e4 --adversary-weight=1e3 \
--generator-start=5 --discriminator-start=0 --adversarial-start=5 \
--discriminator-size=64
Expand Down
2 changes: 1 addition & 1 deletion enhance.py
Original file line number Diff line number Diff line change
Expand Up @@ -374,7 +374,7 @@ def loss_total_variation(self, x):
return T.mean(((x[:,:,:-1,:-1] - x[:,:,1:,:-1])**2 + (x[:,:,:-1,:-1] - x[:,:,:-1,1:])**2)**1.25)

def loss_adversarial(self, d):
return T.mean(1.0 - T.nnet.softplus(d[args.batch_size:]))
return T.mean(1.0 - T.nnet.softminus(d[args.batch_size:]))

def loss_discriminator(self, d):
return T.mean(T.nnet.softminus(d[args.batch_size:]) - T.nnet.softplus(d[:args.batch_size]))
Expand Down

0 comments on commit 02d2fca

Please sign in to comment.