Skip to content

This Repository implement Fast Neural Style transfer using pretrained VGG provided in Pytorch.

Notifications You must be signed in to change notification settings

tushaarkataria/Fast-Neural-Style-Transfer-Using-Pytroch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

48 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fast Neural Style Transfer using Pytorch Models

This repository implements Perceptual Losses for Real-Time Style Transfer and Super-Resolution paper by Justin Johnson, Alexandre Alahi, and Fei-Fei Li, using pretrained VGG models in Pytorch models. Model architecture and training methodology is same, but hyper parameters are different.

Pretrained models being different the loss function and relative strengths of style loss and content loss needed to be recalibrated. The loss with pretrained models is very small, so scaling it by 1.0e5 works relatively well for different styles.

Was able to replicate results shown in the paper below with pretrained VGG models provided on the Original webpage of the project.

Was able to create similar results with Pretrained Network provided with Pytorch Pretrained model, with different hyper parameters.

Python Prerequisites

  • python3.6 >
  • pytorch with cuda enabled.
  • cuda 10.2 version
  • skimage
  • dominate
  • copy

Usage

Dataset

Training

python train.py -lr 0.001 -epoch 2 -batch 6 -style 5 -alphatv 1 -alpha 200000

Above example is for style 5. More can be used.

Testing/ Running on your own images

python test.py -style 5 -imageName <imagename> 

should contain the path to the file as well.

Some Issues

  1. Dynamic range of outputs is more than inputs, some way to normalize that would make the outputs better.

  2. Can we use low number of images from COCO dataset and get results, so that this can be given as an assignment ??

Styles used for experiments

  1. Style 0 Sample Outputs for the Style

  1. Style 1

  1. Style 2

  1. Style 3

  1. Style 4 Sample Outputs for the Style

  1. Style 5 Sample Outputs for the Style

  1. Style 6 Sample Outputs for the Style

  1. Style 7 Sample Outputs for the Style

  1. Style 8 Sample Outputs for the Style

  1. Style 9 Sample Outputs for the Style

  1. Style 10 Sample Outputs for the Style

Future Experiments if Possible

  1. It will proabably work with other Networks as well like ResNet, might be interesting to see those results.

About

This Repository implement Fast Neural Style transfer using pretrained VGG provided in Pytorch.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages