Skip to content

gluch/deep-painterly-harmonization

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

81 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

deep-painterly-harmonization

Code and data for paper "Deep Painterly Harmonization"

Disclaimer

This software is published for academic and non-commercial use only.

Setup

This code is based on torch. It has been tested on Ubuntu 16.04 LTS.

Dependencies:

CUDA backend:

Download VGG-19:

sh models/download_models.sh

Compile cuda_utils.cu (Adjust PREFIX and NVCC_PREFIX in makefile for your machine):

make clean && make

Usage

To generate all results (in data/) using the provided scripts, simply run

python gen_all.py

in Python and then

run('filt_cnn_artifact.m')

in Matlab or Octave. The final output will be in results/.

Note that in the paper we trained a CNN on a dataset of 80,000 paintings collected from wikiart.org, which estimates the stylization level of a given painting and adjust weights accordingly. We will release the pre-trained model in the next update. Users will need to set those weights manually if running on their new paintings for now.

Removed a few images due to copyright issue. Full set here for testing use only.

Examples

Here are some results from our algorithm (from left to right are original painting, naive composite and our output):

Acknowledgement

  • Our torch implementation is based on Justin Johnson's code;
  • Histogram loss is inspired by Risser et al.

Citation

If you find this work useful for your research, please cite:

@article{luan2018deep,
  title={Deep Painterly Harmonization},
  author={Luan, Fujun and Paris, Sylvain and Shechtman, Eli and Bala, Kavita},
  journal={arXiv preprint arXiv:1804.03189},
  year={2018}
}

Contact

Feel free to contact me if there is any question (Fujun Luan [email protected]).

Releases

No releases published

Packages

No packages published

Languages

  • Cuda 48.1%
  • Lua 46.2%
  • MATLAB 2.8%
  • Python 1.8%
  • Other 1.1%