GitHub | automatic coloring of line draft

lc013 2021-09-15 08:44:02

 

Today we are going to introduce a Github project , The project address is as follows , It realizes the automatic coloring function of line draft , The effect is very good , Let's see .

brief introduction

This project mainly realizes the function of automatically converting line draft into color picture . Of course , We can only train the neural network that processes the thread draft , But in practical application, we need to use the specified color to color the line draft in advance . There are many ways to achieve coloring , Include a given prompt (hint).

  • Without a hint

    • No hint of coloring method

    • Input : Only online

  • Atari

    • Coloring method with prompt , Tips are usually lines of the desired color in a specific area ( such as PaintsChainer)

    • Input : Line draft and atari

  • label

    • The tip is how to color the label

    • Input : Thread and label

  • object of reference

    • The coloring method using the reference picture as a hint ( such as style2paints V1)

    • Input : Line drawings and reference pictures


Line extraction method

There are many improved versions of online extraction methods , such as **XDoG ** perhaps SketchKeras. But if you train the model on only one type of thread , The model will over fit this type of line draft , Therefore, it is impossible to realize the automatic coloring function for other types of line manuscripts . therefore , and Tag2Pix equally , Here, a variety of different thread drafts are used as training data for network training .

The following three types of lines are used :

  • XDoG: Line extraction is performed using the difference between two Gaussian distributions to the standard deviation ;

  • SketchKeras: use UNet Line extraction . The line extracted by this method will be similar to pencil sketch ;

  • Sketch Simplified edition : The line extraction is continued through the full convolution network . This method is similar to digital sketch .

The extraction results of the above three methods are shown below :

Github | Automatic coloring of line draft _python

Besides , I also consider three data enhancement methods for online manuscripts , Prevent over fitting .

  • Increase the intensity ;

  • Random morphological transformation deals with lines with different widths ;

  • Random selection RGB Numerical values to deal with lines of different depths ;


Experiments without hints

motivation

First , I need to confirm that the neural network-based method can accurately and variously color without prompt . The main difficulty is the mapping from lines to color images , Because the color changes . therefore , Without prompting , I think the neural network will only learn to paint a single color in any region . To avoid falling into a local minimum , In addition to the loss of content , I also joined the fight against loss , Because the anti learning training neural network can match the distribution of data more accurately .

Method

  • pix2pix

  • pix2pix-gp(pix2pix Plus a centrosymmetric gradient penalty )

  • pix2pixHD

result

  • pix2pix

Github | Automatic coloring of line draft _ neural network _02

  • pix2pix-gp

Github | Automatic coloring of line draft _ data _03

  • pix2pixHD

Github | Automatic coloring of line draft _python_04


use atari The experiment of

motivation

Observing the above results, it is found that , Even if you join the fight against loss , The neural network still seems to fall into a local minimum . Although there are different degrees of color change , But the neural network can only learn a single color on a single character in any region . It is difficult to train to map lines to color images without prompting , therefore , I decided to join the hint , namely atari, Together as input to the network (ps. As shown in the figure below , Is based on the original line draft , Add a line of specified color to a specific area , Prompt the color of this part of the network ).

Method

Added tips

result

Github | Automatic coloring of line draft _ Deep learning _05


Experiments using reference pictures

motivation

I've also considered using reference pictures as prompts into neural networks . First , I tried to achieve style2paints V1. But because of the collapse of training , It's hard for me to reproduce the original experimental results . therefore , I decided to look for a  style2paints V1 Alternative methods .

Method

  • style2paints

result

Github | Automatic coloring of line draft _ Deep learning _06


Video coloring experiment

result

Github | Automatic coloring of line draft _ Deep learning _07
Github | Automatic coloring of line draft _python_08


 


 

Welcome to my WeChat official account. -- The growth of algorithmic apes , Or scan the QR code below , Let's talk , Learning and progress !

  Github | Automatic coloring of line draft _ data _09

 

Please bring the original link to reprint ,thank
Similar articles

2021-09-15

2021-09-15