How do swap the VAE in runtime in Stable Diffusion

What is a VAE?

VAE is a file that is often included alongside Checkpoints (full models) that impacts the color and noise cleanup, to put it simply. You have have seen models advertised as “VAE baked” — literally packing this file into the SafeTensors format. Your humble team at PirateDiffusion reads these guides and installs the VAE that the creator recommends, so 99.9% of the time you are using a VAE already and it is silently working as intended.

More specifically, VAE is a special type of model that can be used to change the contrast, quality, and color saturation. If an image looks overly foggy and your guidance is set above 10, the VAE might be the culprit. VAE stands for “variational autoencoder” and is a technique that reclassifies images, similar to how a zip file can compress and restore an image. If you’re a math buff, the technical writeups are really interesting stuff

The VAE “rehydrates” the image based on the data that it has been exposed to, instead of discrete values. If all of your renders images appear desaturated, blurry, or have purple spots, changing the VAE is the best solution. (See troubleshooting below for more details about this bug). 16 bit VAE run fastest.

 

Most people won’t need to learn this feature, but we offer for enthusiast users that want the most control of their images.

Which VAE is best?

It depends on how you feel about Saturation and colorful particles in your images. We recommend trying different ones to find your groove.

Troubleshooting

Purple spots, unwanted bright green dots, and completely black images when doing /remix are the three most common VAE glitches. If you’re getting these, please let a moderator know and we’ll change the default VAE for the model.  You can also correct it with a runtime vae swap as shown below.

Shown below: This model uses a very bright VAE, but is leaking green dots on the shirt, fingers, and hair.

The Fix: Performing a VAE swap

Our support team can change the VAE at the model level, so you don’t have to do this every time. But maybe you’d like to try a few different looks?  Here’s how to swap it at runtime:

/render #sdxlreal a hamster singing in the subway /vae:GraydientPlatformAPI__bright-vae-xl

Recommended for SDXL

/vae:GraydientPlatformAPI__bright-vae-xl

Recommended for SD15 photos (vae-ft-mse-840000-ema-pruned)

/vae:GraydientPlatformAPI__sd-vae-ft-ema

Recommended for SD15 illustration or anime

/vae:GraydientPlatformAPI__vae-klf8anime2

 

Compatible VAEs

Using a different third-party VAE

Upload or find one on the Huggingface website with this folder directory setup:

https://huggingface.co/madebyollin/sdxl-vae-fp16-fix

Then replace the slashes and remove the front part of the URL, like this:

/render whatever /vae:madebyollin__sdxl-vae-fp16-fix

If you click that file you’ll see that it doesn’t contain a bunch of folders or a checkpoint, it just contains the VAE files.  Pointing into a binned VAE of a whole model will not load. In that case just ask us to load that model for you.

The vae folder must have the following characteristics:

  • A single VAE per folder, in a top-level folder of a Huggingface profile as shown above
  • The folder must contain a config.json
  • The file must be in .bin format
  • The bin file must be named “diffusion_pytorch_model.bin”

Where to find more:  Huggingface and Civitai may have others, but they must be converted to the format above

Other known working vae:

  • /vae:GraydientPlatformAPI__vae-blessed2 — less saturated than kofi2
  • /vae:GraydientPlatformAPI__vae-anything45 — less saturated than blessed2
  • /vae:GraydientPlatformAPI__vae-orange — medium saturation, but
  • /vae:GraydientPlatformAPI__vae-pastel — vivid colors like old Dutch masters
  • /vae:LittleApple-fp16__vae-ft-mse-840000-ema-pruned – great for realism