Schedulers¶
🤗 Diffusers provides many scheduler functions for the diffusion process. A scheduler takes a model's output (the sample which the diffusion process is iterating on) and a timestep to return a denoised sample. The timestep is important because it dictates where in the diffusion process the step is; data is generated by iterating forward n timesteps and inference occurs by propagating backward through the timesteps. Based on the timestep, a scheduler may be discrete in which case the timestep is an int
or continuous in which case the timestep is a float
.
Depending on the context, a scheduler defines how to iteratively add noise to an image or how to update a sample based on a model's output:
- during training, a scheduler adds noise (there are different algorithms for how to add noise) to a sample to train a diffusion model
- during inference, a scheduler defines how to update a sample based on a pretrained model's output
Many schedulers are implemented from the k-diffusion library by Katherine Crowson, and they're also widely used in A1111. To help you map the schedulers from k-diffusion and A1111 to the schedulers in 🤗 Diffusers, take a look at the table below:
A1111/k-diffusion | 🤗 Diffusers | Usage |
---|---|---|
DPM++ 2M | DPMSolverMultistepScheduler |
|
DPM++ 2M Karras | DPMSolverMultistepScheduler |
init with use_karras_sigmas=True |
DPM++ 2M SDE | DPMSolverMultistepScheduler |
init with algorithm_type="sde-dpmsolver++" |
DPM++ 2M SDE Karras | DPMSolverMultistepScheduler |
init with use_karras_sigmas=True and algorithm_type="sde-dpmsolver++" |
DPM++ 2S a | N/A | very similar to DPMSolverSinglestepScheduler |
DPM++ 2S a Karras | N/A | very similar to DPMSolverSinglestepScheduler(use_karras_sigmas=True, ...) |
DPM++ SDE | DPMSolverSinglestepScheduler |
|
DPM++ SDE Karras | DPMSolverSinglestepScheduler |
init with use_karras_sigmas=True |
DPM2 | KDPM2DiscreteScheduler |
|
DPM2 Karras | KDPM2DiscreteScheduler |
init with use_karras_sigmas=True |
DPM2 a | KDPM2AncestralDiscreteScheduler |
|
DPM2 a Karras | KDPM2AncestralDiscreteScheduler |
init with use_karras_sigmas=True |
DPM adaptive | N/A | |
DPM fast | N/A | |
Euler | EulerDiscreteScheduler |
|
Euler a | EulerAncestralDiscreteScheduler |
|
Heun | HeunDiscreteScheduler |
|
LMS | LMSDiscreteScheduler |
|
LMS Karras | LMSDiscreteScheduler |
init with use_karras_sigmas=True |
N/A | DEISMultistepScheduler |
|
N/A | UniPCMultistepScheduler |
All schedulers are built from the base SchedulerMixin
class which implements low level utilities shared by all schedulers.
mindone.diffusers.SchedulerMixin
¶
Bases: PushToHubMixin
Base class for all schedulers.
[SchedulerMixin
] contains common functions shared by all schedulers such as general loading and saving
functionalities.
[ConfigMixin
] takes care of storing the configuration attributes (like num_train_timesteps
) that are passed to
the scheduler's __init__
function, and the attributes can be accessed by scheduler.config.num_train_timesteps
.
Class attributes
- _compatibles (
List[str]
) -- A list of scheduler classes that are compatible with the parent scheduler class. Use [~ConfigMixin.from_config
] to load a different compatible scheduler class (should be overridden by parent class).
Source code in mindone/diffusers/schedulers/scheduling_utils.py
73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 |
|
mindone.diffusers.SchedulerMixin.from_pretrained(pretrained_model_name_or_path=None, subfolder=None, return_unused_kwargs=False, **kwargs)
classmethod
¶
Instantiate a scheduler from a pre-defined JSON configuration file in a local directory or Hub repository.
PARAMETER | DESCRIPTION |
---|---|
pretrained_model_name_or_path |
Can be either:
TYPE:
|
subfolder |
The subfolder location of a model file within a larger model repository on the Hub or locally.
TYPE:
|
return_unused_kwargs |
Whether kwargs that are not consumed by the Python class should be returned or not.
TYPE:
|
cache_dir |
Path to a directory where a downloaded pretrained model configuration is cached if the standard cache is not used.
TYPE:
|
force_download |
Whether or not to force the (re-)download of the model weights and configuration files, overriding the cached versions if they exist.
TYPE:
|
proxies |
A dictionary of proxy servers to use by protocol or endpoint, for example,
TYPE:
|
output_loading_info(`bool`, |
Whether or not to also return a dictionary containing missing keys, unexpected keys and error messages.
TYPE:
|
local_files_only(`bool`, |
Whether to only load local model weights and configuration files or not. If set to
TYPE:
|
token |
The token to use as HTTP bearer authorization for remote files. If
TYPE:
|
revision |
The specific model version to use. It can be a branch name, a tag name, a commit id, or any identifier allowed by Git.
TYPE:
|
To use private or gated models, log-in with
huggingface-cli login
. You can also activate the special
"offline-mode" to use this method in a
firewalled environment.
Source code in mindone/diffusers/schedulers/scheduling_utils.py
94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 |
|
mindone.diffusers.SchedulerMixin.save_pretrained(save_directory, push_to_hub=False, **kwargs)
¶
Save a scheduler configuration object to a directory so that it can be reloaded using the
[~SchedulerMixin.from_pretrained
] class method.
PARAMETER | DESCRIPTION |
---|---|
save_directory |
Directory where the configuration JSON file will be saved (will be created if it does not exist).
TYPE:
|
push_to_hub |
Whether or not to push your model to the Hugging Face Hub after saving it. You can specify the
repository you want to push to with
TYPE:
|
kwargs |
Additional keyword arguments passed along to the [
TYPE:
|
Source code in mindone/diffusers/schedulers/scheduling_utils.py
159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 |
|
mindone.diffusers.schedulers.scheduling_utils.SchedulerOutput
dataclass
¶
Bases: BaseOutput
Base class for the output of a scheduler's step
function.
PARAMETER | DESCRIPTION |
---|---|
prev_sample |
Computed sample
TYPE:
|
Source code in mindone/diffusers/schedulers/scheduling_utils.py
59 60 61 62 63 64 65 66 67 68 69 70 |
|
KarrasDiffusionSchedulers¶
[KarrasDiffusionSchedulers
] are a broad generalization of schedulers in 🤗 Diffusers. The schedulers in this class are distinguished at a high level by their noise sampling strategy, the type of network and scaling, the training strategy, and how the loss is weighed.
The different schedulers in this class, depending on the ordinary differential equations (ODE) solver type, fall into the above taxonomy and provide a good abstraction for the design of the main schedulers implemented in 🤗 Diffusers. The schedulers in this class are given here.
mindone.diffusers.utils.PushToHubMixin
¶
A Mixin to push a model, scheduler, or pipeline to the Hugging Face Hub.
Source code in mindone/diffusers/utils/hub_utils.py
471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 |
|
mindone.diffusers.utils.PushToHubMixin.push_to_hub(repo_id, commit_message=None, private=None, token=None, create_pr=False, safe_serialization=True, variant=None)
¶
Upload model, scheduler, or pipeline files to the 🤗 Hugging Face Hub.
PARAMETER | DESCRIPTION |
---|---|
repo_id |
The name of the repository you want to push your model, scheduler, or pipeline files to. It should
contain your organization name when pushing to an organization.
TYPE:
|
commit_message |
Message to commit while pushing. Default to
TYPE:
|
private |
Whether or not the repository created should be private.
TYPE:
|
token |
The token to use as HTTP bearer authorization for remote files. The token generated when running
TYPE:
|
create_pr |
Whether or not to create a PR with the uploaded files or directly commit.
TYPE:
|
safe_serialization |
Whether or not to convert the model weights to the
TYPE:
|
variant |
If specified, weights are saved in the format
TYPE:
|
from mindone.diffusers import UNet2DConditionModel
unet = UNet2DConditionModel.from_pretrained("stabilityai/stable-diffusion-2", subfolder="unet")
# Push the `unet` to your namespace with the name "my-finetuned-unet".
unet.push_to_hub("my-finetuned-unet")
# Push the `unet` to an organization with the name "my-finetuned-unet".
unet.push_to_hub("your-org/my-finetuned-unet")
Source code in mindone/diffusers/utils/hub_utils.py
500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 |
|