Commit Graph

172 Commits

Author SHA1 Message Date
AUTOMATIC1111
ac02216e54 alternate implementation for unet forward replacement that does not depend on hijack being applied 2023-12-02 19:35:47 +03:00
AUTOMATIC1111
a5f61aa8c5 potential fix for #14172 2023-12-02 18:03:34 +03:00
MrCheeze
293f44e6c1 Fix bug where is_using_v_parameterization_for_sd2 fails because the sd_hijack is only partially undone 2023-12-01 22:56:08 -05:00
AUTOMATIC1111
9c1c0da026 fix exception related to the pix2pix 2023-11-06 11:17:36 +03:00
AUTOMATIC1111
6ad666e479 more changes for #13865: fix formatting, rename the function, add comment and add a readme entry 2023-11-05 19:46:20 +03:00
AUTOMATIC1111
80d639a440 linter 2023-11-05 19:32:21 +03:00
AUTOMATIC1111
ff805d8d0e
Merge branch 'dev' into master 2023-11-05 19:30:57 +03:00
Ritesh Gangnani
44c5097375 Use devices.torch_gc() instead of empty_cache() 2023-11-05 20:31:57 +05:30
Ritesh Gangnani
44db35fb1a Added memory clearance after deletion 2023-11-05 19:15:38 +05:30
Ritesh Gangnani
ff1609f91e Add SSD-1B as a supported model 2023-11-05 19:13:49 +05:30
AUTOMATIC1111
902afa6b4c
Merge pull request #13364 from superhero-7/master
Add altdiffusion-m18 support
2023-10-14 07:29:01 +03:00
superhero-7
2d947175b9 fix linter issues 2023-10-01 12:25:19 +08:00
superhero-7
f8f4ff2bb8 support altdiffusion-m18 2023-09-23 17:55:19 +08:00
superhero-7
702a1e1cc7 support m18 2023-09-23 17:51:41 +08:00
AUTOMATIC1111
59544321aa initial work on sd_unet for SDXL 2023-09-11 21:17:40 +03:00
AUTOMATIC1111
9d2299ed0b implement undo hijack for SDXL 2023-08-19 10:16:27 +03:00
AUTOMATIC1111
a8a256f9b5 REMOVE 2023-08-08 21:08:50 +03:00
AUTOMATIC1111
22ecb78b51 Merge branch 'dev' into multiple_loaded_models 2023-08-05 07:52:29 +03:00
AUTOMATIC1111
f0c1063a70 resolve some of circular import issues for kohaku 2023-08-04 09:13:46 +03:00
AUTOMATIC1111
151b8ed3a6 repair PLMS 2023-08-01 00:38:34 +03:00
AUTOMATIC1111
b235022c61 option to keep multiple models in memory 2023-08-01 00:24:48 +03:00
AUTOMATIC1111
6f0abbb71a textual inversion support for SDXL 2023-07-29 15:15:06 +03:00
AUTOMATIC1111
5677296d1b
Merge pull request #11878 from Bourne-M/patch-1
【bug】reload altclip model error
2023-07-19 16:26:12 +03:00
yfzhou
cb75734896
【bug】reload altclip model error
When using BertSeriesModelWithTransformation as the cond_stage_model, the undo_hijack should be performed using the FrozenXLMREmbedderWithCustomWords type; otherwise, it will result in a failed model reload.
2023-07-19 17:53:28 +08:00
AUTOMATIC1111
0198eaec45
Merge pull request #11757 from AUTOMATIC1111/sdxl
SD XL support
2023-07-16 12:04:53 +03:00
AUTOMATIC1111
2b1bae0d75 add textual inversion hashes to infotext 2023-07-15 08:41:22 +03:00
AUTOMATIC1111
6d8dcdefa0 initial SDXL refiner support 2023-07-14 09:16:01 +03:00
AUTOMATIC1111
594c8e7b26 fix CLIP doing the unneeded normalization
revert SD2.1 back to use the original repo
add SDXL's force_zero_embeddings to negative prompt
2023-07-13 11:35:52 +03:00
AUTOMATIC1111
da464a3fb3 SDXL support 2023-07-12 23:52:43 +03:00
AUTOMATIC1111
af081211ee getting SD2.1 to run on SDXL repo 2023-07-11 21:16:43 +03:00
AUTOMATIC
36888092af revert default cross attention optimization to Doggettx
make --disable-opt-split-attention command line option work again
2023-06-01 08:12:06 +03:00
AUTOMATIC
339b531570 custom unet support 2023-05-27 15:47:33 +03:00
AUTOMATIC
a6e653be26 possible fix for empty list of optimizations #10605 2023-05-23 18:49:15 +03:00
AUTOMATIC
2140bd1c10 make it actually work after suggestions 2023-05-19 10:05:07 +03:00
AUTOMATIC
8a3d232839 fix linter issues 2023-05-19 00:03:27 +03:00
AUTOMATIC
2582a0fd3b make it possible for scripts to add cross attention optimizations
add UI selection for cross attention optimization
2023-05-18 22:48:28 +03:00
AUTOMATIC
1a43524018 fix model loading twice in some situations 2023-05-14 13:27:50 +03:00
Aarni Koskela
49a55b410b Autofix Ruff W (not W605) (mostly whitespace) 2023-05-11 20:29:11 +03:00
AUTOMATIC
028d3f6425 ruff auto fixes 2023-05-10 11:05:02 +03:00
AUTOMATIC
f741a98bac imports cleanup for ruff 2023-05-10 08:43:42 +03:00
AUTOMATIC
762265eab5 autofixes from ruff 2023-05-10 07:52:45 +03:00
Pam
8d7fa2f67c sdp_attnblock_forward hijack 2023-03-10 22:48:41 +05:00
Pam
0981dea948 sdp refactoring 2023-03-10 12:58:10 +05:00
Pam
37acba2633 argument to disable memory efficient for sdp 2023-03-10 12:19:36 +05:00
Pam
fec0a89511 scaled dot product attention 2023-03-07 00:33:13 +05:00
AUTOMATIC1111
dfb3b8f398
Merge branch 'master' into weighted-learning 2023-02-19 12:41:29 +03:00
Shondoit
c4bfd20f31 Hijack to add weighted_forward to model: return loss * weight map 2023-02-15 10:03:59 +01:00
brkirch
2016733814 Apply hijacks in ddpm_edit for upcast sampling
To avoid import errors, ddpm_edit hijacks are done after an instruct pix2pix model is loaded.
2023-02-07 22:53:45 -05:00
AUTOMATIC1111
fecb990deb
Merge pull request #7309 from brkirch/fix-embeddings
Fix embeddings, upscalers, and refactor `--upcast-sampling`
2023-01-28 18:44:36 +03:00
AUTOMATIC
d04e3e921e automatically detect v-parameterization for SD2 checkpoints 2023-01-28 15:24:41 +03:00