Commit Graph

89 Commits

Author SHA1 Message Date
Muhammad Rizqi Nur
bb832d7725 Simplify grad clip 2022-11-05 11:48:38 +07:00
Muhammad Rizqi Nur
237e79c77d Merge branch 'master' into gradient-clipping 2022-11-02 20:48:58 +07:00
Fampai
890e68aaf7 Fixed minor bug
when unloading vae during TI training, generating images after
training will error out
2022-10-31 10:07:12 -04:00
Fampai
3b0127e698 Merge branch 'master' of https://github.com/AUTOMATIC1111/stable-diffusion-webui into TI_optimizations 2022-10-31 09:54:51 -04:00
Fampai
006756f9cd Added TI training optimizations
option to use xattention optimizations when training
option to unload vae when training
2022-10-31 07:26:08 -04:00
Muhammad Rizqi Nur
cd4d59c0de Merge master 2022-10-30 18:57:51 +07:00
Muhammad Rizqi Nur
3d58510f21 Fix dataset still being loaded even when training will be skipped 2022-10-30 00:54:59 +07:00
Muhammad Rizqi Nur
a07f054c86 Add missing info on hypernetwork/embedding model log
Mentioned here: https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/1528#discussioncomment-3991513

Also group the saving into one
2022-10-30 00:49:29 +07:00
Muhammad Rizqi Nur
ab05a74ead Revert "Add cleanup after training"
This reverts commit 3ce2bfdf95.
2022-10-30 00:32:02 +07:00
Muhammad Rizqi Nur
3ce2bfdf95 Add cleanup after training 2022-10-29 19:43:21 +07:00
Muhammad Rizqi Nur
ab27c111d0 Add input validations before loading dataset for training 2022-10-29 18:09:17 +07:00
Muhammad Rizqi Nur
05e2e40537 Merge branch 'master' into gradient-clipping 2022-10-29 15:04:21 +07:00
Muhammad Rizqi Nur
9ceef81f77 Fix log off by 1 2022-10-28 20:48:08 +07:00
Muhammad Rizqi Nur
16451ca573 Learning rate sched syntax support for grad clipping 2022-10-28 17:16:23 +07:00
Muhammad Rizqi Nur
1618df41ba Gradient clipping for textual embedding 2022-10-28 10:31:27 +07:00
DepFA
737eb28fac typo: cmd_opts.embedding_dir to cmd_opts.embeddings_dir 2022-10-26 17:38:08 +03:00
timntorres
f4e1464217 Implement PR #3625 but for embeddings. 2022-10-26 10:14:35 +03:00
timntorres
4875a6c217 Implement PR #3309 but for embeddings. 2022-10-26 10:14:35 +03:00
timntorres
c2dc9bfa89 Implement PR #3189 but for embeddings. 2022-10-26 10:14:35 +03:00
AUTOMATIC
cbb857b675 enable creating embedding with --medvram 2022-10-26 09:44:02 +03:00
AUTOMATIC
7d6b388d71 Merge branch 'ae' 2022-10-21 13:35:01 +03:00
DepFA
0087079c2d
allow overwrite old embedding 2022-10-20 00:10:59 +01:00
MalumaDev
1997ccff13
Merge branch 'master' into test_resolve_conflicts 2022-10-18 08:55:08 +02:00
DepFA
62edfae257 print list of embeddings on reload 2022-10-17 08:42:17 +03:00
MalumaDev
ae0fdad64a
Merge branch 'master' into test_resolve_conflicts 2022-10-16 17:55:58 +02:00
AUTOMATIC
0c5fa9a681 do not reload embeddings from disk when doing textual inversion 2022-10-16 09:09:04 +03:00
MalumaDev
97ceaa23d0
Merge branch 'master' into test_resolve_conflicts 2022-10-16 00:06:36 +02:00
DepFA
b6e3b96dab Change vector size footer label 2022-10-15 17:23:39 +03:00
DepFA
ddf6899df0 generalise to popular lossless formats 2022-10-15 17:23:39 +03:00
DepFA
9a1dcd78ed add webp for embed load 2022-10-15 17:23:39 +03:00
DepFA
939f16529a only save 1 image per embedding 2022-10-15 17:23:39 +03:00
DepFA
9e846083b7 add vector size to embed text 2022-10-15 17:23:39 +03:00
MalumaDev
7b7561f6e4
Merge branch 'master' into test_resolve_conflicts 2022-10-15 16:20:17 +02:00
AUTOMATIC
c7a86f7fe9 add option to use batch size for training 2022-10-15 09:24:59 +03:00
AUTOMATIC
03d62538ae remove duplicate code for log loss, add step, make it read from options rather than gradio input 2022-10-14 22:43:55 +03:00
AUTOMATIC
326fe7d44b Merge remote-tracking branch 'Melanpan/master' 2022-10-14 22:14:50 +03:00
AUTOMATIC
c344ba3b32 add option to read generation params for learning previews from txt2img 2022-10-14 20:31:49 +03:00
MalumaDev
bb57f30c2d init 2022-10-14 10:56:41 +02:00
Melan
8636b50aea Add learn_rate to csv and removed a left-over debug statement 2022-10-13 12:37:58 +02:00
Melan
1cfc2a1898 Save a csv containing the loss while training 2022-10-12 23:36:29 +02:00
AUTOMATIC
c3c8eef9fd train: change filename processing to be more simple and configurable
train: make it possible to make text files with prompts
train: rework scheduler so that there's less repeating code in textual inversion and hypernets
train: move epochs setting to options
2022-10-12 20:49:47 +03:00
AUTOMATIC1111
cc5803603b
Merge pull request #2037 from AUTOMATIC1111/embed-embeddings-in-images
Add option to store TI embeddings in png chunks, and load from same.
2022-10-12 15:59:24 +03:00
DepFA
10a2de644f
formatting 2022-10-12 13:15:35 +01:00
DepFA
5f3317376b
spacing 2022-10-11 20:09:49 +01:00
DepFA
91d7ee0d09
update imports 2022-10-11 20:09:10 +01:00
DepFA
aa75d5cfe8
correct conflict resolution typo 2022-10-11 20:06:13 +01:00
AUTOMATIC
d6fcc6b87b apply lr schedule to hypernets 2022-10-11 22:03:05 +03:00
DepFA
61788c0538
shift embedding logic out of textual_inversion 2022-10-11 19:50:50 +01:00
AUTOMATIC1111
419e539fe3
Merge branch 'learning_rate-scheduling' into learnschedule 2022-10-11 21:50:19 +03:00
AUTOMATIC
d4ea5f4d86 add an option to unload models during hypernetwork training to save VRAM 2022-10-11 19:03:08 +03:00