Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error #1060

Open
2 tasks done
alphaleadership opened this issue Jan 13, 2024 · 0 comments
Open
2 tasks done

error #1060

alphaleadership opened this issue Jan 13, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@alphaleadership
Copy link

Describe the bug

the model files is not generated and the training is bloqued

To Reproduce

svc train

Additional context

[07:37:50] INFO [07:37:50] Using strategy: auto train.py:98
INFO [07:37:50] GPU available: False, used: rank_zero.py:64
False
[07:37:51] INFO [07:37:51] TPU available: False, using: 0 rank_zero.py:64
TPU cores
INFO [07:37:51] IPU available: False, using: 0 rank_zero.py:64
IPUs
INFO [07:37:51] HPU available: False, using: 0 rank_zero.py:64
HPUs
WARNING [07:37:51] warnings.py:109
d:\so-vits-svc-fork\venv\lib\site-packages\
so_vits_svc_fork\modules\synthesizers.py:81
: UserWarning: Unused arguments:
{'n_layers_q': 3, 'use_spectral_norm':
False, 'pretrained': {'D_0.pth':
'https://huggingface.co/datasets/ms903/sovi
ts4.0-768vec-layer12/resolve/main/sovits_76
8l12_pre_large_320k/clean_D_320000.pth',
'G_0.pth':
'https://huggingface.co/datasets/ms903/sovi
ts4.0-768vec-layer12/resolve/main/sovits_76
8l12_pre_large_320k/clean_G_320000.pth'}}
warnings.warn(f"Unused arguments:
{kwargs}")

       INFO     [07:37:51] Decoder type: hifi-gan       synthesizers.py:100
       WARNING  [07:37:51]                                  warnings.py:109
                d:\so-vits-svc-fork\venv\lib\site-packages\                
                torch\nn\utils\weight_norm.py:30:                          
                UserWarning: torch.nn.utils.weight_norm is                 
                deprecated in favor of                                     
                torch.nn.utils.parametrizations.weight_norm                
                .                                                          
                  warnings.warn("torch.nn.utils.weight_norm                
                is deprecated in favor of                                  
                torch.nn.utils.parametrizations.weight_norm                
                .")                                                        

[07:37:54] WARNING [07:37:54] warnings.py:109
d:\so-vits-svc-fork\venv\lib\site-packages\
so_vits_svc_fork\utils.py:246: UserWarning:
Keys not found in checkpoint state
dict:['emb_g.weight']
warnings.warn(f"Keys not found in
checkpoint state dict:" f"{not_in_from}")

       WARNING  [07:37:54]                                  warnings.py:109
                d:\so-vits-svc-fork\venv\lib\site-packages\                
                so_vits_svc_fork\utils.py:264: UserWarning:                
                Shape mismatch: ['dec.cond.weight:                         
                torch.Size([512, 256, 1]) ->                               
                torch.Size([512, 768, 1])',                                
                'enc_q.enc.cond_layer.weight_v:                            
                torch.Size([6144, 256, 1]) ->                              
                torch.Size([6144, 768, 1])',                               
                'flow.flows.0.enc.cond_layer.weight_v:                     
                torch.Size([1536, 256, 1]) ->                              
                torch.Size([1536, 768, 1])',                               
                'flow.flows.2.enc.cond_layer.weight_v:                     
                torch.Size([1536, 256, 1]) ->                              
                torch.Size([1536, 768, 1])',                               
                'flow.flows.4.enc.cond_layer.weight_v:                     
                torch.Size([1536, 256, 1]) ->                              
                torch.Size([1536, 768, 1])',                               
                'flow.flows.6.enc.cond_layer.weight_v:                     
                torch.Size([1536, 256, 1]) ->                              
                torch.Size([1536, 768, 1])',                               
                'f0_decoder.cond.weight: torch.Size([192,                  
                256, 1]) -> torch.Size([192, 768, 1])']                    
                  warnings.warn(                                           
                                                                           
       INFO     [07:37:54] Loaded checkpoint                   utils.py:307
                'logs\44k\G_0.pth' (epoch 0)                               

[07:37:55] INFO [07:37:55] Loaded checkpoint utils.py:307
'logs\44k\D_0.pth' (epoch 0)
+-----------------------------------------------+
| | Name | Type | Params |
|---+-------+--------------------------+--------|
| 0 | net_g | SynthesizerTrn | 45.6 M |
| 1 | net_d | MultiPeriodDiscriminator | 46.7 M |
+-----------------------------------------------+
Trainable params: 92.4 M
Non-trainable params: 0
Total params: 92.4 M
Total estimated model params size (MB): 369
WARNING [07:37:55] warnings.py:109
d:\so-vits-svc-fork\venv\lib\site-packages\
lightning\pytorch\trainer\connectors\data_c
onnector.py:441: The 'val_dataloader' does
not have many workers which may be a
bottleneck. Consider increasing the value
of the num_workers argumentto num_workers=3in theDataLoader` to
improve performance.

[07:37:56] WARNING [07:37:56] warnings.py:109
d:\so-vits-svc-fork\venv\lib\site-packages\
lightning\pytorch\loops\fit_loop.py:293:
The number of training batches (40) is
smaller than the logging interval
Trainer(log_every_n_steps=50). Set a lower
value for log_every_n_steps if you want to
see logs for the training epoch.

       INFO     [07:37:56] Setting current epoch to 0          train.py:311
       INFO     [07:37:56] Setting total batch idx to 0        train.py:327
       INFO     [07:37:56] Setting global step to 0            train.py:317

[07:38:03][07:38:03] WARNING WARNING [ [07:38:0307:38:03]] warnings.py :warnings.py109
: 109
d:\so-vits-svc-fork\venv\lib\site-packages\
d:\so-vits-svc-fork\venv\lib\site-packages\
torch_utils.py:[07:38:03] 831 : UserWarning: torch_utils.py:WARNING 831
: UserWarning: [ TypedStorage is deprecated. It will be 07:38:03
]
warnings.pyTypedStorage is deprecated. It will be :removed in the future and UntypedStorage 109

                removed in the future and UntypedStorage                                 d:\so-vits-svc-fork\venv\lib\site-packages\                will be the only storage class. This should
                       
                    
       will be the only storage class. This should                                             torch\_utils.py:only matter to you if you are using        

831 : UserWarning:
only matter to you if you are using
storages directly. To access

     TypedStorage is deprecated. It will be     storages directly.  To access                                                                  UntypedStorage directly, use               

                                                    removed in the future and UntypedStorage   

UntypedStorage directly, use

                  tensor.untyped_storage                  (will be the only storage class. This should tensor.untyped_storage)               ( instead of        

) instead of
only matter to you if you are using
tensor.storage (
)tensor.storage ( )
storages directly. To access
return self.fget.get
( return instance, UntypedStorage directly, use self.fget.get ( instance,

     owner           )         tensor.untyped_storage(owner()))                                   ( instead of        )                                                     
                                     

                           tensor.storage                                                             (                                            )                                           
                           

                  return self.fget.__get__(instance,                       
                owner)()                                                   

[07:38:03] WARNING [07:38:03] warnings.py:109
d:\so-vits-svc-fork\venv\lib\site-packages\
torch_utils.py:831: UserWarning:
TypedStorage is deprecated. It will be
removed in the future and UntypedStorage
will be the only storage class. This should
only matter to you if you are using
storages directly. To access
UntypedStorage directly, use
tensor.untyped_storage() instead of
tensor.storage()
return self.fget.get(instance,
owner)()
Le volume dans le lecteur D s'appelle dev
Le num‚ro de s‚rie du volume est 103E-6C52

R‚pertoire de D:\drive2\so-vits-svc-fork\logs\44k

12/01/2024 21:44

.
12/01/2024 21:44 ..
13/01/2024 07:37 2ÿ193 config.json
12/01/2024 21:44 187ÿ027ÿ770 D_0.pth
12/01/2024 21:44 209ÿ268ÿ661 G_0.pth
12/01/2024 19:38 lightning_logs
3 fichier(s) 396ÿ298ÿ624 octets
3 R‚p(s) 262ÿ772ÿ918ÿ784 octets libres

Version

latest pip

Platform

windows 10

Code of Conduct

  • I agree to follow this project's Code of Conduct.

No Duplicate

  • I have checked existing issues to avoid duplicates.
@alphaleadership alphaleadership added the bug Something isn't working label Jan 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant