登录
注册
开源
企业版
高校版
搜索
帮助中心
使用条款
关于我们
开源
企业版
高校版
私有云
模力方舟
登录
注册
代码拉取完成,页面将自动刷新
开源项目
>
程序开发
>
语音处理
&&
捐赠
捐赠前请先登录
取消
前往登录
扫描微信二维码支付
取消
支付完成
支付提示
将跳转至支付宝完成支付
确定
取消
Watch
不关注
关注所有动态
仅关注版本发行动态
关注但不提醒动态
27
Star
343
Fork
113
PaddlePaddle
/
PaddleSpeech
代码
Issues
14
Pull Requests
1
Wiki
统计
流水线
服务
JavaDoc
PHPDoc
质量分析
Jenkins for Gitee
腾讯云托管
腾讯云 Serverless
悬镜安全
阿里云 SAE
Codeblitz
SBOM
我知道了,不再自动展开
更新失败,请稍后重试!
移除标识
内容风险标识
本任务被
标识为内容中包含有代码安全 Bug 、隐私泄露等敏感信息,仓库外成员不可访问
搭建了一个在线finetune的服务,但是只能训练第一次请求,再次请求训练就报错AssertionError: Optimizer set error, embedding_2.w_0_moment1_0 should in state dict
待办的
#I7LCVM
myaier
创建于
2023-07-15 11:31
Exception in main training loop: Optimizer set error, embedding_2.w_0_moment1_0 should in state dict Traceback (most recent call last): File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\training\trainer.py", line 149, in run update() File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\training\updaters\standard_updater.py", line 110, in update self.update_core(batch) File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\models\fastspeech2\fastspeech2_updater.py", line 118, in update_core optimizer.step() File "<decorator-gen-313>", line 2, in step File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\dygraph\base.py", line 319, in __impl__ return func(*args, **kwargs) File "<decorator-gen-311>", line 2, in step File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\wrapped_decorator.py", line 26, in __impl__ return wrapped_func(*args, **kwargs) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\framework.py", line 534, in __impl__ return func(*args, **kwargs) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 550, in step param_group_idx=0, File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 1167, in _apply_optimize params_grads, param_group_idx=param_group_idx File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 928, in _create_optimization_pass for p in parameters_and_grads File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 337, in _create_accumulators self._add_moments_pows(p) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 293, in _add_moments_pows self._add_accumulator(self._moment1_acc_str, p, dtype=acc_dtype) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 756, in _add_accumulator var_name Trainer extensions will try to handle the extension. Then all extensions will finalize.[2023-07-14 20:35:10] [ERROR] [app.py:1742] Exception on /train_canton_clone [POST] Traceback (most recent call last): File "C:\Users\leib.l\Miniconda3\lib\site-packages\flask\app.py", line 2525, in wsgi_app response = self.full_dispatch_request() File "C:\Users\leib.l\Miniconda3\lib\site-packages\flask\app.py", line 1822, in full_dispatch_request rv = self.handle_user_exception(e) File "C:\Users\leib.l\Miniconda3\lib\site-packages\flask\app.py", line 1820, in full_dispatch_request rv = self.dispatch_request() File "C:\Users\leib.l\Miniconda3\lib\site-packages\flask\app.py", line 1796, in dispatch_request return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args) File "C:/jisufenxiang/PaddleSpeech/examples/other/tts_finetune/tts3/main.py", line 54, in train local.finetune.finetune_train(pretrained_model_dir,dump_dir,output_dir) File "C:\jisufenxiang\PaddleSpeech\examples\other\tts_finetune\tts3\local\finetune.py", line 276, in finetune_train train_sp(train_args, config) File "C:\jisufenxiang\PaddleSpeech\examples\other\tts_finetune\tts3\local\finetune.py", line 204, in train_sp trainer.run() File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\training\trainer.py", line 198, in run six.reraise(*exc_info) File "C:\Users\leib.l\AppData\Roaming\Python\Python37\site-packages\six.py", line 703, in reraise raise value File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\training\trainer.py", line 149, in run update() File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\training\updaters\standard_updater.py", line 110, in update self.update_core(batch) File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\models\fastspeech2\fastspeech2_updater.py", line 118, in update_core optimizer.step() File "<decorator-gen-313>", line 2, in step File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\dygraph\base.py", line 319, in __impl__ return func(*args, **kwargs) File "<decorator-gen-311>", line 2, in step File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\wrapped_decorator.py", line 26, in __impl__ return wrapped_func(*args, **kwargs) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\framework.py", line 534, in __impl__ return func(*args, **kwargs) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 550, in step param_group_idx=0, File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 1167, in _apply_optimize params_grads, param_group_idx=param_group_idx File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 928, in _create_optimization_pass for p in parameters_and_grads File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 337, in _create_accumulators self._add_moments_pows(p) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 293, in _add_moments_pows self._add_accumulator(self._moment1_acc_str, p, dtype=acc_dtype) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 756, in _add_accumulator var_name AssertionError: Optimizer set error, embedding_2.w_0_moment1_0 should in state dict
Exception in main training loop: Optimizer set error, embedding_2.w_0_moment1_0 should in state dict Traceback (most recent call last): File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\training\trainer.py", line 149, in run update() File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\training\updaters\standard_updater.py", line 110, in update self.update_core(batch) File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\models\fastspeech2\fastspeech2_updater.py", line 118, in update_core optimizer.step() File "<decorator-gen-313>", line 2, in step File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\dygraph\base.py", line 319, in __impl__ return func(*args, **kwargs) File "<decorator-gen-311>", line 2, in step File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\wrapped_decorator.py", line 26, in __impl__ return wrapped_func(*args, **kwargs) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\framework.py", line 534, in __impl__ return func(*args, **kwargs) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 550, in step param_group_idx=0, File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 1167, in _apply_optimize params_grads, param_group_idx=param_group_idx File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 928, in _create_optimization_pass for p in parameters_and_grads File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 337, in _create_accumulators self._add_moments_pows(p) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 293, in _add_moments_pows self._add_accumulator(self._moment1_acc_str, p, dtype=acc_dtype) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 756, in _add_accumulator var_name Trainer extensions will try to handle the extension. Then all extensions will finalize.[2023-07-14 20:35:10] [ERROR] [app.py:1742] Exception on /train_canton_clone [POST] Traceback (most recent call last): File "C:\Users\leib.l\Miniconda3\lib\site-packages\flask\app.py", line 2525, in wsgi_app response = self.full_dispatch_request() File "C:\Users\leib.l\Miniconda3\lib\site-packages\flask\app.py", line 1822, in full_dispatch_request rv = self.handle_user_exception(e) File "C:\Users\leib.l\Miniconda3\lib\site-packages\flask\app.py", line 1820, in full_dispatch_request rv = self.dispatch_request() File "C:\Users\leib.l\Miniconda3\lib\site-packages\flask\app.py", line 1796, in dispatch_request return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args) File "C:/jisufenxiang/PaddleSpeech/examples/other/tts_finetune/tts3/main.py", line 54, in train local.finetune.finetune_train(pretrained_model_dir,dump_dir,output_dir) File "C:\jisufenxiang\PaddleSpeech\examples\other\tts_finetune\tts3\local\finetune.py", line 276, in finetune_train train_sp(train_args, config) File "C:\jisufenxiang\PaddleSpeech\examples\other\tts_finetune\tts3\local\finetune.py", line 204, in train_sp trainer.run() File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\training\trainer.py", line 198, in run six.reraise(*exc_info) File "C:\Users\leib.l\AppData\Roaming\Python\Python37\site-packages\six.py", line 703, in reraise raise value File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\training\trainer.py", line 149, in run update() File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\training\updaters\standard_updater.py", line 110, in update self.update_core(batch) File "C:\jisufenxiang\PaddleSpeech\paddlespeech\t2s\models\fastspeech2\fastspeech2_updater.py", line 118, in update_core optimizer.step() File "<decorator-gen-313>", line 2, in step File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\dygraph\base.py", line 319, in __impl__ return func(*args, **kwargs) File "<decorator-gen-311>", line 2, in step File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\wrapped_decorator.py", line 26, in __impl__ return wrapped_func(*args, **kwargs) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\fluid\framework.py", line 534, in __impl__ return func(*args, **kwargs) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 550, in step param_group_idx=0, File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 1167, in _apply_optimize params_grads, param_group_idx=param_group_idx File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 928, in _create_optimization_pass for p in parameters_and_grads File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 337, in _create_accumulators self._add_moments_pows(p) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\adam.py", line 293, in _add_moments_pows self._add_accumulator(self._moment1_acc_str, p, dtype=acc_dtype) File "C:\Users\leib.l\Miniconda3\lib\site-packages\paddle\optimizer\optimizer.py", line 756, in _add_accumulator var_name AssertionError: Optimizer set error, embedding_2.w_0_moment1_0 should in state dict
评论 (
0
)
登录
后才可以发表评论
状态
待办的
待办的
进行中
已完成
已关闭
负责人
未设置
标签
未设置
标签管理
里程碑
未关联里程碑
未关联里程碑
Pull Requests
未关联
未关联
关联的 Pull Requests 被合并后可能会关闭此 issue
分支
未关联
分支 (11)
标签 (16)
develop
dependabot/npm_and_yarn/demos/speech_web/web_client/semver-5.7.2
dependabot/npm_and_yarn/demos/speech_web/web_client/vite-2.9.16
speechx
r1.4
r1.3
r1.2
r1.1
r1.0
r0.2
r0.1
r1.4.1
r1.4.0
r1.3.0
r1.2.0
r1.1.0
r1.0.1
r1.0.0
r1.0.0a
r0.2.0
r0.1.2
r0.1.1
r0.1.0
v2.1.1
v2.1.0
v1.1
v1.0
开始日期   -   截止日期
-
置顶选项
不置顶
置顶等级:高
置顶等级:中
置顶等级:低
优先级
不指定
严重
主要
次要
不重要
参与者(1)
1
https://gitee.com/paddlepaddle/PaddleSpeech.git
git@gitee.com:paddlepaddle/PaddleSpeech.git
paddlepaddle
PaddleSpeech
PaddleSpeech
点此查找更多帮助
搜索帮助
Git 命令在线学习
如何在 Gitee 导入 GitHub 仓库
Git 仓库基础操作
企业版和社区版功能对比
SSH 公钥设置
如何处理代码冲突
仓库体积过大,如何减小?
如何找回被删除的仓库数据
Gitee 产品配额说明
GitHub仓库快速导入Gitee及同步更新
什么是 Release(发行版)
将 PHP 项目自动发布到 packagist.org
评论
仓库举报
回到顶部
登录提示
该操作需登录 Gitee 帐号,请先登录后再操作。
立即登录
没有帐号,去注册