This website works better with JavaScript.
Home
Explore
Help
Sign In
dan
/
tacotron2
Watch
1
Star
0
Fork
0
Code
Issues
Pull Requests
Releases
Wiki
Activity
Browse Source
Fixing concatenation error for fp16 ditributed training
master
gkarch
5 years ago
parent
825ffa47d1
commit
df4a466af2
1 changed files
with
1 additions
and
1 deletions
Split View
Diff Options
Show Stats
Download Patch File
Download Diff File
+1
-1
distributed.py
+ 1
- 1
distributed.py
View File
@ -140,7 +140,7 @@ def apply_gradient_allreduce(module):
buckets
=
{
}
for
param
in
module
.
parameters
(
)
:
if
param
.
requires_grad
and
param
.
grad
is
not
None
:
tp
=
type
(
param
.
data
)
tp
=
param
.
data
.
dtype
if
tp
not
in
buckets
:
buckets
[
tp
]
=
[
]
buckets
[
tp
]
.
append
(
param
)
Write
Preview
Loading…
Cancel
Save