Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

paddle.grad无法计算高阶微分 #64386

Open
ChevalierOhm opened this issue May 16, 2024 · 2 comments
Open

paddle.grad无法计算高阶微分 #64386

ChevalierOhm opened this issue May 16, 2024 · 2 comments
Assignees

Comments

@ChevalierOhm
Copy link

我的代码如下:
import paddle
import numpy as np

构建一个函数

def transform(x):
return 4 * x ** 3 + 1

x = paddle.rand(shape=[5], dtype=np.float32)
x.stop_gradient = False

print(x)

outputs = transform(x)

print(outputs)

使用paddle.grad命令计算二阶微分

dudx = paddle.grad(outputs, x, grad_outputs=paddle.ones_like(x), retain_graph=True, create_graph=True)[0]
print(dudx)
d2udx2 = paddle.grad(dudx, x, grad_outputs=paddle.ones_like(x), retain_graph=True, create_graph=True)[0]
print(d2udx2)

上述代码会报错,是哪里有问题吗

@ChevalierOhm
Copy link
Author

报错:
Error Traceback (most recent call last)/tmp/ipykernel_95/2675772064.py in
12 dudx = paddle.grad(outputs, x, grad_outputs=None, create_graph=True)[0]
13 print(dudx)
---> 14 d2udx2 = paddle.grad(dudx, x, grad_outputs=None, create_graph=True)[0]
15 print(d2udx2)
in grad(outputs, inputs, grad_outputs, retain_graph, create_graph, only_inputs, allow_unused, no_grad_vars)
/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/wrapped_decorator.py in impl(func, *args, **kwargs)
24 def impl(func, *args, **kwargs):
25 wrapped_func = decorator_func(func)
---> 26 return wrapped_func(*args, **kwargs)
27
28 return impl
/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/framework.py in impl(*args, **kwargs)
545 % func.name
546 )
--> 547 return func(*args, **kwargs)
548
549 return impl
/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/dygraph/base.py in grad(outputs, inputs, grad_outputs, retain_graph, create_graph, only_inputs, allow_unused, no_grad_vars)
689 retain_graph, create_graph,
690 only_inputs, allow_unused,
--> 691 no_grad_vars)
692 else:
693 place = core.Place()
ValueError: (InvalidArgument) The 0-th input does not appear in the backward graph. Please check the input tensor or set allow_unused=True to get None result.
[Hint: Expected allow_unused == true, but received allow_unused:0 != true:1.] (at /paddle/paddle/fluid/eager/general_grad.h:471)

@zoooo0820
Copy link
Contributor

你好,在目前develop版本上未复现出这个问题,可以更新下paddle

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants