You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to understand limitations of the autodiff library, for vec-> vec function, do I need Eigen vectors, or can i also use conventional C++ arrays? For example how to make this work with minimal modification:
#include <iostream>
#include <autodiff/forward/real.hpp>
typedef autodiff::real double_ad;
void f(double_ad * x, double_ad * y, int size)
{
for (int i = 0; i < size; i++)
{
y[i] = x[i] * x[i] * x[i] + 2 * x[i] * x[i] + 3 * x[i] + 4;
}
}
int main()
{
double_ad x_ad[3] = {1, 2, 3};
double_ad y_ad[3];
double_ad y_in[3] = {1,1,1};
f(x_ad, y_ad, 3);
for (int i = 0; i < 3; i++) {
std::cout << "y_ad[" << i << "] = " << y_ad[i] << std::endl;
}
auto y_diff = autodiff::derivatives(y_ad, autodiff::along(y_in), autodiff::at(x_ad));
return 0;
}
Aside: I am trying to port my library from enzyme to autodiff (for ease of packing and porting), what will be good practices to keep in mind?
My current structure is to call a "gradient" function which takes in a functor kind of class (with compute method) and variable array pointer as input, along with a d_variable array pointer, and saves the derivatives in d_variable array. I would like to keep this structure intact for backward compatibility.
I was planning making the compute method accepting and returning Eigen vectors and my gradient functions copying incoming data to Eigen vector and computing the derivative by using simple lambda function as shown in previous discussion here. Is there more efficient way to achieve it?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I am trying to understand limitations of the autodiff library, for vec-> vec function, do I need Eigen vectors, or can i also use conventional C++ arrays? For example how to make this work with minimal modification:
Aside: I am trying to port my library from enzyme to autodiff (for ease of packing and porting), what will be good practices to keep in mind?
My current structure is to call a "gradient" function which takes in a functor kind of class (with compute method) and variable array pointer as input, along with a d_variable array pointer, and saves the derivatives in d_variable array. I would like to keep this structure intact for backward compatibility.
I was planning making the
compute
method accepting and returning Eigen vectors and my gradient functions copying incoming data to Eigen vector and computing the derivative by using simple lambda function as shown in previous discussion here. Is there more efficient way to achieve it?Beta Was this translation helpful? Give feedback.
All reactions