Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Attribute omissions #1210

Open
lintao185 opened this issue Jan 26, 2024 · 7 comments
Open

Attribute omissions #1210

lintao185 opened this issue Jan 26, 2024 · 7 comments

Comments

@lintao185
Copy link

Lately, I have been trying to migrate features from Thop to TorchSharp, and in the process, I have discovered several attribute omissions in specific model components:

The properties related to Convolutional layers, such as groups, among others, are currently not available.

 public Tensor count_convNd(nn.Module<Tensor, Tensor> m, Tensor x, Tensor y)
    {
        var weight = m.get_parameter("weight");
        var bias = m.get_parameter("bias");
        var kernel_ops = torch.zeros(weight.size()[2..]).numel();
        var bias_ops = bias is not null;
        var conv = m as Convolution;
        //待更新
        TotalOps += calculate_conv2d_flops(
            input_size: x.shape,
            output_size: y.shape,
            kernel_size: weight.shape,
            groups: conv.groups,
            bias: bias_ops
        );
        return null!;
    }

Attributes inherent to Softmax layers, notably dim, are missing.

 public Tensor count_softmax(nn.Module<Tensor, Tensor> m, Tensor x, Tensor y)
    {
        var sofmax = m as Softmax;
        //待更新
        var nfeatures = x.size()[sofmax.dim];
        var batch_size = x.numel() / nfeatures;
        TotalOps += calculate_softmax(batch_size, nfeatures);
        return null!;
    }

Features of Linear layers, including the in_features property, are also not included at this time.

 public Tensor count_linear(nn.Module<Tensor, Tensor> m, Tensor x, Tensor y)
   {
       var linear = m as Linear;
       var total_mul = linear.in_features;
       var num_elements = y.numel();

       TotalOps += calculate_linear(total_mul, num_elements);
       return null!;
   }

ect

@NiklasGustafsson
Copy link
Contributor

Thank you! @shaltielshmid and I are going to systematically go over these modules and make the attributes available.

@NiklasGustafsson
Copy link
Contributor

NiklasGustafsson commented Jan 26, 2024

@shaltielshmid -- as we talk about the attribute (property) exposure, it reminds me that I started on an effort to move more of the module login into managed code.

Could you take a look at my (draft) PR for that? It's not meant to be merged, just looked at. I think it may be a better starting point for this work (and avoids a lot of future merge conflicts).

@shaltielshmid
Copy link
Contributor

Will do!

@lintao185
Copy link
Author

Will all attributes be made available in the future?

@shaltielshmid
Copy link
Contributor

Yes, we are working on an update which should make all the attributes available

@NiklasGustafsson
Copy link
Contributor

NiklasGustafsson commented Feb 26, 2024

Yes, it is the intent to do that. Right now, my two priorties are:

  1. Upgrade to libtorch 2.2.1.
  2. Add support for Apple Silicon in the OSX builds.

The refactoring that will allow attributes to be exposed more consistently comes after.

@lintao185
Copy link
Author

Thank you both very much for your efforts, I believe that TorchSharp will become even more powerful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants