-
Notifications
You must be signed in to change notification settings - Fork 688
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow decoders to decode Python types derived from primitives #2230
Comments
Related #1887 |
There are few workarounds for this:
|
As for the suggestion itself, my major concern about adding codecs for primitive types is that they will be invoked on any attempt to convert to a .NET primitive, which in many cases will have a noticeable performance impact. |
|
How about adding a flag:
|
We will not be adding flags, as every flag adds a whole new dimension to the testing matrix. |
@Bluubb can you tell more about your use case? Is there a reason you can not make a function to replace |
Environment
Details
We have used the preview version in the past and wrapped it so that we are able to update the package and change against interfaces for future releases. One part of it is converting numpy types. By default such types are not supported anymore in the release version and it seems to be recommended to use codecs instead. I registered a codec (same as pandanet) but
PyObject.As<double>()
does not convertPyType = <class 'numpy.int32'>
with the help of the codec.By debugging it never hits the PyObjectConversions.TryDecode because double it is not equal to "object type" and not one of the "DecodableByUserTypes".
TODO
to reproduce:
Whish:
ToPrimitive(...)
function which fails.The text was updated successfully, but these errors were encountered: