You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are creating a 'build tools' - like package which will have dependencies to some cross-compiler (i.e. gcc) as well as some include headers and libraries to be used for the cross-compile targets.
You could probably compare to simply depending on a cross-compile gcc toolchain, where pre-built host libraries would be included.
We are wondering what would be the recommended approach?
Should we create two packages and let the conanfile.pys have a build & runtime/host dependency, as we need to specify the build profile for the build tools and the host profile to choose the correct include headers, libraries, ... ?
Or is there a way to package everything in the same package (we'd prefer that). However we'd not know how to deal with the build vs host profiles?
Currently we are not re-packaging everything. But maybe we should?
Maybe our approach is wrong in the first place?
Have you read the CONTRIBUTING guide?
I've read the CONTRIBUTING guide
The text was updated successfully, but these errors were encountered:
For tool-requires including libraries, there are some special considerations:
The libraries included in the tool-requires are often considered part of the toolchain, and deployed in the production target system with the toolchain runtime or basically as part of the system not as part of the application. What do you exactly need? Do you want to handle those libraries as any standard library from any other package?
Libraries included in tool-requires very often do not need a find_package() or equivalent to link with they are part of the toolchain and kind of hardcoded into the toolchain. Do your libraries would require such a find_package() or equivalent?
For those cases, the common thing is just to add the libraries inside the package, built with the settings_target inputs (not with settings) and let the toolchain handle them, Conan wouldn't be aware at consumption time.
If that is not the case, then, 2 different approaches are possible:
About your questions:
In fact yes, we are developing some add-on libraries (standard libraries for the platform) which shall be included. However we will base everything on a standard toolchain which will not be aware of those add-on libraries. So we need to use the find_package() approach.
I'll look into settings_target, so far it never crossed my mind (or eyes for that matter ;) ).
Currently we are creating two different packages, one containing the libraries (for the host context), and the other one containing the build tooling (build context).
I think we'd be fine with spefifying the same dependency twice, once as host and once as build context. However I believe that might give some issues where the host os != build os.
We'll look into it and will follow-up here once we decided on how to continue.
What is your question?
We are creating a 'build tools' - like package which will have dependencies to some cross-compiler (i.e. gcc) as well as some include headers and libraries to be used for the cross-compile targets.
You could probably compare to simply depending on a cross-compile gcc toolchain, where pre-built host libraries would be included.
We are wondering what would be the recommended approach?
conanfile.py
s have a build & runtime/host dependency, as we need to specify the build profile for the build tools and the host profile to choose the correct include headers, libraries, ... ?Have you read the CONTRIBUTING guide?
The text was updated successfully, but these errors were encountered: