You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are using conan in our gitlab/ci&cd pipeline for quite a bit now. We currently have a template pipeline that every conan project consumes and it works really well, but we are facing one issue at the moment.
Problem:
In stage "build" the template runs "conan build ..args . " which does whatever the conanfile says. Since every project has a different conanfile the output of files varies. Some projects generate files at folder path x, others at folder path y etc. The issue is that we need all the resources that "build" generates in another stage "deploy" (every stage possible runs on a different node, so we need to tell gitlab-ci what it has to transfer). Currently every project supplies artifact paths to the pipeline template with:
artifacts:
paths:
- mycv.pdf
- myFolder/
But that means we have to maintain the conanfile and the artifact section. If a path changes in conanfile we also need to change artifacts path.
Possible solution:
We thought about moving "conan export-pkg ...args ." from the deploy stage to the build stage, Transferring the whole package would mean that we would have everything in other stages that we might need but we are not sure what path we would need to provide to gitlabs artifacts section then, since gitlab can only upload files relative to the project root and we have no idea where the package was placed exactly (do we?).
In other projects like maven or npm its a bit simpler. In maven for example i can just bring /target/* to the next stage and in npm the consumer of the pipelines only needs to supply the path to the dist folder.
Are there any best practices or can you suggest some workflow?
Thanks!
Have you read the CONTRIBUTING guide?
I've read the CONTRIBUTING guide
The text was updated successfully, but these errors were encountered:
But that means we have to maintain the conanfile and the artifact section. If a path changes in conanfile we also need to change artifacts path.
From what I understand, this looks like a "packaging" thing.
So far you are using those conanfile just as pure "consumers" ,only with the build() method.
The packaging process, and the package() method is exactly that, encode somewhere what are the "useful" artifacts in the build and put them in a well-known folder, so they can more easily and automatically be used by other packages.
We thought about moving "conan export-pkg ...args ." from the deploy stage to the build stage, Transferring the whole package would mean that we would have everything in other stages that we might need but we are not sure what path we would need to provide to gitlabs artifacts section then, since gitlab can only upload files relative to the project root and we have no idea where the package was placed exactly (do we?).
yes, the conan export-pkg also calls the package() method, and the full conan build + conan export-pkg is mostly equivalent to the regular conan create package flow.
When things have been packaged to the Conan cache the process to extract them out of Conan for final release/put in production is call deployment and consist of a copy of artifacts from the Conan cache (where the folder is internal, and mostly unknown), to a known user folder in user space. It can be done with:
Built-in deployers, like conan install --requires=mypkg1/version --requires=mypkg2/version --deployer=full_deploy. You can also use a single conanfile and call conan install . --deployer=...
Custom deployers: you can create your own deployer code, and distribute and update it with conan config install together with all the rest of the Conan configuration
For very special cases, it is possible to implement the deploy() method in a recipe, to specify the specific deploy of that specific package (and its dependencies) and call it with conan install ... --deployer-package=<pattern>
Conan Artifacts Pipeline Handling
Hello!
We are using conan in our gitlab/ci&cd pipeline for quite a bit now. We currently have a template pipeline that every conan project consumes and it works really well, but we are facing one issue at the moment.
Problem:
In stage "build" the template runs "conan build ..args . " which does whatever the conanfile says. Since every project has a different conanfile the output of files varies. Some projects generate files at folder path x, others at folder path y etc. The issue is that we need all the resources that "build" generates in another stage "deploy" (every stage possible runs on a different node, so we need to tell gitlab-ci what it has to transfer). Currently every project supplies artifact paths to the pipeline template with:
artifacts:
paths:
- mycv.pdf
- myFolder/
But that means we have to maintain the conanfile and the artifact section. If a path changes in conanfile we also need to change artifacts path.
Possible solution:
We thought about moving "conan export-pkg ...args ." from the deploy stage to the build stage, Transferring the whole package would mean that we would have everything in other stages that we might need but we are not sure what path we would need to provide to gitlabs artifacts section then, since gitlab can only upload files relative to the project root and we have no idea where the package was placed exactly (do we?).
In other projects like maven or npm its a bit simpler. In maven for example i can just bring /target/* to the next stage and in npm the consumer of the pipelines only needs to supply the path to the dist folder.
Are there any best practices or can you suggest some workflow?
Thanks!
Have you read the CONTRIBUTING guide?
The text was updated successfully, but these errors were encountered: