The images: remote has standard images for several Linux distributions.
incus image list images:
The next step to customize those images is to use cloud-init. You still use the standard images, but as these images are first launched, you auto-pass cloud-init instructions to them and they get customized. See the Incus documentation for more, or my recent tutorial at How to customize Incus containers with cloud-init – Mi blog lah!
The ultimate way to customize images is to use distrobuilder yourself to create the images on your own. The standard Incus images from images: have been created with distrobuilder and you can start off using the actual configuration files. (note: are these the actual config files?)
You may consider having a separate Incus server with the images, that share them with a separate remote:. Here is the list with your current remotes,
I read your article, thanks! I was able to follow and use cloud-init with profiles.
Problem is, I would like to have the custom image or profile as a file in a repo,
so my colleagues can run a single command and have the customized instance.
How can I achieve that without setting up a private remote?
Thank you, that doesn’t feel right though.
I will try explaining my scenario better.
We have a project P0 that contains the Ansible playbooks for our servers. P0 is under version control.
Currently, when I change something in a playbook, I test it on a local virtual machine.
So, I wrote a script in P0 to automate the testing phase a bit.
The script does:
Creates and starts some vm instances VM1, VM2, VM3…
Build an ansible test inventory with VM1, VM2, VM3… ip addresses
Run the playbooks on those vms
Make some assertions to test the playbooks worked correctly
Stop and delete VM1, VM2, VM3…
When I create the vm instances at point 1, I would like to be able to
set up ssh and a root key.
Right now, I am doing it with a series of incus exec ... from the script,
but if I could start the instance from a custom image that could be simpler.
The profile solution sounds ok, I can set up a test profile from the script.
the missing piece in my head is how to glue togheter incus and cloud init to somehow put the custom image under version control so my test script can use it.
If you need version control for devices section in addition to cloud-init.user-data section, you can simply update the whole profile from a file like that:
Big question. When I add the cloud-init configuration in that way, why do I get those weird characters 2- at the end of the line? It still works though.
Images are binary and have typical sizes of a few hundred MB. I don’t versioning them the same way as code. I version each image by putting it in a directory with a timestamp.
You can distribute scripts like cloud-config files, image names, etc with git and distribute the binary images with something like rsync, or https + basic-auth.
Here’s a “du -sh *” listing of a few of my images:
Each of these is a directory containing a single file, called “image.tar.gz”, which is created by running “incus image export” with a snapshot of a configured container or VM. “a-” stands for alpine.
When I export/import the image, I use the timestamped directory name as the image alias, so it can co-exist with other versions. I have a mapping somewhere that maps plain names to aliases, so when a script needs to use the “nginx” image, it really uses the image a-nginx-20240203-0834.
I typically create custom images by just installing packages to another image. I apply any other configuration when launching the image, with cloud-config files.