 Welcome everybody to Rust Packaging Tutorial. I'm Fabio Valentini, as you can see. My nickname is Stackathop, practically everywhere, so you can reach me on all those social media if you have questions. Yeah, and I'm a member of the Rust Special Interest Group in Fedora and a member of the Packaging Committee and a member of the Engineering and Steering Committee, so you've probably seen me around. Okay, my plan for today would be to give a short, two-length introduction into how Rust crates work and how those properties translate into how we do Rust packaging in Fedora. So how we determine features and optional dependencies still requires, requires, provides which of those things are done automatically and which you have control over. And a short section about how we map semantic versioning to RPM version requirements because that's often some amount of confusion about that. A second part is then looking at real-world examples. My plan would be to look at how does creating a Rust package from scratch look like how to create an update for an existing package that can be updated independently, how to handle multi-crate or multi-package updates for crates that need to be updated together, how to correctly do bindings for system libraries and how to create compatibility packages when you need to package two versions of a crate together or have both available in parallel. Yeah, how many of those examples we will cover will depend on how much time we have left at the end. But I have prepared examples for all those cases. So yeah, we have enough to do I think. Okay, if you want to follow along with the tutorial stuff later, you will need a working packaging tool chain and you will need Rust to RPM preferably version 22 because it has lots of improvements over version 21. And you'll probably also need RPM autospec because we use that by default for all new Rust packages and we are also converting existing ones to use it. So those are the package names. Fedora package Rust to RPM and RPM autospec. If you install all three packages, you should be set for the tutorial and you've got time to do that until the introduction is over. Okay, any questions so far? I've got an eye in the chat so if there's anything, just feel free to ask at any point. Okay, perfect. Then let's start with introduction. Okay, first of all, what happens when you build Rust crates and what do the different build phases of RPM packages map to when you're talking about Rust code or Rust crates? In the prep phase, we do unpack the dot create files which are basically G-SIPT turbos and then we set up a local build environment for cargo with a cargo prep micro. We use generate build requires to generate build requires from create metadata automatically so you don't have to take care of... Okay, multiple people don't hear anything. Okay, at least some people have audio. Okay, then I'll continue. I've also put the slides online. They should be linked from the schedule but I can also add a link to that in the chat. Give me a second. I wanted to do that at the beginning but I forgot. Copy link. It should be a working link to the slides if you want to look at them yourself. Okay, right. In the build phase, we compile the crates with the specified feature flags and that ensures that we only ship working codes. So if you want to package crates and it doesn't even compile the build phase and we run this on all architectures even though the produced packages are no arch, so architecture independent but we ensure that they build on all architectures we support. Then in the install step, we copy create source code into the path for the local registry where we install all crates that are packaged as RPMs and any binaries that are produced by the crates are installed into the directory for binaries in the build route. And when enabled, there's a check section where we compile the unit tests and run all the tests. So there's usually unit tests and integration tests and tests that are embedded in the documentation and they are all built and run during the check section by default. Okay, what do create files look like? Cargo.tomr is mandatory. That's project metadata and dependencies and that's basically what we use for generating the rest packages. Then there's of course the source code itself and sometimes the crates need some dynamic behavior at build time for example to generate some code and those are controlled via build or S basically build scripts that are run during build time. Then there's optional staff, license files, readme files, source code for tests, examples, benchmarks and often some data files that are used as inputs for unit tests. And yeah, you saw the correctly license files are technically optional but we technically require them for packaging them for Fedora and most upstream projects are good with including license files and if they don't then if you poke them they do the right thing because we need the license files for the source code to be redistributable. Okay, this is what the file looks like. I hope it's not too small so I can see a little bit at least. The package table in the crates defines the project metadata and we have some stuff in there that we use. For example the name of the crates we map. The package name, the version is used to populate the version field for the RPM though we do some translation of characters that are valid in semantic versioning but not valid in RPM versions and we use the description string to populate the summary and the description tags of the RPM and we use the license string which is SPDX expression by default to populate the license tag in the RPM package. And one nice simplification here that we managed to do recently is because Fedora now allows SPDX license expressions in license tags. We just use SPDX expression from upstream directly instead of translating it to the equivalent Fedora expression. Basically it's just the hyphen that's problematic from semantic versioning and we translate that to the tilde character. I'm not sure if there are other cases as well but that's the main one. So if you have pre-releases those are in semantic versioning syntax those are separated with a hyphen and in RPM syntax we use the tilde character. So that's the most important one we do for a nice version string compatibility but we have macros to do that and automations so you don't have to do that by hand. Okay then just to explain the default layout of Rust source code that's published to crates.io you usually have a source directory with other directories and Rust source files and unit tests and the unit tests can use private APIs and they're compiled into single test runner when running the tests. Then you have source-slip.rs which is the main entry point for libraries and source-main.rs or source-bin.slash anyrs for applications if that's optional and then we have a directory for tests one directory for example code and one directory for benchmarks and the files under tests are integration tests they don't have access to internal APIs of the crates they only can use public APIs so those are basically integration tests for testing the public API then there's example code in the examples directory that's only compiled when running the tests but it's not run so basically the tests just ensure that the examples are up to date and they're still compile and there's also sometimes benchmarks but for RPM packaging purposes those files are unused because we neither compile nor run the benchmarks and those are the defaults and you can customize everything about this by setting explicit settings in cargo-tomel so if a crates you want to package has some weird directory layout or puts files in weird places you might need to look at the cargo-tomel metadata for how the source code is structured but that's usually the default and then we need to map the semantic versioning strings that are used by rust crates to strings for RPM versions that are compatible and that usually involves things like pre-releases but we don't often package pre-releases but as I said we need to replace the hyphen character which denotes a pre-release in semantic versioning with the tilde character that denotes a pre-release in RPM versions and that works both ways so the conversion is lossless and we also need to do this conversion both ways when running automation for packaging there's also sometimes additional information that's in the version strings for example when you're packaging bindings for C libraries they sometimes put additional metadata into the version like in this case this is the version of the bindings for lib curl and this is used to denote which version of the curl library is bundled if you want to build with rendered dependencies but since we don't do that in Fedora packages we drop this extra metadata suffix and we also don't need it for dependency resolution so we usually patch this out because it's not information that's useful for us or anything that we need okay now something that's a bit less obvious rust crates can define a set of features and optional dependencies which can be on by default or off by default so there's a set of default features which can depend on some optional dependencies and then there's non-default features that also can depend on optional dependencies and we need to map that to RPM concepts somehow to keep the metadata and integrate dependencies correct there's also a mode when you can compile rust crates with no default features which basically turns off all default and all non-default features so you have to keep that in mind when you build crates especially when you build applications because sometimes people will yell at you this application in Fedora doesn't support X and feature Y and then you look at it then you see okay that would be optional features that we don't have enabled in our packaging but if you enable the optional features then those features are supported in the binaries that we build so you have to sometimes look out for that yeah and that's also important which features and optional dependencies are enabled affect functionality of the crate and sometimes also behavior because it can affect conditional compilation so it can affect which code is compiled and which code is included features and optional dependencies are then mapped to RPM sub-packages by Rust to RPM and we have automatic generation of virtual provides and requires between those sub-packages so you don't have to do that yourself yeah that last point is important the set of features and optional dependencies can change with every version of a crate it's usually not good to remove a feature in a patch release but theoretically any version of a crate can change the set of features and dependencies so it's important to rerun Rust to RPM for every new version because only that ensures that there are RPM sub-packages for all the features and optional dependencies if you don't do that yeah right if you don't rerun Rust to RPM that can result in broken packages because then if the crate includes a new optional dependency but you don't generate a sub-package for it then the feature sub-package will have a broken dependency because you don't have generated the sub-package for the optional dependency yeah right and Opium Autospec really comes in handy here because you don't have to restore the change log for the package when you regenerate it with Rust to RPM for a new version so that's really cut down the work we need to do for updating things to new versions right then we have right, I already mentioned that we generate RPM spec files for crates which to a test build during the build phase so we ensure that we don't ship any broken code so anything that we ship should compile if it builds in this crate it should build when you build anything against it yeah and we need dependencies and build dependencies that are specified in cargo Tomo available during the build otherwise you don't have the dependencies that are necessary for the build and these are automatically generated by the cargo generate build request macro and when you run package builds that use your crates then those dependencies are generated for the Rust name of the crates devl package so that handles transitive dependencies basically and by default we also run the test suite for package crates when possible just to make sure that we actually ship working code basically and we've also called lots of bugs this way sometimes code claims to be compatible with certain architectures but when you run the test suite you see this crashes on big end in architectures because they had code byte order in some integer bytes stuff like that and then you've got to find a bug report for that and yeah sometimes that's not fun but in the end that makes sure that we actually ship code that should work and you don't get that benefit when you build with rendered dependencies because you don't run the test suite for all your window dependencies right and the dependencies that are needed for running the tests are not included when the dependency generator runs for the development package because the tests are only one run when you build the crates but not when you include the crates in the build for some other package ok then we have automatic generation of virtual provides that are standardized for example the sub package with just the suffix devil gets virtual provides of crates in parentheses the name of the crates providing the version in the release and if for the plus default you get the crates with the default features and plus some feature bar you get provides for the crates for the feature bar and those are what are referenced by automatically generated requires by dependency generators and by the build requires generator ok then we need to map semantic version requirement syntax to rpm you can specify that your crates needs version 0.1 of package of crates it's based on upstream that's based on a syntax that's used for feature dependencies in cargo itself so it's not really something that we invented it's because it's based on what upstream can do I can show an example later right so if your crates specify the dependency on full 0.1 that's equivalent to using carriage 0.1 or equivalent to using tilde 0.1 those all map to I want greater than version 0.1.0 and I want smaller than 0.2 basically this says I want any version that's compatible with 0.1 of this crate so the rpm dependency that's generated is this one I want any version compatible with 0.1 which means anything between 0.1.0 and anything that's smaller than 0.2.0 and we need that to build a character there because otherwise pre-releases of 0.2 would be included here and for post 1.0 releases the tilde style is not equivalent and that sometimes causes problems because upstream projects may not really realize that it doesn't do the same thing or they do something weird in their dependencies 1.1 and carriage 1.1 still resolves as you expect to full bigger or equal than 1.1.0 and smaller than 2.0.0 but the tilde 1.1 resolves to dependencies between 1.1.0 and 1.2.0 so that is a stricter requirement than the one with the without specification that sometimes causes problems for us because we don't really expect packages to do this and we update packages to versions that are compatible according to semantic versioning but if crates use the tilde requirement for post 1.0 releases that breaks we might update a package from let's say we update full from 1.1 to version 1.3 those are still compatible but the project restricts to using full between 1.1 and 1.2 and then there is a broken dependency there so if you see a package that uses a tilde version requirement for any post 1.0 release you should remove that because it breaks the assumptions we make for packaging and especially for updating versions that should be compatible according to semantic versioning okay now let's look at an example we have some metadata here we have a crate with the name full and the version 1.0.1 we have a dependency bar in version 1.0.2 that's optional we have dependencies for running the tests we call that full test data in this example we need version 0.1 and we have two features we have the default features which has no dependencies and we have a full bar feature which depends on the bar dependency and you see you can specify which versions depend on which other versions or which optional dependencies so in this case the full bar feature depends on the bar optional dependency and you'll see in a moment that the RPM provides and requires that are generated by Rust tooling reflect that 1.1 right okay since this package since this crate has no dependencies by default it will just get a dependency on Rust packaging which are all the RPM packaging tools for Rust and the dependency on cargo and the dependency on the Rust compiler so just to run the test build and for everything in the packaging machinery to work and if the tests are enabled we also generated build requires on the food test data that's specified here and that will look something like build requires crate full test data slash default crate or equal ones 0.1.0 with crate full test data slash default smaller 0.2 till there and now you know why this stuff is automatically generated would we really are prone to do that by hand right and the provides and requires for the different build packages will look like this the development package which contains the crate source code will generate provides for the crate name and the version then there's a sub package for the default features and on the crate with no features and it will provide the crate with the default features yeah and so on and so on the sub package for the bar feature will require the main crate source code it will require the bar crate in this case because we specified that here and it will provide the crate with the bar feature the sub package for the full bar feature will require the crate source code and it will require the bar feature and it will provide the full bar feature so you can see we map the concepts that are present in cargo metadata to RPM sub packages and dependencies between those sub packages you can actually preserve all the information that's there in the upstream project and you have some ways in which you can affect generation of those sub packages if for example you have an optional dependency that you don't want to package it because it has missing dependencies and it's not used but what you need to do also you can disable some of those features are there any questions so far was I talking too fast because I assumed that it would take me longer to go through the introduction but that's perfect because we now have lots of time for the examples and for questions if you have any I will show my last slide because then I can close the presentation basically this is just if you have any questions with regards to rust packaging there's lots of ways how you can reach the rust or rust packages in general there's the Fedora Rust channel on Matrix which is bridged to the Fedora Rust channel on Liberachat IRC there's a mailing list where you can send questions that are if nobody's online in the Matrix or IRC channels and if you want to look at rust2rpm and the packaging macros we use for rust packaging those are hosted on Baguio however you want to pronounce it those actually keep you links if you want to take a look or filebox if you think something's broken right now let me see if I can share my screen instead of the presentation and I hope that things will be big enough to see them on the stream why rust2rpm is written in Python and not in Rust two reasons it was first the first version was written in Python and it's basically state that way rewriting it in Rust would be possible and it would allow us to use some algorithms and libraries that are also used by upstream cargo but it would also introduce a bootstrap problem because you need the Rust tooling to build rust2rpm but you don't have the Rust tooling because you don't have rust2rpm so having it in Python makes it possible to start from zero rewriting it in Rust wouldn't provide much advantage other than using the same libraries for some things right now we have our own parser for semantic versioning syntax and if we wrote it in Rust we could use the same library that cargo uses for doing that it's good enough all the bugs are fixed so it should work in all cases okay yeah right, all the known bugs are fixed okay let me see if I can share my screen somehow let me move there okay, the first example I would like to show you would be updating yet 4k let's see I've looked at bugs for new versions that we have for Rust crates and one of the nice ones we could look at is the 3d.json of course on camera I make typos clone let's look at that it would be too small but I can do this we have a change log file because this package predates up a motor spec so that contains historical change log data we have the spec file standard package let's see if that works 300% 100% is really big but I hope you can read this is that big enough? we can go through the spec file to see how the stuff I've been talking about looks in a real spec file we have the condition for running checks or not we have the name of the crates defined at the top we have the version we have the package name we are using rpm auto spec the summary is generated from cargo metadata the license is generated from cargo metadata we haven't updated this package since spdx expressions were allowed previously we put this claimer here upstream license is MIT or Apache 2.0 and then we had Fedora equivalent license expression down here then we have a macro for the URL of the crates sources we have the description in a separate macro that's also generated from the same description in the cargo tumble file then we have the devl sub package which contains all the source code of the crates it's no arch because it's architecture independent we have license files docs and this line here is the path to the source code of the crates in the build route we have a sub package for the default feature set we have a sub package for alloc feature which means you need a global allocator for this to work we have additional features using order preserving hash maps for example which is not on by default one to one and so on we have the prep stage where we unpack the crates and set up the build environment we have automatic generation of build requires we run the build, we install the source code so the actual meat of the spec file is really simple building rust crates is relatively straightforward and quite standardised because when you upload crates crates.io registry they need to fulfill some minimum requirements for standardisation so the things we need to do to build the package is quite simple and there are I think there's work ongoing in rpm that we can also automatically generate those sub packages at some point in the future because most of the content of the spec file is just expanded templates and those are not really interesting let's see so there was version 10083 released a few days ago and we can try to update that and I have that already open here and the go macros already generate sub packages I thought rpm doesn't have the features to do that yet I mean you can write macros to do it but you can also use macros to generate the sub package definition but I think there's some ideas how to generate sub packages automatically without having to use macros would make our job a bit simpler but in the meantime we could use macros to generate those sub packages but basically just not really important at least that way you can see the contents of the sub packages okay it needs to goes the metadata file because the dependency generators read that file so you need that in every sub package because the dependency generators run on that file generate the necessary requirements and inter package dependencies and that only works if you have that file in the package as well and it goes because you don't actually because otherwise it might you don't actually need the file but you can market as goes the actual file is only sorry for the inception the actual cargo terminal file is included only in the develop sub package because it's under that directory and it goes to included in all other sub packages for the dependency generators does that make sense yeah right they only process the crates if they can find the cargo terminal file you get really weird error messages if you accidentally remove the cargo terminal file yeah okay now look at 30 JSON that's one of probably most used crate packages we have in Fedora because it's the defecto standard 30 and the serialization serialization library for JSON now let's see if you can update that package let's run Rust to RPM and we need, I don't mean when you want the S flag because that actually stores the crate file and we don't have to download it manually the A flag would enable RPM auto spec but that's automatically detected if you're already using it we can include the P flag if we want to package, if we want to patch crate metadata before building the package we can do that in this case although I don't think we need to make any modifications here and then on 083 because you give it the crate name and the crate version we don't want to generate for but it defaults to the latest version so we can omit that and since version 22 it automatically determines the crate name as well so we can omit all that so it's just Rust to RPM-SP then you get an editor where you can see the crate metadata and in this case we don't need to edit any of that because it's all fine nothing too all compatible with what we're doing so we'll just close that and we can see and found a valid spec file for the CERDI JSON crate and used CERDI JSON and it used the latest stable version and it generated the spec file for us and I don't think that text editor shows no it doesn't we can see we generated the spec file with Rust to RPM-22 and last time we ran it we used Rust to RPM-21 but that's not really important for the diff one of the things that changed is that we expand the crate macro in the name so it can easier copy paste stuff everything is different and we no longer translate spdx expression in license tag but we use upstream specification directly and that's all the changes that there are the remaining spec file is identical between version 082 and 1083 now we build the source opn file build against the oldest supported fedora release and run it with post install to make sure all the dependencies for our non-default optional features are also present if you don't do that you might push a package that builds but doesn't install so I tend to do mock builds with post install locally just to make sure that packages not only build but also install when you build them in Koji okay that should be pretty quick refreshing repository data installing the default Rust dependencies and the dependency generators now it's installing the dependencies for the crate itself for auto-generated now it's running the build step yeah fedora 35 doesn't have rust opn version 22 yet it's a crit path package so it needs 14 days in testing now it's building the unit tests sometimes it's your friend but in this case it doesn't help you because I don't have a build rule that I submitted for it so it wouldn't actually get rust opn version 22 if you use the local repo of course this takes longer when you look at it than just a wall of text from compiler output integration tests right now so it shouldn't be long and always reminds me of that one comic where you see the two programmers standing on their chairs and fighting with swords and then it comes by and asks them what are you doing stuff's compiling okay go on okay now it's finished you can see here at the end it's running all the tests from the test suite and then you see output of the dependency generators for example you can see here it's processing the sub package for the and it requires the 30 log feature and it provides the package name with a log somewhere it's more of text not easy to find things in here but no error messages and the post install step also worked it installed all build sub packages with no errors get everything installed everything build complete finish run perfect now we use the package new sources of course you shouldn't do that now source upload now we look at the diff to make sure everything is okay we have new git ignore entry we have the very small changes to the actual spec file the only version increased from 0.82 to 1.083 and the new entry for the source file commit that update to version 1.083 fixes redhead bugzilla number let me just look that up last djson there's only one thing that's slower than rust compile times and it's bugzilla let's copy that bug number paste it in here commit the command was new sources 0.083.crate but that uploads stuff to Fedora server so you shouldn't run that step I'm just doing it to show you how I'm updating the crates now that should give us a nice new commit update to version 1.083 fixes redhead bugzilla something something you see I've also pushed the update to version 1.082 a few three weeks ago I was doing git push Fed package build to actually submit the update and that's not really important now because I just wanted to use a real-world package update to show you during the tutorial to run which command only locally you can use Fed package new sources dash dash offline I think if you don't have access to the package then you need to do that this way okay that's building for raw height and now I need Fed package switch branch Fedora 36 merge the change from raw height we just did and push and Fed package build and because I don't want to type all that every time I've got batch alias that does the same thing now it's also building for Fedora 36 and open another tab Fed package switch branch Fedora 35 git merge Fedora 36 push Fed package build merge the same change to Fedora 35 branch pushing it to git and launching the build and all everything takes a few seconds but now it's also building for Fedora 35 okay we can look at this later in this case I've just updated from 1082 to 1083 so I know that nothing requires exactly this so for cases like that you don't really need to check if something has an odd dependency like that and in this case I'm also 99.99% sure that nothing does stupid stuff like that because I update the 3D JSON package like every two weeks nothing breaks so yeah but yes for other packages you need to check sometimes I can look at another package yeah I can move on to the next example where I can show you that sometimes is that so big I can show you how to check for reverse dependencies okay next example I wanted to show you or in this case I'll show you another example earlier that's for the same deserialization serialization framework not the JSON support that package that create recently got a few new upstream releases now let's check what the package looks like standard package as well just spec file sources read me change log no patches nothing okay why is this small again 300% just like the previous package nothing special in here all standard package with no need for manual modifications let's run rust2rpm to update it to the new version all done reload looks good now look let's look between the old and the new version and now you see a problem because the old version was 0825 and the new version is 094 so that's an incompatible upgrade because it changes the leading number and in that case there's a few things you can do but let me see I use this bash script to check reverse dependencies and now you see there's a long list of packages that depend on 30 YAML version 0.8 so if you updated 30 YAML from 0.8 to 0.9 you break all those because they require smaller than 0.9 and I can show the expanded version of that bash script no it doesn't really need to do that the script just checks which packages require the current version and if you want to do an incompatible version bump then you check which packages require the current version so you know which packages would be broken if you updated it right now it's basically just two repo query commands for DNF that check raw height and raw height source for which packages require create packages without features and with any features which is that plus star and then source the results I've pushed that script somewhere I can link to it or I can I have it on github somewhere let me paste that into the chat okay see you now we know that we can't update this package to version 0.9 if we don't want to break a whole lot of packages so we need to think about how to handle that and in this case because the list of packages is quite long and I've already looked at the change log for 0.9 version and there's quite a lot of incompatible API changes supporting all those packages to the new version is not really feasible for downstream packages yeah at block is sometimes blocking too many things right in this case we can't really update this to version 0.9 yet unless we also make some other changes and in this case I could show you next case which is creating a compatibility package for an older crate version in this case we want to package both version 0.8 and 0.9 so they're available in parallel crates can depend on whichever they need so we have both of them available and that's probably going to be necessary as some applications already started using the new version and lots of things is still using the old version so unless we want to block updates for the packages that use the new version we need to package both and that's basically one of the things I wanted to show anyway so let's see yeah right once all the packages have upgraded to depend on the new version we can remove the old one from Fedora so it's basically just helping us move forward while retaining backward compatibility for crates that don't support the new version yet now we can look at Rust to RpML and we see that there should be one option that's interesting for generating compatibility packages for older versions which is the suffix option and we also need to know which version to package from the 0.8 branch because we can't just auto-detect the latest one because we don't want the latest one which version from the 0.8 branch we want we go to crates.io and to the page for the 30 demo crates and to the version tab and you see that the latest version from the 0.8 branch is 0.8.26 so we run Rust to RPM for 30 YAML for 0.8 to 0.26 with a package suffix of 0.8 generated spec file let's look at that small again interesting so it basically generates the same spec file that we had for the 30 YAML package before we ran the update for the 0.9 update the only thing it changes is that it puts suffix after the package name both in the name of the spec file and in the package names so since all the sub packages are names based in that name both packages are also parallel and the location on the file system where those files are installed are also names based by both crates name and version so they also don't conflict at the file system level so you can basically have packages where some parts of the dependency tree require the new version, some parts require the old version but because everything is names based correctly that just works which is also how cargo upstream works so we don't break things that work upstream okay that was pretty simple, not really much to talk about here or other questions no questions nothing in the Q&A tab okay let's check on our 30 JSON builds looks like they're all going nicely we have 0 failed tasks 5 tasks that are done, 1 that's still pending and similar here, a similar here just still taking a bit of time okay right and once you want to submit those updates in this case for 30 YAML 0.9.4 and the compatibility package for the 0.8 branch you should put those into the same update so they're updated atomically that there's no point in time at which packages are broken they either see only the 0.8 I'm also seeing some weird flashes on my screen maybe that's affecting the video in some way but I think the stream is still working right and when you're submitting that update from 0.8 to 0.9 including the parallel installable compatibility package for the 0.8 branch you should submit those two builds in the same update atomically so either stuff sees the 0.8 version or the 0.8 compatibility version but there's no point in time at which packages only see 0.9 so there's no transitionary period where stuff is broken alright next example I have prepared two more examples of three one for updating for creating multi-crate updates where you need to update two or more crates together for them to work correctly or packaging bindings for system libraries correctly or packaging a new package packaging a new crate that hasn't been packaged before are there any preferences but I think we have enough time to cover all of them so it's not really important just regarding order in which to cover those yeah I mean I can I know that for example the 3d emerald crate version 0.9 oops I know that this role application 300% the new version of CRD YAML requires a new dependency because it switched the internal YAML parser from one library to another it previously used the first implementation and now it uses an implementation that's based on LibYAML and that's not packaged yet for Fedora so we need to package that anyhow we can look at that basically that would be make a new directory for a new package name and why in a minute REST to RPM for store the crate file for unsafe LibYAML for the labetest version now let's look at this file now the package description already tells you why we don't use system bindings in this case it's because it's LibYAML source code transpiled to Rust so it's not really C code anymore C code transpiled to Rust by some automated tool and yeah you can see there's some stuff in here that looks a bit weird because it ships to binaries that sound like they're only used for REST source something but the REST looks normal now let's check whether this package actually builds RPM build build source Rust on unsafe LibYAML spec now let's build that on mock let's see what it does if it works or if it doesn't I already know the answer because I looked at doing all those examples yesterday to make sure the demo was all work okay it failed now let's scroll up why it failed looks like the build itself finished successfully let's see it failed and install extracting debug information all finished successfully but somewhere in executing the check phase so in running or building the test suite it failed and we'll see why in a moment because it does this error message and you might see this in some cases when you're managing REST crates so you start to know what this message means at some point you see failed to resolve use of undeclared crates on unsafe LibYAML test suite so that means this test code tried to import another crates which wasn't there and wasn't also specified in cargo so that basically means one of two things either upstream is broken which I know is not the case here but let's look at the source code for this crate on github I'll make that bigger in a second okay we have this project on github and that there's oh this is a workspace that means there's multiple crates in here and when you look at the tests data directory there's another crate in here unsafe LibYAML test suite alright that's the one it failed to find when running the tests but you see this crate is not published so it's not available for us for packaging so basically we are out of luck for running the test suite here because the dependencies for the tests we are not published so in this case we'd go into the spec file this flag so that we don't run the tests add a comment what should the comment be test data for not published or something like that and it's really important to add a comment why you're disabling tests because the next person who look at the package won't know why you disabled them okay now let's try again build the source package and run the build again that should be a bit faster now because it doesn't install test dependencies that should work now compiling the library extracting debug in for installing the library yeah it's all done okay now we still have one small problem because we don't really want those test binaries generated if you're not even running the tests and there's a trick you can use for that let's rerun with generating a patch so with the dash p flag and we can add a setting that we don't want automatic detection of binaries save that and exit and now you see that we have automatically included a patch here and we add a comment do not build unused test binaries and we need to update test something like that and now we have those two small modifications for the automatically generated spec file and that should result in a package that doesn't build any binaries and build successfully yeah I know but I don't really there's not really a reason to do it that way and I have I have bash aliases for this long command I usually just type mf35 and it does the same thing so I save a lot of typing by doing it that way but yeah I know that you can build from the spec file directly right but it's also not really necessary to build the source RPM in MOOC because the Rust tool chain for building packages is compatible between version fdora35, 36 and flow height so a source package on fdora35 is compatible with building it on write so if that wouldn't be the case then building the source package in MOOC would be the only way to do it but yeah right but in this case the Rust packaging tools don't affect don't affect the generated SRPM anyway yeah right okay that basically covers the creating a new package where we actually need to make some small modifications okay what are the other examples I wanted to show you those are a bit more complicated as well let's look at that one now what does peak use sound like any guesses yeah that's bindings for Postgres so we need to make some modifications to this package because the patch we had for the last version no longer applies so we need to redo some of those changes manually let's look at the package we have okay there's some stuff here which means we don't run the tests on 32-bit architectures that looks a bit weird but there's actually an explanation here there's some patches here where we drop windows specific dependencies or stuff like that oh and here's also a change log okay so this package doesn't there hasn't been converted to RPM Autospec yet so let's run RPM Autospec convert and done now that automatically converted the package to RPM Autospec which is nice because now we don't have to take care of the change log anymore and now let's update it new version to RPM store create file and we need to apply some patches okay let's look at the spec file we have an optional dependency on package config and we don't want that to be actually optional because we always use it so we remove that and that should do it except if any packages use explicitly use the package config flag the break so we need to use we need to preserve that feature flag otherwise packages might break now we have a conflict because we have an optional dependency because we have a feature no I think that should work not sure how I solved this yesterday either way I think that should work this way we have a new feature that has no dependencies with the same name as the previous optional dependency just to make sure we don't break any packages that import this okay now you can see we have actually whoops what's that we have more output from last to RPM this time here there's a message that it dropped a target specific dependency on VC package which is visual studio support so we don't need that on Linux it renamed some deprecated config file to a new name it generated the spec file it generated automatically generated patch and it generated the patch that we wrote so let's look at those files we have the automatically generated patch which is a new feature with Rust to RPM version 22 that it automatically drops dependencies on features or dependencies on crates that are not needed for building on Linux so in this case it automatically removes a dependency on VC PKG which is only used when the target environment is Microsoft Visual C and we don't need that and then we have our own patch here where we we preserve the package config feature flag but we don't we make the build dependency on package config non-optional because we always build against the system Postgres library okay now let's see what's different about the spec file it's meaning the config file the config file specifies that we depend on libpq that's not really interesting fixmetadata if that's the patch we wrote then okay it's removed we need to look at if the test suite still passes because it was previously disabled for 32-bit architectures then here's the usual stuff that changes with Rust to RPM 22 we have the expanded macro in the name we have the update from 046 to 047 we have the spdx expression in the license field now and we have the new patches and some other minor changes that happened between Rust to RPM that was used to build this last time which was Rust to RPM 15 pretty old package we updated the spec file template between version 15 and 22 so the diff is a bit bigger but that's basically only fixes for typos or grammar in the description of the packages and that's also interesting looks like that package didn't have license files when it was the last build but now we have MIT and Apache license files and we have docs as well so that's good the package now actually ships license files okay now what do we still need to look at there was another patch in this package that we didn't look at yet which was the patch to make it build with a system library using package config by default that actually patches one of the source files so we still need to do that so in this case it patches the build.rs as a sort of build script and you can see it removes conditionally compiled code for the case where we are not using package config but we'll look at the source code again because I know that this patch no longer applies because of upstream changes so we remove that and extract the source code okay let's look at the source code clear and let's use git for generating our patches because that's always nice we add all files, make sure we don't generate changes based on line endings because some people still use windows for developing and add everything and now we have everything prepared for writing the patch, we know we need to edit the build script what do we need to do, we need to make sure that the code that's executed when the package config feature was enabled is always executed because we made package config non-optional okay let's look for the main function because the main function is what's actually run when you run the build script there's a whole stuff about homebrew and macOS, I think we'll remove that because we don't need it configured by a VC package no, no, no, that's all not what we need what's the main function, here's the main function print rerun if nth changed pqlib static no we don't static link we don't build for different, we don't do cross compiling so we don't need to look at the target variable no, that's not interesting for us either that's all not interesting we don't build against a different library path, we don't use that either configured by package config, that sounds interesting what does that do, configured by package config function runs the probe library pq from package config great and that's actually the only thing that we'll need to do in the main function let's just copy that here which means we fail the main function if this fails and I think we can drop everything else because we are just using package config in all cases so okay, I think that should do it and let's check if that actually matches what I did a few days ago because I know that what I did a few days ago works I need to patch what does the file look like yeah, the file looks the same, okay that's also what I did last time so that should work yeah, and if you're wondering no, you don't really need to know how to do that when you build your first Rust package that's just something I wanted to show yeah, sometimes, especially when you build bindings for system libraries you need to actually patch the build scripts to do the right thing but in this case the end result is actually pretty simple because it's just running package config to find the libpq library and if it doesn't find it, it will fail the build okay, let's see yeah, the diff is pretty big because we removed all the stuff for static linking and for homebrew and for Microsoft Visual C environments and for macOS and all that stuff and we only keep the support for building with package config let's write that commit and conditionally build not really I forgot what I called that patch last time let's just check and conditionally use package config link against system libpq that should be pretty descriptive okay now let's go back to the workshop folder use this, okay, git format patch the last commit output directory where we want it you can generate patches any way you want this is the way I like to do it because it makes it easier when there are multiple patches okay, now we need to look at the spec file again to make it actually include that patch there it is, make it big again no idea what terminal remembers the 300% zoom but text editor doesn't okay, for the automatically generated patch we don't need a comment because that already explains what it does for the manually created patch we need to say make dependency on package config non-optional and then we have another patch where we patch the build script actually let's just copy the patch patch subject and conditionally use package config to link against system libpq that's all good, dependency on libpq development files is still there build dependency on libpq dependency at development files is also there okay, looks good now let's build that package let's see if our patches do what we want it to do 35, all installed okay, no missing dependencies build finished successfully packages installed successfully good, now we can take a look here we see it's running the tests suite and some smoke tests and looks like everything passed, so that looks good now what did we forget about there was a comment that well if you click on that issue link you'll see that the tests suite fails on 32-bit architectures and in this case I've actually looked at the failure this was reported a while ago but it's basically harmless because the tests hard code some numbers that are only correct on 64-bit architectures but the code itself should be fine so it's just the tests that are broken but the code itself is good so we'll just keep that link so that way if arch ix64 or ARM and with check so if we run on 32-bit architectures we don't run the tests but if we are on 64-bit architectures we do run the tests and let's keep that link to the upstream issue and actually add a comment what the issue is about generated bind gen tests only work on 64-bit architectures basically those are tests that are automatically generated for the Rust bindings and they check stuff like struct layouts struct sizes and of course struct sizes are different on 64-bit architectures especially if there's pointers or integers or stuff like that so it's no surprise that those tests don't work on 32-bit architectures when they are generated on 64-bit architectures but I've looked at the generated code and the generated code itself should be fine let's see that's all good we kept the comment about the failing tests that's good there's some optional white space that I don't really want where is it? let's look at that again okay, this looks good now some stuff that's just generated by the new Rust opm version in this case because it's actually a package that's kind of architecture dependent because it bindings for system libraries in such cases I sometimes launch scratch builds on Koji to make sure it actually builds on all architectures before I submit it let's build it for Fedora 35 because that also still has 32-bit ARM and we can just launch that and that should work at least it worked two days ago let's move that here and I think the 30 JSON builds or scratch build SRPM that might work but in this case I launched a scratch build from an SRPM with still uncommitted changes in the repository I don't know if that command works if you have uncommitted changes okay, well then it should work too, yeah okay, let's see the 30 JSON builds all finished successfully yeah, sure that's just the way I keep doing it so let's yeah, there's always three different things to do the same thing, so okay, let's look at Bode I think I need to log in again for some reason okay, you see there's quite a lot of pending updates in my account most of them rust packages let's see here okay, the raw height update for the 30 JSON build we launched is already here and now let's submit the Fedora 36 and 34 builds as well rest 30 JSON, 36 and 35 update to version 0.83 there shouldn't be any open bugs because the raw height build already closed closed it we only need one stable Karama because nobody gives Karama for build route only packages anyway right, okay so we see 30 JSON update for the latest version submitted to all Fedora branches yeah yeah, nobody gives Karama for my updates because nobody installs the development packages right okay, we can remove 30 JSONs this git clone and that should do it for now, that one is still running okay, the last example I wanted to talk about was updating two interdependent crates is this still interesting yes, okay I know that there have been new releases for some crates that are called this error and this error, IMPLE when we look at the Rust, this error package let's look at IMPLE first because I know we need to build IMPLE first and then the other one can just update to the new version all good, all green that's also new with the latest Rust to RPM version I actually made the program output a lot nicer where is it, there it is standard Rust package with, yeah, there's nothing special about this the only special thing is that it's actually an implementation of macros so this crate basically contains code that generates code, which is always interesting when you want to do metaprogramming now what are the differences, okay, I can see I have removed some markdown markup from the summary let's do that again because RPM doesn't run the markdown and then the other changes are trivial Rust uses SPDX for the license string and bumps the version from 1031 to 1032 no other changes let's build that, that should work without problems we can, while that runs, look at the other package, the sent new tab, the new version no special messages from Rust to RPM generated spec file looks really normal the diff is really short too there's just different version of Rust to RPM bump from 1031 to 1032 using SPDX license expression so basically the same diff in this package using the other one, okay, the build for the first package was successful, that's always nice the same one here mock build for Fedora 35 and we should see an error message during dependency resolution no matching package to install create this error, input 1032 and that's the one we built here so this error create always requires the matching version of this error input create, so those two always need to be updated together there's a few projects that are split into multiple crates like this for various reasons, in this case because the macros always need to be in separate crates because they are built differently so those two crates will always need to be updated together and in this case we just test whether that actually works just this error input this error with the chain command for mock and the post install command to make sure the packages we built are actually installable and now it's building the first package and then the second package by using the first package as a dependency or making the first the build results from the first build available as dependencies for the second build basically that shouldn't take too long and it should work two days ago when I tested the demo I think technically my session ends right now but I think we have time until the hour is over so I think we still have a few minutes I just want to show how to launch those builds in an on-demand tag in Koji so that they are submitted together without breaking dependent packages and just see if the build actually succeeds let's look up the bug numbers for those new versions this error input and this error I marked the bugs as assigned so nobody fixes them in between me looking for examples and today's workshop sources this error in this tab both builds were successful then new sources in this tab as well to version 032 fix this right hand bug cellar number so on so 2x hand bug cellar number 6x8 now I run fedpackage request site tag now we get a new build target keep push fedpackage build but this time with a custom target for the yeah I usually run 6 builds in parallel and then chain build is a bit clunky because I usually start builds for raw height fedora36 and fedora35 for the same package and then wait until they are all done and then launch the dependent builds for fedora36 and 35 so chain build would take a lot longer because it serializes the builds so I'm running those 3 in parallel but yeah I know the chain build exists it can do that okay interesting I need to look at that but I actually with shell aliases I've set up it's really not much work to do it that way either so okay in the dependent package we can run let's repo and copy the name of the site tag we got where is it everything's so big I can't find anything wait until this build is done in that site tag and then get push and fedpackage build target in this target and I could launch that now but that would take a while because that build is still running in Koji so I won't do that right now I'll do that after the workshop is over and finish that update because I think our time is basically up but I've actually covered everything I wanted to talk about so yeah anybody got any questions or can I finish the presentation yeah I hope it was interesting because there's a lot of technical details that are a bit unusual when you talk about rust packaging especially all the generated requires and provides and builds requires and the generated sub-packages and rust to RPM with the configuration file and sometimes you need to do weird things to make system bindings work the way you need them to work and yeah having more people would be great I already showed the slide where you can find the rust sig numbers and in the first slide there were all my other ways you can poke me I need some rust package or rust fix or something yeah you know where to find me thanks for showing up everyone I hope it was interesting for beginners and I hope it wasn't too complicated