Thursday, 2023-09-07

*** bauzas_ is now known as bauzas00:17
*** bauzas_ is now known as bauzas00:49
*** bauzas_ is now known as bauzas03:01
*** bauzas_ is now known as bauzas03:50
*** bauzas_ is now known as bauzas04:26
*** bauzas_ is now known as bauzas05:32
*** bauzas_ is now known as bauzas05:45
*** bauzas_ is now known as bauzas06:30
*** bauzas_ is now known as bauzas06:38
*** ralonsoh_away is now known as ralonsoh07:09
*** bauzas_ is now known as bauzas07:33
*** bauzas_ is now known as bauzas08:04
*** bauzas_ is now known as bauzas10:07
*** bauzas_ is now known as bauzas10:15
*** bauzas_ is now known as bauzas11:31
elodilleshi, today a lot of stein-eol patches have merged, so I'm planning to run the eol cleanup script ~2-3 hours later today if that is OK ( https://review.opendev.org/q/topic:stein-eol+is:merged )14:52
fricklerelodilles: should be fine afaict14:53
elodillesthx o/14:53
fricklerelodilles: regarding https://review.opendev.org/c/openstack/openstack-zuul-jobs/+/891628 I think that this may become a very long list of fixes needed, so I'm thinking about following corvus' idea of force-merging that change14:56
fricklerthat would lead to config-errors in all affected projects, with the benefit of being able to see them all at once instead of one by one only14:57
fricklerfungi: clarkb: ^^ what do you think about that?14:58
elodillesfrickler: that is possible that there are many projects where the job is still used, so maybe the force-merge approach is the easiest14:59
clarkbwould reverting undo the errors? If so that seems like a good outlet we can use15:01
elodillesi hoped that publish-openstack-sphinx-docs was really used on already eol'd pike, queens, rocky branches, and those are mostly deleted and only some of them should be still opened (those which are not listed under releases repo's deliverables)15:02
elodillesbut i fear that it is still lingering on some 'master' branch as well...15:03
fricklerclarkb: so force-merge, make a list of the errors, revert, fix the errors, revert the revert?15:03
clarkbfrickler: or maybe just revert if the errors become problematic for some reason. We dno't need to revert unless necessary. But it is a out we can use15:04
fricklerclarkb: yes, that's what I wanted to suggest, then we're on the same page15:04
clarkbsounds like a plan15:04
fricklerso if I get a second +2 on that patch, I'll go ahead with it15:06
clarkblooking now15:06
fricklernote the job deletion is stacked on top, but needs a rebase after the regex amendmends15:07
fricklerI'll do that once this patch is merged, avoids needing to rebase both15:07
clarkbdone15:07
clarkboh I did only the first. Reviewing the second now15:08
clarkbgerrit renders the diff on the second one oddly. But lgtm15:09
opendevreviewMerged openstack/openstack-zuul-jobs master: Remove rocky branch from periodic-stable templates  https://review.opendev.org/c/openstack/openstack-zuul-jobs/+/89162815:11
opendevreviewDr. Jens Harbott proposed openstack/openstack-zuul-jobs master: Clean up rocky branch filters  https://review.opendev.org/c/openstack/openstack-zuul-jobs/+/89162915:20
fricklerI count 22 errors https://zuul.opendev.org/t/openstack/config-errors?name=Template+Not+Found15:23
fricklernoonedeadpunk: ^^ maybe you can take a look at cleaning up the openstack-ansible projects in that list?15:24
frickleror find someone in your team to do it?15:24
fricklerso certainly a good thing we didn't try to go through all of those one by one15:25
fricklerbut also only queens or older affected, which is a good thing I guess15:25
* noonedeadpunk wonders why pike/queens/ocata repos are still there15:25
noonedeadpunkthey all should have been EOLed15:26
fricklernoonedeadpunk: likely because of missing release automation15:26
noonedeadpunkI will reach releases team with EOL of Stein regarding these leftovers15:26
noonedeadpunkI think I was writing to IRC once about that but either missed answer or message was lost15:27
noonedeadpunkWill check on that15:27
fricklernoonedeadpunk: those branches were not created by release automation, so they cannot be eoled automatically. elodilles may provide more context maybe15:28
noonedeadpunkthanks for the ping15:28
fricklermeh, the job removal triggers even more errors in https://review.opendev.org/c/openstack/openstack-zuul-jobs/+/89162916:54
fungias the sweater unravels17:17
elodillesthe script cleaned up the following repos in today's run: https://paste.opendev.org/show/bagmzLMWsWAw5n5UYsW5/18:00
*** bauzas_ is now known as bauzas18:00
fungithanks!18:05
fungithat's a bunch18:06
fungiawesome18:06
elodillesyepp. 11 less periodic-stable fail mail, hopefully18:10
fungicurrently it's down around 39 failure e-mails, so yes i'll be curious to see how much it drops18:11
fungireducing it by a quarter would be great18:12
elodilleshmmm, i've found a weird thing... at least i can't figure out how it is possible... horizon's stable/stein is deleted, but horizon's periodic stable jobs are still running against stein... i don't even see the zuul inventory file where does it come from. this is an example: https://zuul.opendev.org/t/openstack/build/5d3168053f224f1ebe9953d1055a437f18:19
fungiwhen was the branch deleted?18:19
elodilleslet me check it.... (but not today)18:20
elodilles(here it should have a line with a zuul yaml file from horizon's repo: https://zuul.opendev.org/t/openstack/build/5d3168053f224f1ebe9953d1055a437f/log/zuul-info/inventory.yaml#38-43 )18:21
elodillesfungi: it was deleted in August 16th18:22
fungilooks like it ran with master of horizon: https://zuul.opendev.org/t/openstack/build/5d3168053f224f1ebe9953d1055a437f/log/zuul-info/inventory.yaml#12218:23
fungipresumably because it couldn't find the stable/stein branch18:23
elodilles:S18:25
elodillesbut why does it work differently then other repos?18:25
fungia very good question. maybe some cache entry didn't get correctly invalidated? clarkb ^ i don't suppose you have any ideas?18:26
elodillesthe branch was deleted successfully & i don't see stable/stein on opendev.org/openstack/horizon either, so it is like how it should be18:27
fungihttps://opendev.org/openstack/openstack-zuul-jobs/src/branch/master/zuul.d/project-templates.yaml#L2622-L2670 does have branch matchers for stein, but i don't think that should cause zuul to trigger the jobs for those branches when they don't exist18:28
elodillesfor all the other repos this works fine ^^^ only triggers a job if the given repo has a stable/stein branch with the 'periodic-stable-jobs' template in it18:30
fricklerhttps://zuul.opendev.org/t/openstack/project/opendev.org/openstack/horizon?branch=stable%2Fstein&pipeline=periodic-stable looks like they may be defined in project-config?18:30
fungioh, good catch... https://zuul.opendev.org/t/openstack/freeze-job?pipeline=periodic-stable&project=openstack%2Fhorizon&job=openstack-tox-docs&branch=stable%2Fstein18:31
elodillesso is this defined somewhere in opendev.org/openstack/project-config ? (i don't find it :-o)18:34
fricklerI was suspecting official-openstack-repo-jobs but that's not it18:35
fungiwell, the pipeline is defined in project-config and the horizon project is as well, but yes i'm still not clear what in there is telling zuul to trigger jobs on a nonexistent branch18:36
fungizuul doesn't have that same job freeze for other projects stable/stein was deleted from, based on spot checks18:37
fungiit definitely seems like a leftover/stale cached entry for that branch, like zuul never found out about the deletion18:38
fricklermaybe that's related to the "refs/heads" issue. could it also affect branch deletions?18:38
fungimaybe. what introduced the regression?18:38
fricklerI don't know, need to check with corvus18:39
fungiif it ended up in a zuul restart between previous branch deletions and the horizon stein deletion, then that seems suspect yeah18:40
elodillesyepp, the first mail with horizon's failing stable/stein periodic is August 17th, so right after the branch was deleted18:44
fricklerother branches deleted on August 16 do not show this issue, though18:49
elodillesonly horizon18:49
frickleryes. moving to #opendev to get corvus involved18:49
fricklerI cannot find openstack/compute-hyperv in governance, is that some kind of oversight?19:00
fungiit should have been retired when the winstackers team ceased to exist, but maybe that got missed19:00
frickleroh, yes, I just found https://review.opendev.org/c/openstack/governance/+/88688019:01
fricklerthat was not as long ago as I expected19:01
elodillesyepp, just wanted to say it should be in legacy.yaml19:01
fricklerbut https://opendev.org/openstack/compute-hyperv seems untouched19:02
fricklereven retirement would usually only empty the default branch, not stable branches, right?19:02
fungiopenstack/os-win, openstack/oswin-tempest-plugin and openstack/networking-hyperv also needed retiring19:03
fungifrickler: we'd also remove them from zuul's config though19:03
fungiso jobs would stop being run19:04
fricklerah, that would solve my current issue, right19:04
fricklerso moving this topic to #-tc19:04
elodillesit should look like this for example: https://opendev.org/openstack/panko19:04
fricklerack19:05
elodillesi guess this was missed as the team disappeared from community :/19:05
gmannfrickler: fungi: elodilles: yes complete project winstacker is retired and I am working on cleaning up the deps (like nova hyperV driver) etc and repo content  19:06
elodillesgmann: ack, thanks!19:06
*** bauzas_ is now known as bauzas22:11

Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!