clarkb | Just about meeting time | 18:59 |
---|---|---|
ianw | o/ | 19:01 |
clarkb | #startmeeting infra | 19:01 |
opendevmeet | Meeting started Tue Aug 9 19:01:30 2022 UTC and is due to finish in 60 minutes. The chair is clarkb. Information about MeetBot at http://wiki.debian.org/MeetBot. | 19:01 |
opendevmeet | Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. | 19:01 |
opendevmeet | The meeting name has been set to 'infra' | 19:01 |
clarkb | #link https://lists.opendev.org/pipermail/service-discuss/2022-August/000351.html | 19:01 |
fungi | ahoy | 19:01 |
frickler | \o | 19:02 |
clarkb | #topic Announcements | 19:02 |
clarkb | First up we are currently in the middle of the opendev service coordinator nomination period. It will run through August 16, 2022 | 19:02 |
clarkb | #link https://lists.opendev.org/pipermail/service-discuss/2022-July/000347.html for details. | 19:02 |
clarkb | If you are interested please go for it :) | 19:02 |
clarkb | The other thing to announce (which should've gone on the meeting agenda) is that the PTG is a virtual event now | 19:03 |
clarkb | That doesn't change a whole lot for us | 19:04 |
clarkb | But we should probably expect more people trying to use meetpad during that week (and ptgbot was going to be used either way) | 19:04 |
clarkb | More details on that change will be coming out as things get sorted out | 19:04 |
clarkb | #topic Updating Grafana Management Tooling | 19:06 |
clarkb | ianw: I see some but not all of the changes have merged? | 19:06 |
clarkb | #link https://review.opendev.org/q/topic:grafana-json | 19:06 |
clarkb | I think there were problems with newer grafana image updates (they release beta's under the :latest tag?) | 19:06 |
clarkb | Any chance you and/or frickler can catch us up on this item? | 19:07 |
ianw | yes, unclear if that is a bug or a feature | 19:07 |
ianw | the release of betas | 19:07 |
ianw | anyway, i dug into that and it got fixed | 19:08 |
ianw | #link https://github.com/grafana/grafana/issues/53275 | 19:08 |
ianw | the last thing is cleaning up the jobs | 19:08 |
clarkb | And that appears to be two chagnes that just need reviews | 19:09 |
clarkb | #link https://review.opendev.org/c/openstack/openstack-zuul-jobs/+/851951 | 19:09 |
clarkb | #link https://review.opendev.org/c/openstack/project-config/+/851954 | 19:09 |
ianw | i just went and switched their topics to grafana-json, sorry, to keep them together | 19:10 |
ianw | but yeah, a couple of eyes on that and i think this is done | 19:10 |
clarkb | https://grafana.opendev.org/d/f3089338b3/nodepool-dib-status?orgId=1 that does show different colors for build status now. Which is sort of what kicked this all off again | 19:10 |
clarkb | Thank you for getting this done | 19:10 |
ianw | heh, yeah, i had it before that but trying to fix that was what pushed me to get it in :) | 19:10 |
frickler | did we switch to running grafana:latest again? do we want that? | 19:11 |
ianw | frickler: i think we probably do want that, otherwise we fall behind and it's just more painful | 19:11 |
frickler | I'd actually prefer to avoid beta versions if possible | 19:11 |
frickler | but it's a thin line, I admit | 19:12 |
fungi | seems like they need not only a grafana:latest but also a grafana:greatest | 19:12 |
ianw | i feel like it gives us generally one problem to sort out at a time, instead of updating every X months/years and having many problems all together | 19:12 |
clarkb | Looks like they don't have a 8 series tag or similar that we can just hang out on? But ya I think chances are we'll just end up super out of date if we aren't on latest | 19:12 |
ianw | there was an issue opened about them releasing the beta ... | 19:12 |
ianw | #link https://github.com/grafana/grafana/discussions/47177 | 19:13 |
clarkb | cool sounds like there are some people asking for a stable tag which would probably also work for us | 19:14 |
ianw | i guess ideal for us would be production on a stable, and we can run a -devel job | 19:14 |
ianw | i mean, it might be said we have better CI than upstream, since we found and bisected it before any of its testing noticed :) | 19:15 |
clarkb | heh, but ya seems like that discussion is headed the right direction for our needs. Hopefully they enact the requested changes | 19:15 |
ianw | (let they who have not released something buggy cast the first stone ... but we can certainly help :) | 19:15 |
fungi | given their business model, making it easier to run up-to-date stable systems may not be aligned with their agenda anyway | 19:16 |
clarkb | Anything else on this subject? | 19:16 |
ianw | not from me, thanks | 19:16 |
clarkb | #topic Bastion Host Updates | 19:16 |
clarkb | I haven't seen any movement on this (which is fine), but I've also been focused on mailman3 stuff so wanted to double check I didn't miss anything important | 19:17 |
clarkb | anything new to add to this? | 19:17 |
ianw | yeah i've started looking at getting ansible into a venv, to isolate things better | 19:17 |
ianw | #link https://review.opendev.org/q/topic:bridge-ansible-venv | 19:18 |
ianw | so far it's cleanups | 19:18 |
clarkb | But they are ready for review? | 19:18 |
ianw | those bits can be reviewed, they're independent | 19:19 |
clarkb | great. I'll put them on my list | 19:19 |
ianw | it is tangential, but related | 19:20 |
ianw | #link https://review.opendev.org/q/topic:gra1-bump-timeouts | 19:20 |
ianw | (because these changes are all running jobs that sometimes fail) | 19:20 |
clarkb | oh I completely missed there was a parent change to review there | 19:20 |
clarkb | Thats what I get for doing code review first thing in the morning | 19:20 |
ianw | that became a bit of a yak shaving exercise because updating the borg job ran it, which made me see it was failing | 19:20 |
ianw | i also now have borg 1.2 updates on my todo list. however the first item in the upstream upgrade check-list is "do you really want to run borg 1.2, the 1.1 branch is still getting fixes" ... so that didn't exactly scream "do this now" :) | 19:22 |
clarkb | I guess that is good of them to call out :) | 19:22 |
fungi | i missed the parent change as well | 19:23 |
ianw | i think we've seen that before, that old pip doesn't know to not try and install later packages? | 19:23 |
fungi | yeah, it's the abi3 stuff | 19:24 |
fungi | newer wheels can declare they support abi3 rather than specific interpreter versions | 19:24 |
fungi | but older pip knows nothing about abi3 | 19:24 |
clarkb | there is also the thing where old pip doesn't know to check the required python version metadata on pypi | 19:24 |
fungi | yeah, if it's too old, there's that as well | 19:24 |
ianw | yeah i think its metadata here | 19:24 |
fungi | oof | 19:25 |
fungi | though usually not checking requires_python metadata means you end up trying to install too-new packages which lack support for the interpreter version you have | 19:25 |
clarkb | ya then that fails | 19:25 |
clarkb | er the software itself fails to run | 19:25 |
clarkb | in any case I'll need to review the chagnes to page in the context here | 19:26 |
ianw | it did make me think that ensure-pip in zuul-jobs exports the command to get a virtualenv | 19:26 |
clarkb | I'll try to do that soon | 19:26 |
ianw | which we could make "bash -c 'python -m venv <path>; ./venv/bin/pip install --upgrade pip setuptools;" ... maybe | 19:27 |
ianw | i don't know if we could template in the path | 19:27 |
ianw | anyway, no need to discuss here, but something to keep an eye on i guess | 19:27 |
ianw | it's probably actually only bionic that has this issue with old versions, so impact is limited | 19:28 |
clarkb | ok | 19:28 |
clarkb | #topic Upgrading Bionic Servers to Focal/Jammy | 19:28 |
ianw | (just hitting us because our bridge is that ... so back to the topic :) | 19:29 |
clarkb | #link https://etherpad.opendev.org/p/opendev-bionic-server-upgrades Notes on the work that needs to be done. | 19:29 |
clarkb | The changes I had made to support mailman3 on jammy which would generally support jammy updates have landed | 19:29 |
clarkb | ianw: that borg work is also related ? | 19:29 |
ianw | yeah, there's a follow-on that adds borg testing on a jammy node. no reason it wouldn't work, but good to cover it | 19:29 |
clarkb | got it and ++ to having test coverage there | 19:30 |
clarkb | I think if we can avoid updating to focal from bionic and jump to jammy we'll save ourselves future work so getting the jammy bootstrapping done is worthwhile | 19:30 |
clarkb | As mentioned I've been somewhat focused on mailman3 stuff but I think that is starting to solidify so I'm hoping to have time for an actual upgrade or two in the near future | 19:31 |
clarkb | But I didn't have anything else on this topic | 19:32 |
clarkb | #topic Mailman 3 | 19:32 |
clarkb | #link https://review.opendev.org/c/opendev/system-config/+/851248 WIP change to deploy a mailman 3 instance | 19:32 |
clarkb | I think the deployment aspects of this change are largely done at this point. (though I may have discovered a new bug I'll push a silly workaround for shortly). | 19:32 |
clarkb | There is still a fair bit of work to do around figuring out how we want to configure mailman3 and what our list settings should look like. But I've got a skeleton framework for addressing that in ansible in the change as well | 19:33 |
clarkb | There is a held node 104.130.26.212 which we'll use to try and answer some of those questions. I'm enlisting fungi's help because it involves email and things that I just don't undersatnd as well as others | 19:33 |
clarkb | If you are interested I think reviewing the change at this point and/or checking out the server would be helpful | 19:34 |
fungi | yeah, i have a basic test scenario from my earlier mm3 poc i want to run through, creating test lists on multiple domains | 19:35 |
fungi | manual testing, that is | 19:35 |
clarkb | also don't worry about breaking things. Holding a new node isn't difficult | 19:35 |
clarkb | definitely poke at it and see what we can improve. Ideally when we get around to doing the migration people will for the most part not notice other than that the UI and user database has changed | 19:36 |
clarkb | I am slightly concerned that the mailman service is going from simple and straightforward to giant django monolith. But the new thing shoudl support our use cases better and it has a future so I'm rolling with it | 19:37 |
clarkb | django in particular is not the easiest thing to automate around whcih the ansible illustrates | 19:38 |
ianw | yeah they love a big old turing complete configuration file | 19:39 |
fungi | you could say the same of exim | 19:40 |
* fungi won't speak ill of exim though | 19:40 | |
clarkb | ha | 19:41 |
clarkb | I think that is all on mailman3 msotly just it is ready for your feedback and help :) | 19:41 |
clarkb | #topic Gitea 1.17 | 19:41 |
clarkb | Gitea 1.17 has been out for about a week now | 19:41 |
clarkb | #link https://review.opendev.org/c/opendev/system-config/+/847204 | 19:42 |
clarkb | We have a change to upgrade to it if we like. | 19:42 |
clarkb | However it looks like 1.17.1 is in the works and will include a number of bugfixes | 19:42 |
clarkb | #link https://github.com/go-gitea/gitea/milestone/122 1.17.1 Milestone is in progress | 19:42 |
clarkb | We're probably better off just waiting for that since there isn't a pressing need to upgrade right now. The change for 1.17.0 shouldn't be much different than the one for 1.17.1 though. The only difference I expect is the tag version in the docker file | 19:43 |
clarkb | that means if you want to review that now it won't be wasted effort | 19:43 |
clarkb | #topic Open Discussion | 19:43 |
clarkb | That was everything I had. Anything else before we find $meal? | 19:44 |
fungi | i didn't have anything | 19:46 |
ianw | nope. i still need to finish up the ansible-lint upgrades, but thanks for reviews on that one last week | 19:46 |
ianw | #link https://github.com/ansible/ansible/issues/78423 | 19:46 |
ianw | if you're interested in the python/ansible versions interaction | 19:47 |
clarkb | wow ansible 5 will still talk to python2.6 on the target nodes | 19:48 |
clarkb | Sounds like that is it. Thank you everyone! | 19:50 |
clarkb | We'll be back here next week same time and location | 19:50 |
fungi | thanks clarkb! | 19:50 |
corvus | i don't think we ever had py26 support in zuul-jobs? | 19:50 |
clarkb | corvus: we didn't. Mostly just surprised that upstream does it | 19:50 |
clarkb | #endmeeting | 19:50 |
opendevmeet | Meeting ended Tue Aug 9 19:50:46 2022 UTC. Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4) | 19:50 |
opendevmeet | Minutes: https://meetings.opendev.org/meetings/infra/2022/infra.2022-08-09-19.01.html | 19:50 |
opendevmeet | Minutes (text): https://meetings.opendev.org/meetings/infra/2022/infra.2022-08-09-19.01.txt | 19:50 |
opendevmeet | Log: https://meetings.opendev.org/meetings/infra/2022/infra.2022-08-09-19.01.log.html | 19:50 |
corvus | so that means that zuul-jobs might be holding onto py27 for a while | 19:50 |
clarkb | ya py27 definitely still valid for ansible 5 on the target node according to that table | 19:51 |
corvus | which might (i haven't checked) mean we might end up asking to keep older distros around for that? | 19:51 |
clarkb | corvus: jammy still has python2.7 https://packages.ubuntu.com/source/jammy/python2.7 | 19:51 |
corvus | oh interesting | 19:52 |
clarkb | I think we may be able to get away with jammy test nodes in zuul | 19:52 |
corvus | ianw: thats good info, thanks for digging :) | 19:52 |
ianw | yep; sorting out the py27 testing env to something more sustainable was on my todo to finalise that series now :) | 19:53 |
*** diablo_rojo_phone is now known as Guest139 | 21:23 |
Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!