Thursday, 2022-10-13

opendevreviewGhanshyam proposed openstack/project-team-guide master: Clarify Extended Maintenance branch testing and support policy  https://review.opendev.org/c/openstack/project-team-guide/+/86114106:40
*** pojadhav is now known as pojadhav|afk07:49
fricklerfyi I added the current list of config-error affected projects to https://etherpad.opendev.org/p/zuul-config-error-openstack in order to show the scope of the issue08:11
*** pojadhav|afk is now known as pojadhav08:16
noonedeadpunkgmann: sure, will do this today09:02
*** dasm|off is now known as dasm13:00
*** knikolla[m] is now known as knikolla13:26
*** blarnath is now known as d34dh0r5313:32
gmannfrickler: thanks 14:42
gmanntc-members: weekly meeting here in an ~9 min from now14:51
jungleboyj\o/14:54
gmannnoonedeadpunk: thanks14:55
gmann#startmeeting tc15:00
opendevmeetMeeting started Thu Oct 13 15:00:14 2022 UTC and is due to finish in 60 minutes.  The chair is gmann. Information about MeetBot at http://wiki.debian.org/MeetBot.15:00
opendevmeetUseful Commands: #action #agreed #help #info #idea #link #topic #startvote.15:00
opendevmeetThe meeting name has been set to 'tc'15:00
JayFo/15:00
gmann#topic Roll call15:00
gmanno/15:00
noonedeadpunko/15:00
knikollao/15:00
slaweqo/15:00
dansmitho/15:00
gmannin Absence section today15:01
gmannarne_wiebalck will miss the meeting on Oct 1315:01
gmannrosmaita will miss Oct 1315:01
gmannlet's start15:02
gmann#link https://wiki.openstack.org/wiki/Meetings/TechnicalCommittee#Next_Meeting15:02
gmanntoday agenda ^^15:02
spotzo/15:02
gmann#topic Follow up on past action items15:02
gmanngmann to update the wording the EM branch status and reality of maintenance policy/expectation15:02
gmannI proposed patch yesterday #link https://review.opendev.org/c/openstack/project-team-guide/+/86114115:03
gmannplease review ^^15:03
gmann#topic Gate health check15:03
gmannany news on gate?15:03
dansmithso, I have seen a bunch of POST_FAILUREs lately15:03
gmannohk15:03
dansmithI don't have specific pointers, I might be able to find some, but they also seem undebuggable since there are no logs15:04
dansmithI dunno if anyone else has been seeing that, or what might be causing it15:04
slaweqnope, I didn't, at least not in neutron related projects15:04
dansmithhere's an example:https://zuul.opendev.org/t/openstack/build/3c40559f664543359c0109b28dc0765615:05
dansmithanyway, I'll start collecting links for next week if I keep seeing them15:05
gmannseems nothing showing in console also15:06
dansmiththere was one run where there were like six jobs that all POST_FAILUREd, so it seemed systemic15:06
noonedeadpunkit's actually interesting, as logs are present15:06
noonedeadpunkand there're not much of stuff to be executed afterwards15:07
dansmithyeah, but I didn't see any reasons for the failure.. some of the other ones were just no logs, which are hard to debug15:07
slaweqmaybe it's timeing out on sending logs to swift?15:07
slaweqI think we had such kind of issue few weeks ago in Neutron functional tests job15:07
noonedeadpunkWell when there're not logs it's related to swift highly likely. 15:07
dansmithin the no logs case, perhaps15:07
dansmithanyway, maybe just something to keep an eye out for15:08
noonedeadpunkWe had issues when upload logs to swift took more then 30 mins15:08
noonedeadpunkAnd it was some specific provider just being slow15:08
gmannyeah15:08
fungifailures during log upload can be challenging to expose, since it's a chicken-and-egg problem (zuul relies on being able to serve the logs to provide results as it doesn't store that data elsewhere)15:08
dansmithyup15:08
noonedeadpunkBut eventually even with logs present it can be same - as timeout for post jobs is 30 mins15:08
fungibut if someone has a recent list of ones that are suspect, i can try to find causes (likely tracebacks from zuul) in the executor service logs15:09
gmannone I can see 'process-stackviz: pip' role taking 10 min15:09
gmannbut not sure if that is causing timeout15:09
fungithere was a recent complaint about aodhclient hitting timeouts from pip's dep solver taking too long because they don't use constraints. maybe the problem is more widespread (i don't know if the stackviz installation is constrained)15:10
noonedeadpunkI actually don't think it's timeout for mentioned example - it ended at 00:16:19 and post jobs started 3 mins earlier15:10
gmannusually it takes 20-30 sec15:11
gmannfungi: that is not constraint i think, it is latest published one we use?15:11
fungigmann: it has dependencies though, right?15:12
gmannyeah15:12
fungithose are what i'm talking about possibly taking time to dep solve if they aren't constrained15:12
gmannchecked some other result and it was ok15:13
fungianyway, expect that there may be multiple causes for the observed failures, but we can try to classify them and divide/conquer15:13
gmannyeah, let's check and debug later if it is occurring more15:13
gmannany other failure observed in gate?15:13
gmannBare 'recheck' state15:14
gmann#link https://etherpad.opendev.org/p/recheck-weekly-summary15:14
gmannslaweq: please go ahead15:14
slaweqthings are good15:14
slaweqwhat is IMO worth to mention is fact that we have less and less teams with 100% of bare rechecks in last 7 days15:15
gmannnice15:15
slaweqso it is improving slowly :)15:15
gmanngreat15:15
gmannZuul config error15:16
gmann#link https://etherpad.opendev.org/p/zuul-config-error-openstack15:16
JayFDo we have any documentation on not doing bare rechecks, that can be sent to contributors who do bare rechecks?15:16
gmannfrickler added the effected projects for zuul config error in etherpad15:16
JayFI continue to work, with other Ironic contributors, to get these config errors wrapped up in Ironic.15:16
gmannJayF: yes, this one #link https://docs.openstack.org/project-team-guide/testing.html#how-to-handle-test-failures15:17
gmannknikolla: please go ahead if any updates15:17
knikollai'm updating the etherpad as i go. pushed patches for zaqar and senlin today15:17
fungibe aware the configuration may be on any branch, that list in the pad isn't differentiating between branches but the details in the zuul webui will tell you the project+branch+file15:17
knikollastarting with projects who have errors in the master branch15:18
gmann+1, nice15:18
knikollaadjutant's gate had been broken for a year, so pushed a fix for that as well. 15:18
fungibroken for... a year?!?15:18
knikollayes. the requirements.txt specificed django <2.315:18
fungidoesn't that mean the project is just de facto retired then?15:18
gmannknikolla: thanks15:18
knikollaupper contraind specified == 3.215:19
gmannthere is only one maintainer (PTL) there, you can add him in review to get it merge15:19
noonedeadpunkdjango version?15:19
noonedeadpunkah yes, sorry, missed15:19
knikollawill do. i'm using this as an exercise in getting to know the ptl of the smaller teams15:20
gmannperfect15:20
funginot so gentle reminder, projects with development blocked by broken job configuration which is unaddressed for a very long time should be a strong signal that it can just be retired15:20
gmannlet's continue that and everyone can pick up few projects to fix in their available time15:20
gmannwe have new volutneer to took over this project, let's see how it goes now. 15:21
knikollafungi: totally agree15:21
fungikeeping track of those situations would be a good idea, even if someone steps in to fix the testing for them15:21
gmannwe had long time that there was no maintainer so broken gate is very much possible 15:21
gmannsure15:21
gmannanything else on zuul config error or gate health ?15:21
gmannthanks knikolla for helping here15:22
noonedeadpunkyeah, but adjutant was working nicely if u-c for django - we were testing it in osa15:22
gmannack15:22
fungiit just had no development for a year straight, and no maintainers to address any bugs in it15:22
noonedeadpunkwe will go back to zun then now 15:23
gmanntrue but as we have new maintainer let's wait for this cycle how it goes15:23
noonedeadpunk+115:23
gmann#topic 2023.1 cycle PTG Planning15:23
gmann#link https://etherpad.opendev.org/p/tc-leaders-interaction-2023-115:24
gmann#link https://etherpad.opendev.org/p/tc-2023-1-ptg15:24
gmannplease add the topics in the etherpad, by friday I might try to schedule the present topics 15:24
gmannone news: I talked to kubernetes steering committee members to attend the TC PTG session like we have in past couple of PTG15:25
gmannand two members Tim and Christoph agree to attend the session on Friday 21 Oct 16:00 - 17:00 UTC15:26
knikollaawesome! 15:26
gmannfeel free to add the topic you would like to discuss with them - #link https://etherpad.opendev.org/p/tc-2023-1-ptg#L7215:26
JayFI might add a topic w/r/t fungi's point from earlier15:26
JayFabout more proactively identifying projects that are unmaintained or insufficiently maintained15:27
slaweq++15:27
gmannsure, we can discuss that15:27
spotzI suspect some of the others might be already headed to Detroit that Friday15:27
gmannAs discussed in last PTG, one thing we did for that is emerging and in-active project things15:28
gmannJayF: this one #link https://governance.openstack.org/tc/reference/emerging-technology-and-inactive-projects.html15:28
gmannbut it will be good to discuss it further 15:28
gmannwe have enough slot for TC at least as per the current topics present in etherpad so feel free to add more15:29
gmannbut try to add before Friday central time 4-5PM15:29
gmannSchedule 'operator hours'15:30
spotzI can speak a little on this15:30
gmannwe have ~12 project signed up for operator hours which is good15:30
gmanni sent ML reminder also15:30
gmannspotz: please go ahead15:30
gmann#link https://lists.openstack.org/pipermail/openstack-discuss/2022-October/030790.html15:31
spotzSo we've put a link to all the operator hours in the main ops etherpad. Kendall puttogether a blog which has been mailed out to Large Scale and Public Cloud SiG Members as well as attendees from past OPS Meetups. We've also tweeted and retweeted15:31
gmannyeah, +115:32
gmannmany tweeting it. I did this week. spreading the information will be very helpful 15:32
spotzThey've been invited to attend the enviro-sus meeting Monday morning for an intro to the PTG type session and on THursday we hope to get feedback for the future though there are more sessions on Friday15:32
gmann#link https://etherpad.opendev.org/p/oct2022-ptg-openstack-ops15:33
gmanncentral etherpad ^^15:33
gmannspotz: thanks, please ask them to join IRC channel also in case they find any difficulties to join PTG or switch to projects operator-hours15:33
fungii've also been reaching out where it makes sense. brought all those sessions up during the scs-tech meeting earlier today, for example15:34
gmann+115:34
gmannfungi: thanks15:34
fungithere seemed to be interest in the large-scale and public cloud sig meetings too15:34
gmannespecially if they join #openinfra-events channel there we can help them for joining issues or so15:35
gmanngreat15:35
spotzgmann I'll mention that to Kendall for the Monday morning session15:35
spotzI'll see her tonight15:35
gmannspotz: cool, thanks 15:35
gmannanything else on PTG topic?15:36
gmann#topic 2023.1 cycle Technical Election & Leaderless projects15:36
gmannone thing left in this, appointment of Zun project PTL.15:37
gmann#link https://review.opendev.org/c/openstack/governance/+/86075915:37
gmannhongbin volunteer to server as PTL and patch has good amount of required votes. it needs to be wait until 15 oct and I will merge if no negative fedback15:37
gmann#topic Open Reviews15:38
gmann#link https://review.opendev.org/q/projects:openstack/governance+is:open15:38
gmannwe need review in this #link https://review.opendev.org/c/openstack/governance/+/86059915:38
noonedeadpunkactually, ^ that is good topic15:39
noonedeadpunkor well, quite valid comment to it15:39
gmanndiscuss it in PTG?15:40
noonedeadpunkon top of that we should actully decide what OS grenade job for N+2 should run?15:40
gmannsure, we can discuss it there next week. I will add it. thanks 15:40
noonedeadpunkas it's either leaving py3.8 on 2023.1 or backporting py3.10 to Y15:40
noonedeadpunkand focal vs yammy15:41
noonedeadpunk*jammy15:41
dansmithmmm, yammy15:41
fungiyummy yams15:41
gmann:)15:41
dansmithI think we have to run it on focal right?15:41
dansmithmuch easier to do that than anything else I think15:41
noonedeadpunkFor me - yes15:41
gmannyammy could be good name though 15:41
dansmithbut we should discuss next week15:41
gmannyeah, let's discuss and clarify the things accordingly in doc15:42
gmannall other open reviews are in good shape15:42
noonedeadpunkyeah, that's why I always mix the first letter, as yammy really way better name :D15:42
slaweqactually I have a question about https://review.opendev.org/c/openstack/governance/+/83688815:42
gmann:)15:42
slaweqshould we maybe find new volunter who would work on this?15:42
spotzI can't read Jammy without thinking of the dog I had with that name:(15:42
slaweqor abandon it maybe?15:42
gmannnext week we will be in PTG, so I will send the meeting cancel on ML15:42
gmannslaweq: yeah, good point15:43
*** tkajinam is now known as Guest297115:43
gmannjungleboyj: not sure if you will be able to update it or work on it? if no then either of us can pick it15:43
jungleboyjSorry, got pulled into another meeting.15:43
jungleboyjAssume this is related to the review of the User Survey stuff?15:45
gmannjungleboyj: yes15:45
jungleboyjOk.  I was planning to look at that this week before the PTG.  Other fires started.15:46
gmannjungleboyj: thanks. ping us if you need help15:46
jungleboyjI am going to try to look at it tomorrow or during the PTG next week so I can wrap it up.15:46
jungleboyjgmann:  I will.  Apologies.15:46
gmanncool15:46
slaweqthx jungleboyj++15:46
gmannnp!15:46
gmannok with next week meeting cancel, our next weekly meeting will be on Oct 27.15:46
gmannwith that, that is all for today meeting15:46
opendevreviewMerged openstack/governance master: Add project for managing zuul jobs for charms  https://review.opendev.org/c/openstack/governance/+/86104415:47
jungleboyj++15:47
slaweqo/15:47
gmannif nothing else to discuss then let's close it little early (13 min before)15:47
spotzwoohoo15:47
gmannthanks everyone for joining.15:48
gmann#endmeeting15:48
opendevmeetMeeting ended Thu Oct 13 15:48:08 2022 UTC.  Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4)15:48
opendevmeetMinutes:        https://meetings.opendev.org/meetings/tc/2022/tc.2022-10-13-15.00.html15:48
opendevmeetMinutes (text): https://meetings.opendev.org/meetings/tc/2022/tc.2022-10-13-15.00.txt15:48
opendevmeetLog:            https://meetings.opendev.org/meetings/tc/2022/tc.2022-10-13-15.00.log.html15:48
JayFthanks, have a good one o/15:48
spotzThanks gmann and everyone!15:48
knikollasee you all next week virtually! 15:48
jungleboyjThanks!15:48
opendevreviewGhanshyam proposed openstack/project-team-guide master: Clarify Extended Maintenance branch testing and support policy  https://review.opendev.org/c/openstack/project-team-guide/+/86114115:58
*** Guest2868 is now known as diablo_rojo17:59
*** dasm is now known as dasm|off23:52

Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!