Friday, 2021-07-23

*** rlandy|ruck|biab is now known as rlandy|ruck00:07
*** bhagyashris__ is now known as bhagyashris04:41
*** marios is now known as marios|ruck05:20
*** pojadhav|afk is now known as pojadhav05:29
soniya29|rovermarios|ruck, I am facing certain issues while connecting with vpn05:51
marios|rucksoniya29|rover: k there was a global dns issue yesterday so maybe related 05:52
marios|rucksoniya29|rover: which server you trying to conne05:52
marios|ruck...05:52
marios|ruck:D05:52
soniya29|rovermarios|ruck, I tried connecting to Pune, Amsterdam, Beijing and even Red Hat Global VPN05:54
soniya29|rovermarios|ruck, but not luck05:54
marios|rucksoniya|rover: same? anyway don't worry about vpn for now unless you need somethign d/stream or internal? 06:08
marios|ruckanyone else in india facing the same issue? pojadhav bhagyashris chandankumar o/ soniya|rover has problems with vpn today 06:08
soniya|rovermarios|ruck, yeah, same. I have raised a ticket for it06:08
* marios|ruck coffee brb 06:08
marios|rucksoniya|rover: fyi new gate blocker https://bugs.launchpad.net/tripleo/+bug/1937333 but resolution in progress 06:08
pojadhavmarios|ruck, mine is connected from yesterday :D, hope so till now not disconnected..06:10
soniya|rovermarios|ruck, okay06:10
ykarelsoniya|rover, Pune vpn seems have issues06:12
ykareltry different one, /me connected with singpore currently06:12
soniya|roverykarel, okay, give me a moment, I will just try it06:14
soniya|roverykarel, no luck with singapore as well06:18
soniya|roverykarel, I tried 3 times06:18
ykarelsoniya|rover, then possibly some issue with your local network06:18
ykareltry changing to some other06:19
soniya29|roverykarel, tried with mobile hotspot but same issues06:29
bhagyashrismarios|ruck, yeah pune VPN have problem i can connect to Amsterdam 06:29
marios|ruckbhagyashris: thanks i think soniya29|rover already tried ams06:30
soniya29|rovermarios|ruck, bhagyashris, yes, recently tried again..but no success06:31
marios|ruckzbr: ready to merge? https://review.opendev.org/c/openstack/tripleo-repos/+/801621/10#message-9de745c8952981b3b16b7606ef24044a903221c106:48
marios|ruckarxcruz: https://review.opendev.org/c/openstack/validations-libs/+/801623/4#message-f8aa96f56bb12defdd866189ad0a495b72ba31b4 :)06:55
*** amoralej|off is now known as amoralej06:56
*** marios|ruck is now known as marios07:01
*** marios is now known as marios|ruck07:05
*** marios_ is now known as marios08:06
zbrmarios|ruck: morning! yep, just added +W.08:07
*** marios is now known as marios|ruck08:08
zbrbtw, are you aware of some prioblems with image building, i seem multiple failures on a recheck on https://review.opendev.org/c/openstack/tripleo-ansible/+/78776708:08
zbrasking just to know if i should just "recheck" again or wait08:09
zbri spotted few places where old-centos8 was still used instead of stream and that is not good, as the clock is ticking08:09
marios|ruckzbr: doesn't look familiar 2021-07-22 16:18:20.888 85939 ERROR tripleoclient.v2.tripleo_container_image.Build Stderr: 'level=debug msg="Pull Policy for pull [PullIfNewer]"\nerror creating build container: Error initializing source docker://registry.access.redhat.com/ubi8:latest: error pinging docker registry registry.access.redhat.com: Get "http://registry.access.redhat.com/v2/": dial tcp: lookup 08:10
marios|ruckregistry.access.redhat.com on 127.0.0.1:53: server misbehaving\nFailed to write to log, write /home/zuul/container-builds/61bbfae4-b1da-45d5-8409-ee5aa8bb2cd0/base/base-build.log: file already closed\nFailed to write to log, write /home/zuul/container-builds/61bbfae4-b1da-45d5-8409-ee5aa8bb2cd0/base/base-build.log: file already closed\n' 08:10
marios|ruckhttps://1b8c54b5d412a3af0fef-40bd60678638a1db566d5d37b438f20d.ssl.cf1.rackcdn.com/787767/2/check/tripleo-ci-centos-8-content-provider/8e1b645/logs/undercloud/home/zuul/container_image_build.log08:10
zbrmarios|ruck: do you suspect a problem with that line https://review.opendev.org/c/openstack/tripleo-ansible/+/787767/2/tripleo_ansible/roles/tripleo_container_image_build/defaults/main.yml ?08:11
zbrlike some jobs not being able to pull from quay?08:11
zbri coult try to use short name instead.08:11
zbrbut considering that previous runs failed in other jobs, i suspect that the problem may be of different nature08:12
zbrthe run from 18h ago passe all jobs but one which failed in POST, so my guess is that this change should report green unless we encounter some flakiness08:14
marios|ruckzbr: not sure if that's it but in general i haven't seen it elsewhere yet. thought the gate is a bit of a mess from yesterday outage and new blocker today so not a lot running yet08:15
*** chem is now known as Guest187708:38
*** ykarel is now known as ykarel|lunch09:07
marios|ruckchandankumar: do you remember if that was seen just once? I only see one log in the description https://bugs.launchpad.net/tripleo/+bug/1934994 09:33
*** jpodivin_ is now known as jpodivin09:43
marios|rucksoniya29|rover: o/ hey did you get a chance to check the trello cards yesterday? I didn't see any updates. i got up to the same place as yesterday, but going to go do something else now. would be great if you could have a look10:12
marios|rucksoniya29|rover: thank you10:12
soniya29|rovermarios|ruck, i am over it10:12
marios|rucksoniya29|rover: if you find some time please look at 3rd card in degraded i just commented on the 2nd one (https://trello.com/c/V6SrUOIe/2018-cixlp1934994tripleociproa-octavia-healthmonitorscenariotest-are-failing-with-load-balanceris-immutable-and-cannot-be-updated#comment-60fa92d374b26a6708c5b2fb )10:12
marios|rucksoniya29|rover: so basically 3 cards 10:13
marios|ruckand one more in external10:13
marios|rucksoniya29|rover: just try to catchup with what the status is check the bug for any info that isn't on the card etc 10:13
soniya29|rovermarios|ruck, ack10:13
*** ykarel|lunch is now known as ykarel10:33
marios|ruckbhagyashris: akahat: aware of something wrong with promoter? logs look strange @ train http://38.102.83.109/promoter_logs/centos8_train.log no candidates? how can that be11:35
marios|ruckchandankumar: do you know if somethign was done recently to train promoter? ^^ 11:37
marios|ruck:/11:38
bhagyashrismarios|ruck, looking 11:42
marios|ruckbhagyashris: thanks11:43
rlandymarios|ruck: soniya29|rover: hey - reading through tripleo-ci chat11:53
marios|ruckrlandy: yeah just added couple points there easier to summarize like that 11:58
marios|ruckbrb coffee refresh 11:58
rlandymarios|ruck: yep - got it - thanks for doc'ing that11:58
rlandyhas a lot of failure last night11:58
rlandyDNS outage11:58
marios|ruckrlandy: yeah i saw ... this morning looked clear of that but it was hitting the new gate blocker12:05
rlandyalways a reason for joy12:05
marios|ruckrlandy: fix is easy but the problem is getting it through the puppet-neutron gate https://review.opendev.org/c/openstack/puppet-neutron/+/801933/1#message-5a0eef40cd55034a153012c35c10fbfd49ab7912 and current run failing again https://zuul.openstack.org/status#801933 12:07
chandankumarmarios|ruck: that https://bugs.launchpad.net/tripleo/+bug/1934994 was seen once, we proactively added it to skip list so we had not seen  further failure, Sean investigated it but there was not much in the logs, and revert with testproject also passed and we have not seen more instance of the same. Bug is already marked as a invalid.12:08
marios|ruckchandankumar: k thanks for confirming. but lets not file bugs for things we see once. and especially lets not start skipping tempest tests for such things12:09
marios|ruckchandankumar: otherwise we will be chasing our tails even more than we already do ;)12:09
chandankumarmarios|ruck: sure, :-)12:09
marios|ruckbhagyashris: akahat: Looks like this started immediately after the last promotion...12:19
marios|ruckbhagyashris: akahat: promotion at http://38.102.83.109/promoter_logs/centos8_train_2021-07-20T11:51.log  then the next logfile has the 'no candidates' http://38.102.83.109/promoter_logs/centos8_train_2021-07-20T12:03.log and continue that way until today12:19
bhagyashrismarios|ruck, looks weird to me12:20
bhagyashrisbecause it is only happening for train12:21
marios|ruckbhagyashris: yeah we had promotions for other branches after that12:21
marios|ruckbhagyashris: but who knows what the status is there i mean why is it different from master did someone do something on that promoter recently? 12:22
marios|ruckakahat: ? ^ 12:22
akahatmarios|ruck, looking in it.12:22
bhagyashrisi need to check because i havent touch to promoter12:22
bhagyashrisstill lokking into code12:22
akahatmarios|ruck, yes. that is expected.12:23
chandankumarakahat: bhagyashris last week we merged this https://review.rdoproject.org/r/c/rdo-infra/ci-config/+/3381512:23
marios|ruckbhagyashris: akahat: thanks folks it isn't a fire now but we totally need that resolved before monday/tuesday latest if we are going to promote before the program call 12:23
marios|ruck15:23 < akahat> marios|ruck, yes. that is expected.12:23
akahatwe have turned off debug logs to reduce logs12:23
marios|ruckakahat: the no candidates?12:23
akahatmarios|ruck, yes. no candidates to promote.12:24
akahatare there any candidates ?12:24
marios|ruckakahat: how is it expected 12:24
bhagyashrisakahat, why only train12:24
marios|ruckakahat: i'm sure there would be some candidates since the last promotion i mean we have a new tripleo-ci-testing every day 12:24
akahati mean if there are any candidates then promoter is not detecting them12:24
akahatfor c8 train??12:25
marios|ruckakahat: checking12:25
marios|rucktripleo-ci-testing/2021-07-23 06:13 12:25
marios|ruckhttps://trunk.rdoproject.org/centos8-train/ 12:25
marios|ruckakahat: ^ 12:25
marios|ruckcurrent-tripleo/2021-07-20 07:12 12:26
akahatyeah.. there is new candidate but it is not picking it up.12:27
akahatneed to check.12:28
marios|ruckakahat: yeah that is the problem and it means that it will promote nothing until we fix it 12:28
marios|ruckakahat: please thanks12:28
rlandymarios|ruck: akahat: it looks to be the same hash: https://trunk.rdoproject.org/centos8-train/current-tripleo/delorean.repo.md5 2a68d3b9e14d86b57db95487cadce0e2 and https://trunk.rdoproject.org/centos8-train/tripleo-ci-testing/delorean.repo.md5 2a68d3b9e14d86b57db95487cadce0e212:28
marios|ruckrlandy: wtf... how can that be/? 12:28
rlandyhttps://trunk.rdoproject.org/centos8-train/current-tripleo-rdo/delorean.repo.md512:29
rlandyalso the same12:29
marios|ruckrlandy: does it just mean there is no newer content? but then why is it dated 23rd July? 12:29
marios|ruck15:25 < marios|ruck> tripleo-ci-testing/2021-07-23 06:13 12:29
marios|ruck15:26 < marios|ruck> current-tripleo/2021-07-20 07:12 12:29
akahatrlandy, yes. both are same hash.. how it is possible... is dlrn drunk?12:29
akahati mean those are different dates.12:29
rlandymarios|ruck: akahat: we'd have to check if any components rev'ed12:29
rlandyjpena|off is PTO12:30
rlandybut it would be right if nothing rev'ed12:30
akahatoh.. this is promotion to current-tripleo-rdo: http://38.102.83.109/promoter_logs/centos8_train_2021-07-20T11:51.log12:30
marios|ruckk i think we need a bug for that wdyt rlandy? 12:30
rlandyidk yet12:30
marios|ruckrlandy: cos we may need somethign to point to for the program call if not resolved by then ... 12:30
rlandylet's confirm that nothing rev'ed12:30
marios|ruckrlandy: right OK... soniya29|rover can you please follow ^^^ and if needed (ask before you do to confirm) can you please file a bug for that? 12:31
soniya29|rovermarios|ruck, sure12:31
rlandy(testenv) [rlandy@localhost ci-config]$ python3 roles/rrcockpit/files/telegraf_py3/ruck_rover.py --release train12:31
marios|rucksoniya29|rover: more pointers for the issue in gchat (see PROMOTIONS - chasing master, train promoter broken?12:31
rlandymarios|ruck: ^^ ran that12:31
marios|ruckrlandy: yeah but that also queries dlrn right so same result 12:33
marios|ruckHash under test: https://trunk.rdoproject.org/api-centos8-train/api/civotes_agg_detail.html?ref_hash=2a6812:33
marios|ruckd3b9e14d86b57db95487cadce0e212:33
rlandywe'd have to do a component by component check12:34
rlandyonly way12:34
rlandythe date could rev as we create tripleo-ci-testing on every integration jobs run12:34
marios|ruckrlandy: so the promoter might be telling the truth then? and the date is changed if one of the ocmponent promoted. but if it did, then it should be a new candidate hash tripleo-ci-testing no? 12:34
rlandymarios|ruck: yep - but we'd need to confirm that12:35
marios|ruckrlandy: ah so the directory is recreated you mean 12:35
rlandyakahat: ^^ I know I am late for our scenario010 meeting12:35
rlandywill ping when done here12:35
akahatmarios|ruck, yes possibly12:35
akahatrlandy, ah.. sorry.. joining12:35
rlandyakahat: ^^ not yet12:36
rlandygoing to confirm train for marios|ruck first12:36
marios|ruckrlandy: k go ahead i will look at components to check 12:36
rlandywill ping you12:36
akahatrlandy, okay. np12:36
rlandymarios|ruck; the ruck_rover tool should help12:36
rlandychecking12:36
rlandyah- better way ...12:37
marios|ruckrlandy: i don't see anythign to signa the stale content though in the ruck_rover tool ? 12:37
marios|ruckrlandy: or am i missing it12:37
marios|ruckrlandy: i am trying --component all --release train now12:37
akahattripleo-ci-testing, current-tripleo and current-tripleo-rdo are pointing at same hash.. :/12:37
marios|ruckakahat: k so something went wrong with the last promotion sounds like? 12:38
marios|ruckakahat: i noticed same in logs see http://38.102.83.109/promoter_logs/centos8_train_2021-07-20T12:03.log 12:38
marios|ruck2021-07-20 12:03:12,387 2230179 INFO     promoter Target label 'current-tripleo': current hash is aggregate: 2a68d3b9e14d86b57db95487cadce0e2, commit: dc72a3b84e498f5110bb739c30e6afecb93a1296, distro: 871628e5c31aafc96db1390050dc1a6d7c62f2a9, extended: None, component: validation, timestamp: 162676172112:38
rlandyhttp://dashboard-ci.tripleo.org/d/mOvYIiOMk/component-pipeline-train?orgId=1 says thing changed12:38
marios|ruck2021-07-20 12:03:16,727 2230179 INFO     promoter Target label 'current-tripleo-rdo': current hash is aggregate: 2a68d3b9e14d86b57db95487cadce0e2, commit: 36705150b1082763a9597da4e89be03fbae788a2, distro: 6edf151d9389630feec1eef65f60e15b8a793947, extended: None, component: tripleo, timestamp: 162676512112:38
marios|ruckakahat: i.e. current-tripleo and current-tripleo-rdo are both aggregate: 2a68d3b9e14d86b57db95487cadce0e2, 12:39
marios|ruckno actually that might be ok akahat 12:39
marios|rucki mean we promote current-tripleo to current-tripleo-rdo if the jobs pass so... 12:39
rlandyhonestly the only way to be sure is to do this manually12:39
marios|ruckmaybe it just mean no new content if current is also saying the same12:40
marios|ruckrlandy: so did you find some way with ruck_rover tool or should i go back to clicking ;)12:40
rlandymarios|ruck: let's split the stack of components12:40
rlandyI'll start from baremetal12:40
rlandydo  you want to start from validations12:40
rlandywe will meet in the middle12:40
marios|ruckrlandy: err hold on12:40
marios|ruckrlandy: the list i am looking at has glance at the top and tripelo at the bottom 12:40
akahatyes.. that is also possibility.. if nothing gets in the c8 train branch then hash would be same.12:40
marios|ruckrlandy: http://dashboard-ci.tripleo.org/d/HkOLImOMk/upstream-and-rdo-promotions?orgId=1 12:40
marios|ruckrlandy: which list you looking at ;)12:41
rlandyhttps://trunk.rdoproject.org/centos8-train/component/12:41
marios|ruckprobably fro mdlrn directory 12:41
marios|ruckrlandy: k 12:41
rlandyalphabetical12:41
marios|ruckrlandy: ok looking at validation 12:41
marios|ruckrlandy: already found one i think... validations  https://trunk.rdoproject.org/centos8-train/component/validation/ 12:42
marios|ruckrlandy: look at tripleo-ci-testing still on 13th but consistent from 22nd that is wrong no? 12:42
rlandybaremetal matches12:42
rlandylooking at validations12:42
rlandymarios|ruck: https://paste.opendev.org/show/807678/12:44
marios|ruckrlandy: ah so dates mean nothing :/12:44
rlandyyep12:45
marios|ruckrlandy: but that makes debug much harder wasn't always like that no? 12:45
rlandyneed to check commit/distro hash12:45
rlandyalways12:45
marios|ruckrlandy: are we doing that with the promoter maybe? 12:45
rlandythe ruck rover tool was meant to make it easier12:45
rlandyBUT12:45
rlandywhen the chips are down, I go back to manual12:45
rlandyso we are sure we have the right info12:45
marios|ruckrlandy: k i guess we can only rely on the date for current-tripleo ... anyway k thanks continuing then 12:45
rlandyno -the promoter is a commit/distro game12:45
zbrfrenzy_friday: apparently today we are doomed to face problems where local runs give different results than CI ones :(12:46
rlandyyep - because that is a pormotion-only hash12:46
rlandythe rest rev per run12:46
frenzy_fridayzbr, true that12:46
rlandywe always promote promoted-components to tripleo-ci-testing per run12:46
* rlandy carries on12:46
marios|ruckrlandy: thx12:46
rlandycheck the commit.yaml12:46
marios|ruckrlandy: yah 12:47
rlandybaremetal, cinder - good so far12:48
rlandymarios|ruck: the cockpit is tracking the wrong component list12:52
rlandyhttp://dashboard-ci.tripleo.org/d/mOvYIiOMk/component-pipeline-train?orgId=1 is tracking master components12:52
marios|ruckrlandy: am at glance should i stop? 12:52
marios|ruckrlandy: ack ok 12:52
rlandyI am on common12:52
marios|ruckrlandy: k stop then 12:53
marios|ruckrlandy: am doing compute 12:53
marios|ruckrlandy: gimme sec will post but don't see any diff on these so far12:53
rlandystopping12:53
akahathttps://paste.centos.org/view/4a78c68a <-- this tells something different.. validation component which is under same hash is promoted to the tripleo-ci-testing but commit hash is same. 12:54
rlandyhttps://paste.opendev.org/show/807679/12:54
rlandy^^ mine - nothing is diff12:54
akahatthis is the result of some of the failures jobs.12:54
akahatwhich got passed. and got promoted to the tripleo-ci-testing 12:55
akahathope so i got it right ^^12:55
marios|ruckrlandy: https://paste.opendev.org/raw/807680/ same can't see any diff12:55
marios|ruckrlandy: akahat: so then there is no newer content looks like 12:55
rlandymarios|ruck: akahat: yep - looks right12:55
marios|ruckrlandy: akahat: sorry for false alarm. but how is that possible nothing into train in 5 days is weird no? 12:55
akahatyup.12:55
rlandymarios|ruck: no worries, the cockpit is wrong12:56
rlandy^^ you are right there is a bug somewhere12:56
* rlandy hates tools - goes back to first principles12:56
rlandyelectrical engineering training :)12:56
marios|ruckbhagyashris: ^^^ fyi false alarm there is actually no new content soniya29|rover no need for bug 12:57
marios|rucklet all go to the pub mojitos are on me 12:57
soniya29|rovermarios|ruck, okay12:57
rlandyakahat: give me five mins - will ping re: scenario01012:57
akahatrlandy, ok12:57
bhagyashrismarios|ruck, yup , it seems wired to me from start 12:58
rlandymarios|ruck: soniya29|rover: I pinged the security group that multinode-ipa is voting again .. feel free to lean on their caesar is there are failures13:07
rlandyor dwilde in ade's absence 13:08
rlandyakahat: k - ready - joining scenraio010 meeting 13:08
*** amoralej is now known as amoralej|lunch13:09
soniya29|roverrlandy, sure13:09
rlandyhttps://ci.centos.org/view/rdo/view/promotion-pipeline/13:09
rlandyno bad there  - rerunning ussuri and victoria13:09
rlandynot13:09
akahati'm in.13:09
marios|ruckrlandy: ack 13:16
rlandyarxcruz: hey - are you core on https://opendev.org/osf/python-tempestconf?13:17
arxcruzrlandy: yes 13:17
rlandycan you join us on https://meet.google.com/gdx-chgo-amf?authuser=013:18
rlandyhttps://opendev.org/osf/python-tempestconf/src/branch/master/.zuul.yaml#L4513:18
arxcruztosky: can we get rid of scenario010 job on tempestconf ?13:22
arxcruztosky: we are deprecating scen010 on master, just on train 13:23
toskyarxcruz: well, if it's disappearing... what was it meant to test (without digging into the various tables)?13:27
toskyarxcruz: which is the closest replacement?13:27
toskykopecmartin: ^^13:27
arxcruzhttps://review.opendev.org/c/osf/python-tempestconf/+/80202413:29
arxcruztosky: kopecmartin ^13:29
arxcruztosky: i think we were testing some octavia stuff 13:30
toskyarxcruz: so is there another scenario with octavia?13:32
arxcruztosky: only 10 runs octavia 13:32
toskyarxcruz: so there is no testing for octavia at all?13:34
toskythat seems strange13:34
arxcruztosky: not on tripleo side 13:34
arxcruzrlandy: akahat ^ can you confirm it ?13:34
arxcruztosky: we can run it as stable/train though 13:34
toskyarxcruz: which doesn't mean it can't break13:36
arxcruztosky: https://meet.google.com/gdx-chgo-amf?authuser=013:37
arxcruztosky: oh, nevermind 13:37
rlandyarxcruz: tosky: we can keep the scenario010 job - we will just need t add the train and ussuri versions13:37
toskyrlandy: I get it, I still don't get why there is no octavia testing anymore13:39
toskyis tripleo dropping octavia?13:39
eagles2c no13:39
marios|ruckrlandy: (when you have time, no action needed fyi):  I said I would look into centos7 periodics today and just found some time to do that. Situation summarized there https://review.opendev.org/c/openstack/tripleo-ci/+/800646/2#message-6c147dba41d78a64ee7f4ba78faa7365d50af74d - we need that patch to unblock the C7 container push job13:39
eagles(I hope not, it's hugely important to a lot of people)13:40
rlandytosky:nobody is dropping octavia13:40
eagles:)13:40
rlandyto the contrary, we are aiming to make it voting13:40
rlandyin that process we are splitting out a train branch and a ussuri branch13:41
toskyrlandy: so does it mean that scenario010 will be replaced by another job which also includes octavia?13:41
toskymy point is: that job is used on python-tempestconf to test octavia; if that job is going to be removed, and octavia is still supported, I guess there will be another job which replaces the removed one13:41
rlandythe reason we pinged you guys is that tempestconf https://opendev.org/osf/python-tempestconf/src/branch/master/.zuul.yaml#L45 will need to be edited to run the train and ussuri branched versions as well13:41
rlandytosky: no job is being removed13:42
rlandywe will add to it13:42
toskyrlandy: I'm still confused13:42
rlandyI will out up a patch for your review13:42
toskyrlandy: can you please clarify this discussion first, please? This discussion started as " we are deprecating scen010 on master, just on train "13:42
toskythat's the first step before removal13:42
rlandyeagles: https://opendev.org/openstack/puppet-octavia/src/branch/master/.zuul.yaml#L12 is in the same boat ... once https://review.opendev.org/c/openstack/tripleo-ci/+/800540 merges13:43
toskyI'd like to have a clear explanation of what's going on before deciding13:43
rlandywe will need to add tha train and ussuri tests13:43
rlandytosky: to be clear, nothing is being removed ... scenario101 standlaone was non-voting13:44
rlandyit was failing on train and ussuri13:44
rlandyin the process of making the job voting13:44
rlandywe are creating new definitions for train and ussuri13:44
toskythat's a completely different scenario that was discussed at the beginning13:44
rlandyso those branches can be made voting when ready13:44
rlandybeginning?13:45
rlandy<arxcruz> tosky: we are deprecating scen010 on master, just on train ... that was the confusing line13:45
toskyrlandy: https://meetings.opendev.org/irclogs/%23oooq/%23oooq.2021-07-23.log.html#t2021-07-23T13:22:4313:45
rlandy^^ I think we confused arxcruz 13:46
rlandyit is taying on master13:46
arxcruzrlandy: tosky sorry i express myself wrong :) 13:46
toskyok, so you would like to add more jobs for older branches13:47
rlandytosky: ^^ clear now? if not we can meet and I'll explain more13:47
toskyplease note we haven't done so far for any tripleo jobs13:47
toskyas the gates for tempestconf already runs a good amount of jobs, I wonder if we are not going block the queues13:48
rlandytosky: "done so" ^^ what does that refer to?13:48
rlandyit won;t run extra jobs13:48
rlandyit will just run a jobs by a slightly diff name on train and ussuri13:48
toskythat's running extra jobs13:48
toskytempestconf is not branched13:48
toskyright now we have python-tempestconf-tempest-devstack-admin, python-tempestconf-tempest-devstack-admin-{wallaby,victoria,ussuri,train} 13:49
toskyare you suggesting to have the same for tripleo-ci-centos-8-scenario002-standalone? 13:49
rlandyno13:50
rlandythat one is voting everywhere13:50
rlandyyou're fine then13:50
toskyhappy to know I'm fine, I'm still not sure what's just happened13:51
toskyback to internal stuff13:51
arxcruzrlandy: tosky i already abandon the patch, sorry, i missunderstood 13:52
rlandyarxcruz: np13:54
*** amoralej|lunch is now known as amoralej14:01
*** pojadhav- is now known as pojadhav14:16
*** ykarel is now known as ykarel|away14:56
rlandymarios|ruck: looking at https://review.rdoproject.org/r/c/testproject/+/3465414:56
rlandydo we need to rerun 20 and 39 to get master to promote?14:56
marios|ruckrlandy: yeah i already tried a recheck 14:56
marios|ruckrlandy: that hash is from yesterday but good enough i think 14:56
rlandyk - see that comment14:57
marios|ruckrlandy: which comment14:57
rlandyrecheck14:57
rlandythey failed again14:57
rlandy: Get image expected checksum14:57
marios|ruckrlandy: https://review.rdoproject.org/zuul/status#34654 14:57
marios|ruckrlandy: they didt report yet 14:58
marios|ruckrlandy: but yeah failure 14:58
rlandyinstall erros14:59
rlandyerrors14:59
rlandymarios|ruck: the ipa job is going to report now14:59
rlandymaybe we try one more recheck?15:00
marios|ruckrlandy: sure go ahead am looking at other stuff right now (doing some ptl things that need addressing)15:00
marios|ruckrlandy: i think i'm done looking at jobs for today :) my brain hurts and its only day #2 :/15:00
rlandyk15:00
rlandymarios|ruck: yeah - I get that15:01
* rlandy nicks15:01
*** rlandy is now known as rlandy|ruck15:01
rlandy|ruckmarios|ruck: other than master promotion, anything else?15:01
rlandy|ruckhttps://zuul.openstack.org/status#80193315:02
marios|ruckrlandy|ruck: yeah that one is a gate blocker so def need that 15:02
rlandy|ruckwatching15:02
marios|ruckrlandy|ruck: gate and promotion blocker in fact 15:02
rlandy|ruckamazing15:02
marios|ruckrlandy|ruck: is why i have it on depends-on of that master testproject15:03
rlandy|ruckyep - I see that15:03
rlandy|ruckmaybe be causing build failure15:03
rlandy|ruckI'll recheck when it merges if fails again15:03
marios|ruckrlandy|ruck: thanks15:04
rlandy|ruckrekicked15:16
marios|ruckrlandy|ruck: did you see that one btw https://review.opendev.org/c/openstack/tripleo-ci/+/800646/2#message-6c147dba41d78a64ee7f4ba78faa7365d50af74d15:22
*** holser is now known as holser_15:24
rlandy|rucklucky we checked :)15:26
rlandy|ruckwe = you15:26
marios|ruckrlandy|ruck: well no guarantees we'll get a promotion (user experience may vary, images for illustrative purposes only) but its the first step  will revisit next week 15:28
marios|ruckah crap totally forgot about the happy friday call :/15:34
marios|ruckno happy friday for me 15:34
marios|ruck:(15:34
rlandy|ruckwe missed you15:35
marios|rucki'm sure arxcruz missed me most15:35
rlandy|rucksigh ... unrequited love15:36
marios|ruck:D15:37
*** holser_ is now known as holser15:51
marios|ruckrlandy|ruck: starting shutdown sequence15:56
rlandy|ruckmarios|ruck: k - have a good weekend15:57
rlandy|ruckhttps://review.rdoproject.org/r/c/testproject/+/34654 failed again15:57
* rlandy|ruck waits on merge15:57
marios|ruckcome on scen2 ... waiting https://zuul.openstack.org/status#801933 so i can change topic in tripleo and reply to http://lists.openstack.org/pipermail/openstack-discuss/2021-July/023805.html15:59
marios|ruckthanks rlandy|ruck re 34654 would be good to get master 16:00
marios|rucksweet merged 16:03
*** marios|ruck is now known as marios16:03
rlandy|rucknice16:06
*** marios is now known as marios|out16:08
soniya29|rovermarios|out, rlandy|ruck, leaving for the day16:13
*** amoralej is now known as amoralej|off16:29
rlandy|ruckhmmm ... we have an issue with docker push17:15
rlandy|ruckakahat: I think we have a problem with master promotions17:20
rlandy|ruckno containers in docker for current-tripleo17:21
rlandy|ruckhttps://hub.docker.com/r/tripleomaster/openstack-keystone/tags?page=1&ordering=last_updated17:21
rlandy|ruckhttps://images.rdoproject.org/centos8/master/rdo_trunk/07715e073191d94abc2f35e6b90563f9/17:23
rlandy|ruckwe have images but no containers17:23
rlandy|ruckakahat: chandankumar: you guys around?17:34
rlandy|ruckarxcruz: ^^ around?17:50
rlandy|ruckanyone?17:50
rlandy|ruckneed someone to check container repush17:50
arxcruzrlandy|ruck: something wrong ?18:04
rlandy|ruckarxcruz: yep ...18:04
rlandy|ruckhttps://hub.docker.com/r/tripleomaster/openstack-base/tags?page=1&ordering=last_updated18:04
rlandy|ruckdocker and quay missing containers for current-tripleo18:04
rlandy|ruckhttps://trunk.rdoproject.org/centos8-master/current-tripleo/delorean.repo.md518:04
rlandy|ruck^^ no containers with that tag18:04
rlandy|ruckkilling validation libs18:05
rlandy|rucktrying to check rdo registry to see if the containers are there18:05
rlandy|ruckhttps://trunk.registry.rdoproject.org:8443/oapi/v1/namespaces/tripleomaster/imagestreamtags/ failing for e18:05
rlandy|ruckme18:05
arxcruzrlandy|ruck: i'm seeing 07715e073191d94abc2f35e6b90563f9 for tripleo-base on quay 18:06
rlandy|ruckthe keystone one was missing from docker18:06
rlandy|ruckdo you see that one?18:06
arxcruzlet me check keystone18:06
arxcruzrlandy|ruck: yup 18:07
arxcruzkeystone is there18:08
rlandy|ruckon docker?18:08
arxcruzhttps://quay.io/repository/tripleomaster/openstack-keystone?tab=tags18:08
rlandy|ruckor just on quay18:08
arxcruzon quay 18:08
arxcruzrlandy|ruck: http://38.102.83.131/quay/tag/master-report.html18:08
arxcruzall containers were tagged successfully 18:08
rlandy|ruckI don;t see it on docker18:08
rlandy|ruckhttps://hub.docker.com/r/tripleomaster/openstack-keystone/tags?page=1&ordering=last_updated18:08
rlandy|ruckarxcruz: do you see that?18:09
rlandy|ruckhttps://zuul.opendev.org/t/openstack/builds?job_name=tripleo-ci-centos-8-standalone-validation-libs18:09
rlandy|ruckmaybe just a hitch18:10
arxcruzrlandy|ruck: i'm not seeing, but we can copy this particular tag to docker no ?18:10
arxcruzfrom quay to docker ?18:10
rlandy|ruckarxcruz: yeah - want to do that from the promoter?18:10
rlandy|ruckor from toolbox?18:10
arxcruznever did it18:10
arxcruzbut we can try 18:10
rlandy|ruckk - should we tmux?18:11
rlandy|ruckarxcruz: the alternative is to wait and promote the next hash18:14
rlandy|ruckarxcruz: we should copy run promoter again 18:15
arxcruzrlandy|ruck: just pushed the hash 18:16
arxcruzhttps://hub.docker.com/layers/tripleomaster/openstack-keystone/07715e073191d94abc2f35e6b90563f9/images/sha256-747c0417c34606cd8bb7bf0bf3121089df4acfce58a745a7395c68365f4a224c?context=explore18:16
arxcruzrlandy|ruck: is there some other container missing 18:16
rlandy|ruckarxcruz: thank you!!!18:16
arxcruzrlandy|ruck: is not tagged as current-tripleo though 18:16
arxcruzshould i do that on docker?18:16
arxcruzor just need to have the container there ?18:17
rlandy|ruckI don;t think it matters but they should match18:18
*** holser is now known as holser_21:07
*** holser_ is now known as holser21:54

Generated by irclog2html.py 2.17.2 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!