Friday, 2022-07-08

*** dviroel|out is now known as dviroel00:20
dviroelrcastillo: master promoted00:21
dviroelmerging revert00:22
dviroelcurrent-tripleo/2022-07-07 23:3800:22
dviroelrevert merged00:57
dviroelo/00:57
*** dviroel is now known as dviroel|out00:57
*** rlandy|bbl is now known as rlandy01:01
*** rlandy is now known as rlandy|out01:01
*** ysandeep|out is now known as ysandeep01:45
*** ysandeep is now known as ysandeep|afk04:23
*** ysandeep|afk is now known as ysandeep06:02
*** ysandeep is now known as ysandeep|afk06:44
*** amoralej|off is now known as amoralej06:46
marioso/ happy friday 06:49
*** ysandeep|afk is now known as ysandeep07:11
ysandeephappy friday o/ marios and team 07:12
ysandeepbhagyashris, pojadhav do you know if soniya is supposed to be back today?07:12
bhagyashrisysandeep, not sure but she said she will be out this week07:15
ysandeepokay, I don't see her PTO on PTO calendar so I thought she will be back today07:16
bhagyashrisi see rhos calendar and that is showing her PTO planned till yesterday 07:16
bhagyashrisyeah07:16
ysandeepI am not sure if she is aware about her RR shift, I don't remember if she requested anyone to replace her up on Friday.07:21
*** chandankumar is now known as chkumar|rover07:25
chkumar|roverysandeep: she asked me to cover for today07:25
ysandeepah great, thanks chkumar|rover 07:26
pojadhavysandeep, she will be back today but may be lil late07:44
ysandeeppojadhav, ack thanks.. chkumar|rover is already covering her.07:45
pojadhavysandeep, ack07:47
jm1happy friday :)08:43
*** ysandeep is now known as ysandeep|lunch09:09
* jm1 reconnecting09:28
*** rlandy__ is now known as rlandy10:33
rlandychkumar|rover: thanks for covering for soniya10:35
rlandychkumar|rover: is there  new rr hackmd?10:35
rlandychkumar|rover: let's sync when dviroel_ is in10:36
rlandyysandeep|lunch: any word on dlrn for rhel-8?10:36
ysandeep|lunchrlandy: no 10:38
*** ysandeep|lunch is now known as ysandeep10:38
rlandyysandeep|lunch: ok - thanks10:38
* ysandeep why are you still doing lunch that was 2 hours ago.. ah i missed to update irc nic. :)10:39
rlandyysandeep: at least we promoted  rhos-17 on rhel-9 last night 10:39
ysandeepI saw that this morning, that's a good news :D10:42
ysandeepthanks for tracking that10:42
chkumar|roverrlandy: https://hackmd.io/hD7Q6AsZQQyki_v4BoMfxg10:43
rlandychkumar|rover: thanks - will sync when dviroel_ is here10:43
rlandychkumar|rover: just trying to get c9 nodes back in downstream10:44
chkumar|roverrlandy: yes just saw that, thanks!10:44
chkumar|roverysandeep: seen this issue 'tripleo_cephadm_alertmanager_container_image' is undefined\10:50
chkumar|roverhttps://sf.hosted.upshift.rdu2.redhat.com/logs/openstack-periodic-integration-rhos-17/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-rhel-8-scenario001-standalone-rhos-17/5d7beb3/logs/undercloud/home/zuul/tripleo-deploy/standalone-ansible-3rt_k_ja/cephadm/cephadm_command.log ?10:50
ysandeepchkumar|rover, that's known.. need tripleo component promotion once dlrn is enabled back for rhel810:51
ysandeepchkumar|rover, fix is already in, just that dlrn is stopped atm10:51
chkumar|roverysandeep: ++ thanks!10:51
ysandeepchkumar|rover, fyi.. https://bugzilla.redhat.com/show_bug.cgi?id=210437210:52
rlandychkumar|rover: when jon comes online we can check with him when dlrn will be restarted there10:54
chkumar|roverrlandy: ok thanks!10:55
rlandychkumar|rover: also, I think we can rekick 16.2 ...10:56
chkumar|roverrlandy: yes, full-tempest-api job was failing at tempest10:56
rlandychecking with ysandeep if we have the fix in place, 10:56
rlandysee #rhos-dev10:56
rlandythe patch is pushed, I think it is in trunk10:56
rlandydeps10:56
rlandyso we need to restart 16.2 line10:57
rlandyI'll rekick10:57
chkumar|roverok10:58
rlandy16.2 restarted10:59
* ysandeep reading back11:00
ysandeepyes11:02
ysandeephttps://osp-trunk.hosted.upshift.rdu2.redhat.com/rhel8-osp16-2/report.html11:02
ysandeepovsdbapp rpm is available11:02
ysandeepwe can rekick network component11:02
chkumar|roveryes correct, first we need to promote network for rhos-16.2 then integration line11:03
chkumar|roverrlandy: ^^11:03
*** ysandeep is now known as ysandeep|afk11:05
rlandychkumar|rover: hmmm ... I see that rpm in http://download.eng.bos.redhat.com/brewroot/repos/rhos-16.2-rhel-8-trunk-build/latest/x86_64/pkglist11:06
rlandywhich should be picked up by integration line11:06
rlandywe can testproject 16.2 network11:07
chkumar|roverrlandy: we need this one https://osp-trunk.hosted.upshift.rdu2.redhat.com/rhel8-osp16-2/component/network/promoted-components/python3-ovsdbapp-0.17.6-2.20220514145356.4d9ea84.el8osttrunk.noarch.rpm11:07
rlandychkumar|rover: ok11:08
chkumar|roverrlandy: sorry this one https://osp-trunk.hosted.upshift.rdu2.redhat.com/rhel8-osp16-2/component/network/consistent/python3-ovsdbapp-0.17.6-2.20220707155303.4d9ea84.el8osttrunk.noarch.rpm11:08
rlandylet me dequeue11:08
*** tosky_ is now known as tosky11:11
rlandychkumar|rover: network 16.2 in testproject11:11
chkumar|roverlet me send that11:12
chkumar|roverrlandy: thanks, you sent it already11:13
chkumar|roverhttps://code.engineering.redhat.com/gerrit/c/testproject/+/39875811:13
rlandymarios: chkumar|rover: pojadhav: frenzy_friday: ysandeep|afk: have a conflict - will miss review time11:14
rlandypls carry on w/o me11:14
chkumar|roverok11:14
mariosack rlandy 11:16
pojadhavrlandy, ack11:16
*** dviroel_ is now known as dviroel|ruck11:18
* dviroel|ruck on duty11:19
pojadhavmarios, https://review.rdoproject.org/r/c/config/+/4358511:19
dviroel|ruckhappy friday o/11:19
*** marios is now known as marios|call11:22
pojadhavdviroel|ruck, marios|call ysandeep|afk : https://review.opendev.org/c/openstack/tripleo-quickstart/+/84403611:26
*** ysandeep|afk is now known as ysandeep11:28
dviroel|ruckchkumar|rover: i was about to ask you to stay, for rr sync11:29
chkumar|roverdviroel|ruck: let me join back11:29
*** marios|call is now known as marios11:29
chkumar|roverdviroel|ruck: https://meet.google.com/vzm-nrah-qqf?authuser=011:30
chkumar|roverdviroel|ruck: https://hackmd.io/hD7Q6AsZQQyki_v4BoMfxg11:31
* pojadhav stepping out for hour...11:33
chkumar|roverrlandy: I synced with dviroel|ruck on rr11:34
rlandychkumar|rover: dviroel|ruck: ok - great11:36
ykarelchkumar|rover, dviroel|ruck is issue with multinode ipa c9 master job known?12:05
ykarelseen in one of my patch https://ef1bd6032c64c1864435-d6fd3b06d50c034d0364bbf684ea1b1c.ssl.cf5.rackcdn.com/849077/1/check/tripleo-ci-centos-9-standalone-on-multinode-ipa/813d771/logs/undercloud/var/log/tempest/stestr_results.html12:06
ykarelsimilar in other patch https://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_ac5/848558/2/check/tripleo-ci-centos-9-standalone-on-multinode-ipa/ac5cf96/logs/undercloud/var/log/tempest/stestr_results.html12:06
ykarelseeing memcache related errors in keystone12:06
chkumar|roverykarel: nope12:07
dviroel|ruckykarel: no, there was a inconsistent failure on keystone deployment, but this tempest one is new12:09
ykareltempest failures seems to be caused by https://zuul.opendev.org/t/openstack/build/813d771af5f04ad3b8baeae565a653c9/log/logs/undercloud/var/log/containers/keystone/keystone.log#1377312:09
chkumar|roverhttps://zuul.opendev.org/t/openstack/builds?job_name=tripleo-ci-centos-9-standalone-on-multinode-ipa&skip=0 seems be good, with few fails12:09
ykarelack please check it might impact gate12:10
ykarelas failures unrelated to patches12:10
chkumar|roverbrb12:12
rlandydviroel|ruck: chkumar|rover: c9 jobs of internal have started working12:36
dviroel|ruckrlandy: great, thanks12:42
rlandyugh network component on 16,2 is not doing well12:47
rlandychkumar|rover: can you investigate the network 16.2 component fails  - need to meet with frenzy_friday 12:55
rlandyalso wallaby c9 is failing all over12:55
*** ysandeep is now known as ysandeep|afk12:57
dviroel|ruckrlandy: chkumar|rover: wallaby c9 failure are related to mirror issues13:02
dviroel|ruckonly cs9 fs020 failed on tempest, but with connection erros, which may be related with infra connectivity issues too13:03
rlandyfrenzy_friday: https://meet.google.com/dha-wgza-afm?pli=1&authuser=013:04
rlandyfrenzy_friday: https://github.com/openstack/tripleo-ci/blob/master/toci-quickstart/config/testenv/ovb-vexxhost.yml13:07
rlandyfrenzy_friday: cloudenv: "vexxhost"13:12
chkumar|roverrlandy:  on it13:20
*** amoralej is now known as amoralej|lunch13:27
chkumar|roverdviroel|ruck: rlandy https://sf.hosted.upshift.rdu2.redhat.com/logs/58/398758/59/check/periodic-tripleo-ci-rhel-8-scenario010-standalone-network-rhos-16.2/69c302e/logs/undercloud/var/log/tempest/stestr_results.html13:28
chkumar|roversc10 octavia failures are known?13:28
rlandychkumar|rover: from the history of the job?13:29
chkumar|roverhttps://sf.hosted.upshift.rdu2.redhat.com/zuul/t/tripleo-ci-internal/builds?job_name=periodic-tripleo-ci-rhel-8-scenario010-standalone-network-rhos-16.213:30
chkumar|roverin component run it passed but failed on testproject13:30
rlandychkumar|rover: running now - can you check that13:31
chkumar|roverrlandy: I am checking the logs from testproject13:31
*** ysandeep|afk is now known as ysandeep13:31
chkumar|roverhttps://sf.hosted.upshift.rdu2.redhat.com/logs/58/398758/59/check/periodic-tripleo-ci-rhel-8-containers-multinode-network-rhos-16.2/23cd168/logs/undercloud/home/zuul/overcloud_deploy.log13:33
chkumar|rovererror={"msg": "The conditional check '(tripleo_bootstrap_packages_bootstrap_result.rc | int) == 1' failed. The error was: error while evaluating conditional ((tripleo_bootstrap_packages_bootstrap_result.rc | int) == 1): 'dict object' has no attribute 'rc'"}13:33
chkumar|roverysandeep: ^^ is this one known?13:33
chkumar|roverhttps://sf.hosted.upshift.rdu2.redhat.com/zuul/t/tripleo-ci-internal/status#openstack-component-network has also started13:54
chkumar|roverwill wait for their result also13:54
*** dasm|off is now known as dasm13:56
dasmo/13:56
ysandeephappy hours o/ if anyone wants to join14:04
rlandychkumar|rover: dviroel|ruck: rhos-17 on rhel-8 rev'ed tripleo copomponent consistent14:05
rlandygoing to to rerun that14:05
rlandywith testproject14:06
dviroel|ruckrlandy: ack, there are no pending commits in dlrn anymore14:07
chkumar|roverdviroel|ruck: https://sf.hosted.upshift.rdu2.redhat.com/logs/openstack-component-network/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-rhel-8-scenario010-standalone-network-rhos-17/0985742/job-output.txt14:08
chkumar|roverdviroel|ruck:  tripleo.operator.tripleo_ceph_spec : Generate Ceph Spec is broken now14:08
chkumar|roverit might happened due to python3-pyyaml update14:09
chkumar|roverworking job: python3-pyyaml-3.12-12.el8.x86_6414:09
chkumar|roverfailed one: python3-pyyaml-5.4.1-2.el8ost.x86_6414:09
dviroel|ruckchkumar|rover: ""stdout": "openstack: 'overcloud ceph spec --standalone --mon-ip 192.168.42.1 --yes --output /home/zuul/ceph_spec.yaml --stack standalone --osd-spec /home/zuul/osd_spec.yaml' is not an openstack command. See 'openstack --help"14:10
*** amoralej|lunch is now known as amoralej14:11
chkumar|roveroh i missed that errror14:11
chkumar|roverthen something else is causing it14:12
dviroel|ruckmight be missing bits14:12
dviroel|ruckrlandy: chkumar|rover: pls review/merge this https://review.opendev.org/c/openstack/openstack-tempest-skiplist/+/84911314:12
dviroel|ruck^ affects ovb-fs020 too14:12
chkumar|roverdone14:13
rlandydviroel|ruck: same as the ipa bug we logged?14:14
chkumar|roveripa bug?14:14
dviroel|ruckrlandy: yeah, "Failed to find floating IP" error, same test14:14
dviroel|ruckrlandy: added logs in comments14:15
ykareldviroel|ruck, chkumar|rover is it something known that periodic-tripleo-ci-centos-9-standalone-full-tempest-api-master and periodic-tripleo-ci-centos-9-standalone-full-tempest-scenario-master running same set of tests?14:15
ykarellooked suspecious, just noticed while looking at something else, not checked other releases14:15
ykarelso please check if not already a known thing14:15
ysandeepmarios, chkumar|rover rlandy dviroel|ruck podified mtg time14:16
dviroel|ruckykarel: thanks, I will check, we have been refactoring tempest allowed/skipped lists14:16
chkumar|roverhttps://opendev.org/openstack/tripleo-ci/commit/25a426031774d2093f487c3d33da80625c2b9f7114:16
dviroel|ruckmight be related14:16
chkumar|rovermight caused it14:16
dviroel|ruckyeah14:17
dviroel|ruckbhagyashris: can you take a look on this please? ^ 14:18
* jm1 out for today, have a nice weekend :)15:10
dasmjm1[m]: o/15:10
* pojadhav out.. Have a great weekend to all !15:11
dasmpojadhav: o/15:12
chkumar|roversee ya people!15:22
dviroel|rucko/15:23
dviroel|ruckcontent-provider-wallaby on gate failure: Error: Failed to download metadata for repo 'centos9-rabbitmq': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried15:23
* dviroel|ruck going to lunch now15:23
*** dviroel|ruck is now known as dviroel|ruck|lunch15:23
ysandeepchkumar|rover, looks like you had similiar thoughts as I had about the workflow with some changes, I will reach out to you on Monday to understand your workflow in a better way.15:25
*** marios is now known as marios|out15:42
*** ysandeep is now known as ysandeep|out15:43
rlandychkumar|rover: hmm is periodic-tripleo-ci-rhel-8-scenario010-standalone-network-rhos-16.2 legut fail?15:44
rlandylegit15:44
*** amoralej is now known as amoralej|off16:14
dasmdviroel|ruck|lunch: rlandy can any of you give +W? https://review.rdoproject.org/r/c/rdo-jobs/+/4377816:30
*** dviroel|ruck|lunch is now known as dviroel|ruck16:36
* dviroel|ruck back16:37
dviroel|ruckdasm: yes, and we can see results on weekend16:39
dasmthat would be preferable16:39
dasm\( ゚ヮ゚)/16:40
rlandylunch - brb16:41
rlandydviroel|ruck: so network  tempest 16.2 has a legit failure on scenario010 17:14
rlandyrunning multinode ipa gain for 1717:14
dviroel|ruckoh, let me see17:18
dviroel|rucki was looking rhel-8-standalone-on-multinode-ipa-tripleo-rhos-17 - timeout downloading file17:19
rlandyyep - rekicked that one17:21
rlandydviroel|ruck: but ipa in check also not looking great17:21
rlandyhttps://zuul.openstack.org/builds?job_name=tripleo-ci-centos-9-standalone-on-multinode-ipa&skip=017:22
dviroel|ruckyeah, ykarel mentioned earlier17:24
dviroel|ruckseems to be a real issue now17:24
dviroel|ruckrlandy: ok, let me create tempest 16.2 bz first17:25
dviroel|ruckdo we already have one:17:25
dviroel|ruck?17:25
rlandydviroel|ruck: no17:47
rlandywe have one for the original error:17:48
rlandyhttps://trello.com/c/t32EwcFL/2614-cixbz2104931osp162rhel8neutron-qos-related-tempest-test-failure-and-traceback-observed-in-neutron-server-log-error-neutronplugin17:48
dviroel|ruckack - finishing BZ, was looking octavia logs17:56
* dviroel|ruck biab18:26
* dviroel|ruck back18:44
dviroel|rucktripleo-ci-centos-9-standalone-on-multinode-ipa seems inconsistent18:45
dviroel|ruckonly periodic-tripleo-ci-centos-8-ovb-3ctlr_1comp-featureset035-train missing for train promotion, as usual18:56
dviroel|ruckrlandy: great, tripleo component can now promote19:12
rlandydviroel|ruck: awesome19:39
rlandydviroel|ruck: thanks for running periodic-tripleo-rhel-8-rhos-17-component-tripleo-promote-to-promoted-components19:39
rlandydviroel|ruck: did you re-enqueue 17 on 8 after tripleo promo?19:40
dviroel|ruckrlandy: np, should we dequeue the current running line?19:40
rlandylol19:40
rlandyI guess you didn't19:40
dviroel|ruckrlandy: no19:40
rlandyif not, yeah - I am going to requeue it19:40
rlandyok - dviroel|ruck: restarted19:41
dviroel|rucknice19:42
rlandydviroel|ruck: need help with anything else?19:42
dviroel|rucki don't think so19:44
dviroel|rucktrying to findo patterns on failing jobs19:44
rlandythanks for looking into that19:44
dviroel|ruckrlandy: something is not right, all standalones are testing the same set of tempest tests20:10
dviroel|ruckhttps://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_055/849048/2/gate/tripleo-ci-centos-9-scenario004-standalone/0554957/logs/undercloud/var/log/tempest/stestr_results.html20:10
dviroel|ruck^ should be running manila tests20:10
rlandydviroel|ruck: ugh - probably the skiplist/allow list20:24
rlandythey are all using fs05220:25
rlandydviroel|ruck; can you tel when that started?20:25
rlandyI would guess when we merged the fs052 change on tq20:26
dviroel|ruckrlandy: https://review.opendev.org/c/openstack/tripleo-ci/+/84809720:26
dviroel|ruckmaybe 20:26
rlandydviroel|ruck: revert?20:26
rlandyor leave until monday and discuss with arx?20:26
dviroel|ruckrlandy: i would prefer the revert, jobs are merging without the proper testing20:27
dviroel|ruckright?20:27
rlandydviroel|ruck: go4it20:27
rlandyleave arx, pojadhav and soniya an email20:28
rlandywe will be able to see on the revert if the testing is back20:28
* dviroel|ruck looking if is the right patch to revert20:29
rlandydviroel|ruck: there are two20:30
rlandyone that adds the definition in the skiplist 20:30
rlandyand one that removes the tests from the fs20:30
rlandyor job definition20:30
dviroel|ruckThu Jul 7 08:05:02 2022 - last one on scn004 that run manila tests20:31
dviroel|ruckyeah, match with https://review.opendev.org/c/openstack/tripleo-ci/+/84809720:31
rlandydviroel|ruck: go ahead and revert20:33
rlandylet's see 20:33
dviroel|ruckrevert created - merge conflict :(20:33
rlandyyou'll be able to tell on the patch run o a depends on if it fixes the problem20:33
dviroel|ruckneed to fix merge conflict first20:33
rlandyk20:34
dviroel|ruckreverted the wrong one, one sec20:35
dviroel|ruckhttps://review.opendev.org/c/openstack/tripleo-ci/+/849006 - this one doesn't have merge conflicts20:37
dviroel|ruckthe change itself should test it, since it is in tripleo-ci20:39
rlandyok - let me know20:41
rlandythanks for catching that20:41
rlandydviroel|ruck: is it just check or also periodic?20:42
rlandyrunning the wrong tests20:42
dviroel|ruckhum, let me check20:43
dviroel|ruckhum, weird20:44
dviroel|rucklol20:44
dviroel|ruckhttps://logserver.rdoproject.org/openstack-periodic-integration-main/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-centos-9-scenario004-standalone-master/64c84c4/logs/undercloud/var/log/tempest/stestr_results.html.gz20:45
dviroel|ruckrunning everything20:45
dviroel|ruck"everything"20:45
dviroel|ruckthe outputs are different, but it is not correct20:47
dviroel|ruckrlandy: the revert will fix periodics too20:47
rlandydviroel|ruck: ok - good20:48
rlandydviroel|ruck: let's go with it and contact arx, pojadhav and soniya on monday to correct20:49
dviroel|ruckack20:49
rlandydviroel|ruck: which landed up being the final revert?21:00
rlandygot a few emauks21:00
rlandyemails with patches21:00
rlandyugh - 5 pm always give us node failures on psi21:04
dviroel|ruckit is already weekend for PSI21:05
dviroel|ruckrlandy: https://review.opendev.org/c/openstack/tripleo-ci/+/849006 - tripleo-ci one21:06
dviroel|ruckperidic-scenario004 is running 2 manila tests, but not the expected one :P21:09
rlandyugh21:28
rlandydviroel|ruck: work it through with arx on monday21:28
rlandyleave him an email 21:28
rlandyhe can get started before you come on line21:28
rlandysoniya too21:28
rlandyshe is rucl rovering21:28
rlandytell them to start fixing this21:28
dviroel|ruckrlandy: email is ready, will wait ci results21:28
dviroel|ruckrlandy: I will go afk, and back to check job results21:30
rlandydviroel|ruck: yeah  I am going to rerun falied rhle-8 jobs21:30
rlandyand log off21:30
rlandybeen a loooooong week 21:30
dviroel|ruckrlandy: ack - i can merge the revert if we get good results21:30
rlandydviroel|ruck: if it's the right tests running  - sure go ahaed21:31
rlandyahead21:31
dviroel|ruckrlandy: have a great weekend o/21:32
rlandydviroel|ruck: thanks - you too - see you monday21:33
dviroel|rucktks o/21:33
*** dviroel|ruck is now known as dviroel|ruck|afk21:33
* dasm disappears too21:36
dasmo/21:36
*** dasm is now known as dasm|off21:36
*** dviroel|ruck|afk is now known as dviroel|ruck23:16
*** dviroel|ruck is now known as dviroel|out23:26

Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!