Wednesday, 2024-03-27

tkajinamhberaud, yes but you said 2024.2 but 2023.100:58
tkajinamhberaud, 2024.2, not 2023.2, I mean00:58
opendevreviewOpenStack Proposal Bot proposed openstack/oslo.log master: Imported Translations from Zanata  https://review.opendev.org/c/openstack/oslo.log/+/91270503:34
hberaudsean-k-mooney: concerning the multiple readers support, we are aware of this https://github.com/eventlet/eventlet/issues/87406:58
hberaudsean-k-mooney: IMO I think it should be possible to adapt swift to not rely on this unsupported feature https://opendev.org/openstack/swift/src/branch/master/swift/common/utils/__init__.py#L6102-L611609:05
hberaudSwift heavily rely on eventlet so I think Swift will be transitioned last, it leave time to see how to adapt Swift to not rely on multireaders.09:08
opendevreviewMerged openstack/oslo.log stable/2024.1: Fix eventlet detection  https://review.opendev.org/c/openstack/oslo.log/+/91426209:09
opendevreviewDaniel Bengtsson proposed openstack/oslo.messaging stable/zed: Fix typo in quorum-related variables for RabbitMQ  https://review.opendev.org/c/openstack/oslo.messaging/+/91443510:02
opendevreviewMerged openstack/oslo.log stable/2023.2: Fix eventlet detection  https://review.opendev.org/c/openstack/oslo.log/+/91426610:49
hberaudtkajinam++ damani++ thanks guys for your reactivity with the previous eventlet topic!10:52
damanihberaud, thanks a lot for the code review and merged :)11:32
damaniabout it 11:32
sean-k-mooneyhberaud: ya so i could retrigger the job and just turn of swift and see how far it gets11:43
sean-k-mooneybut i was expecting swift to be problmeatic11:43
opendevreviewMerged openstack/oslo.messaging master: kafka: Fix invalid hostaddr format for IPv6 address  https://review.opendev.org/c/openstack/oslo.messaging/+/90951712:02
opendevreviewMerged openstack/oslo.log stable/2023.1: Fix eventlet detection  https://review.opendev.org/c/openstack/oslo.log/+/91426712:18
hberauddamani: FYI https://bugs.launchpad.net/octavia/+bug/2039346/14:09
hberaud(related to the previous backports)14:10
hberaudSee the discussion here: https://github.com/eventlet/eventlet/issues/432 , IMO octavia is suffering from several incomplete backports on the oslo side.14:18
opendevreviewTakashi Kajinami proposed openstack/oslo.messaging stable/2023.2: kafka: Fix invalid hostaddr format for IPv6 address  https://review.opendev.org/c/openstack/oslo.messaging/+/91444815:08
opendevreviewTakashi Kajinami proposed openstack/oslo.messaging stable/2024.1: kafka: Fix invalid hostaddr format for IPv6 address  https://review.opendev.org/c/openstack/oslo.messaging/+/91444915:08
opendevreviewTakashi Kajinami proposed openstack/oslo.messaging stable/2023.2: kafka: Fix invalid hostaddr format for IPv6 address  https://review.opendev.org/c/openstack/oslo.messaging/+/91444815:09
opendevreviewTakashi Kajinami proposed openstack/oslo.messaging stable/2023.1: kafka: Fix invalid hostaddr format for IPv6 address  https://review.opendev.org/c/openstack/oslo.messaging/+/91445015:10
crohmanndamani: tkajinam: I'd like to discuss "the" greenthreads issue with eventlet - https://bugs.launchpad.net/octavia/+bug/2039346. 15:56
crohmannhberaud: and I discussed this in https://github.com/eventlet/eventlet/issues/432 and he suggested there might be sensible bugfix backports required for Zed and Yoga (https://github.com/eventlet/eventlet/issues/432)15:57
hberaudcrohmann: just replied to your latest comment https://github.com/eventlet/eventlet/issues/432#issuecomment-202313088215:58
crohmannAh, thx for your time and patience figuring this out and naming the right steps required.15:59
hberaudnp, you are welcome16:00
crohmannI might need a moment to understand this fully. So you are saying https://opendev.org/openstack/oslo.log/commit/94b9dc32ec1f52a582adbd97fe2847f7c87d6c17 should go into Zed and Yoga, but that then also needs the more recent bugfix on top?16:02
hberaudcrohmann: see my advice here => https://github.com/eventlet/eventlet/issues/432#issuecomment-202314539316:03
hberaudthe short response to your question is, yes!16:04
hberaudThe 3 patches I refered too in my last comment are all mandatories16:05
hberaudThe are the different pieces of the puzzle16:05
crohmannAllright. Does it make sense for me to push backport changes then? Or would you push them? Maybe also clustered via topic or changeid to make clear they belong together?16:06
crohmannI am more than willing to do the work, but don't want to create more chaos and confusion in pushing backports.16:08
hberaudI won't push these backports, I don't have enough bandwidth to manage them, so, if you want, feel free to cherry-pick them, else discuss with damani if he can handle this story.16:09
hberaudI already pinged damani early today about this story but for now I've no update from him, so, I think you are safe proposing them.16:11
johnsomDo we have a similar problem in oslo messaging?16:11
johnsomhttps://opendev.org/openstack/oslo.messaging/src/branch/master/oslo_messaging/_utils.py#L2516:11
johnsomI see that global "try_import" in two places in oslo messaging16:12
hberaudI don't think, because we have this condition `eventletutils.is_monkey_patched("thread"):`16:12
hberaudthe problem in oslo.log was that the condition only relied on the import of eventlet16:13
johnsomOk, cool. I wasn't sure it the simple fact of importing it led to issues as well or not. 16:13
hberaudso if eventlet was imported elsewhere, it would have been present in sys.module, and so it would have been considered as "enabled" by that poorly designed condition (in oslo.log)16:14
crohmannhberaud: thanks gain for your time. If I may ask, what's the best approach to bundle the three backports into one? Use the same changeid?16:14
hberaudthe condition was similar to "if eventlet is in sys.module, then we can consider that the env is monkey patched"16:15
hberaudsimply cherry-pick each patch in the same branch and then submit your review by using git review would do the job16:16
hberaud3 cherry-pick by local development branch and then followed by a git review16:17
hberaudgit review will manage all these changeid stuff for you16:17
crohmannhberaud: johnsom: We've seen issues with RabbitMQ timeouts / connections lossed (oslo.messaging) but I suppose the new pthread feature (https://review.opendev.org/q/topic:%22disable-green-threads%22) helps with that.16:19
crohmannhberaud: I know how git review and cherry-picking works, but was just wondering if there was some special way to bundle multiple commits properly :-)16:20
hberaudno we don't have specific process16:20
crohmannAm I not seeing clearly or is there no stable/zed branch for oslo.log anymore?16:22
hberaudah sorry... I missed that point16:23
crohmannhttps://opendev.org/openstack/oslo.log/branches16:23
hberaudindeed, years ago we transitioned oslo.log as an independent deliverable and then after 2 series we moved back oslo.log (and some other oslo deliverables) to the coordinated series16:24
hberaudSo, we have no option for these stable branches16:25
crohmannSo nowhere to apply the patches to you mean?16:25
hberaudonly downstream packages can be patched16:25
hberaudAFAIK yes16:25
JayFhberaud: I know in Ironic in the past, we've suggested to deployers to install a library outside of constraints to resolve some bugs when we've been in that case16:25
crohmannAs in Ubuntu Cloud Archive you mean?16:25
JayFhberaud: e.g. "If you install the newer version of sushy, it should resolve your bug" even if we can't test it or package it to install that way for other reasons16:26
JayFI know that's not an ideal fix; but at least giving a workaround is better than no option whatsoever16:26
hberaudJayF: ack, thanks for examples16:26
hberaudcrohmann: then maybe you could follow JayF's suggestions16:27
crohmannFor our own sake (OpenStack Cloud) we are also happy with all of them issues fixed for 2023.1 (Antelope) onwards. We will iterate through multiple updates since Zed will approach unmaintained state soon anyways.16:28
hberaudThat's why I moved back oslo libs into the coordinated series, because it was more annoying than something else to have them in independent deliverables16:28
crohmannAfter this discussion I now have a much better understanding of the issues. Well, we will then skip quickly past Zed. I don't know if you still want to at least somehow get the word out to deployers how to fix this Zed?16:32
hberaudthat would be awesome to socialize this point with operators/deployers16:33
hberaudthough, I've no idea how to spread the words in an official manner16:34
hberaudusing the ML won't really help and the message would be lost in a couple of week16:35
hberaudrelease notes are strongly linked to specific version/series16:36
JayFhberaud: in the past, we've been able to ask operator sigs to promote things on socials, but we usually use that for meetups and such16:36
JayFhberaud: perhaps if you were able to write up something about it on a wiki page for oslo or in the docs somewhere, you could have it made into something for Socials ... maybe with a spin on it being "another good reason to upgrade"?16:37
hberaudyeah, AFAICS that would be the more reasonable approach16:38
tkajinamhberaud, I'm wondering if we really need https://opendev.org/openstack/oslo.log/commit/de615d9370681a2834cebe88acfa81b919da340c , since I suspect https://review.opendev.org/c/openstack/oslo.log/+/914190 would fix the original issue16:40
tkajinamwe can attempt to backport it but I'm hesitant to rush it at this timing while we haven't yet confirmed that the solution has no side effect... zed is moving to unmaintained soon (after 2024.1 GA) so if we break it now then it may not likely be fixed.16:41
tkajinamunless someone steps up to maintain oslo's unmaintained branches16:41
tkajinam( and I now caught up with the discussion...16:44
hberaudtkajinam: you are far well more active than me on oslo, so if you think we should holds this topic for a bit, I trust you at 100%16:47
hberaudI think this is a wise decision16:48
tkajinam:-)16:49
hberaudhowever, that won't prevent us to update the doc as Jay suggested 16:49
hberaud(IMO)16:49
tkajinamI wonder how we can update the doc without stable/zed branch16:49
tkajinamprobably the first step is to record these discussions in the bug as a placeholder ?16:49
hberaudwe could just put a comment on the latest version 16:50
hberaudtkajinam: indeed, that's a good starting point16:50
hberaudeven the best starting point16:50
tkajinamprobably but I find it a bit strange that we document a problem of an old version in the latest version.16:51
tkajinamI'll give it another thought during my day16:51
hberaudsure np16:51
hberaudno rush16:51
hberaudI tried to summarize the situation into the bugtracker https://bugs.launchpad.net/octavia/+bug/2039346/comments/1517:06
JayFhberaud: part of me wonders if an upgrade of the oslo.log package (and only that package) would fix the issue for them, too, but that is risky to suggest without testing :D 18:06
hberaudyep18:06
opendevreviewMerged openstack/oslo.messaging stable/2024.1: kafka: Fix invalid hostaddr format for IPv6 address  https://review.opendev.org/c/openstack/oslo.messaging/+/91444918:23
crohmannsorry hberaud and I had to run earlier. I did read through the rest of the conversation and thank you for the summary in the ticket.18:48
sean-k-mooneyhberaud: just an fyi nova is also dependign on aise RuntimeError("Multiple readers are not yet supported by asyncio hub")18:51
sean-k-mooneyfor the pipe_mutex stuff in oslo_log18:52
sean-k-mooneyhttps://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_130/914108/5/check/tempest-full-py3/130576a/controller/logs/screen-n-api.txt18:52
sean-k-mooneywe can possibly work aorund that by setting heatbeat_in_pthread to false in the api18:53
sean-k-mooneythat is the only native thread we have in the wsgi applications18:54
sean-k-mooneybut we have others that we cant remvoe in other processes18:54
sean-k-mooneyideally we shoudl remove oslo_messaging_rabbit.heartbeat_in_pthread 18:55
sean-k-mooneyits caused a bunch of issues when its set to true so i would never recommend doing that personally18:56
sean-k-mooneythat wasnt why the pipe mutex stuff was added to oslo log however18:56
sean-k-mooneyhberaud: we will need to modify https://review.opendev.org/c/openstack/oslo.log/+/852443 or add suport for multi reader support to proceed with the asycio hub work 18:58
opendevreviewMerged openstack/oslo.messaging master: Fix incorrect desc of rabbit_stream_fanout option  https://review.opendev.org/c/openstack/oslo.messaging/+/91387020:08

Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!