Wednesday, 2020-12-02

*** sboyron has quit IRC00:07
*** rcernin has quit IRC01:37
*** rcernin has joined #openstack-oslo01:39
*** rcernin has quit IRC02:36
*** iurygregory|pto has quit IRC03:01
*** rcernin has joined #openstack-oslo03:06
openstackgerritlikui proposed openstack/oslo.limit master: add py38 matedata  https://review.opendev.org/c/openstack/oslo.limit/+/76504405:19
*** rcernin has quit IRC05:43
*** rcernin has joined #openstack-oslo05:55
*** hamalq has quit IRC06:17
*** hamalq has joined #openstack-oslo06:18
openstackgerritlikui proposed openstack/oslo.context master: Replace deprecated UPPER_CONSTRAINTS_FILE variable  https://review.opendev.org/c/openstack/oslo.context/+/76506206:35
*** sboyron has joined #openstack-oslo06:42
*** brinzhang0 has joined #openstack-oslo06:46
*** brinzhang_ has quit IRC06:49
*** sboyron has quit IRC06:55
*** zzzeek has quit IRC07:13
*** zzzeek has joined #openstack-oslo07:14
*** brinzhang_ has joined #openstack-oslo07:32
*** brinzhang0 has quit IRC07:35
*** rcernin has quit IRC07:38
*** rcernin has joined #openstack-oslo07:44
*** rcernin has quit IRC07:48
*** rcernin has joined #openstack-oslo08:01
*** sboyron has joined #openstack-oslo08:05
*** rcernin has quit IRC08:16
*** rpittau|afk is now known as rpittau08:28
*** rcernin has joined #openstack-oslo08:32
*** rcernin has quit IRC08:46
*** tkajinam has quit IRC08:49
*** tosky has joined #openstack-oslo09:05
*** rcernin has joined #openstack-oslo09:42
*** rcernin has quit IRC09:42
*** dtantsur|afk is now known as dtantsur11:19
*** raildo has joined #openstack-oslo11:50
*** brinzhang_ has quit IRC11:52
*** brinzhang_ has joined #openstack-oslo11:52
*** iurygregory has joined #openstack-oslo12:03
*** geguileo has joined #openstack-oslo12:11
*** dtantsur is now known as dtantsur|brb12:31
*** hamalq has quit IRC13:01
*** Luzi has joined #openstack-oslo13:03
*** hamalq has joined #openstack-oslo13:05
hberauddamani: FYI https://review.opendev.org/c/openstack/releases/+/76513313:34
damanihberaud, i will check13:34
hberaudthanks13:34
hberaudbnemec, moguimar, stephenfin: Please can you take a look => https://review.opendev.org/c/openstack/oslo.messaging/+/76477613:42
*** dtantsur|brb is now known as dtantsur13:46
stephenfinsure13:48
hberaudthx13:48
*** lbragstad has quit IRC13:50
*** Luzi has quit IRC13:51
*** lbragstad has joined #openstack-oslo13:55
*** raildo has quit IRC14:19
*** raildo has joined #openstack-oslo14:27
*** redrobot has quit IRC16:11
*** brinzhang0 has joined #openstack-oslo16:14
*** brinzhang_ has quit IRC16:16
*** Guest18921 has joined #openstack-oslo16:58
*** Guest18921 is now known as redrobot17:01
mnaserhi oslo team17:04
*** rpittau is now known as rpittau|afk17:04
mnaserhas anyone ever ran into an issue with oslo.messaging having _a lot_ of stale threads?17:05
mnaserrunning magnum-api as 'magnum-api' where it forks out into 1 process per request seems to have no issues17:05
mnaserbut running it inside uwsgi leads to a lot of threads slowly leaking17:05
mnaserand eventually hitting an error where you cant start any more threads17:05
mnaserwhat sucks is i cant troubleshoot this inside GMR17:06
mnaserbecause cant run that uwsgi17:06
mnaserand the issue doesn't exist inside magnum-api directly (though i do see some zombie processes at times)17:06
hberaudmnaser: o/ it looks like to something that we already have experienced downstream17:16
hberaudlemme grab some links17:16
hberaudmnaser: I think this one is more or less relevant for your needs => https://bugzilla.redhat.com/show_bug.cgi?id=171179417:22
openstackbugzilla.redhat.com bug 1711794 in openstack-tripleo-heat-templates "[OSP15][deployment] AMQP heartbeat thread missing heartbeats when running under nova_api" [High,Closed: notabug] - Assigned to mschuppe17:22
hberaudAlso notice that this combo can introduce issues within a monkey patched environment => httpd + the MPM prefork engine + uwsgi17:26
hberaud#eventlet17:26
hberaudmnaser: and for another more or less related example also you can take a look to => https://github.com/openstack/oslo.messaging/commit/22f240b82fffbd62be8568a7d0d3369134596ace17:28
hberaudhopefully that would help you17:28
mnaserhberaud: in our case actually magnum is pretty busy, it never goes 'idle' :(17:32
mnaserbecause http health checks are constantly pinging it17:32
mnaser3 haproxy's that check it non stop so there's almost always non-stop requests17:33
mnaserim now wondering if this is somewhat related -- https://github.com/openstack/magnum/blob/master/magnum/common/context.py#L130-L15017:33
mnaseri see this https://github.com/openstack/magnum/blob/f0dec728e78bcb3851b1a484b73bfe567b3c1fc9/magnum/common/rpc_service.py#L54-L57 so i also wonder if that is related17:34
hberaudmnaser: notice that also we currently have an issue with our rpc server replies => https://review.opendev.org/c/openstack/oslo.messaging/+/76477617:35
mnaserhttps://github.com/openstack/magnum/blob/master/magnum/service/periodic.py#L205-L210 maybe its unrelated to oslo.messaging and magnum doing stuff17:39
*** camelCaser has joined #openstack-oslo17:48
*** dtantsur is now known as dtantsur|afk18:47
*** iurygregory has quit IRC19:29
*** brinzhang_ has joined #openstack-oslo20:58
*** brinzhang0 has quit IRC21:01
*** iurygregory has joined #openstack-oslo21:28
*** raildo has quit IRC21:40
*** sboyron has quit IRC21:43
*** rcernin has joined #openstack-oslo22:34
*** tkajinam has joined #openstack-oslo23:00

Generated by irclog2html.py 2.17.2 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!