Monday, 2023-09-11

opendevreviewGhanshyam proposed openstack/nova master: Remove the Hyper-V driver  https://review.opendev.org/c/openstack/nova/+/89446602:53
fricklerthe new oslo.log release seems to need changes in nova unit tests, please check https://review.opendev.org/c/openstack/requirements/+/89292807:00
bauzasfrickler: looking08:22
sean-k-mooneyfrickler: we are passed the non client lib freeze09:12
sean-k-mooneywhy is there a 5.3.0 happening now09:12
bauzasmy bad, I thought it was related to https://bugs.launchpad.net/ironic/+bug/2030976 but actually no, so indeed this isn't an urgency, right?09:24
bauzasanyway, this is u-c, not a nova change09:24
sean-k-mooneyit would break master when it becomes caracl09:24
sean-k-mooneythe reason im askin gis do then intend 5.3.0 to be part of bobcat or not09:24
sean-k-mooneywe have not cut RC1 yet09:25
sean-k-mooneyso we shoudl not really be merging any changes for this until after that is done09:26
bauzashttps://docs.openstack.org/releasenotes/oslo.log/unreleased.html09:26
bauzasif this is just for deprecating os-win, meh09:26
sean-k-mooneydeprecations need to go out in a slurp anyway so i think this can wait09:27
bauzassean-k-mooney: but maybe frickler wants to merge this u-c change after RC1 ?09:27
sean-k-mooneyit must contain something else too 09:27
sean-k-mooneyya perhaps and that is fine09:27
sean-k-mooneyi woudl just prefer to hold this until RC1 is cut but once it is we can correect whatever the unit test fallout is09:28
sean-k-mooneyhttps://github.com/openstack/oslo.log/compare/5.2.0...5.3.009:29
bauzashttps://github.com/openstack/oslo.log/compare/5.3.0...5.2.009:29
bauzasheh09:29
sean-k-mooneyhttps://github.com/openstack/oslo.log/commit/6abf69e194c9dac13d26bca3e7ac1f710f9e26a009:29
sean-k-mooneyso they are droping 3.8 support09:29
sean-k-mooneythat is why its breaking things09:30
sean-k-mooneyso this cant merge in bobcat09:30
bauzasright09:30
bauzasand the relnotes are actually silent, which is sad09:31
sean-k-mooneyi -1'd the uc patch09:32
sean-k-mooneyand linked to the relevent commits and upstream runtimes09:32
sean-k-mooneyfunillay enough this failind in the py310 job09:34
sean-k-mooneyhttps://6bd5f66fe830a1e3af93-5011cbf696878119f18f8b3f0098ae5d.ssl.cf1.rackcdn.com/892928/1/check/cross-nova-py310/9f15891/testr_results.html09:34
bauzassorry for lagging but this morning, I'm switching from 4G to fiber (which is back, woohooo) and testing a few things 09:36
bauzasI have an unifi USG as my home gateway and I'm tempted to stick with a passive HA mode with 4G :)09:36
elodillesbauzas sean-k-mooney : the release is part of bobcat and it was released in time. when the release is out, then a requirements patch (upper-constraint bumping patch) is generated, that tests compatibility. this seems to be failing now09:37
bauzaselodilles: I don't disagree09:37
sean-k-mooneyelodilles: well as i noted that oslo tag increases the min python to 3.909:37
bauzaselodilles: but we're just saying that we can't import it in our bobcat release09:37
sean-k-mooneyelodilles: so it cannot be inclucded in bobcat without reverting that09:37
bauzasanyway, kids taxi for a couple of mins09:37
sean-k-mooneyon wait...09:38
sean-k-mooney2023.2 i bobcat09:39
sean-k-mooneyok actully it can be incled but i would still consider this late09:39
elodillessean-k-mooney: it didn't drop py38 support. py38 is still the min python09:39
sean-k-mooneyno it did https://github.com/openstack/oslo.log/commit/6abf69e194c9dac13d26bca3e7ac1f710f9e26a009:40
sean-k-mooneypython_requires = >=3.909:40
sean-k-mooneybut 2023.2 is bobcat and bocat technially did drop 3.8 support09:40
sean-k-mooneyhowever we didnt put in a min python_requires in nova09:40
sean-k-mooneywe did attempt too but there was objections to that as it would break other porjects that did not have that change09:41
elodillessean-k-mooney: all the py38 patches were reverted. here i see: https://opendev.org/openstack/oslo.log/src/commit/b5b8c30b0d925aa3d31b58932c94586631827b62/setup.cfg#L909:41
elodilles* py38 dropping patches09:42
sean-k-mooneyoh just reread the runtimes patch too """All Python-based projects must additionally target and test against following Python versions:09:42
sean-k-mooney    Python 3.8 (available as default in Ubuntu 20.04)09:42
sean-k-mooney"09:42
sean-k-mooney""09:42
sean-k-mooneyok so they reverted that09:42
elodillesyepp09:42
sean-k-mooneyso im still trying to figure out why its failing as none of the failing test cases test/use oslo log fucntionality09:43
sean-k-mooneyi think this is actully breakign due to privsep somehosw09:43
elodillesanyway, this needs to be sorted out together with oslo team, as Bobcat final release is in 3 weeks09:44
elodillesI'd say if there is an easy fix in nova, then we should add that fix in nova ASAP, because it's easier then to invalidate a release09:44
elodillessean-k-mooney: thanks for looking into the issue09:45
sean-k-mooneyoh https://github.com/openstack/oslo.log/compare/5.2.0...5.3.0 is not in revers chronalogical order09:46
sean-k-mooneyso ya i see teh revert i was confued by that09:46
elodilles++09:47
sean-k-mooneyim not seeing any change there that could cause this so i think it must be form a change in privsep09:47
sean-k-mooneywell 09:47
sean-k-mooneywe do actully log her 09:47
sean-k-mooneyhttps://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L43-L9009:47
sean-k-mooneyin the failing test we are makign this raise an error https://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L5709:48
sean-k-mooneyspecificaly value error09:49
sean-k-mooneyand we are expecting to take this path https://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L76-L7809:49
sean-k-mooneywe mock os.write and assert its not called here https://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L6109:50
sean-k-mooneysorry off by one09:50
sean-k-mooneyhttps://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L6209:50
sean-k-mooneyim wondering if the LOG.error in the excption block is somehow counting as a call to os.write09:51
sean-k-mooneyin the unit test we shoudl be using a stringIO buffer for logging in the test fixture09:51
sean-k-mooneyso there shoudl not be aby write calls to stdout or a file09:52
sean-k-mooneyill see if i can repoduce it locally but it does not really make sesne why this would pass on 5.2.0 and not 5.3.009:52
fricklersorry, was away for a bit. in general, new openstack library releases are not affected by requirements freeze I think. in particular if merging them is delayed by other projects09:53
fricklerso oslo.log 5.3.0 is part of the bobcat release and thus should make it into u-c as well if possible09:54
fricklerah, elodilles said that in between already09:55
sean-k-mooneyfrickler: the final release is ment to happen before FF10:04
sean-k-mooneythat why we have the non client freeze before it10:04
sean-k-mooneyso we do not expect to have any new non-client release in uc between FF and RC-1 unless its an RC-1 fix10:04
sean-k-mooneyit looks like the release was actully done 13 days ago10:05
sean-k-mooneyso that would have been before FF10:05
frickleryes, it is just the u-c bump that is lagging10:05
sean-k-mooneyright which is not good because that kidn of defeats the reason for the eailer releas10:06
sean-k-mooneybut i guess in this case it just would have been better to highlihgt this earlier10:06
sean-k-mooneyi.e. that it was not passing10:07
sean-k-mooneyim running the tests with the older release not and then ill manually unpin and see if this repoduces for me locally10:08
frickleryes, handling by the requirements team could be improved if they had more participants I guess. it also isn't helped by the CI being unstable in general. I'm just trying to fill in some of the gaps where I can10:08
sean-k-mooneyso locally passes with 5.2.0 and fails with 5.3.010:10
sean-k-mooneyso thats a good start10:10
sean-k-mooneylooks like the same error too10:11
sean-k-mooneyalthough for some reason my tox execution seams to be hanging...10:11
sean-k-mooneyoh right the other test is a timeout exception10:16
sean-k-mooneyso it was just waitign on that10:16
opendevreviewMerged openstack/python-novaclient stable/2023.2: Update .gitreview for stable/2023.2  https://review.opendev.org/c/openstack/python-novaclient/+/89407410:45
sean-k-mooneythis is kind of insane11:07
sean-k-mooneyapprently this LOG.ERROR is causign the test to fail 11:07
sean-k-mooneyhttps://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L77C13-L78C6211:07
sean-k-mooneythis passes https://paste.opendev.org/show/bUyh4M9k01rrYaP58223/11:08
sean-k-mooneybut https://paste.opendev.org/show/b4wfwcpSB6XdjPZzC5UN/ fails11:08
sean-k-mooneybut the error is 11:09
sean-k-mooney    AssertionError: Expected 'write' to not have been called. Called 1 times.11:09
sean-k-mooneyCalls: [call(7, b'X')].11:09
sean-k-mooneythe commented out code11:10
sean-k-mooney    #     m.write(b"x" * align_size)11:10
sean-k-mooney    #     os.write(fd, m)11:10
sean-k-mooneywoudl write X to fd 711:10
sean-k-mooneybut its commeted out...11:10
sean-k-mooneyand the log message is not writng that11:10
sean-k-mooneysothere is somehting funky happening  https://github.com/openstack/oslo.log/compare/5.3.0...5.2.0 and https://github.com/openstack/oslo.log/compare/5.2.0...5.3.011:30
sean-k-mooneyshoudl not both show patches but they do11:30
elodillessean-k-mooney: now that i'm looking at git tree it shows that we had some issue with oslo.log at antelope release, thus reverted the two patches, but we did it on stable/2023.1 branch and not on master11:37
sean-k-mooneyack11:39
sean-k-mooneyperhaps i shoudl be diffeint agaisnt a diffent base11:39
sean-k-mooneythe failure is definatly comign form the LOG.error call11:40
sean-k-mooneybut i cant see any patch that would casue this in those diffes11:40
elodillesyes, 5.1.0..5.3.0, but it still not shows the reverts that we only merged on stable/2023.1 :/11:40
elodillesthere are 3 patches reverted in 5.2.011:41
sean-k-mooneyjust so we are on the same page https://paste.opendev.org/show/bHorJ0sp6UTAKfu2EbJ2/ passes11:41
sean-k-mooneyits the LOG.error on line 3411:42
sean-k-mooneythat is causing the issue but the message it is loging and the assert that is fialing do not make sense11:42
sean-k-mooneyno matter what i log it causes Calls: [call(7, b'X')] on os.write11:43
elodillessean-k-mooney: these were reverted on 5.2.0, but not in master (5.3.0): https://paste.opendev.org/show/bRU8ommJhreoVpdiWMRV/11:45
elodillesone of them has to be related to the issue11:45
sean-k-mooneyso the eventlet fix was added and removed fith11:46
sean-k-mooney*right11:46
sean-k-mooneyim going to cloen oslo.log and pip install -e it into the env11:47
sean-k-mooneyand i guess do a biset effectivly11:47
fricklerso this whole issue is 6 months old and got successfully ignored? cool11:48
sean-k-mooneyno idea really. its only causing two test to fail11:49
sean-k-mooneyso i dont think its actully breaking real code11:49
sean-k-mooneyhowever im not parcarlly happy with just modifying the test  without understadnign why this is breakign11:50
sean-k-mooneyas it really not obvious why this is happeining11:50
sean-k-mooneyand looking at the patch delta i woudl not expect this type of change between 5.2.0 and 5.3.011:50
elodillesstephenfin: do you have any idea what could be a solution for this oslo.log issue? ^^^11:53
sean-k-mooneyso its broken by 94b9dc311:56
fricklersean-k-mooney: 5.2.0 is a bugfix release only done on stable/2023.1, 5.1.0 is the initial release for 2023.111:56
sean-k-mooneyhttps://github.com/openstack/oslo.log/commit/94b9dc32ec1f52a582adbd97fe2847f7c87d6c1711:56
elodillesyepp11:56
fricklerso 5.1.0..5.3.0 gives a better idea of what happened in master. though iiuc the issue is in 5.0.0..5.1.011:57
frickler94b9dc3 is 5.0.1, ack11:58
sean-k-mooneyso with that calls to LOG.error are internally using os.write11:59
sean-k-mooneypresumably its realted to the lock its taking11:59
sean-k-mooneyhere https://github.com/openstack/oslo.log/commit/94b9dc32ec1f52a582adbd97fe2847f7c87d6c17#diff-2c76d41c287653560e6e84f39ce877b4f9ba33c7b36db17194e7435cd54adfb0R4111:59
sean-k-mooneyits this call on release 12:00
sean-k-mooneyhttps://github.com/openstack/oslo.log/commit/94b9dc32ec1f52a582adbd97fe2847f7c87d6c17#diff-2c76d41c287653560e6e84f39ce877b4f9ba33c7b36db17194e7435cd54adfb0R10812:01
sean-k-mooneyline 10812:01
sean-k-mooneythat is causing teh test failure sepcifically12:01
sean-k-mooneythe next commit maks this conditionaly 12:01
sean-k-mooneyhttps://github.com/openstack/oslo.log/commit/de615d9370681a2834cebe88acfa81b919da340c12:01
sean-k-mooneyso i guess there are two ways to fix this12:02
sean-k-mooneyi could proably disable this fix in this test12:02
sean-k-mooneyor i might be able to mock the lock12:02
sean-k-mooneythe issue is we have a StandardLogging fixture in our base test clase which handles all or most of the loging config12:03
sean-k-mooneyi can also mock the LOG var on the moduel i guess12:03
sean-k-mooneywe cocationally do that so i would just replace LOG with a mock explictly12:04
elodillesso this issue only comes up in nova's tests, but could not cause any problem outside the test code, you mean?12:05
sean-k-mooneyits not breaking any real code12:07
sean-k-mooneyits just breaking the test case as we are mocking os.write and testign a function that uses it directly12:07
sean-k-mooneyand oslo.log is not ment ot driectly call os.write12:07
sean-k-mooneyor at least id did not previously12:07
sean-k-mooneythe "eventlet logging fix" patch add a loc that internally calls os.write12:08
elodilleshmmmm, i see12:10
sean-k-mooneyself.useFixture(fixtures.MonkeyPatch("nova.privsep.utils.LOG", mock.Mock()))12:11
sean-k-mooneythat shoudl be the fix12:11
sean-k-mooneyya that works12:11
sean-k-mooneyill push that up as a patch12:12
elodillessean-k-mooney: thanks! \o/12:27
opendevreviewsean mooney proposed openstack/nova master: adapt to oslo.log changes  https://review.opendev.org/c/openstack/nova/+/89453813:24
sean-k-mooneyfrickler: elodilles: bauzas: ^ that should fix that specific issue13:24
bauzasjust saw the patch13:24
sean-k-mooneythere may be more generic ways to adress this but i dont really want do do anything more invaisve or global right now13:25
bauzassean-k-mooney: I wish oslo.log would have had an upgrades relnote13:26
sean-k-mooneywell they were not awre i guss that we were sentiive to things like calls to os.write13:26
sean-k-mooneyit does not rbeak runtime code13:26
sean-k-mooneywith that said i need to test this with our functional tests as well13:27
sean-k-mooneyso im going to do that now13:27
bauzascool13:28
opendevreviewMerged openstack/python-novaclient stable/2023.2: Update TOX_CONSTRAINTS_FILE for stable/2023.2  https://review.opendev.org/c/openstack/python-novaclient/+/89407613:53
sean-k-mooneybauzas: so the functional test also work find with 5.3.014:22
sean-k-mooneyso it really was just thowse two speicifc unit tests that were impacted14:22
bauzasI voted +214:50
bauzasfolks (esp. cores), I have a lot of changes that need to be merged before RC1 : https://etherpad.opendev.org/p/nova-bobcat-rc-potential14:51
bauzasat least : https://review.opendev.org/c/openstack/nova/+/893742 https://review.opendev.org/c/openstack/nova/+/893744 https://review.opendev.org/c/openstack/nova/+/893749 14:52
bauzasdansmith: sean-k-mooney: gmann: ^14:53
dansmithbauzas: yep I saw, I'll work through the etherpad when I'm done with my current thing14:56
bauzas++14:56
sean-k-mooneystephenfin: can you review https://review.opendev.org/c/openstack/nova/+/89453815:10
stephenfinsure15:10
sean-k-mooneybauzas: ill start look at those now to15:11
stephenfinsean-k-mooney: is the print intentional? Do we need it?15:11
sean-k-mooneydid we really have no api microversion this cycle15:11
sean-k-mooneyok i tought we did15:11
bauzassean-k-mooney: no, we haven't, neither any RPC change15:12
bauzasit was a very smooth release in terms of upgrade15:12
sean-k-mooneythis is both good and bad for the first slurp release15:12
sean-k-mooneystephenfin: no it was not good catch15:13
stephenfincool, can +2/+W once it's respin15:14
stephenfin*respun15:14
bauzassean-k-mooney: I thought your print was needed15:14
sean-k-mooneyi can add a log for it15:14
sean-k-mooneythat was me just debuging things15:14
sean-k-mooneythe way we use pbr does not work with the debugger in vscode15:15
bauzascool, just respin then15:15
bauzashah15:15
bauzasI personnally rather pdb with an entrypoint I add15:15
bauzasbut I understand your ask for walking thru15:15
opendevreviewsean mooney proposed openstack/nova master: adapt to oslo.log changes  https://review.opendev.org/c/openstack/nova/+/89453815:16
stephenfinthanks15:17
sean-k-mooneybauzas: i mentioned this to you before but can we also try to land https://review.opendev.org/c/openstack/nova/+/860829 today15:26
bauzassean-k-mooney: this is just changing the way we query, right?15:33
bauzasmy only concern is how much it would impact the SQL query plan15:33
sean-k-mooneywe need to do it anyway15:33
bauzasand what would be the impact in terms of performance15:33
bauzassean-k-mooney: because of SQLA 3, right?15:34
bauzasS/3/215:34
sean-k-mooneyfor 2 yes15:34
sean-k-mooneywe are changing the way we describe the join15:34
sean-k-mooneybut it should not matiraly chagne the queary15:35
bauzasthat, I understood15:35
bauzasideally, if the execution plan isn't changing, that's a no-brainer15:35
bauzasbut I don't know this15:35
sean-k-mooneyits been a while since  i read the docs but im not expecting it to mofy the executed SQL at all15:35
sean-k-mooneyi read https://docs.sqlalchemy.org/en/14/orm/backref.html and stephens blog before reviewing https://that.guru/blog/sqlalchemy-relationships-without-foreign-keys/15:37
bauzasI hope you understand my concern : if we merge anything so far in the cycle that would impact the query time on some large and frequently called records like instance or instance info cache, operators would jump on our throats15:37
bauzasstephenfin: around ?15:38
sean-k-mooneybauzas: i do but i dont think that is a reason not to merge this15:38
sean-k-mooneyno ne just left to head into the city15:38
sean-k-mooneyhe will be around tomorrow15:38
bauzaskk15:38
sean-k-mooneywe can chat about it in the team meeting tomrowow if you like15:38
bauzassean-k-mooney: and I assume you want to address this in this cycle so that the SQLA 2 bump is considered 100% complete for Bobcat ?15:39
sean-k-mooneyyes15:39
sean-k-mooneyi want that to be one of our cycle highlights for what its worth15:39
bauzasin the prelude if you want15:39
bauzascycle highlights are tend to be marketing-readable15:40
sean-k-mooneywell i would be happy even if its in the normaly release notes since all of the above means we finsihed it :)15:40
bauzasI mean, don't get me wrong15:40
bauzasthis looks a quickwin15:40
sean-k-mooneywell support for sqlachemy 2.0 is good but we need all the other proejct to also support it before we can fully claim victory15:41
bauzasand I wouldn't disagree merging such things15:41
sean-k-mooneyfor what its worth this was first propsed 11 months ago and we punted it once already15:42
bauzasa long time before in a very far galaxy, we had a DB job 15:42
bauzasthat was ensuring our performance was stable15:42
bauzassean-k-mooney: I get your frustration and again, I'm not against merging it15:42
bauzasI'm just asking for clarification15:42
bauzasand visibility15:43
sean-k-mooneyyep i understand15:43
opendevreviewMerged openstack/nova master: doc: mark the maximum microversion for 2023.2 Bobcat  https://review.opendev.org/c/openstack/nova/+/89374215:48
opendevreviewsean mooney proposed openstack/nova-specs master: replace() argument 1 must be str, not _StrPath  https://review.opendev.org/c/openstack/nova-specs/+/89455316:11
sean-k-mooneybauzas: ^ that will fix the nova-specs docs job16:12
opendevreviewMerged openstack/nova master: adapt to oslo.log changes  https://review.opendev.org/c/openstack/nova/+/89453816:42
opendevreviewMerged openstack/nova-specs master: replace() argument 1 must be str, not _StrPath  https://review.opendev.org/c/openstack/nova-specs/+/89455316:59
atmarkhello, is possible to migrate cell back to non cell on existing deployment?17:04
dansmithatmark: there's no "non-cell" mode in nova17:07
dansmithmaybe explain more about what you're trying to do17:08
greatgatsby_Hello.  Our host aggregates seem to get out of sync with our provider aggregates.  There seems to be a `nova-manage placement sync_aggregates` command, but we're confused how they're getting out of sync in the first place.  Any suggestions of what could be the cause?  This is deployed via kolla-ansible yoga18:41
*** bauzas_ is now known as bauzas19:13
*** bauzas_ is now known as bauzas19:27
*** bauzas_ is now known as bauzas21:47
*** bauzas_ is now known as bauzas22:04
*** bauzas_ is now known as bauzas22:28
*** bauzas_ is now known as bauzas23:05
*** bauzas_ is now known as bauzas23:21
*** bauzas_ is now known as bauzas23:35

Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!