Thursday, 2021-02-11

*** zzzeek has quit IRC00:01
*** zzzeek has joined #openstack-lbaas00:02
*** yamamoto has joined #openstack-lbaas01:46
*** yamamoto has quit IRC01:50
*** sapd1 has joined #openstack-lbaas02:00
*** ianychoi has joined #openstack-lbaas02:09
*** zzzeek has quit IRC02:20
*** zzzeek has joined #openstack-lbaas02:21
*** yamamoto has joined #openstack-lbaas02:27
*** ramishra has joined #openstack-lbaas02:37
*** yamamoto has quit IRC02:44
*** sapd1 has quit IRC02:56
*** zzzeek has quit IRC03:02
*** zzzeek has joined #openstack-lbaas03:06
*** yamamoto has joined #openstack-lbaas03:12
*** yamamoto has quit IRC03:16
*** psachin has joined #openstack-lbaas03:44
*** yamamoto has joined #openstack-lbaas04:22
*** armax has joined #openstack-lbaas04:25
*** armax has quit IRC04:26
*** yamamoto has quit IRC04:52
*** yamamoto has joined #openstack-lbaas05:05
*** yamamoto has quit IRC05:06
*** zzzeek has quit IRC05:08
*** zzzeek has joined #openstack-lbaas05:09
*** yamamoto has joined #openstack-lbaas05:28
*** yamamoto has quit IRC05:28
*** yamamoto has joined #openstack-lbaas05:28
*** yamamoto has quit IRC05:29
*** yamamoto has joined #openstack-lbaas05:29
*** xgerman has quit IRC05:30
*** yamamoto has quit IRC06:07
*** yamamoto has joined #openstack-lbaas06:08
*** yamamoto has quit IRC06:08
*** yamamoto has joined #openstack-lbaas06:14
*** yamamoto has quit IRC06:14
*** yamamoto has joined #openstack-lbaas06:15
*** yamamoto has quit IRC06:20
*** sapd1 has joined #openstack-lbaas06:20
*** vishalmanchanda has joined #openstack-lbaas06:22
*** gcheresh has joined #openstack-lbaas06:23
*** yamamoto has joined #openstack-lbaas06:26
*** ccamposr has joined #openstack-lbaas07:08
*** ataraday has joined #openstack-lbaas07:30
*** rpittau|afk is now known as rpittau07:53
*** ianychoi has quit IRC08:33
*** ianychoi has joined #openstack-lbaas08:48
*** yamamoto has quit IRC09:03
*** yamamoto has joined #openstack-lbaas09:22
*** yamamoto has quit IRC09:27
*** yamamoto has joined #openstack-lbaas09:44
*** sapd1 has quit IRC09:44
*** sapd1 has joined #openstack-lbaas09:45
*** sapd1 has quit IRC10:19
openstackgerritGregory Thiemonge proposed openstack/octavia master: Fix pools going into ERROR when updating the pool  https://review.opendev.org/c/openstack/octavia/+/76046111:01
*** zzzeek has quit IRC11:28
*** zzzeek has joined #openstack-lbaas11:30
*** yamamoto has quit IRC12:50
*** luksky has joined #openstack-lbaas12:56
lukskyhi, I'm trying to update octavia from queens to rocky (2.1.2 -> 3.2.2), after running octavia-db-manage upgrade head I get:12:58
lukskyERROR [alembic.util.messaging] Can't locate revision identified by 'e1554c03234f'12:58
lukskycan You point me where to search for this problem :) ? Don't have any idea where to start.12:58
*** yamamoto has joined #openstack-lbaas13:00
*** yamamoto has quit IRC13:00
*** yamamoto has joined #openstack-lbaas13:00
*** yamamoto has quit IRC13:04
rm_workgthiemonge: commented on https://review.opendev.org/c/openstack/octavia/+/77379813:04
*** yamamoto has joined #openstack-lbaas13:05
*** yamamoto has quit IRC13:05
rm_workluksky: hmm let me look at what revisions we ever had13:05
*** yamamoto has joined #openstack-lbaas13:05
rm_workyeah `e1554c03234f` was never a revision we had. did you have any downstream patches that touched the schema?13:05
rm_workluksky: if you ever made any schema patches downstream, that would cause your DB to be out of sync (the main reason I am very careful to NEVER patch the schema downstream)13:06
*** yamamoto has quit IRC13:06
lukskyrm_work: didn't touch this setup for a long time13:06
rm_workyeah, but back when you did touch it...13:06
rm_workdo you run downstream patches at all?13:07
lukskyonly upgrade from ubuntu 16 to 18, but version of octavia wasn't touched (still 2.1.2)13:07
*** yamamoto has joined #openstack-lbaas13:07
lukskyno... :/13:07
*** yamamoto has quit IRC13:07
rm_workhmmm13:07
*** zzzeek has quit IRC13:07
*** yamamoto has joined #openstack-lbaas13:07
*** yamamoto has quit IRC13:07
rm_workdo you have any nodes where the old version of the octavia package you were using is still installed?13:07
lukskyand octavia is installed from pip, so wasn't installed from ubuntu's debs13:08
*** zzzeek has joined #openstack-lbaas13:08
rm_workoh, ok13:08
lukskyyes13:08
rm_workcool, can you go on there and pastebin me an `ls -l` of...13:08
*** yamamoto has joined #openstack-lbaas13:08
*** yamamoto has quit IRC13:08
lukskythey all use the same DB (but it is rather obviouse)13:08
*** yamamoto has joined #openstack-lbaas13:09
rm_work`lib/python3.7/site-packages/octavia/db/migration/alembic_migrations/versions`13:09
rm_workwherever that lib is (venv? wherever)13:09
rm_workneed to see the state of your alembic migrations13:10
luksky:/13:11
luksky[DEV_WAW1][root@octavia-03 ~]# find /var -name alembic_migrations -type d                                                                                                                                                                                                                                              14:1113:11
luksky[DEV_WAW1][root@octavia-03 ~]#13:11
lukskyok, sorry, not this dir13:12
*** yamamoto has quit IRC13:13
lukskyhttp://paste.openstack.org/show/802566/ - not touched octavia13:14
lukskyhttp://paste.openstack.org/show/802567/ - touched octavia13:14
lukskythis revison hash is taken directly from DB ?13:19
*** zzzeek has quit IRC13:21
gthiemongerm_work: thanks, I replied in https://review.opendev.org/c/openstack/octavia/+/77379813:24
*** psachin has quit IRC13:26
*** zzzeek has joined #openstack-lbaas13:26
lukskycan I safly delete version_num from alembic_version table in DB ?13:37
lukskyrm_work: ^13:37
rm_workno13:38
rm_workthen it will have no idea where you are13:38
rm_workneed to figure out where e1554c03234f came from13:38
lukskyok13:39
rm_worksomehow a patch was applied to your DB that put it on that schema id13:39
lukskyhm.. can I list the patches somehow witch was applied ?13:41
rm_workit should have been these files you listed13:42
rm_workbut i don't see that hash in here13:42
lukskyI think the value should be 0aee2b45051213:42
rm_workthen you can TRY changing the value in the db to that13:43
rm_workand see if the upgrade applies13:43
lukskywell this is devel env - I can update to this version in DB, have dump. This one step before production :)13:43
rm_workok13:43
rm_workin prod is the version id the same?13:43
lukskyon productoin I checked -13:43
*** sapd1 has joined #openstack-lbaas13:43
lukskyin prod there is 0aee2b450512 (so OK)13:43
rm_workor did someone maybe just apply a patch they were developing that never merged, and forget to roll it back?13:43
rm_workok yeah13:43
rm_worksounds like that's what probably happened13:43
lukskyit may be...13:43
lukskychanging in DB13:44
rm_worksomeone was working on a migration and applied it (or applied a migration from a patch that hasn't yet merged)13:44
rm_workyour DB may actually still have this applied tho13:44
rm_workso it may be ... broken13:44
lukskyyes, it colud be the case :/13:44
rm_workwas hoping to find the patch file13:44
rm_work*migration file13:44
lukskyok, understand the logic now :)13:44
rm_workwish i could search all open patches for a file with that text... not sure how13:45
rm_workhmmm `file:e1554c03234f` might be it but it comes up with nothing13:46
rm_worknope, doesn't work, tried it on a file i know exists. whelp, dunno13:47
rm_worksomeone could have applied a patch from their work laptop/desktop? or another server you didn't see?13:48
rm_workanyway, if you understand the issue now, i'll just let you try to figure it out :D good luck13:48
lukskythank You :)13:48
lukskyrm_work: http://paste.openstack.org/show/802569/ - it didn't fail :P, checking the services :)13:49
rm_workhmm wtf13:51
rm_worknoticed 'd52f64b86d3a' had no message, started trying to see what that was for just so i could know who to be annoyed with, and i can't find that one in the official repo either13:53
*** stand has quit IRC13:54
luksky:/13:58
rm_workany idea where that came from?13:58
rm_workwhat's in it?13:58
lukskysearching the file13:59
rm_workcan you pastebin it?13:59
luksky(services seams to work ok)13:59
lukskyyes, one moment14:00
lukskyrm_work: http://paste.openstack.org/show/802572/ - strange content14:01
rm_workthat's an empty new revision14:02
rm_worklooks like alembic was run and created a new revision (standard for how we'd start a patch)14:02
rm_workperhaps you or someone accidentally ran `alembic revision`?14:03
rm_work(I think that's the command)14:03
lukskyI run octavia-db-manage history revision14:03
lukskysory, octavia-db-manage upgrade revision14:03
rm_workright14:04
rm_workbut at some point, someone may have run `alembic revision` for octavia14:04
rm_workwhich would create a blank revision at the end of the chain14:04
rm_work(exactly what that file is)14:04
rm_workit could have been an accident (like, trying to figure out what revision is used)14:04
rm_workthe side effect of which created that file, which is now applied14:05
rm_workso you're going to run into the same issue next time you try to upgrade14:05
rm_workoh wait14:05
rm_workdid you run that command LITERALLY>14:05
rm_work`octavia-db-manage upgrade revision`14:06
rm_workor do you mean like14:06
rm_work`octavia-db-manage upgrade $revision`14:06
rm_workthe correct command is `octavia-db-manage upgrade head`14:07
rm_workif you actually had "revision" there, I wonder if it could have passed that through to alembic, as we're essentially using a thin alembic wrapper there14:07
lukskyI run this command exactly: octavia-db-manage upgrade revision14:08
rm_workok yeah, that's not correct14:08
rm_worki bet that caused this14:08
rm_worki'm just not totally sure how yet14:08
lukskyOK14:08
rm_workagain, the correct command is `octavia-db-manage upgrade head`14:09
rm_workyou'll want to update your db to `76aacf2e176c`14:09
*** lemko has quit IRC14:09
rm_workand delete that file14:09
rm_workotherwise you'll have the same issue again14:09
*** lemko has joined #openstack-lbaas14:09
lukskyok, I did run once more command14:09
lukskyand I think this the reason:14:10
luksky[DEV_WAW1][root@octavia-01 ~]# octavia-db-manage revision                                                                                                                                                                                                                                                              15:0814:10
luksky  Generating /usr/local/lib/python2.7/dist-packages/octavia/db/migration/alembic_migrations/versions/2802489e7306_.py ...  done14:10
luksky[DEV_WAW1][root@octavia-01 ~]#14:10
rm_workyep14:10
lukskythe content is the same, as in the revision with 'empty message'14:11
rm_workmake sure to delete the files you created (that one, and d52f64b86d3a_.py)14:11
lukskyok14:11
rm_workand set your DB back to version 76aacf2e176c manually14:11
rm_workand then avoid ever typing "revision" again :D14:11
luksky:D14:11
lukskythank You14:11
rm_worklikely that is the same thing that happened last time / earlier, so probably your DB is in ok shape14:11
lukskyyes, in deed14:12
lukskybut I have one LB in PENDING UPDATE state14:12
lukskycan I fix it in DB now ?14:12
lukskythis LB was created before upgrade (to check if it survive the upgrade :) )14:12
rm_workah14:13
rm_workI would set it to ERROR, then failover14:13
rm_workthough that doesn't necessarily tell you what you were hoping, I think14:13
rm_workhow did it end up in PENDING?14:13
rm_workit was ACTIVE pre-upgrade?14:13
*** vishalmanchanda has quit IRC14:14
lukskyyes, and now it is working (I mean - the site behind him is working)14:14
lukskyI think the amphora need to updated, as well...14:15
lukskyto rocky versoin14:15
lukskyneed to be *14:15
lukskyand second one goes to ERROR state :/ (I don't like upgrades of octavia :D )14:17
rm_workupgrading should do nothing on the dataplane14:17
rm_workexisting amphora should just continue to function, LBs should stay ACTIVE14:18
rm_workthere will be a new image available, but it will only be used by new amps14:18
lukskyYes, they are working, but are unmanagable now14:18
rm_workhmm, should not be14:18
rm_workwhen you did your upgrade, did you change your certificates?14:18
lukskyno14:18
rm_workdo you use kolla-ansible or some other deployment framework, or do manual upgrades?14:18
lukskyonly changed version of octavia14:19
lukskyno kolla, my own puppet manifests14:19
lukskyupgrade of octavia is only one command in this case: pip install octavia==3.2.214:19
lukskyoctavia-db-migrate upgrade head14:20
lukskyand then start the services14:20
rm_workhmm14:39
rm_workit should work perfectly fine with all existing LBs14:39
lukskywill search - but I have another issue:14:44
lukskyduring building image for rocky (-g stable/rocky):14:44
luksky2021-02-11 14:36:57.684 | ERROR: No matching distribution found for oslo.serialization>=2.28.114:44
lukskybuilding of amphora image *14:53
*** mchlumsky has quit IRC14:59
*** yamamoto has joined #openstack-lbaas15:10
rm_workinteresting15:22
rm_workhave not seen that one15:22
rm_workrocky is still a pretty old release...15:23
rm_workluksky: are you running octavia in its own virtualenv?15:23
rm_workor sharing one virtualenv for multiple services15:23
rm_workif you've got a standalone virtualenv, i'd recommend upgrading just octavia directly to victoria, you'll probably have less issues15:24
lukskyYou writing about amphora building ?15:24
rm_workabout everything, but yes15:24
rm_workalso amp building15:24
rm_worki don't actually know if rocky amps are buildable anymore15:25
lukskyok, so latest amphoras wan't do any trouble with rocky ?15:25
*** bcafarel has quit IRC15:25
rm_workrecommend making a python3 virtualenv and installing octavia victoria release, then build a victoria amphora :)15:26
rm_workshould still be able to communicate with old amps15:26
lukskyI need to have complement env (regarding upgrading controllers)15:26
*** bcafarel has joined #openstack-lbaas15:26
lukskyand can't upgrade all other components from openstack to the latest release15:27
rm_workdon't need to15:27
rm_workoctavia Victoria could run on Nova/Neutron from like... Icehouse15:27
lukskywell...15:28
lukskyinteresting15:28
lukskyok, give it a try on devel.15:28
rm_workI'm contemplating a talk for the next summit titled something like "Multi-Release Cloud and You: How to Mix & Match Service Versions"15:29
rm_workour cloud is a hilarious mix of versions15:29
rm_workKeystone Rocky, Nova Stein, Neutron Train, Glance/Octavia/Designate on git-master15:29
rm_workah, Senlin on git-master too15:30
haleybFrankenCloud15:30
rm_workbasically the only thing i've found that matters is that Neutron should be >=Nova-Version15:30
*** yamamoto has quit IRC15:31
lukskyheehh :D15:35
*** luksky has quit IRC15:48
-openstackstatus- NOTICE: Recent POST_FAILURE results from Zuul for builds started prior to 15:47 UTC were due to network connectivity issues reaching one of our log storage providers, and can be safely rechecked15:51
*** luksky has joined #openstack-lbaas16:05
lukskyrm_work: can I upgrade directly from queens to victoria :D ?16:18
rm_workyes16:18
lukskysimple octavia-db-manage upgrade head and pray :) ?16:19
rm_workyep16:19
rm_worki recommend you STOP the health-manager services before an upgrade16:19
rm_workthen restart all services16:19
lukskyok, will do it in venv16:20
lukskylet You know how goes :P16:20
rm_workalso, i'd have a new amp image uploaded and ready to go by the time you restart the services on the new version16:25
rm_workwe GUARANTEE image compatibility for N+/-116:25
rm_workthis is like +4 :P16:25
rm_workit should still "work" but it'd be best to get amps failed over somewhat soon anyway if they're still running such an old image16:26
rm_workjust for OS security reasons16:26
rm_workwe cycle all our amps just about every 6 months minimum16:26
*** xgerman has joined #openstack-lbaas17:03
*** yamamoto has joined #openstack-lbaas17:29
johnsomYeah, it should work. That is a big jump though. If, and I mean if, there was an issue with the old amp image, the controller would just mark it PENDING_UPDATE and fail it over to the new image anyway, automatically. Assuming you followed the upgrade guide and upgraded the components in the prescribed order.17:32
*** rpittau is now known as rpittau|afk17:43
*** yamamoto has quit IRC17:45
*** gcheresh has quit IRC17:49
*** sapd1 has quit IRC17:51
openstackgerritMichael Johnson proposed openstack/python-octaviaclient master: Remove install unnecessary packages  https://review.opendev.org/c/openstack/python-octaviaclient/+/75162917:55
johnsomrebase to re-render the docs17:55
rm_workyeah that's why I was saying i recommend having the new image uploaded before you restart services on the new version18:00
rm_workthat way if failover IS necessary, it should just work18:01
*** numans has joined #openstack-lbaas18:15
*** luksky has quit IRC18:35
openstackgerritGhanshyam proposed openstack/octavia master: [goal] Deprecate the JSON formatted policy file  https://review.opendev.org/c/openstack/octavia/+/76457818:36
*** luksky has joined #openstack-lbaas18:52
*** njohnston has quit IRC19:04
lukskyok, will let You know tomorrow, now I'm finishing work :)19:15
*** gcheresh has joined #openstack-lbaas19:41
*** yamamoto has joined #openstack-lbaas19:42
*** gcheresh has quit IRC19:52
*** yamamoto has quit IRC19:59
*** rcernin has joined #openstack-lbaas21:20
*** yamamoto has joined #openstack-lbaas21:57
*** gmann is now known as gmann_afk22:10
*** luksky has quit IRC22:25
*** yamamoto has quit IRC22:25
*** lemko7 has joined #openstack-lbaas22:29
*** lemko has quit IRC22:30
*** lemko7 is now known as lemko22:30
*** xgerman has quit IRC22:32
*** emccormick has quit IRC22:40
*** emccormick has joined #openstack-lbaas22:40
*** yamamoto has joined #openstack-lbaas22:41
*** gmann_afk is now known as gmann23:11
*** ccamposr has quit IRC23:48

Generated by irclog2html.py 2.17.2 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!