*** zzzeek has quit IRC | 00:01 | |
*** zzzeek has joined #openstack-lbaas | 00:02 | |
*** yamamoto has joined #openstack-lbaas | 01:46 | |
*** yamamoto has quit IRC | 01:50 | |
*** sapd1 has joined #openstack-lbaas | 02:00 | |
*** ianychoi has joined #openstack-lbaas | 02:09 | |
*** zzzeek has quit IRC | 02:20 | |
*** zzzeek has joined #openstack-lbaas | 02:21 | |
*** yamamoto has joined #openstack-lbaas | 02:27 | |
*** ramishra has joined #openstack-lbaas | 02:37 | |
*** yamamoto has quit IRC | 02:44 | |
*** sapd1 has quit IRC | 02:56 | |
*** zzzeek has quit IRC | 03:02 | |
*** zzzeek has joined #openstack-lbaas | 03:06 | |
*** yamamoto has joined #openstack-lbaas | 03:12 | |
*** yamamoto has quit IRC | 03:16 | |
*** psachin has joined #openstack-lbaas | 03:44 | |
*** yamamoto has joined #openstack-lbaas | 04:22 | |
*** armax has joined #openstack-lbaas | 04:25 | |
*** armax has quit IRC | 04:26 | |
*** yamamoto has quit IRC | 04:52 | |
*** yamamoto has joined #openstack-lbaas | 05:05 | |
*** yamamoto has quit IRC | 05:06 | |
*** zzzeek has quit IRC | 05:08 | |
*** zzzeek has joined #openstack-lbaas | 05:09 | |
*** yamamoto has joined #openstack-lbaas | 05:28 | |
*** yamamoto has quit IRC | 05:28 | |
*** yamamoto has joined #openstack-lbaas | 05:28 | |
*** yamamoto has quit IRC | 05:29 | |
*** yamamoto has joined #openstack-lbaas | 05:29 | |
*** xgerman has quit IRC | 05:30 | |
*** yamamoto has quit IRC | 06:07 | |
*** yamamoto has joined #openstack-lbaas | 06:08 | |
*** yamamoto has quit IRC | 06:08 | |
*** yamamoto has joined #openstack-lbaas | 06:14 | |
*** yamamoto has quit IRC | 06:14 | |
*** yamamoto has joined #openstack-lbaas | 06:15 | |
*** yamamoto has quit IRC | 06:20 | |
*** sapd1 has joined #openstack-lbaas | 06:20 | |
*** vishalmanchanda has joined #openstack-lbaas | 06:22 | |
*** gcheresh has joined #openstack-lbaas | 06:23 | |
*** yamamoto has joined #openstack-lbaas | 06:26 | |
*** ccamposr has joined #openstack-lbaas | 07:08 | |
*** ataraday has joined #openstack-lbaas | 07:30 | |
*** rpittau|afk is now known as rpittau | 07:53 | |
*** ianychoi has quit IRC | 08:33 | |
*** ianychoi has joined #openstack-lbaas | 08:48 | |
*** yamamoto has quit IRC | 09:03 | |
*** yamamoto has joined #openstack-lbaas | 09:22 | |
*** yamamoto has quit IRC | 09:27 | |
*** yamamoto has joined #openstack-lbaas | 09:44 | |
*** sapd1 has quit IRC | 09:44 | |
*** sapd1 has joined #openstack-lbaas | 09:45 | |
*** sapd1 has quit IRC | 10:19 | |
openstackgerrit | Gregory Thiemonge proposed openstack/octavia master: Fix pools going into ERROR when updating the pool https://review.opendev.org/c/openstack/octavia/+/760461 | 11:01 |
---|---|---|
*** zzzeek has quit IRC | 11:28 | |
*** zzzeek has joined #openstack-lbaas | 11:30 | |
*** yamamoto has quit IRC | 12:50 | |
*** luksky has joined #openstack-lbaas | 12:56 | |
luksky | hi, I'm trying to update octavia from queens to rocky (2.1.2 -> 3.2.2), after running octavia-db-manage upgrade head I get: | 12:58 |
luksky | ERROR [alembic.util.messaging] Can't locate revision identified by 'e1554c03234f' | 12:58 |
luksky | can You point me where to search for this problem :) ? Don't have any idea where to start. | 12:58 |
*** yamamoto has joined #openstack-lbaas | 13:00 | |
*** yamamoto has quit IRC | 13:00 | |
*** yamamoto has joined #openstack-lbaas | 13:00 | |
*** yamamoto has quit IRC | 13:04 | |
rm_work | gthiemonge: commented on https://review.opendev.org/c/openstack/octavia/+/773798 | 13:04 |
*** yamamoto has joined #openstack-lbaas | 13:05 | |
*** yamamoto has quit IRC | 13:05 | |
rm_work | luksky: hmm let me look at what revisions we ever had | 13:05 |
*** yamamoto has joined #openstack-lbaas | 13:05 | |
rm_work | yeah `e1554c03234f` was never a revision we had. did you have any downstream patches that touched the schema? | 13:05 |
rm_work | luksky: if you ever made any schema patches downstream, that would cause your DB to be out of sync (the main reason I am very careful to NEVER patch the schema downstream) | 13:06 |
*** yamamoto has quit IRC | 13:06 | |
luksky | rm_work: didn't touch this setup for a long time | 13:06 |
rm_work | yeah, but back when you did touch it... | 13:06 |
rm_work | do you run downstream patches at all? | 13:07 |
luksky | only upgrade from ubuntu 16 to 18, but version of octavia wasn't touched (still 2.1.2) | 13:07 |
*** yamamoto has joined #openstack-lbaas | 13:07 | |
luksky | no... :/ | 13:07 |
*** yamamoto has quit IRC | 13:07 | |
rm_work | hmmm | 13:07 |
*** zzzeek has quit IRC | 13:07 | |
*** yamamoto has joined #openstack-lbaas | 13:07 | |
*** yamamoto has quit IRC | 13:07 | |
rm_work | do you have any nodes where the old version of the octavia package you were using is still installed? | 13:07 |
luksky | and octavia is installed from pip, so wasn't installed from ubuntu's debs | 13:08 |
*** zzzeek has joined #openstack-lbaas | 13:08 | |
rm_work | oh, ok | 13:08 |
luksky | yes | 13:08 |
rm_work | cool, can you go on there and pastebin me an `ls -l` of... | 13:08 |
*** yamamoto has joined #openstack-lbaas | 13:08 | |
*** yamamoto has quit IRC | 13:08 | |
luksky | they all use the same DB (but it is rather obviouse) | 13:08 |
*** yamamoto has joined #openstack-lbaas | 13:09 | |
rm_work | `lib/python3.7/site-packages/octavia/db/migration/alembic_migrations/versions` | 13:09 |
rm_work | wherever that lib is (venv? wherever) | 13:09 |
rm_work | need to see the state of your alembic migrations | 13:10 |
luksky | :/ | 13:11 |
luksky | [DEV_WAW1][root@octavia-03 ~]# find /var -name alembic_migrations -type d 14:11 | 13:11 |
luksky | [DEV_WAW1][root@octavia-03 ~]# | 13:11 |
luksky | ok, sorry, not this dir | 13:12 |
*** yamamoto has quit IRC | 13:13 | |
luksky | http://paste.openstack.org/show/802566/ - not touched octavia | 13:14 |
luksky | http://paste.openstack.org/show/802567/ - touched octavia | 13:14 |
luksky | this revison hash is taken directly from DB ? | 13:19 |
*** zzzeek has quit IRC | 13:21 | |
gthiemonge | rm_work: thanks, I replied in https://review.opendev.org/c/openstack/octavia/+/773798 | 13:24 |
*** psachin has quit IRC | 13:26 | |
*** zzzeek has joined #openstack-lbaas | 13:26 | |
luksky | can I safly delete version_num from alembic_version table in DB ? | 13:37 |
luksky | rm_work: ^ | 13:37 |
rm_work | no | 13:38 |
rm_work | then it will have no idea where you are | 13:38 |
rm_work | need to figure out where e1554c03234f came from | 13:38 |
luksky | ok | 13:39 |
rm_work | somehow a patch was applied to your DB that put it on that schema id | 13:39 |
luksky | hm.. can I list the patches somehow witch was applied ? | 13:41 |
rm_work | it should have been these files you listed | 13:42 |
rm_work | but i don't see that hash in here | 13:42 |
luksky | I think the value should be 0aee2b450512 | 13:42 |
rm_work | then you can TRY changing the value in the db to that | 13:43 |
rm_work | and see if the upgrade applies | 13:43 |
luksky | well this is devel env - I can update to this version in DB, have dump. This one step before production :) | 13:43 |
rm_work | ok | 13:43 |
rm_work | in prod is the version id the same? | 13:43 |
luksky | on productoin I checked - | 13:43 |
*** sapd1 has joined #openstack-lbaas | 13:43 | |
luksky | in prod there is 0aee2b450512 (so OK) | 13:43 |
rm_work | or did someone maybe just apply a patch they were developing that never merged, and forget to roll it back? | 13:43 |
rm_work | ok yeah | 13:43 |
rm_work | sounds like that's what probably happened | 13:43 |
luksky | it may be... | 13:43 |
luksky | changing in DB | 13:44 |
rm_work | someone was working on a migration and applied it (or applied a migration from a patch that hasn't yet merged) | 13:44 |
rm_work | your DB may actually still have this applied tho | 13:44 |
rm_work | so it may be ... broken | 13:44 |
luksky | yes, it colud be the case :/ | 13:44 |
rm_work | was hoping to find the patch file | 13:44 |
rm_work | *migration file | 13:44 |
luksky | ok, understand the logic now :) | 13:44 |
rm_work | wish i could search all open patches for a file with that text... not sure how | 13:45 |
rm_work | hmmm `file:e1554c03234f` might be it but it comes up with nothing | 13:46 |
rm_work | nope, doesn't work, tried it on a file i know exists. whelp, dunno | 13:47 |
rm_work | someone could have applied a patch from their work laptop/desktop? or another server you didn't see? | 13:48 |
rm_work | anyway, if you understand the issue now, i'll just let you try to figure it out :D good luck | 13:48 |
luksky | thank You :) | 13:48 |
luksky | rm_work: http://paste.openstack.org/show/802569/ - it didn't fail :P, checking the services :) | 13:49 |
rm_work | hmm wtf | 13:51 |
rm_work | noticed 'd52f64b86d3a' had no message, started trying to see what that was for just so i could know who to be annoyed with, and i can't find that one in the official repo either | 13:53 |
*** stand has quit IRC | 13:54 | |
luksky | :/ | 13:58 |
rm_work | any idea where that came from? | 13:58 |
rm_work | what's in it? | 13:58 |
luksky | searching the file | 13:59 |
rm_work | can you pastebin it? | 13:59 |
luksky | (services seams to work ok) | 13:59 |
luksky | yes, one moment | 14:00 |
luksky | rm_work: http://paste.openstack.org/show/802572/ - strange content | 14:01 |
rm_work | that's an empty new revision | 14:02 |
rm_work | looks like alembic was run and created a new revision (standard for how we'd start a patch) | 14:02 |
rm_work | perhaps you or someone accidentally ran `alembic revision`? | 14:03 |
rm_work | (I think that's the command) | 14:03 |
luksky | I run octavia-db-manage history revision | 14:03 |
luksky | sory, octavia-db-manage upgrade revision | 14:03 |
rm_work | right | 14:04 |
rm_work | but at some point, someone may have run `alembic revision` for octavia | 14:04 |
rm_work | which would create a blank revision at the end of the chain | 14:04 |
rm_work | (exactly what that file is) | 14:04 |
rm_work | it could have been an accident (like, trying to figure out what revision is used) | 14:04 |
rm_work | the side effect of which created that file, which is now applied | 14:05 |
rm_work | so you're going to run into the same issue next time you try to upgrade | 14:05 |
rm_work | oh wait | 14:05 |
rm_work | did you run that command LITERALLY> | 14:05 |
rm_work | `octavia-db-manage upgrade revision` | 14:06 |
rm_work | or do you mean like | 14:06 |
rm_work | `octavia-db-manage upgrade $revision` | 14:06 |
rm_work | the correct command is `octavia-db-manage upgrade head` | 14:07 |
rm_work | if you actually had "revision" there, I wonder if it could have passed that through to alembic, as we're essentially using a thin alembic wrapper there | 14:07 |
luksky | I run this command exactly: octavia-db-manage upgrade revision | 14:08 |
rm_work | ok yeah, that's not correct | 14:08 |
rm_work | i bet that caused this | 14:08 |
rm_work | i'm just not totally sure how yet | 14:08 |
luksky | OK | 14:08 |
rm_work | again, the correct command is `octavia-db-manage upgrade head` | 14:09 |
rm_work | you'll want to update your db to `76aacf2e176c` | 14:09 |
*** lemko has quit IRC | 14:09 | |
rm_work | and delete that file | 14:09 |
rm_work | otherwise you'll have the same issue again | 14:09 |
*** lemko has joined #openstack-lbaas | 14:09 | |
luksky | ok, I did run once more command | 14:09 |
luksky | and I think this the reason: | 14:10 |
luksky | [DEV_WAW1][root@octavia-01 ~]# octavia-db-manage revision 15:08 | 14:10 |
luksky | Generating /usr/local/lib/python2.7/dist-packages/octavia/db/migration/alembic_migrations/versions/2802489e7306_.py ... done | 14:10 |
luksky | [DEV_WAW1][root@octavia-01 ~]# | 14:10 |
rm_work | yep | 14:10 |
luksky | the content is the same, as in the revision with 'empty message' | 14:11 |
rm_work | make sure to delete the files you created (that one, and d52f64b86d3a_.py) | 14:11 |
luksky | ok | 14:11 |
rm_work | and set your DB back to version 76aacf2e176c manually | 14:11 |
rm_work | and then avoid ever typing "revision" again :D | 14:11 |
luksky | :D | 14:11 |
luksky | thank You | 14:11 |
rm_work | likely that is the same thing that happened last time / earlier, so probably your DB is in ok shape | 14:11 |
luksky | yes, in deed | 14:12 |
luksky | but I have one LB in PENDING UPDATE state | 14:12 |
luksky | can I fix it in DB now ? | 14:12 |
luksky | this LB was created before upgrade (to check if it survive the upgrade :) ) | 14:12 |
rm_work | ah | 14:13 |
rm_work | I would set it to ERROR, then failover | 14:13 |
rm_work | though that doesn't necessarily tell you what you were hoping, I think | 14:13 |
rm_work | how did it end up in PENDING? | 14:13 |
rm_work | it was ACTIVE pre-upgrade? | 14:13 |
*** vishalmanchanda has quit IRC | 14:14 | |
luksky | yes, and now it is working (I mean - the site behind him is working) | 14:14 |
luksky | I think the amphora need to updated, as well... | 14:15 |
luksky | to rocky versoin | 14:15 |
luksky | need to be * | 14:15 |
luksky | and second one goes to ERROR state :/ (I don't like upgrades of octavia :D ) | 14:17 |
rm_work | upgrading should do nothing on the dataplane | 14:17 |
rm_work | existing amphora should just continue to function, LBs should stay ACTIVE | 14:18 |
rm_work | there will be a new image available, but it will only be used by new amps | 14:18 |
luksky | Yes, they are working, but are unmanagable now | 14:18 |
rm_work | hmm, should not be | 14:18 |
rm_work | when you did your upgrade, did you change your certificates? | 14:18 |
luksky | no | 14:18 |
rm_work | do you use kolla-ansible or some other deployment framework, or do manual upgrades? | 14:18 |
luksky | only changed version of octavia | 14:19 |
luksky | no kolla, my own puppet manifests | 14:19 |
luksky | upgrade of octavia is only one command in this case: pip install octavia==3.2.2 | 14:19 |
luksky | octavia-db-migrate upgrade head | 14:20 |
luksky | and then start the services | 14:20 |
rm_work | hmm | 14:39 |
rm_work | it should work perfectly fine with all existing LBs | 14:39 |
luksky | will search - but I have another issue: | 14:44 |
luksky | during building image for rocky (-g stable/rocky): | 14:44 |
luksky | 2021-02-11 14:36:57.684 | ERROR: No matching distribution found for oslo.serialization>=2.28.1 | 14:44 |
luksky | building of amphora image * | 14:53 |
*** mchlumsky has quit IRC | 14:59 | |
*** yamamoto has joined #openstack-lbaas | 15:10 | |
rm_work | interesting | 15:22 |
rm_work | have not seen that one | 15:22 |
rm_work | rocky is still a pretty old release... | 15:23 |
rm_work | luksky: are you running octavia in its own virtualenv? | 15:23 |
rm_work | or sharing one virtualenv for multiple services | 15:23 |
rm_work | if you've got a standalone virtualenv, i'd recommend upgrading just octavia directly to victoria, you'll probably have less issues | 15:24 |
luksky | You writing about amphora building ? | 15:24 |
rm_work | about everything, but yes | 15:24 |
rm_work | also amp building | 15:24 |
rm_work | i don't actually know if rocky amps are buildable anymore | 15:25 |
luksky | ok, so latest amphoras wan't do any trouble with rocky ? | 15:25 |
*** bcafarel has quit IRC | 15:25 | |
rm_work | recommend making a python3 virtualenv and installing octavia victoria release, then build a victoria amphora :) | 15:26 |
rm_work | should still be able to communicate with old amps | 15:26 |
luksky | I need to have complement env (regarding upgrading controllers) | 15:26 |
*** bcafarel has joined #openstack-lbaas | 15:26 | |
luksky | and can't upgrade all other components from openstack to the latest release | 15:27 |
rm_work | don't need to | 15:27 |
rm_work | octavia Victoria could run on Nova/Neutron from like... Icehouse | 15:27 |
luksky | well... | 15:28 |
luksky | interesting | 15:28 |
luksky | ok, give it a try on devel. | 15:28 |
rm_work | I'm contemplating a talk for the next summit titled something like "Multi-Release Cloud and You: How to Mix & Match Service Versions" | 15:29 |
rm_work | our cloud is a hilarious mix of versions | 15:29 |
rm_work | Keystone Rocky, Nova Stein, Neutron Train, Glance/Octavia/Designate on git-master | 15:29 |
rm_work | ah, Senlin on git-master too | 15:30 |
haleyb | FrankenCloud | 15:30 |
rm_work | basically the only thing i've found that matters is that Neutron should be >=Nova-Version | 15:30 |
*** yamamoto has quit IRC | 15:31 | |
luksky | heehh :D | 15:35 |
*** luksky has quit IRC | 15:48 | |
-openstackstatus- NOTICE: Recent POST_FAILURE results from Zuul for builds started prior to 15:47 UTC were due to network connectivity issues reaching one of our log storage providers, and can be safely rechecked | 15:51 | |
*** luksky has joined #openstack-lbaas | 16:05 | |
luksky | rm_work: can I upgrade directly from queens to victoria :D ? | 16:18 |
rm_work | yes | 16:18 |
luksky | simple octavia-db-manage upgrade head and pray :) ? | 16:19 |
rm_work | yep | 16:19 |
rm_work | i recommend you STOP the health-manager services before an upgrade | 16:19 |
rm_work | then restart all services | 16:19 |
luksky | ok, will do it in venv | 16:20 |
luksky | let You know how goes :P | 16:20 |
rm_work | also, i'd have a new amp image uploaded and ready to go by the time you restart the services on the new version | 16:25 |
rm_work | we GUARANTEE image compatibility for N+/-1 | 16:25 |
rm_work | this is like +4 :P | 16:25 |
rm_work | it should still "work" but it'd be best to get amps failed over somewhat soon anyway if they're still running such an old image | 16:26 |
rm_work | just for OS security reasons | 16:26 |
rm_work | we cycle all our amps just about every 6 months minimum | 16:26 |
*** xgerman has joined #openstack-lbaas | 17:03 | |
*** yamamoto has joined #openstack-lbaas | 17:29 | |
johnsom | Yeah, it should work. That is a big jump though. If, and I mean if, there was an issue with the old amp image, the controller would just mark it PENDING_UPDATE and fail it over to the new image anyway, automatically. Assuming you followed the upgrade guide and upgraded the components in the prescribed order. | 17:32 |
*** rpittau is now known as rpittau|afk | 17:43 | |
*** yamamoto has quit IRC | 17:45 | |
*** gcheresh has quit IRC | 17:49 | |
*** sapd1 has quit IRC | 17:51 | |
openstackgerrit | Michael Johnson proposed openstack/python-octaviaclient master: Remove install unnecessary packages https://review.opendev.org/c/openstack/python-octaviaclient/+/751629 | 17:55 |
johnsom | rebase to re-render the docs | 17:55 |
rm_work | yeah that's why I was saying i recommend having the new image uploaded before you restart services on the new version | 18:00 |
rm_work | that way if failover IS necessary, it should just work | 18:01 |
*** numans has joined #openstack-lbaas | 18:15 | |
*** luksky has quit IRC | 18:35 | |
openstackgerrit | Ghanshyam proposed openstack/octavia master: [goal] Deprecate the JSON formatted policy file https://review.opendev.org/c/openstack/octavia/+/764578 | 18:36 |
*** luksky has joined #openstack-lbaas | 18:52 | |
*** njohnston has quit IRC | 19:04 | |
luksky | ok, will let You know tomorrow, now I'm finishing work :) | 19:15 |
*** gcheresh has joined #openstack-lbaas | 19:41 | |
*** yamamoto has joined #openstack-lbaas | 19:42 | |
*** gcheresh has quit IRC | 19:52 | |
*** yamamoto has quit IRC | 19:59 | |
*** rcernin has joined #openstack-lbaas | 21:20 | |
*** yamamoto has joined #openstack-lbaas | 21:57 | |
*** gmann is now known as gmann_afk | 22:10 | |
*** luksky has quit IRC | 22:25 | |
*** yamamoto has quit IRC | 22:25 | |
*** lemko7 has joined #openstack-lbaas | 22:29 | |
*** lemko has quit IRC | 22:30 | |
*** lemko7 is now known as lemko | 22:30 | |
*** xgerman has quit IRC | 22:32 | |
*** emccormick has quit IRC | 22:40 | |
*** emccormick has joined #openstack-lbaas | 22:40 | |
*** yamamoto has joined #openstack-lbaas | 22:41 | |
*** gmann_afk is now known as gmann | 23:11 | |
*** ccamposr has quit IRC | 23:48 |
Generated by irclog2html.py 2.17.2 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!