Monday, 2023-08-07

*** kiennt2609 is now known as kiennt03:45
*** elvira2 is now known as elvira12:28
tkajinamHi. it seems release job of puppet-nova failed likely because of a temporal issue with puppet forge. Can we attempt to rerun the job ? https://zuul.opendev.org/t/openstack/build/6057ca178a8e46079589ac4a46aa0b3c13:02
noonedeadpunkHey folks! It looks to me, like some mirrors for jammy are desynced. Maybe, even this one - `deb http://mirror.iad3.inmotion.opendev.org:8080/MariaDB/mariadb-10.11.2/repo/ubuntu jammy main`13:07
fungitkajinam: sorry, i was responding to you in the #openstack-puppet channel. we're now talking about this problem in four channels (also #rdo and #openstack-release)13:07
noonedeadpunkas jobs fail with ` E: Version '12.3.0-1ubuntu1~22.04' for 'libgcc1' was not found` https://zuul.opendev.org/t/openstack/build/a39cba90d870457ea01975131f57682113:07
tkajinamfungi, oh, sorry. I didn't notice it. let me move to #puppet-openstack13:08
funginoonedeadpunk: all our "mirrors" should have identical content, since they're just web front-ends to a distributed network filesystem13:08
noonedeadpunkwhile ones, that use `deb http://mirror-int.ord.rax.opendev.org:8080/MariaDB/mariadb-10.11.2/repo/ubuntu jammy main` do pass https://zuul.opendev.org/t/openstack/build/90a3284262e54665bad9b7a339931fec13:08
noonedeadpunkfungi: some caching on front-end?13:10
funginoonedeadpunk: only at the filesystem driver layer13:11
funginoonedeadpunk: oh, though this isn't our normal ubuntu mirrors?13:11
fungior maybe the :8080 is throwing me off13:11
fungii'll need to look into what vhost is served from that port, maybe it's not a normal reprepro mirror in our afs13:12
noonedeadpunkI've jsut checked 3 failed jobs fro today - all of them were from inmotion. Also checked 3-4 that passed today - not a single one was inmotion, but rax or ovh13:14
noonedeadpunkIt could jsut co-incidence, but not sure how I can debug that thingy.13:14
funginoonedeadpunk: okay, i found it. 8080 is a caching reverse web proxy to https://downloads.mariadb.com/MariaDB/ according to https://downloads.mariadb.com/MariaDB/13:17
fungier, according to https://opendev.org/opendev/system-config/src/branch/master/playbooks/roles/mirror/templates/mirror.vhost.j2#L304-L30713:17
fungiso i suppose it's possible the mirror in inmotion cached a bad response from downloads.mariadb.com with a long ttl13:18
fungior there could be something else going on with the server's disk. i'll see if anything's out of sorts there13:18
noonedeadpunkwell, ttl doesn't look that long according to config. And I think we've started seeing issues previous week. At least I see first failure dated 31st of July13:20
fungiserver seems happy enough13:20
noonedeadpunk(that could be "proper" failure due to original mirror misbehaving)13:21
funginoonedeadpunk: looks like downloads.mariadb.com is a round-robin to four cloudflare cdn endpoints, so it's possible one of those is actually serving stale content to requests for its region (which might then only be impacting our inmotion environment)13:22
noonedeadpunkyeah, that's also a possibility ofc13:23
fungii'm seeing if i can find the same missing package version13:23
noonedeadpunkfungi: well, actually, I think this package comes not from mariadb mirror13:27
noonedeadpunkhttps://paste.openstack.org/show/buBAKn2nSXBk1a40Ms5U/13:27
noonedeadpunkbut in output it's missing exactly same package I have installed in my sandbox13:28
fungihttps://packages.ubuntu.com/libgcc113:31
noonedeadpunkhttps://packages.ubuntu.com/jammy-updates/libgcc-s113:33
fungioh, virtual package for libgcc-s1 on jammy13:33
fungiyep13:33
noonedeadpunkyeah, I guess it's time to update this in the roles, but it should work....13:34
fungiand changelog says that backport was from may13:34
fungidoesn't explain why it's breaking in only one provider though13:35
fungiout of curiosity, why does that role pin exact package versions? ubuntu doesn't keep them in their respective suites indefinitely, so those versions are going to get rotated out for newer updates over time13:36
fungiseems like you'll wind up with constant churn on those version pins13:37
noonedeadpunkI think we're pinning only packages that come from third-partie repos, like mariadb or rabbitmq, that are there for quite long time13:39
noonedeadpunkOr well, we never had issues with mariadb being pinned, we had with rabbit, but it's worth pinning them, as you can get incompatible erlang/rabbit quite easily13:40
fungi"apt-get ... install install debconf-utils=1.5.79ubuntu1 libgcc1=12.3.0-1ubuntu1~22.04 libstdc++6=12.3.0-1ubuntu1~22.04 python3-pymysql=1.0.2-1ubuntu1 libmariadb-dev=1:10.11.2+maria~ubu2204 mariadb-client=1:10.11.2+maria~ubu2204 mariadb-backup=1:10.11.2+maria~ubu2204 mariadb-server=1:10.11.2+maria~ubu2204 socat=1.7.4.1-3ubuntu4"13:41
noonedeadpunkor anytime you decide to extend a cluster - it can jsut fall apart13:41
fungilike half of those are straight from the ubuntu main repository13:41
noonedeadpunkfungi: ah. you're about that. It's Ansible. Hvae no idea why, but they've re-invented system resolver for package versions13:41
fungioh, yeesh13:42
noonedeadpunkAnd they jsut handle dependencies and provide exact versions to apt13:42
fungiif their dep solver has subtle differences with apt's, i can imagine that would lead to some crazy failure conditions13:42
noonedeadpunkI think they did that to support things like version comparison and ability to supply `>=` and stuff to apt module13:43
fungimaybe they just use libapt to do it13:43
noonedeadpunkoh, they had quite wild failures when you was crazy enough to use apt prefernces.d isntead if their resolver...13:43
fungibut yeah, i would have expected they'd only pass through version pins specified in the task data13:43
fungianyway, still doesn't explain why it would be different in the inmotion environment13:44
noonedeadpunkyeah, they actually do https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/apt.py#L493-L51413:47
fungiso circling back around to the beginning, libgcc1 (or libgcc-s1) 12.3.0-1ubuntu1~22.04 is being served from our global afs file tree so all the mirrors should have it, or none should, since they're all just serving content from the same read-only replica from the afs fileservers13:47
fungibut i'll double-check that13:47
fungifirstly, /afs/openstack.org/mirror/ubuntu/pool/main/g/gcc-12/libgcc-s1_12.3.0-1ubuntu1~22.04_amd64.deb exists13:50
fungihttps://mirror.iad3.inmotion.opendev.org/ubuntu/pool/main/g/gcc-12/libgcc-s1_12.3.0-1ubuntu1~22.04_amd64.deb is present13:52
fungialso the package index for jammy-updates at https://mirror.iad3.inmotion.opendev.org/ubuntu/dists/jammy-updates/main/binary-amd64/Packages.gz has a libgcc-s1 12.3.0-1ubuntu1~22.04 entry with the desired "Provides: libgcc1 (= 1:12.3.0-1ubuntu1~22.04)"13:55
noonedeadpunkhuh....13:59
* noonedeadpunk confused now13:59
noonedeadpunklet me re-check again then...14:00
noonedeadpunkthanks for checking on this14:03
funginoonedeadpunk: it may be something odd with the images for the nodes themselves. i see that the last time we successfully uploaded any ubuntu-jammy images to our providers was wednesday, and the last successful ubuntu-jammy upload to inmotion-iad3 was a week ago14:06
fungii'm going to switch to #opendev and start looking into why we're having problems uploading (or building) images14:06
opendevreviewMichael Johnson proposed openstack/project-config master: Allow designate-core as osc/sdk service-core  https://review.opendev.org/c/openstack/project-config/+/89036522:42
opendevreviewMichael Johnson proposed openstack/project-config master: Allow designate-core as osc/sdk service-core  https://review.opendev.org/c/openstack/project-config/+/89036522:48
opendevreviewMerged openstack/project-config master: Fix app-intel-ethernet-operator reviewers group  https://review.opendev.org/c/openstack/project-config/+/89056923:46

Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!