*** rfolco|ruck has quit IRC | 02:18 | |
*** rfolco|ruck has joined #zuul | 02:19 | |
*** bhavikdbavishi has joined #zuul | 04:38 | |
*** bhavikdbavishi has quit IRC | 04:58 | |
*** bhavikdbavishi has joined #zuul | 05:36 | |
*** snapiri has joined #zuul | 05:55 | |
*** ChrisShort has quit IRC | 06:19 | |
*** ChrisShort has joined #zuul | 06:19 | |
*** bhavikdbavishi has quit IRC | 06:25 | |
tobiash | pabelanger: responded on 652764 | 06:29 |
---|---|---|
*** pcaruana has joined #zuul | 06:56 | |
*** jamesmcarthur has joined #zuul | 11:06 | |
*** jamesmcarthur has quit IRC | 11:10 | |
*** bhavikdbavishi has joined #zuul | 11:49 | |
*** bhavikdbavishi has quit IRC | 13:54 | |
*** jamesmcarthur has joined #zuul | 14:35 | |
*** sshnaidm|off has quit IRC | 14:37 | |
*** jamesmcarthur has quit IRC | 14:39 | |
*** altlogbot_2 has quit IRC | 15:00 | |
*** jamesmcarthur has joined #zuul | 15:01 | |
*** altlogbot_2 has joined #zuul | 15:02 | |
pabelanger | tristanC: http://softwarefactory-project.io/zuul/ wasn't loading properly. Could have been an outage I guess. | 15:08 |
openstackgerrit | Clark Boylan proposed zuul/zuul master: Tiny cleanup in change panel js https://review.opendev.org/655589 | 15:14 |
openstackgerrit | Paul Belanger proposed zuul/zuul master: Increase timeout value for test_playbook timeout job (again) https://review.opendev.org/656174 | 15:15 |
clarkb | pabelanger: thank you for the follow up on the parent of ^ the one I just rebased | 15:15 |
pabelanger | clarkb: np | 15:15 |
pabelanger | zuul-maint: ^bumps out timeout job again for test_playbook, it still is a little racy. However, I don't feel long term that is the best solution, so open to suggestions. | 15:16 |
clarkb | is this the test that actually runs ansible? | 15:16 |
pabelanger | the good news, is out of 30 test runs, we only had 1 failues (^) for testing: https://review.opendev.org/655805/ | 15:16 |
pabelanger | clarkb: yah | 15:16 |
pabelanger | clarkb: I should say, it tests that we can abort an ansible run properly | 15:17 |
pabelanger | once we hit the timeout | 15:17 |
pabelanger | but in somecases, we timeout in pre-run, which causes zuul to run the job again, messing up our build results | 15:18 |
pabelanger | so, my first thought was just to implement job.pre_timeout, so that doesn't happen | 15:19 |
pabelanger | but that is a change in functionality | 15:19 |
pabelanger | clarkb: also, if you are able to https://review.opendev.org/656072/ was the other race found in testing 655805 | 15:20 |
clarkb | pabelanger: done | 15:21 |
pabelanger | ty | 15:21 |
openstackgerrit | Paul Belanger proposed zuul/zuul master: DNM: exercise halving concurrency https://review.opendev.org/655805 | 15:22 |
clarkb | oh good my tox.ini update to fix the coverage target merged too | 15:23 |
clarkb | that was useful when testing the security fix (so good to have in ou rback pocket) | 15:24 |
pabelanger | yah, we've stabalized things pretty well over the last few days | 15:24 |
pabelanger | I am hoping to get 5 rechecks out of 655805 without any failures | 15:25 |
clarkb | catching up on the fix for the docker image builds. Was it just the firewall update? | 15:31 |
pabelanger | clarkb: yah, that seems to have been the last fix | 15:32 |
pabelanger | but only for ipv6 I think | 15:32 |
clarkb | ya Ithink docker modifies ipv4 rule sitself | 15:32 |
clarkb | it might do ipv6 rules too if we tell it to manage ipv6 (I seem to recall a config flag for that) | 15:32 |
pabelanger | clarkb: as we say that (registry) this failure just happened: http://logs.openstack.org/89/655589/2/check/zuul-build-image/0c05cc8/job-output.txt.gz#_2019-04-28_15_36_43_818975 | 15:38 |
clarkb | that is a different error than before right? before was an EOF ( which makes sense give nthe firewall) | 15:39 |
clarkb | what is odd is it pushed other images and layers prior to that | 15:41 |
clarkb | I wonder if there are errors in the logs we can get at | 15:41 |
clarkb | (not in a great spot to look as trying to pay attention to the board meeting) | 15:41 |
pabelanger | yah, same | 15:42 |
clarkb | *errors in the intermediate registry logs | 15:43 |
openstackgerrit | Merged zuul/zuul master: Fix test race in test_job_pause_retry (#2) https://review.opendev.org/656072 | 15:56 |
pabelanger | I'll be relocating to coffee shop, but suspect I'll miss the confirmation vote for zuul. Looking forward to results shortly | 15:56 |
*** jamesmcarthur has quit IRC | 16:06 | |
corvus | clarkb, pabelanger: the other thing with the registry fix was adding retries to skopeo, so we will try each skopeo command 3 times | 16:10 |
openstackgerrit | Tobias Henkel proposed zuul/zuul master: Make test_playbook more robust https://review.opendev.org/656177 | 16:10 |
tobiash | pabelanger: check out this idea regarding test_playbook ^ | 16:10 |
corvus | the error that pabelanger linked happened 3 times, so... maybe there is something wrong with that blob | 16:10 |
*** jamesmcarthur has joined #zuul | 16:11 | |
openstackgerrit | Tobias Henkel proposed zuul/zuul master: DNM: exercise halving concurrency https://review.opendev.org/655805 | 16:13 |
*** jamesmcarthur has quit IRC | 16:14 | |
openstackgerrit | Tobias Henkel proposed zuul/zuul master: Fix race in test_job_node_failure_resume https://review.opendev.org/656178 | 16:14 |
openstackgerrit | Tobias Henkel proposed zuul/zuul master: DNM: exercise halving concurrency https://review.opendev.org/655805 | 16:15 |
*** jamesmcarthur has joined #zuul | 16:26 | |
mnaser | corvus: https://opendev.org/vexxhost/ansible-role-wireguard interesting simple multi-node vpn testing :) | 16:55 |
corvus | the openstack foundation board of directors confirmed zuul as an open infrastructure project | 17:10 |
corvus | we're in! :) | 17:10 |
clarkb | hip hip hurray! (is this an appropriate use of that?) | 17:10 |
mugsie | congrats! | 17:12 |
tobiash | :) | 17:14 |
mordred | corvus: I was going to type that sentence, then I realized that I think technically I'm not supposed to - so thanks! | 17:17 |
corvus | mordred: yeah, us peanut gallery folks are under no such restrictions, so i figured i'd do it :) | 17:18 |
mordred | \o/ | 17:18 |
AJaeger | \o/ | 17:24 |
*** jamesmcarthur has quit IRC | 17:30 | |
pabelanger | \o/ | 17:47 |
pabelanger | http://logs.openstack.org/05/655805/9/check/zuul-tox-py35-1/f94c0f2/job-output.txt.gz#_2019-04-28_16_19_09_551133 | 17:49 |
pabelanger | that is weird | 17:49 |
pabelanger | filesystem issue? | 17:49 |
clarkb | pabelanger: might not be executable? | 17:52 |
clarkb | (that would be weird though) | 17:52 |
pabelanger | yah, going to recheck and see if there i an easy way to setup an autohold on that review for all the tox jobs | 17:54 |
pabelanger | in case we get it again | 17:54 |
pabelanger | tobiash: +2 | 17:58 |
openstackgerrit | Monty Taylor proposed zuul/zuul-website master: Add slides for Spring 2019 Board Update https://review.opendev.org/656182 | 17:59 |
mordred | clarkb, corvus: ^^ draft slides for this afternoon | 17:59 |
clarkb | one small thing noted inline | 18:02 |
clarkb | two small things now :) | 18:03 |
pabelanger | Was zuul 3.3.0 for berlin? | 18:04 |
pabelanger | I _think_ it was | 18:05 |
pabelanger | mordred: also left comments, but looks great | 18:11 |
pabelanger | http://logs.openstack.org/33/631933/16/check/zuul-build-image/d6f4993/job-output.txt.gz#_2019-04-28_18_15_15_226860 | 18:22 |
pabelanger | another yarn failure | 18:22 |
pabelanger | wonder if there is an upstream issue | 18:22 |
*** jamesmcarthur has joined #zuul | 18:35 | |
openstackgerrit | Paul Belanger proposed zuul/zuul master: WIP: stream logs for ansible loops https://review.opendev.org/656185 | 18:57 |
*** jamesmcarthur has quit IRC | 19:06 | |
*** jamesmcarthur has joined #zuul | 19:14 | |
openstackgerrit | Monty Taylor proposed zuul/zuul-website master: Add slides for Spring 2019 Board Update https://review.opendev.org/656182 | 19:15 |
openstackgerrit | James E. Blair proposed zuul/zuul-jobs master: Add missing conditionals to build-docker-image https://review.opendev.org/656187 | 19:39 |
*** pcaruana has quit IRC | 19:40 | |
tobiash | pabelanger: commented on 656185 | 19:51 |
openstackgerrit | Merged zuul/zuul-jobs master: Add missing conditionals to build-docker-image https://review.opendev.org/656187 | 19:56 |
*** jamesmcarthur has quit IRC | 20:05 | |
pabelanger | tobiash: cool, that is what I figured but couldn't find a reference | 20:10 |
*** jamesmcarthur has joined #zuul | 20:13 | |
clarkb | that yarn issue seems to be persistent? | 20:19 |
pabelanger | unsure, now that docker is updated on laptop, I'm going to play around with it | 20:22 |
pabelanger | so for react wizards out there, sometime my network is terrible, and when I have zuul.o.o open (or any zuul web) it doesn't seem to properly reconnect once the network is better. I just have the spinning circle on the top right | 20:54 |
pabelanger | a browser refresh is required, but when it was old web, I didn't have this is while on terrible wifi | 20:54 |
pabelanger | what I see in web console is: Failed to load resource: net::ERR_NETWORK_CHANGED | 20:55 |
*** jamesmcarthur has quit IRC | 20:56 | |
clarkb | fwiw sometimes I have to restart all of firefox for it to work again (so not always a zuul specific issue) | 20:56 |
pabelanger | yah, from what I see, it seems GET status look just gets stuck | 20:57 |
pabelanger | s/look/loop | 20:57 |
*** jamesmcarthur has joined #zuul | 21:04 | |
Shrews | Hooray for Zuul-nation! | 21:37 |
Shrews | I shall celebrate with drinking of a yeast fermented beverage | 21:38 |
SpamapS | FYI, I believe zuul-web has some issues with reconnecting that result in CORS violations. | 21:47 |
SpamapS | We see it too.. it was more prevalent when the service worker was enabled. | 21:48 |
nickx-intel | CORS? | 21:48 |
nickx-intel | @SpamapS ^? | 21:48 |
SpamapS | But I think what happens is sometimes you need to load javascript and the headers don't authorize, so browser policies fail that, and you get Loading... | 21:48 |
SpamapS | https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS | 21:48 |
nickx-intel | thanks SpamapS d ^__^ | 21:49 |
SpamapS | I haven't confirmed yet, but it's the working theory. One of our devs uses "Brave" and it is quite reliable for him to get a blank or Loading... screen, shift-refresh fixes. | 21:50 |
clarkb | the only cors headers I see are set to * | 21:58 |
openstackgerrit | Monty Taylor proposed zuul/zuul-website master: Add slides for Spring 2019 Board Update https://review.opendev.org/656182 | 21:58 |
clarkb | and those are set by the api/ subpath | 21:58 |
clarkb | the other paths doesn't seem to set them at all | 21:58 |
SpamapS | clarkb: the static bits need them too. | 21:59 |
clarkb | (this is looking at opendev's deployment) | 21:59 |
clarkb | I thought the default if unset was basically * | 21:59 |
clarkb | maybe some browsers (like brave) change that? | 21:59 |
*** jamesmcarthur has quit IRC | 22:19 | |
*** jamesmcarthur has joined #zuul | 22:30 | |
*** sshnaidm has joined #zuul | 22:43 | |
*** jamesmcarthur has quit IRC | 22:54 | |
*** jamesmcarthur has joined #zuul | 23:02 | |
*** jamesmcarthur has quit IRC | 23:12 | |
SpamapS | clarkb:yes, and firefox has a more stringent option that many folks turn on. | 23:17 |
*** jamesmcarthur has joined #zuul | 23:20 | |
pabelanger | http://paste.openstack.org/show/750044/ | 23:20 |
pabelanger | first time I seen that failure in our unittests | 23:20 |
pabelanger | we should be able to make assertNodepoolState() retry on connectionloss | 23:21 |
pabelanger | https://opendev.org/zuul/zuul/src/branch/master/tests/base.py#L2870 | 23:22 |
pabelanger | Shrews: ^maybe you have some thoughts | 23:22 |
pabelanger | http://logs.openstack.org/05/655805/9/check/zuul-tox-py35-2/a8df9ae/testr_results.html.gz | 23:23 |
pabelanger | is where the failure was | 23:23 |
*** ianw_pto is now known as ianw | 23:37 | |
openstackgerrit | Paul Belanger proposed zuul/zuul master: Update assertNodepoolState() to retry zk requests https://review.opendev.org/656213 | 23:41 |
pabelanger | first attempt ^ | 23:41 |
openstackgerrit | Paul Belanger proposed zuul/zuul master: DNM: exercise halving concurrency https://review.opendev.org/655805 | 23:42 |
*** jamesmcarthur has quit IRC | 23:43 | |
*** irclogbot_3 has quit IRC | 23:54 | |
*** edmondsw_ has quit IRC | 23:58 | |
*** irclogbot_0 has joined #zuul | 23:58 |
Generated by irclog2html.py 2.15.3 by Marius Gedminas - find it at mg.pov.lt!