Monday, 2020-12-28

*** zzzeek has quit IRC00:31
*** zzzeek has joined #openstack-ironic00:39
*** tosky has quit IRC00:57
*** jawad_axd has joined #openstack-ironic01:01
*** zzzeek has quit IRC01:02
*** zzzeek has joined #openstack-ironic01:03
*** jawad_axd has quit IRC01:06
*** zzzeek has quit IRC01:13
*** zzzeek has joined #openstack-ironic01:14
*** zzzeek has quit IRC01:38
*** zzzeek has joined #openstack-ironic01:41
*** zzzeek has quit IRC01:50
*** zzzeek has joined #openstack-ironic01:52
*** ociuhandu has joined #openstack-ironic02:08
*** zzzeek has quit IRC02:09
*** ociuhandu has quit IRC02:13
*** zzzeek has joined #openstack-ironic02:13
*** rcernin has quit IRC02:45
*** uzumaki has quit IRC03:02
*** xinliang has joined #openstack-ironic03:03
*** rcernin has joined #openstack-ironic03:08
*** zzzeek has quit IRC03:20
*** zzzeek has joined #openstack-ironic03:21
*** rcernin has quit IRC03:27
*** rcernin has joined #openstack-ironic03:27
*** ociuhandu has joined #openstack-ironic03:56
*** ociuhandu has quit IRC04:01
*** ociuhandu has joined #openstack-ironic04:22
*** ociuhandu has quit IRC04:26
*** xinliang has quit IRC04:26
*** zzzeek has quit IRC04:34
*** zzzeek has joined #openstack-ironic04:35
*** zzzeek has quit IRC04:44
*** zzzeek has joined #openstack-ironic04:45
*** rcernin has quit IRC04:50
*** rcernin has joined #openstack-ironic04:54
*** zzzeek has quit IRC04:59
*** zzzeek has joined #openstack-ironic05:00
*** ociuhandu has joined #openstack-ironic05:05
*** mkrai has joined #openstack-ironic05:06
*** zzzeek has quit IRC05:06
*** zzzeek has joined #openstack-ironic05:08
*** ociuhandu has quit IRC05:10
*** bhagyashris|ruck is now known as bhagyashris05:33
*** zzzeek has quit IRC05:50
*** zzzeek has joined #openstack-ironic05:50
*** uzumaki has joined #openstack-ironic05:57
*** zzzeek has quit IRC06:05
*** jawad_axd has joined #openstack-ironic06:06
*** zzzeek has joined #openstack-ironic06:07
*** jawad_axd has quit IRC06:07
*** jawad_axd has joined #openstack-ironic06:07
*** mkrai has quit IRC06:13
*** zzzeek has quit IRC06:27
*** zzzeek has joined #openstack-ironic06:29
*** zzzeek has quit IRC06:50
*** zzzeek has joined #openstack-ironic06:50
*** ociuhandu has joined #openstack-ironic06:53
*** ociuhandu has quit IRC06:58
*** ociuhandu has joined #openstack-ironic07:00
*** zzzeek has quit IRC07:18
*** zzzeek has joined #openstack-ironic07:19
*** zzzeek has quit IRC07:24
*** zzzeek has joined #openstack-ironic07:25
*** ociuhandu has quit IRC07:27
*** ildikov has quit IRC07:45
*** pas-ha has quit IRC07:45
*** PrinzElvis has quit IRC07:46
*** pas-ha has joined #openstack-ironic07:47
*** ildikov has joined #openstack-ironic07:47
*** PrinzElvis has joined #openstack-ironic07:47
*** sshnaidm_ has joined #openstack-ironic08:02
*** sshnaidm has quit IRC08:05
*** rcernin has quit IRC08:07
*** ociuhandu has joined #openstack-ironic08:26
*** rcernin has joined #openstack-ironic08:26
*** rcernin has quit IRC08:31
*** uzumaki has quit IRC08:37
*** rcernin has joined #openstack-ironic08:43
*** rcernin has quit IRC08:47
*** ociuhandu has quit IRC08:55
*** zzzeek has quit IRC08:58
*** zzzeek has joined #openstack-ironic09:01
*** ociuhandu has joined #openstack-ironic09:08
*** ociuhandu has quit IRC09:23
*** ociuhandu has joined #openstack-ironic09:27
*** uzumaki has joined #openstack-ironic09:28
*** ociuhandu has quit IRC09:32
*** ociuhandu has joined #openstack-ironic09:54
*** ociuhandu has quit IRC10:02
*** ociuhandu has joined #openstack-ironic10:15
*** ociuhandu has quit IRC10:33
*** ociuhandu has joined #openstack-ironic10:42
*** sshnaidm_ is now known as sshnaidm|rover10:50
*** ociuhandu has quit IRC10:52
*** tosky has joined #openstack-ironic10:55
*** ociuhandu has joined #openstack-ironic11:18
openstackgerritKaifeng Wang proposed openstack/ironic-specs master: Snapshot support  https://review.opendev.org/c/openstack/ironic-specs/+/74693511:22
*** ociuhandu has quit IRC11:39
*** zzzeek has quit IRC11:43
*** zzzeek has joined #openstack-ironic11:44
openstackgerritKaifeng Wang proposed openstack/ironic-specs master: Snapshot support  https://review.opendev.org/c/openstack/ironic-specs/+/74693511:49
*** zzzeek has quit IRC11:51
*** zzzeek has joined #openstack-ironic11:55
*** uzumaki has quit IRC12:20
*** ociuhandu has joined #openstack-ironic12:26
*** ociuhandu has quit IRC12:45
*** iurygregory has joined #openstack-ironic13:18
iurygregoryo/13:22
*** mkrai has joined #openstack-ironic14:04
*** ociuhandu has joined #openstack-ironic14:18
*** ociuhandu has quit IRC14:23
*** ociuhandu has joined #openstack-ironic14:23
*** ociuhandu has quit IRC14:29
*** ociuhandu has joined #openstack-ironic14:30
*** sshnaidm|rover has quit IRC14:49
*** mkrai has quit IRC15:23
*** anuradha1904 has joined #openstack-ironic15:31
*** ociuhandu has quit IRC15:35
*** ociuhandu has joined #openstack-ironic15:36
*** ociuhandu has quit IRC15:40
*** ociuhandu has joined #openstack-ironic15:43
*** JasonF has quit IRC15:55
*** uzumaki has joined #openstack-ironic16:48
*** jawad_axd has quit IRC17:21
*** jawad_axd has joined #openstack-ironic17:22
*** ociuhandu has quit IRC18:07
*** ociuhandu has joined #openstack-ironic18:08
*** ociuhandu has quit IRC18:13
*** ociuhandu has joined #openstack-ironic18:18
*** mgoddard has quit IRC18:31
*** ociuhandu has quit IRC18:32
*** ociuhandu has joined #openstack-ironic18:33
*** ociuhandu has quit IRC18:37
*** ociuhandu has joined #openstack-ironic18:43
NobodyCamhappy holidays to all the Ironic family18:46
JayF\o/ same19:02
*** ociuhandu has quit IRC19:11
*** sshnaidm has joined #openstack-ironic19:11
*** ociuhandu has joined #openstack-ironic19:11
*** sshnaidm is now known as sshnaidm|rover19:11
*** ociuhandu has quit IRC19:17
*** ociuhandu has joined #openstack-ironic19:54
*** ociuhandu has quit IRC19:58
*** ociuhandu has joined #openstack-ironic20:14
*** ociuhandu has quit IRC20:15
*** ociuhandu has joined #openstack-ironic20:16
*** ociuhandu has quit IRC20:16
*** ociuhandu has joined #openstack-ironic20:16
*** ociuhandu has quit IRC20:28
*** ociuhandu has joined #openstack-ironic20:42
*** zzzeek has quit IRC20:53
*** zzzeek has joined #openstack-ironic20:55
*** ociuhandu has quit IRC21:00
*** uzumaki has quit IRC21:05
*** ociuhandu has joined #openstack-ironic21:18
*** ociuhandu has quit IRC21:23
*** ociuhandu has joined #openstack-ironic21:36
*** ociuhandu has quit IRC21:43
*** ociuhandu has joined #openstack-ironic21:57
*** markguz_ has joined #openstack-ironic23:03
markguz_Hi, I;ve been scratching my head on a problem for a couple of weeks now where if i spawn a baremetal instance, it can take up to 10 minutes before the node powers on.  vms deploy instantly but baremetal takes just so long.23:06
markguz_i am running rocky. I was going through upgrading and hit this problem and have been debugging it for what seems like forever.  i don't want to continue on with upgrading with ironic broken...23:07
markguz_i've been checking rabbitmq and it seems ok. this is a relatively small stack with maybe 25 compute nodes and 50 baremetal nodes. and less that 100 users23:08
markguz_i really don't kknow how to debug this any further.  I'm trying to see what happens in nova when i make the request to spin up a baremetal server, but i'm kinda lost.23:09
markguz_it seems to me that the request just seems to go into a void for a while then suddenly re-appears...23:11
*** ociuhandu has quit IRC23:17
*** ociuhandu has joined #openstack-ironic23:17
*** ociuhandu has quit IRC23:21
ayoungIPMI?23:33
ayoungbtw markguz_ its break time and I doubt many people are around...I happen to be playiung a game on line and checked IRC....23:34
markguz_yeah. it's late in the day here, i'm in the US.. just frustrated with this. so... IPMI is working fine. manually controlling the nodes with ipmi is instant.23:35
ayoungAnd the nodes are all in the ready state, I take it23:35
markguz_yup...23:35
ayoungavailable...23:35
markguz_yeah23:36
ayoungits strange that it eventually works23:36
JayFsounds suspiciously like conductor overworked23:36
JayFor a long-held-lock on the node23:36
ayoungthat says to me that maybe something is timing out...like it is having a schedular problem23:36
ayoungyeah23:37
markguz_the conductor is doing nothing23:37
markguz_this is not a busy environment23:37
JayFI'd poll `openstack baremetal node show $hostname` during the wait time, see when it flips to provision_state: wait call-back and power_state: on23:37
JayFthat'll also tell you when nova schedules to it23:37
JayFthat should help you isolate where in the path most of the time is being taken23:37
JayFor at least help bisect it :)23:38
ayoungis it possible that you have a non-existant something?23:38
ayoungLike, something was created and deleted, but the system still thinks it is there, and a request goes to it and dies in committee...23:38
ayoungI'm the worse person to try any help, BTW.  I really only know Keystone23:38
markguz_it's possible.  not sure what that would be... this was working with pike.... then i went pike->queens->rocky and it no longer works23:39
ayoungBut I make an OK rubber duck23:39
ayoungTime to upgrade23:39
JayFHeh23:39
JayFI mean, I can't ID any specific thing broken based on your description23:39
JayFmost OpenStack cluster problems present that way :|23:39
ayoungIs this an upstream install, or from a distro?23:39
JayFthe key is to figure out where it's taking the longest, or if it's just slow all the way thru23:39
markguz_well.... eh.. no.. i stopped the process when this happened. i didn't wont to keep going with a broken component23:40
ayoungDo all baremetal nodes behave this way, or just one23:40
JayFI suspect ayoung may be right; and you'll discover the slowdown is "upstream" of Ironic, and once Ironic gets involved it goes pretty quick23:40
ayoungI am?  Even a broken clock is right twice a day, I guess...23:40
JayFI mean, 'openstack cluster is slow' -> blame Nova scheduler23:41
JayFthat's a tried and true troubleshooting flowchart :D23:41
markguz_heh23:41
JayFlol23:41
JayFagain I'll emphasize that we can't know annything without more info, but isolating what's taking a long time will help23:42
ayoungWhat process did you stop?  Upgrading?23:42
ayoungI think he's look for help gathering that info23:42
ayoungwhere to look23:42
JayFif everything is slow, you're likely looking at rabbitmq issues, and I know diddly squat about troubleshooting that23:42
markguz_ayoung: yeah. in my experience trying to upgrade something in openstack that's already not working just gets more not working.23:42
JayF> poll `openstack baremetal node show $hostname` during the wait time, see when it flips to provision_state: wait call-back and power_state: on23:42
JayF+ reading conductor logs23:42
ayoungIf it were Rabbit...I would think the message would just get dropped.  But we do RPC, which means send and wait....23:43
ayoungIt could be that the message is getting dropped, but...that does not feel right23:43
markguz_hmm... in the logs just after nova-conductor selects the node it starts a "block device mapping" for the instance, and that's the last reference for the instance uuid23:44
ayoungWhat are you using for storage?23:44
markguz_noop23:44
ayoungnoop?23:45
markguz_it just uses the disk on the server.  havene't changed that in for ever.23:45
ayoungIs it just this one server that is slow?  Have you tried different Baremetal servers with the same problem?23:46
markguz_i have tried many23:46
markguz_all behave the same way23:46
ayoungWhat about disk images....23:46
ayoungCould it be downloading the image each time?23:47
ayoungThat sounds like the kind of mistake I would make.23:47
markguz_ storage_interface      | noop23:47
markguz_it streams direct from swift23:48
ayoungWhat is your overall set up?23:48
JayFI don't think storage interface matters at the nova-conductor mapping step23:48
JayFbut IMBW23:48
JayFhmm23:48
JayFAre nova and Ironic running the same version?23:48
ayoungoooh23:49
markguz_JayF: yep23:49
markguz_hmm it just flipped and started the node23:50
markguz_sat from 17:32 until 17:48 CST23:50
ayoungWell, if it was my son starting homework I would say that it finally got to a save point in his game.23:51
ayoungI know that is not helpful, but sometimes real life bleeds over.23:51
JayFI'd be checking logs for every service between nova-conductor and ironic23:51
markguz_seems like a timeout at least23:52
JayFlike nova-compute, ironic-conductor, as well as any services in the conductor log it talked to23:52
JayFwho knows? I don't wanna speculate, the logs would have more info23:52
JayFif they don't seem to; enable debug and reproduce again23:52
ayoungWhat is the sequence after the message hits conductor23:52
JayFBTW; I'm going to be here for only ~8 more minutes23:52
ayoungI bet there is an error generated somewhere in there23:52
JayFAlthough these troubleshooting steps should get you more info to ask the channel again regardless of if I'm here or not :)23:52
ayoungCould it be placement?23:53
ayoungI would thinkg Placement would just fail outright23:54
JayFI don't wanna speculate too hard without real info23:54
JayFbut yeah, that's what I suspect23:54
JayFI think as part of this upgrade, nova splits from placement23:54
markguz_it could be anything. but regular VMs are more or less instant23:54
JayFand if placement isn't setup, I wouldn't be surprised to see this behavior23:54
JayFhmmm23:54
markguz_actually that's stein23:54
JayFAh23:54
markguz_rocky is the last combined.23:54
JayFYeah, you gotta find something in a log, or figure out where it's slowing down to get further help23:54
JayFwe have lots of folks here who work with Queens clouds, so if you get more info it might be easy to nail down23:55
markguz_yeah.. i'll dig around some more... and see.  the logs seem to be nova-conductor triggering block-device mapping.. then nothing for 15mins then nova-compute -> neutron -> ironic23:56
JayFI'd reproduce with debug23:57
markguz_handy just talking it out with folks has helped me clear the fog a bit. I'll do some more digging, now i have some info about what's happening.23:57
markguz_thanks for your help !23:57
JayFrubber ducking is what IRC troubleshooting is best for23:57
JayFgood luck! quack quack23:57
markguz_hehe23:57
markguz_quack quack23:57

Generated by irclog2html.py 2.17.2 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!