00:55:50  * evanlucasquit (Remote host closed the connection)
00:56:06  * evanlucasjoined
03:46:09  <Trott>Wondering if there are stray processes on test-joyent-smartos14-x86-1 that need to be terminated. I'm seeing a bunch of the issue described at https://github.com/nodejs/node/issues/8209 which makes me wonder if some processes are hanging around an taking ports. This only came up today AFAIK.
04:09:49  * muracquit (Quit: Leaving)
05:55:44  <Trott>node-nodesource-raspbian-wheezy-pi1p-8-mininodes is not building:
05:55:44  <Trott>https://ci.nodejs.org/job/node-test-binary-arm/3440/RUN_SUBSET=4,label=pi1-raspbian-wheezy/console
05:55:44  <Trott>https://ci.nodejs.org/job/node-test-binary-arm/3440/RUN_SUBSET=3,label=pi1-raspbian-wheezy/console
05:55:44  <Trott>https://ci.nodejs.org/job/node-test-binary-arm/3440/RUN_SUBSET=1,label=pi1-raspbian-wheezy/
05:55:44  <Trott>/cc jbergstroem rvagg
05:57:21  <Trott>test-mininodes-ubuntu1604-arm64_odroid_c2-2 too. https://ci.nodejs.org/job/node-test-commit-arm/4625/nodes=ubuntu1604-arm64/console
07:02:05  <ljharb>if anyone wants to help me test my rewrite of nvm's installation code paths, https://github.com/creationix/nvm/pull/1204 would love your help :-)
08:20:17  <joaocgreis>Trott, rvagg: I can't investigate right now, I've just marked it as offline in jenkins
08:39:36  <rvagg>Something's up with mininodes-8, it had a full disk during the week which a reboot cleaned up. It might need a rebuild so best it stays offline for now.
08:40:06  <rvagg>Whenever a reboot fixes a full disk, something's wrong
10:38:46  * thealphanerdquit (Quit: farewell for now)
10:39:17  * thealphanerdjoined
11:35:06  <jbergstroem>Trott: did you log in to the smartos machine
11:35:20  <jbergstroem>rvagg: third time now :( probs have to put it to rest
11:36:08  <jbergstroem>Trott: a bunch of node-test-commit-smartos/nodes/smartos14-32/out/Release/node /home/iojs/build/workspace/node-test-commit-smartos/nodes/smartos14-32/test/sequential/test-child-process-pass-fd.js child 12414
11:37:56  <jbergstroem>Trott: no more lingering stuff on any smartos (was one on x64-1
11:37:56  <jbergstroem>)
11:41:43  * node-ghjoined
11:41:44  * node-ghpart
11:51:02  <rvagg>jbergstroem: yeah, I have backup SD cards for these situations, not the first one that would have a card go dodgy .. hopefully not more than that!
12:06:57  <jbergstroem>speaking of breaking drives, has anyone experienced ebs volumes dying on you? I've had it three times the last months
12:08:24  <jbergstroem>[had it happen]
12:08:37  <jbergstroem>ljharb: i'll have a look
12:18:30  <jbergstroem>phillipj: assom! i'll check your pr soon
12:27:27  <phillipj>yeay
12:57:55  * lance|afkchanged nick to lanceball
13:00:22  * node-ghjoined
13:00:22  * node-ghpart
13:03:20  * node-ghjoined
13:03:20  * node-ghpart
13:04:40  * node-ghjoined
13:04:40  * node-ghpart
13:05:37  * node-ghjoined
13:05:37  * node-ghpart
13:13:17  * node-ghjoined
13:13:18  * node-ghpart
13:14:06  * node-ghjoined
13:14:07  * node-ghpart
13:14:27  * node-ghjoined
13:14:28  * node-ghpart
13:14:37  * node-ghjoined
13:14:38  * node-ghpart
13:14:57  * Fishrock123joined
13:15:13  * node-ghjoined
13:15:13  * node-ghpart
13:15:48  * node-ghjoined
13:15:48  * node-ghpart
13:16:27  * node-ghjoined
13:16:27  * node-ghpart
13:32:45  <mhdawson>@joaocgreis or @jbergstroem could one of you check the firewall config to see if the 2 aix machines are still there. They don't seem to be able to connect to the ci
13:32:49  <mhdawson>They should be
13:33:47  <mhdawson>test-osuosl-aix61-ppc64_be-1/140.211.9.101 and test-osuosl-aix61-ppc64_be-2/140.211.9.100
14:08:55  * node-ghjoined
14:08:55  * node-ghpart
14:40:04  * node-ghjoined
14:40:04  * node-ghpart
15:16:44  <jbergstroem>mhdawson: will check now
15:17:28  * ofrobots_ooochanged nick to ofrobots
15:18:56  <jbergstroem>mhdawson: fixed
15:22:12  <mhdawson>thanks
16:08:32  <ofrobots>Folks, I am getting 504 Timeouts with the CI
16:13:22  <jbergstroem>ofrobots: 2sec
16:13:52  <jbergstroem>ofrobots: java was tired and up to no good -- looks fine now
16:14:19  <ofrobots>jbergstroem: thanks!
16:14:49  <jbergstroem>👌🏻
16:17:24  <ofrobots>Trott: is parallel/test-regress-GH-746 supposed to be flaky? It intermittently fails on the V8 waterfall
16:17:29  * node-ghjoined
16:17:30  * node-ghpart
16:17:49  <ofrobots>Trott: https://build.chromium.org/p/client.v8.fyi/builders/V8%20-%20node.js%20integration%20-%20lkgr?numbuilds=100
16:18:27  <ofrobots>s/fails/times out
16:26:51  * lanceballchanged nick to lance|afk
16:38:44  * rmgjoined
16:41:53  * rmgquit (Client Quit)
17:34:14  <Trott>ofrobots: I'm unaware of that test having any history of flakiness...
17:34:53  <ofrobots>interesting.. must be flaky only in the V8 test environment
17:35:05  <ofrobots>I'll take a look at the test
17:37:16  <Trott>That test goes way back so it's possible it had some flakiness at some point before I started paying attention (about a year ago).
18:25:02  * lance|afkchanged nick to lanceball
19:43:46  * node-ghjoined
19:43:46  * node-ghpart
19:49:08  * node-ghjoined
19:49:08  * node-ghpart
19:49:33  * node-ghjoined
19:49:33  * node-ghpart
19:51:25  * node-ghjoined
19:51:25  * node-ghpart
19:52:06  * node-ghjoined
19:52:06  * node-ghpart
20:09:46  <jbergstroem>phillipj: did you delete that commenet wrt raw/gather_facts?
20:10:10  * node-ghjoined
20:10:10  * node-ghpart
20:14:34  <phillipj>jbergstroem: nope, this one? https://github.com/nodejs/build/pull/469#discussion_r75745411
20:15:28  <jbergstroem>phillipj: you say "install aptitude" with an apt: command. i also think you might have to bootstrap debian8
20:15:36  <jbergstroem>sorry, bootstrap python on debian8
20:15:55  <phillipj>Saw your answer tho, I'll try it out tomorrow
20:16:13  <jbergstroem>phillipj: note the two hosts in the same file! https://github.com/nodejs/build/pull/446/files#diff-54a20c9aff06229996d3d614d66b5425R4
20:22:52  * node-ghjoined
20:22:52  * node-ghpart
20:26:22  <phillipj>Haven't had the need to install python on any of the deb 8 boxes so far
20:30:58  * node-ghjoined
20:30:58  * node-ghpart
20:33:01  * node-ghjoined
20:33:01  * node-ghpart
21:05:09  <jbergstroem>hmhm
21:05:13  <jbergstroem>i swear i did
21:09:30  <jbergstroem>it might be that its shipped different on different providers
21:09:38  <jbergstroem>i know joyent beefs up their base images substantially
21:35:44  * ak52joined
21:45:49  * lanceballchanged nick to lance|afk
21:51:21  * Fishrock123quit (Quit: Leaving...)
22:01:55  * node-ghjoined
22:01:55  * node-ghpart
22:03:08  * node-ghjoined
22:03:08  * node-ghpart
22:19:27  * node-ghjoined
22:19:27  * node-ghpart
22:22:43  * node-ghjoined
22:22:43  * node-ghpart
22:25:44  * node-ghjoined
22:25:45  * node-ghpart
22:31:47  * ak52quit (Remote host closed the connection)