01:11:56  <jbergstroem>not sure how but the python install on one of hte mininodes just looks broken
01:12:00  <jbergstroem>trying to force reinstall
01:14:50  <jbergstroem>didn't help
01:14:57  <jbergstroem>the package must've been broken while downloading
01:14:57  <jbergstroem>strange
01:15:01  <jbergstroem>i'll manually patch
01:16:26  <jbergstroem>fixored
01:24:34  <jbergstroem>this is concerning. both freebsd vm's are running into compiler hickups as of late: https://ci.nodejs.org/job/node-test-commit-freebsd/4274/nodes=freebsd10-64/console
01:25:10  <jbergstroem>i couldn't reproduce when i had a look yestreday but i'll try and check again tomorrow
01:31:21  <jbergstroem>emptying ccache
02:12:35  <Trott>On Raspberry Pi devices in CI, is the source directory NFS mounted? I just saw a test timeout when all it really was going to do was be skipped. I might be reaching here, but the only thing I can figure is that it has a `require('../common')` and maybe some NFS issue caused that to timeout? Because it exits pretty much immediately after that module is
02:12:35  <Trott>required.
02:16:33  <Trott>Also: shout out to whoever did the magic on Windows such that it has been reliable for, like, weeks now or something. It used to be a top source of "oh well, build failure" but it's been (as far as I can tell, anyway) rock solid for weeks. \o/
02:17:13  <Trott>(Trying hard not to be one of those people that only notices when things go wrong. You know, basically everyone on the planet.)
03:32:42  <Trott>joaocgreis rvagg jbergstroem node-stress-single-test-pi1-binary seems to be failing. https://ci.nodejs.org/job/node-stress-single-test-pi1-binary/12/label=pi1-raspbian-wheezy/console `grep: /home/iojs/.ssh/known_hosts: No such file or directory`
03:38:28  <rvagg>Trott: k, I've put a mkdir & touch in there to ensure that file always exists
03:38:45  <Trott>\m/ Thanks!
03:49:47  <rvagg>Trott: I think I've fixed the flaky test propagation too, wasn't being sent in to `make` in node-test-binary-arm but should be now
03:51:35  <Trott>Awesome! We may get back to yellow, at least, soon.
04:16:16  <Trott>test-mininodes-ubuntu1604-arm64_odroid_c2-3 is reliably failing to build.
04:16:31  <Trott>https://ci.nodejs.org/job/node-test-commit-arm/5037/nodes=ubuntu1604-arm64/console
04:16:42  <Trott>And click "Previous Build" from there for other examples...
04:18:20  <Trott>test-osuosl-aix61-ppc64_be-2 seems to be reliably failing to build too. mhdawson
04:19:49  <Trott>https://ci.nodejs.org/job/node-test-commit-aix/910/nodes=aix61-ppc64/console
04:20:26  <Trott>https://ci.nodejs.org/job/node-test-commit-aix/900/nodes=aix61-ppc64/console
04:20:28  <Trott>etc
05:03:08  <rvagg>ok then ...
05:03:14  <rvagg>that's kind of .. strange
05:14:19  <Trott>I know the proposal for "what platforms are supported" is coming, but I wanted to mention: It would be great if it included some kind of proposed mechanism for determining when platforms *stop* being supported. Like, I'm trying to figure out how much time I want to invest in really puzzling armv6 issues on the Raspberry Pi 1 devices. Are those going to be
05:14:19  <Trott>with us for years? Or probably not so much? Some way to even make an educated guess at that would be useful (to me).
05:17:29  <rvagg>Trott: specifically re armv6 it'll likely come down to v8 which is considering dropping support soon, which is reasonable, so Node v6 will probably be the last LTS line to support it
05:18:54  <rvagg>Trott: I've updated, cleaned out and poked at test-mininodes-ubuntu1604-arm64_odroid_c2-3 and now rebooted it, without finding anything specific I can just cross fingers and hope it was a temporary problem that's fixed with update & reboot
05:23:19  <rvagg>and since it's not coming back up ... perhaps there's more wrong than that
05:25:08  <Trott>Heh. Sorry.
05:25:51  <Trott>On the support thing: Yeah, I guess I should have figured V8 would be the limiting factor. I guess if ChakraCore ever comes to Node.js, then that even gets more complicated. But I suppose we can cross that bridge if and when we get to it.
05:26:42  <Trott>Thanks. That's helpful.
05:27:52  <rvagg>hmm.. my updates might have killed 2 of them now
05:27:59  <rvagg>that's not fun
06:05:32  <Trott>I may have jinxed the Windows builds with my positivity earlier. https://ci.nodejs.org/job/node-compile-windows/4248/
09:36:13  * not-an-aardvarkquit (Quit: Connection closed for inactivity)
10:36:14  * thealphanerdquit (Quit: farewell for now)
10:36:44  * thealphanerdjoined
10:54:16  <jbergstroem>rvagg: i fixed issues with them prior
10:55:10  <jbergstroem>rvagg: but there's definitely something going on with them. i reckon we should reinstall
11:05:39  <joaocgreis>rvagg: good catch re flaky tests, I've added it to osk as well, should be all fixed now
11:33:07  * node-ghjoined
11:33:07  * node-ghpart
12:41:03  * lance|afkchanged nick to lanceball
13:05:47  * chorrelljoined
13:31:20  * chorrellquit (Quit: Textual IRC Client: www.textualapp.com)
13:36:36  * evanlucasquit (Remote host closed the connection)
13:47:55  * Fishrock123joined
13:59:33  * chorrelljoined
14:42:28  * chorrellquit (Quit: Textual IRC Client: www.textualapp.com)
15:08:49  * chorrelljoined
15:15:12  * jenkins-monitorquit (Remote host closed the connection)
15:15:20  * jenkins-monitorjoined
15:18:08  * Fishrock123quit (Remote host closed the connection)
15:19:14  * Fishrock123joined
15:20:40  * jenkins-monitorquit (Remote host closed the connection)
15:20:50  * jenkins-monitorjoined
15:30:31  * jenkins-monitorquit (Remote host closed the connection)
15:30:49  * jenkins-monitorjoined
15:35:30  * jenkins-monitorquit (Remote host closed the connection)
15:35:38  * jenkins-monitorjoined
15:36:19  * lanceballchanged nick to lance|afk
15:40:19  * jenkins-monitorquit (Remote host closed the connection)
15:40:29  * jenkins-monitorjoined
15:45:20  * jenkins-monitorquit (Remote host closed the connection)
15:45:27  * jenkins-monitorjoined
15:51:25  * jenkins-monitorquit (Remote host closed the connection)
15:51:33  * jenkins-monitorjoined
15:56:59  * not-an-aardvarkjoined
15:59:24  * chorrellquit (Quit: Textual IRC Client: www.textualapp.com)
16:00:38  * jenkins-monitorquit (Remote host closed the connection)
16:00:45  * jenkins-monitorjoined
16:20:12  * chorrelljoined
16:20:46  * Fishrock123quit (Read error: Connection reset by peer)
16:21:16  * Fishrock123joined
16:55:08  * chorrellquit (Quit: Textual IRC Client: www.textualapp.com)
17:13:40  * node-ghjoined
17:13:40  * node-ghpart
17:14:04  * lance|afkchanged nick to lanceball
17:41:26  * node-ghjoined
17:41:27  * node-ghpart
19:09:51  * chorrelljoined
19:17:48  * lanceballchanged nick to lance|afk
19:41:55  * imyllerjoined
20:01:05  * lance|afkchanged nick to lanceball
20:01:57  * chorrellquit (Quit: Textual IRC Client: www.textualapp.com)
20:47:05  * evanlucasjoined
21:10:44  * Fishrock123quit (Remote host closed the connection)
21:11:41  * Fishrock123joined
21:31:19  * lanceballchanged nick to lance|afk
21:51:10  * ljharbquit (*.net *.split)
21:54:53  * ljharbjoined
22:00:25  * Fishrock123quit (Quit: Leaving...)
22:50:04  * jenkins-monitorquit (Excess Flood)
22:50:15  * jenkins-monitorjoined
23:21:54  <Trott>test-osuosl-aix61-ppc64_be-2 seems to still be reliably failing still. (Started more than 24 hours ago.) Help? mhdawson? Someone else if he's unavailable?
23:22:26  <Trott>Example failure: https://ci.nodejs.org/job/node-test-commit-aix/936/nodes=aix61-ppc64/console
23:22:38  <Trott>Build failures, not test failures, just in case that wasn't clear.
23:38:06  <Trott>On a different note, I don't suppose anyone knows why this is just hanging out and not moving to the next task? https://ci.nodejs.org/job/node-test-commit-arm-fanned/4438/
23:41:27  <jbergstroem>can check
23:42:10  <jbergstroem>Trott: a lot of hanging test failures:
23:42:20  <jbergstroem>this: /home/iojs/build/workspace/node-test-commit-aix/nodes/aix61-ppc64/out/Release/node /home/iojs/build/workspace/node-test-commit-aix/nodes/aix61-ppc64/test/parallel/test-child-process-fork-dgram.js child
23:42:29  <jbergstroem>..and this: /home/iojs/build/workspace/node-test-commit-aix/nodes/aix61-ppc64/out/Release/node /home/iojs/build/workspace/node-test-commit-aix/nodes/aix61-ppc64/test/sequential/test-child-process-pass-fd.js child
23:44:47  <jbergstroem>seems to be a few 'wait' processes. I wonder if that syntax behaves differently on aix
23:46:05  <jbergstroem>Trott: been active for four days :( i'm on the phone for a couple of hours still -- could you perhaps file an issue and ping me/mhdawson?)
23:47:58  <jbergstroem>(we should ping gibfahn too -- he had a few tests that didn't exit either)
23:50:46  <Trott>OK, done. https://github.com/nodejs/build/issues/494
23:50:51  * node-ghjoined
23:50:51  * node-ghpart
23:54:36  <Trott>Looks like all the Raspberry Pi fanned jobs are stalling out at "Starting build job git-nodesource-update-reference."
23:54:42  <Trott>rvagg: ^^^^^
23:55:09  <Trott>Example: https://ci.nodejs.org/job/node-test-commit-arm-fanned/4438/
23:55:18  <Trott>(Go to the console to see the message I pasted above.)
23:56:07  <rvagg>My ISP decided to cancel my account so I got disconnected ... They are trying to undo their mistake but I'm on a backup link now so Pi's should be back online soon.
23:57:24  <Trott>👍