00:19:10  <vtali>yeah how is it? :)
01:09:43  * erobitjoined
01:14:59  * vtaliquit (Remote host closed the connection)
01:38:49  * dennisma_joined
01:38:49  * dennismartenssonquit (Read error: Connection reset by peer)
01:43:06  * dennismartenssonjoined
01:43:06  * dennisma_quit (Read error: Connection reset by peer)
01:48:28  * vtalijoined
01:49:13  * k1ijoined
01:50:23  <k1i>owenb: you on?
01:56:58  * vtaliquit (Ping timeout: 256 seconds)
02:05:40  * k1i_joined
02:09:02  * k1iquit (Ping timeout: 255 seconds)
02:09:02  * k1i_changed nick to k1i
02:23:25  * vtalijoined
02:27:38  * vtaliquit (Ping timeout: 245 seconds)
03:15:29  * k1iquit (Read error: Connection reset by peer)
03:15:53  * k1ijoined
03:50:22  * liorixjoined
03:54:30  * vtalijoined
03:58:53  * vtaliquit (Ping timeout: 245 seconds)
04:00:12  * majekquit (Ping timeout: 276 seconds)
04:00:30  * oalquit (Ping timeout: 256 seconds)
04:00:48  * owenbquit (Ping timeout: 264 seconds)
04:20:53  * ArxPoeticaquit (Quit: Leaving.)
04:25:25  * liorixquit (Remote host closed the connection)
04:45:25  * ArxPoeticajoined
04:45:35  * ArxPoeticaquit (Client Quit)
04:53:20  * ArxPoeticajoined
04:55:00  * vtalijoined
04:58:30  * ArxPoeticaquit (Quit: Leaving.)
05:00:06  * vtaliquit (Ping timeout: 264 seconds)
05:10:13  * liorixjoined
05:11:50  * lluadquit (Quit: lluad)
05:15:05  * mtsrjoined
05:25:58  * vtalijoined
05:26:00  * mtsrquit (Ping timeout: 276 seconds)
05:30:33  * vtaliquit (Ping timeout: 245 seconds)
05:59:24  * majekjoined
06:00:14  * owenbjoined
06:00:38  * oaljoined
06:26:28  * vtalijoined
06:30:33  * vtaliquit (Ping timeout: 245 seconds)
06:45:12  * mtsrjoined
06:57:03  * vtalijoined
07:01:23  * vtaliquit (Ping timeout: 245 seconds)
07:11:33  * liorixquit (Remote host closed the connection)
07:12:07  * liorixjoined
07:16:21  * liorixquit (Ping timeout: 248 seconds)
07:21:35  * erobitquit (Quit: Leaving.)
07:43:27  * liorixjoined
07:51:56  * liorixquit (Ping timeout: 255 seconds)
07:57:28  * vtalijoined
08:01:48  * vtaliquit (Ping timeout: 245 seconds)
08:43:42  * k1iquit (Quit: k1i)
08:47:59  * dennisma_joined
08:48:00  * dennismartenssonquit (Read error: Connection reset by peer)
08:49:59  * dennismartenssonjoined
08:49:59  * dennisma_quit (Read error: Connection reset by peer)
09:29:49  * vtalijoined
09:33:53  * vtaliquit (Ping timeout: 245 seconds)
10:31:14  * vtalijoined
10:35:33  * vtaliquit (Ping timeout: 245 seconds)
10:42:15  * dennismartenssonquit (Read error: Connection reset by peer)
10:42:48  * dennismartenssonjoined
10:43:28  * erobitjoined
11:15:12  * vtalijoined
11:15:57  * liorixjoined
11:16:58  * liorixquit (Read error: Connection reset by peer)
11:17:30  * liorixjoined
11:19:44  * vtaliquit (Ping timeout: 245 seconds)
11:58:54  * liorixquit (Ping timeout: 264 seconds)
12:06:38  * liorixjoined
12:21:07  * vtalijoined
12:23:03  * vtaliquit (Remote host closed the connection)
12:24:18  * vtalijoined
12:42:25  * liorixquit (Remote host closed the connection)
12:42:57  * liorixjoined
12:47:55  * liorixquit (Ping timeout: 264 seconds)
12:49:04  * ArxPoeticajoined
12:51:28  * ArxPoetica1joined
12:51:28  * ArxPoeticaquit (Read error: Connection reset by peer)
13:23:11  * erobitquit (Quit: Leaving.)
13:25:35  <owenb>hey vtali
13:25:41  <owenb>and Arx
13:28:17  <vtali>hi owenb, how was the talk
13:40:01  <owenb>it's on Monday :)
13:45:43  <ArxPoetica1>what talk you giving?
13:46:04  * ArxPoetica1changed nick to ArxPoetica
13:57:35  <owenb>oh the usual ;) but lots of new stuff to talk about wrt socketstream 0.4
14:10:42  <ArxPoetica>that's awesome
14:10:46  <ArxPoetica>can't wait for beta ha
14:11:02  <ArxPoetica>did you do an official alpha?
14:14:39  <owenb>not yet
14:14:45  <owenb>but very soon now
14:15:23  <owenb>just to give you a heads up, i'll be finishing off the design of the 'realtime server' part of socketstream 0.4 this weekend
14:15:36  <owenb>it will be released as a separate module in time for monday
14:15:49  <owenb>so people who just want that part of SS can use it without any of the other ideas
14:16:12  <owenb>i'll then begin working on finishing off all the client asset building etc
14:16:37  <owenb>when all the component parts are ready, that will be the first alpha of ss 0.4
14:17:21  <owenb>the first realtime framework that will let you get started quickly, but also expand into your own custom tech stack in the future. it's going to be good :)
14:41:01  <vtali>saweet!
14:41:22  <vtali>looking forward to integrating it with angularjs :)
14:59:07  <owenb>me too :) it's coming soon
15:43:21  * ArxPoetica1joined
15:45:33  * ArxPoeticaquit (Ping timeout: 245 seconds)
15:48:01  <ArxPoetica1>awesome
15:48:02  * mtsrquit (Read error: Operation timed out)
15:52:55  * erobitjoined
16:06:31  * ArxPoetica1changed nick to ArxPoetica
16:16:50  * lluadjoined
16:57:56  * erobitquit (Ping timeout: 245 seconds)
17:04:00  * k1ijoined
17:04:13  <k1i>owenb: you here?
17:32:18  * sonofjackquit (Read error: Connection reset by peer)
17:32:31  * sonofjackjoined
17:49:38  * sonofjackquit (Read error: Connection reset by peer)
17:50:59  * sonofjackjoined
18:02:18  * zenoconjoined
18:04:15  * zenoconquit (Read error: Connection reset by peer)
18:20:59  * k1iquit (Quit: k1i)
18:22:22  * k1ijoined
20:51:14  * ArxPoeticaquit (Quit: Leaving.)
21:22:28  <owenb>hey k1i
21:22:34  <k1i>dude
21:22:39  <k1i>ok
21:22:40  <k1i>so
21:22:40  <owenb>sorry, i'm on irccloud so i guess it often shows me as here when i'm not
21:22:59  <k1i>time to talk about this wohle situation
21:23:05  <k1i>my model proxy library works really really well
21:23:10  <k1i>there are a few problems I need help solving
21:23:18  <owenb>ok
21:23:22  <k1i>but instance methods and class methods properly proxy
21:23:25  <k1i>let me upload this to a repo
21:23:32  <owenb>cool ok
21:23:34  <k1i>I had to modify something in realtime-service to serialize properly
21:23:39  <owenb>sure
21:23:44  <k1i>but, if you download this, I will show you the main idea and what's left
21:23:51  <owenb>cool ok
21:24:17  <k1i>forking now
21:26:42  <k1i>https://github.com/korbin/socketstream-0.4
21:28:11  <k1i>based on your initial recommendation, you can instantiate models like this
21:28:16  <k1i>ss.model.get("book", function(book) {
21:28:17  <k1i> book.find(1, function(err, model) {
21:28:18  <k1i> model.fancyColor();
21:28:19  <k1i> model.changeColor("green", function(err, model) {
21:28:20  <k1i> console.log(model);
21:28:21  <k1i>https://github.com/korbin/socketstream-0.4/blob/master/example_app/client/app/app.js
21:28:57  <k1i>oh siht
21:28:59  <k1i>I didnt git add, sec
21:29:23  <owenb>np
21:29:31  <k1i>https://github.com/korbin/socketstream-0.4/tree/master/mods/rts-model now contains
21:31:20  <k1i>so, it does dynamic code generation to proxy off class/instance methods
21:31:48  <k1i>it proxies so well that it actually creates the argument list on the clientside as well (for console purposes, etc.)
21:32:42  <k1i>I dont think this DSL is too heavy: https://github.com/korbin/socketstream-0.4/blob/master/example_app/services/model/book.coffee
21:33:42  <k1i>i wanted the actual model proxying framework to be really really clever
21:34:12  <owenb>in your example, what does 'local' mean? server-side?
21:34:34  <k1i>local means executable in a browser context
21:34:39  <k1i>it's relative to the client
21:34:41  <owenb>ah i see
21:34:43  <k1i>client executable functions = local
21:34:46  <k1i>remote = serverside
21:34:53  <k1i>a serverside instance of the model could execute either
21:34:57  <owenb>could be worth changing that to 'client' and 'server' to make it totally clear
21:34:57  <k1i>seamlessly
21:34:59  <k1i>yep
21:35:01  <k1i>will do
21:35:14  <k1i>the reason I didn't, though
21:35:27  <k1i>is because "local" just means it is executable within the browser (very limited scope)
21:35:31  <k1i>the model itself with limited resources
21:35:36  <k1i>not necessarily clientside
21:35:50  <k1i>when we have backend services also consuming models, they will be able to execute local or remote functions, simply because it is the same environment
21:35:53  <k1i>no proxying necessary
21:36:00  <k1i>but yeah, for all intents and purposes, I can change that
21:36:07  <owenb>ok i see
21:36:55  <owenb>i'm amazed at what you've done without any documentation and an api which is changing a lot :)
21:37:15  <k1i>thanks I am surprised the code works myself
21:37:17  <k1i>it is so clever, IMO
21:37:23  <k1i>the model object stays in sync with the serverside
21:37:30  <k1i>and allows you to interact with it like it is a normal JS object
21:38:04  <owenb>this is cool
21:38:11  <k1i>needs DRY'ed out
21:38:16  <owenb>i'm going to run it and try it now. give me a few mins
21:38:18  <k1i>and there is a bit of a scoping challenge I am working on
21:38:22  <k1i>basically
21:38:34  <k1i>the problem is
21:38:43  <k1i>if a client is executing a remote (serverside) instance method
21:38:57  <k1i>and you reference "this" - it passes the context of the clientside model to "this"
21:39:07  <k1i>so you cant chain an additional serverside method on
21:39:25  <k1i>so in an instance method, I can't execute this.save() and have it refer to an instance method in scope
21:39:44  <k1i>I am thinking about passing an additional scope somehow (for the serverside model) such as @@ or _this to allow that to happen
21:44:35  <owenb>ah right. i see the problem, but not a solution as yet
21:44:55  <k1i>yeah, you can't call @save within an instance method
21:45:00  <k1i>the solution would be either - pass another scope
21:45:02  <k1i>or my other idea
21:45:14  <k1i>transparently re-filter and re-proxy serverside/clientside methods
21:45:43  <k1i>or provide an alias: "ss.model.send" that can be executed in the server scope
21:45:53  <k1i>which may be a cleaner, easier solution
21:46:21  <owenb>aim for simple and clear over 'magical'
21:46:29  <owenb>i've learned that lesson the hard way :)
21:46:31  <k1i>heh
21:46:32  <k1i>ss.model.__send__
21:46:35  <k1i>needs to exist on the serverside
21:46:46  <k1i>then
21:47:11  <k1i>that would be probably the simplest way, but, I couldn't figure out how to randomly inject that variable into the function calling context
21:47:24  <k1i>models.server[msg.model]::[msg.method].apply msg.context, args
21:47:33  <k1i>since .apply only sets "this"
21:47:43  <k1i>and "this" really needs to be the clientside representation of the model
21:48:18  <owenb>that's why now.js had the magical this.now object which would (supposedly) always be kept in sync between server and client
21:48:25  <owenb>unfortunately it never worked properly
21:48:30  <k1i>well
21:48:31  <k1i>this works really well
21:48:33  <k1i>with that one caveat.
21:48:43  <k1i>now.js always had problems proxying instance methods
21:48:46  <k1i>and I've cracked that
21:48:52  <owenb>:)
21:49:03  <k1i>if this issue can be solved, this will be perfect
21:49:16  <k1i>either that, or ship it with the warning "can't call instance methods from other instance methods"
21:49:18  <k1i>which kind of sucks
21:49:26  <owenb>thing is though, often the client will have a tonne of code in the instance model that the serve shouldn't have to know anything about
21:49:29  <owenb>e.g. dom stuff
21:49:34  <k1i>but, if we were able to alias ss.model.__send__ to the real method, it'd be game over
21:49:43  * vtaliquit (Ping timeout: 245 seconds)
21:49:57  <k1i>example?
21:50:54  <owenb>well i was thinking display logic
21:51:35  <owenb>or are you wanting to keep this 'purely data' ?
21:52:01  <k1i>I was thinking it was supposed to be purely data
21:52:05  <k1i>and then display logic would get added onto another module
21:52:09  <k1i>such as rts-angular
21:52:15  <k1i>rts-angular would consume rts-model's models
21:52:17  <owenb>ah i see
21:52:20  <owenb>i'm all for that approach
21:52:34  <k1i>this is simply a JS object syncing system
21:52:45  <k1i>which will have pubsub shortly to handle live-model-updates (easy)
21:52:54  <k1i>this is, if I must say, a better approach IMO than minimongo
21:53:03  <k1i>because it will be trivially easy to add mongoose, squel, or whatever datastore you want onto it
21:53:09  <owenb>exactly!
21:53:11  <owenb>that's so important
21:53:22  <k1i>the "rtm-mongoose" module will simply append its own methods into the model
21:53:33  <k1i>and the best part
21:53:49  <k1i>the rtm-mongoose module can have its own set of requirements for exports within book(the model).js
21:53:59  <k1i>exports.schema, could be one of those, if you are using rtm-mongoose
21:54:37  <k1i>I think it feels very node-like.
21:54:49  <k1i>callbacks are sent and received/proxied cleanly
21:55:23  <k1i>ss.model.__send__ just needs to be accessible on the serverside and this thing will be a great POC
21:55:42  <k1i>the DSL in model.js is fairly loose, I don't think it is too restrictive - it is more there to determine where a function is being capable of executed, to save network overhead.
21:56:15  <k1i>you certainly could execute everything on the server, but, I figure, if it just adds a tiny bit of DSL complexity and a ton of potential performance, that's an acceptable trade-off in this case
21:56:58  <owenb>i agree
21:57:27  <k1i>I really thought the .find function would be so cool when implemented with something like mongoose
21:57:51  <owenb>my main concern is that a lot of client-side functionality is already implemented in backbone models or in angular directly. i'm not sure how people using this frameworks would benefit form having yet another place to put their client-side model logic
21:58:06  <k1i>backbone models are specifically designed to use REST
21:58:13  <k1i>angular doesn't have any reliance on $resource
21:58:17  <k1i>this will simply be $model
21:58:29  <k1i>$model will need to be written (rts-angular)
21:58:32  <owenb>have you seen firebase-angular ?
21:58:36  <owenb>totally
21:58:38  <k1i>yes
21:58:51  <owenb>i did rts-angular without seeing firebas
21:58:52  <owenb>e
21:58:59  <owenb>it came out 2 days after hehe
21:59:09  <k1i>rts-model is essentially firebase
21:59:13  <owenb>now i've seen that i def want to re-write the whole thing
21:59:19  <owenb>that's good
21:59:31  <k1i>rts-angular is going to be like
21:59:33  <k1i>the angularfire module
22:00:02  <k1i>I like Backbone, and have used it extensively (more than angular), but it really isn't easy to use outside of a RESTful environment, which is exactly what we are doing
22:00:13  <k1i>angular is pretty model-agnostic
22:00:19  <k1i>and likes javascript objects served as factories
22:01:16  <k1i>I think angular <-> (rts-model) <-> socketstream will blow the socks off of meteor's minimongo implementation
22:01:18  <k1i>and handlebars
22:01:49  <owenb>btw as an aside, i've checked out your repo and trying to run the example app. i'm having loads of npm errors saying the 'colors' module is missing... which it is as i've not added it as a dep to rts-pubsub etc. thinking i should really take this out and say that you shouldn't log colors to server.log() at all but instead you'll be able to pass params which
22:01:49  <owenb>the realtime service logger will use to render colors in the terminal
22:02:14  <k1i>ok
22:02:14  <owenb>if it all works reliably and scales amazingly well, it certainly will :)
22:02:18  <k1i>well
22:02:25  <k1i>scaling with connect-redis-realtime should be pretty simple
22:02:28  <k1i>I see no potential scaling issues with this
22:02:56  <k1i>at least at first glance
22:03:24  <owenb>well for session storage you'll be fine - we just tell people to use connect redis session store, which we're going to anyhway
22:03:42  <owenb>for pubsub, the realtime-models module will need to take a pub-sub transport
22:03:46  <k1i>yep
22:04:00  <owenb>this should be anything - activemq, rabbitmq, redis
22:04:14  <k1i>i was thinking you specify in the app config file
22:04:16  <k1i>rts-pubsub
22:04:22  <k1i>the already-instanced rts-pubsub, that is
22:04:27  <k1i>that you could also use for the rest of your app
22:04:32  <k1i>so rts-model consumes rts-pubsub
22:04:49  <owenb>no
22:04:52  <owenb>i really don't want this
22:05:00  <owenb>rts pubsub is crappy
22:05:01  <k1i>the same pubsub bus?
22:05:07  <owenb>and will go away soon
22:05:11  <owenb>it's only there to be compatible with 0.3
22:05:12  <k1i> var pubsub = require('rts-pubsub')();
22:05:13  <k1i> app.server.service('pubsub', pubsub);
22:05:14  <k1i> app.server.service('model', require('rts-model')(pubsub: pubsub));
22:05:14  <owenb>i can do much better
22:05:22  <k1i>im not specifically stating that module in particular
22:05:22  <owenb>we need this instead: https://github.com/socketstream/socketstream/blob/master/src/publish/transports/redis.coffee
22:05:23  <k1i>i am saying
22:05:34  <k1i>a proxied version of the same pubusb transport that is being used by the rest of the app
22:05:38  <owenb>this is just a thin wrapper of send and listen
22:05:42  <owenb>you can use anything
22:05:45  <owenb>redis, activemq
22:05:48  <owenb>whatever
22:05:56  <k1i>yeah
22:05:57  <owenb>hmmmm
22:05:59  <owenb>so hang on
22:06:09  <owenb>ignore rpc-pubsub
22:06:22  <owenb>we could have one way of doing pubsub for all realtime services
22:06:26  <k1i>yea
22:06:29  <k1i>i think that is highly necessary.
22:06:32  <owenb>you pass it in when you create the server
22:06:35  <k1i>so you arent doubly-configuring the same service
22:06:37  <k1i>yep
22:06:40  <owenb>just like you do with the session store
22:06:41  <owenb>BUT
22:06:47  <k1i>and that is something that needs to be done at a low level like that
22:06:50  <k1i>also
22:06:54  <k1i>if you do it when you configure the session store
22:07:08  <k1i>you can send the same redis engine/config to the pubsub variable or whatever
22:07:11  <owenb>it would mean there is no abiiity to use activeMq for RT Models and Redis for something else
22:07:21  <k1i>sure it would
22:07:27  <k1i>you have two variables
22:07:37  <k1i>sessionStore = require(connect-redis-realtime)
22:07:50  <k1i>pubsub = require('zmq');
22:07:51  <k1i>or
22:07:58  <k1i>pubsub = redisStore.redis;
22:08:04  <k1i>if you wanted to use the same one
22:08:10  <owenb>hmmm
22:08:20  <owenb>actually i can think of a way
22:08:29  <k1i>but this is something that is so low level that it needs to be provided to realtime services before server start.
22:08:42  <k1i>ZMQ is not something that you could just configure (or want to) at runtime
22:08:50  <k1i>things like multicast would make that tricky
22:09:11  <owenb>when you do app.service() the first param is the name, the second is the realtime service object, but the third is for overriding the behavour - e.g. do not send me any client-side assets. you could pass your own pub/sub transport here JUST for that service. otherwise it would use the default
22:09:20  <k1i>yes
22:09:29  <k1i>and the default should be set-up before the realtime server starts
22:09:30  <owenb>ok
22:09:36  <owenb>agreed
22:09:40  <k1i>same place as the session store
22:09:45  <k1i>also
22:09:47  <owenb>one session store, one pubsub transport
22:09:51  <k1i>we need to implement bson
22:09:57  <k1i>services.use.bson
22:10:13  <k1i>not exactly a priority, but, would be nice
22:10:19  <owenb>that means a massive c dependency. i can barely compile that for mongo
22:10:25  <owenb>normally it craps out
22:10:33  <owenb>i'd rather not do that.
22:10:33  <k1i>they have a browser-capable version
22:10:36  <owenb>but we do need to support binary
22:10:39  <k1i>works reall well man
22:10:41  <k1i>https://github.com/mongodb/js-bson/tree/master/browser_build
22:10:56  <owenb>(binary websocket data i was thinking)
22:11:06  <owenb>for transports that support it
22:11:10  <owenb>but that's in the future
22:11:21  <k1i>yea
22:11:30  <k1i>did you see my serialization I had to add to the realtime-service?
22:11:37  <k1i>to pull off the realtime model integration
22:11:46  <owenb>no can you show me the line number?
22:11:50  <k1i>in order to serialize whole objects, some decorator data had to be added to the reply/send funcs
22:11:51  <k1i>sec
22:12:00  <k1i>it's at the very top of both of the files
22:12:03  <k1i>i crudely threw it in there
22:12:16  <owenb>i see
22:12:20  <k1i>https://github.com/korbin/socketstream-0.4/blob/master/mods/realtime-service/server.js
22:12:27  <k1i>deflateModel / inflateModel
22:12:47  <k1i>adds decorator data to allow JSON to serialize complicated objects/responses
22:12:59  <k1i>namely, function objects (models)
22:13:13  <owenb>i see
22:13:54  <owenb>i'm in no rush to put BSON it into realtime services. i'm really happy with how light and simple everything is at the moment, but i'm totally cool with you using it in realtime-models. it makes sense there
22:13:59  <k1i>wlel
22:14:02  <k1i>I don't need BSON
22:14:04  <k1i>JSON works just fine
22:14:09  <k1i>but it would save a bit of serialization time
22:14:15  <k1i>so when performance matters (1.0), we should have it
22:14:21  <owenb>you mentioned you had to modify something in realtime services. what was that?
22:14:25  <k1i>just linke dit
22:14:27  <k1i>https://github.com/korbin/socketstream-0.4/blob/master/mods/realtime-service/server.js
22:14:31  <k1i>the JSON decorator
22:14:37  <k1i>k1i: it's at the very top of both of the files
22:14:37  <k1i>[5:12pm] k1i: i crudely threw it in there
22:14:39  <k1i>[5:12pm] owenb: i see
22:14:40  <k1i>[5:12pm] k1i: https://github.com/korbin/socketstream-0.4/blob/master/mods/realtime-service/server.js
22:14:41  <k1i>[5:12pm] k1i: deflateModel / inflateModel
22:14:42  <k1i>[5:12pm] k1i: adds decorator data to allow JSON to serialize complicated objects/responses
22:15:14  <k1i>it's just a bit of code that allows a javascript function object to be JSON.stringified
22:15:52  <owenb>best do that in your own module
22:15:54  <k1i>well
22:15:55  <k1i>the problem is
22:16:02  <owenb>just don't use the json feature of realtime services
22:16:13  <k1i>it has to happen before and after every client.send and reply()
22:16:25  <owenb>it's only there for convinence.
22:16:42  <owenb>you can still do that in your lib
22:16:54  <k1i> server.onmessage = (msg, meta, reply) ->
22:16:55  <k1i> when "RPC_CLASS"
22:16:56  <k1i> msg.args.cb = ->
22:16:56  <k1i> reply arguments
22:16:58  <owenb>just wrap it
22:17:00  <k1i>ok
22:17:04  <k1i>I forgot why I had to do it there
22:17:08  <k1i>I feel like there was some fringe case
22:17:37  <owenb>i need to be really ruthless about not including things in services. need to keep everything as fast and 'close to the wire' as possible
22:17:42  <owenb>let me know if there is
22:17:42  <k1i>yeah
22:17:50  <k1i>i forgot what it was, but there was a reason
22:17:52  <k1i>I tried to stay out of there
22:17:57  <owenb>sure
22:18:06  <owenb>when you remember what it is let me know
22:18:10  <owenb>we'll find a fix
22:18:32  <owenb>so let me tell you some news
22:18:56  <owenb>i've decided to call 'socketstream-server' 'prism'
22:19:08  <owenb>i.e. the name of the pure realtime server will be called Prism
22:19:19  * k1i_joined
22:19:20  <owenb>think of white light being split into components
22:19:25  <owenb>(services)
22:19:33  <k1i_>whoops
22:19:34  <k1i_>internet died
22:19:36  <k1i_>owenb: sure
22:19:36  <owenb>np
22:19:37  <k1i_>[5:18pm] k1i: oh yeah.
22:19:39  <k1i_>[5:18pm] k1i: because on the callbacks, there is no way to re-inflate the object
22:19:40  <k1i_>[5:18pm] k1i: I can obviously throw reply deflateModel(arguments) there
22:19:41  <k1i_>[5:18pm] k1i: but then on the clientside
22:19:41  <k1i_>[5:18pm]
22:19:58  <owenb>ah i see
22:20:16  <k1i_>but yeah, i would have rather not modified realtime-service
22:20:22  <owenb>can you just not use: json
22:20:24  <owenb>and stick to strings
22:20:29  <owenb>then you can do whatever you like
22:20:34  <owenb>or am i missing the point?
22:20:44  <k1i_>I think I can
22:20:56  <k1i_>the inflatemodel/deflatemodel was added before I refactored some of this code
22:21:12  <k1i_>originally I was generating/compiling/regenerating model objects on both the serverside and clientside
22:21:14  <k1i_>and then it got to be too much
22:21:20  <owenb>(when you dropped out I was saying that the socketstream-server module is going to be renamed prism and launched as it's own project/module on Monday when i give my talk)
22:21:25  <k1i_>ok
22:21:28  <k1i_>give a talk where
22:21:33  <owenb>Realtime Conf EU
22:22:01  <k1i_>where is this at
22:22:15  <owenb>lyon
22:22:17  <owenb>where are you based?
22:22:21  <k1i_>USA
22:22:32  * k1iquit (Ping timeout: 252 seconds)
22:22:33  * k1i_changed nick to k1i
22:22:43  <owenb>cool ok. where abouts? i'm in VA in two weeks
22:22:47  <k1i>south dakota, heh.
22:22:53  <k1i>wild west
22:22:57  <owenb>http://lanyrd.com/2013/realtime-conf-europe/schedule/
22:22:59  <k1i>I am normally in NYC
22:23:00  <owenb>cool ok
22:23:06  <owenb>there may be live streaming
22:23:06  <k1i>for a good portion fo the year
22:23:11  <k1i>i wil look out for it
22:23:13  <owenb>i'm speaking 4th or 5th i think
22:23:17  <owenb>cool
22:23:18  <k1i>why is ss-server being separated?
22:23:34  <owenb>many people in Node know of socketstream
22:23:46  <owenb>but they think it's a framework, which it is
22:24:00  <owenb>Prism is going to be a standalone realtime server
22:24:01  <owenb>nothing else
22:24:09  <k1i>with the ability to add realtime models?
22:24:11  <k1i>err
22:24:12  <k1i>services rather
22:24:15  <k1i>such as rts-model et al
22:24:15  <owenb>yup
22:24:17  <owenb>everything
22:24:20  <owenb>and transports
22:24:26  <owenb>socketstream 0.4 will use prism
22:25:06  <owenb>socketstream 0.4 will become an even more oppionated high-productivity framework
22:25:10  <k1i>good
22:25:13  <k1i>how do you suggest I make ss.model.__send__ available in the serverside scope when executing .apply?
22:25:38  <owenb>BUT unlike meteor, if you want to change your build system in the future, or move from mongo to something else, you'll totally be able to do that
22:25:41  <k1i>ss.model.__send__ just needs to be an alias for every method
22:25:45  <k1i>meteor pisses me off
22:25:56  <owenb>and realtime transports and services can be used in any framework etc in the future
22:26:03  <owenb>so there is zero vendor lock in
22:26:10  <k1i>meteor is the definition of vendor lock-in
22:26:15  <k1i>the packages system, holy shit
22:26:15  <owenb>exactly
22:26:20  <k1i>makes my blood boil
22:26:40  <owenb>realtime services are just objects with no knowledge of the realtime server (prism) or framework (socketstream)
22:26:49  <owenb>it's a much better design i think
22:27:09  <k1i>yeah
22:27:35  <k1i>the thing that i think turns a lot of people off about SS
22:27:44  <k1i>is the fact that it isnt turnkey
22:28:15  <owenb>exactly
22:28:32  <owenb>so by breaking things into tiny modules, we're going to get the best of both
22:28:34  <k1i>SS, without models and database opinions, cant replace meteor
22:28:36  <k1i>yeah
22:28:39  <owenb>you can start off with a highly oppionated framework
22:28:44  <owenb>get up and running instantly
22:28:51  <k1i>heh, try and remove minimongo from meteor.
22:28:53  <owenb>i'll pick the best of everything etc
22:29:08  <owenb>and show tutorials examples screencasts
22:29:17  <owenb>BUT at no stage are you locked in
22:29:27  <k1i>yep
22:29:29  <k1i>for instance
22:29:40  <k1i>I believe angular to be the best frontend framework for realtime apps
22:29:46  <k1i>JS and ember are too locked into REST, IMO
22:29:51  <owenb>e.g. you will easily be able to switch out the socketstream build process with Grunt
22:29:52  <k1i>that should be a modular choice though
22:30:02  <owenb>yup
22:30:42  <owenb>so i'm going to go in a sec and work on this some more. so much to do before monday
22:30:44  <k1i>I was honestly really relieved when this proxying thing started working
22:30:53  <owenb>but i'm going to explore the idea of a global pubsub
22:30:55  <owenb>like https://github.com/socketstream/socketstream/blob/master/src/publish/transports/redis.coffee
22:30:59  <k1i>im going to push a refactored version of this inflatemodel/deflatemodel to my repo
22:31:03  <owenb>cool
22:31:04  <k1i>so it isnt in realtime-service
22:31:08  <owenb>look out for changes over the weekend
22:31:14  <k1i>do yo uhave any thoughts of passing ss.model.__send__ to a serverside function
22:31:16  <owenb>everything should be good to go by monday
22:31:16  <k1i>called with .apply?
22:31:35  <k1i>i cant think of a good way to declare that variable in-scope, since, .apply doesn't execute stuff in-scope
22:31:39  <owenb>not off hand, sorry. the problem you'll have is syncing the instance data
22:31:51  <k1i>it's already synced
22:31:54  <owenb>yeah?
22:31:55  <k1i>already syncs, rather
22:32:07  <owenb>then can't you just .bind() to the function?
22:32:14  <k1i>if you look
22:32:22  <k1i> models.server[msg.model]::[msg.method].apply msg.context, args is how an instance method is executed
22:32:41  <k1i>msg.context, though, is the clientside representation of the model object
22:32:49  <owenb>ok
22:32:52  <k1i>in the clientside representation
22:32:59  <k1i>all instance method calls are proxied to ss.model.__send__
22:33:03  <k1i>which doesnt exist on the serverside
22:33:17  <k1i>I want to make it exist, and then apply the function
22:33:34  <k1i>.apply doesn't execute in scope
22:34:26  <k1i>if I can make ss.model.__send__ exist when .applying the function like that, this will work flawlessly
22:34:28  <owenb>sorry i'm dense but i thought the first argument to .apply() was the instance of this
22:34:31  <owenb>e.g. the scope
22:35:11  <k1i>yes, but I want the scope to remain the clientside copy of the model
22:35:22  <k1i>the clientside copy of the model thinks of instance methods as ss.model.send calls which doesnt exist on the server
22:35:27  <k1i>I just want to alias it, so there is no code-regeneration
22:35:37  <owenb>but you said you have local methods
22:35:50  <owenb>these could change the value of properties of the instance on the client
22:35:55  <k1i>yep
22:35:57  <owenb>there is no way for the server to know what these are
22:36:00  <k1i>yes
22:36:07  <k1i>you would then execute a .save()
22:36:14  <k1i>which is a remote instance method
22:36:15  <k1i>or whatever
22:36:22  <owenb>ok
22:36:33  <owenb>so it just saves all the properties to the server?
22:36:34  <k1i>the local instance methods aren't designed to do persistent data manipulation, they only execute the object in memory, anything persistent needs to execute on the server naturally
22:36:41  <k1i>it sends the whole model object over the wire.
22:36:43  <owenb>ok
22:36:44  <k1i>and would save it
22:36:51  <k1i>the code works just fine entirely
22:36:55  <owenb>and i guess in the future you can just send patches
22:36:59  <k1i>yep
22:37:00  <owenb>ok
22:37:04  <k1i>OT will be implementable
22:37:07  <k1i>the problem I am having
22:37:07  <owenb>yes
22:37:17  <k1i>is i want to be able to this.save from another remote instance method
22:37:29  <k1i>or this.anyremoteinstancemethod
22:37:30  <owenb>on the server?
22:37:32  <k1i>yes
22:37:41  <owenb>ok
22:37:44  <k1i>however, since you are working with a clientside copy of the model
22:37:54  <k1i>ss.model.send doesnt exist
22:38:19  <k1i>I want to make that exist seamlessly on the server so you can access "remote" (which arent really remote anymore) instance/class methods
22:38:34  <owenb>yes i see now
22:38:44  <owenb>so you can do this inside RPC code for example
22:38:50  <owenb>or your own node scripts
22:39:15  <owenb>so i guess you need an object prototype for the server
22:39:22  <owenb>which has .save() and stuff
22:39:23  <k1i>yeah, it makes one
22:39:31  <k1i>its just that I want it to call it seamlessly
22:39:35  <k1i>models.server[msg.model]::[msg.method].apply msg.context, args
22:39:45  <k1i>it executes the serverside copy of the model's initial function
22:39:56  <k1i>however, if you add a this.save() into that function, since, the context is the clientside version
22:40:07  <k1i>it wont be able to execute that "over the wire"
22:40:23  <owenb>hmm
22:40:24  <k1i>since its already in the serverside
22:40:42  <owenb>i get it
22:40:48  <k1i>the way I want to achieve that, is to create ss.models.send on the serverside, and have it really just execute the serverside function directly
22:40:56  <owenb>yes
22:41:13  <k1i>but .apply doesn't let you call functions in scope
22:41:41  <k1i>heh, maybe I just do more code generation on the fly ;)
22:41:49  <k1i>I try to avoid it unless I can make it really clean
22:41:54  <owenb>yeah
22:41:58  <owenb>think about the performance
22:42:07  <k1i>well all of the code is generated before-hand
22:42:11  <k1i>at server start
22:42:23  <k1i>there is negligible performance hit on that
22:42:36  <k1i>I will work on this
22:42:40  <k1i>let me know what changes you are making
22:43:43  <owenb>there will be plenty over the next few days as i want to get everything ready for monday (just the realtime server). i have made some changes to the asset serving but i'm no where near happy with those yet
22:43:48  <owenb>i'll push what i can soon
22:44:06  <k1i>also
22:44:08  <k1i>one thing I saw in rts-angular
22:44:12  <owenb>main thing i need to do is make prism it's own npm module/repo
22:44:15  <k1i>how would you add angular.js to the asset pipeline
22:44:31  <owenb>so right now you do it with browserAssets
22:44:38  <owenb>but it's the weakest part of realtime services
22:44:42  <owenb>today i looked at Bower
22:44:46  <owenb>i'm thinking about using that
22:44:48  <owenb>but not decided yet
22:44:55  <owenb>on my list to explore this weekend
22:44:55  <k1i>it is hard to design this cleanly
22:44:58  <k1i>i mean
22:44:58  <owenb>yes
22:45:07  <k1i>do you like the idea of rts-angular packaging/vendoring a version of angular.js?
22:45:10  <k1i>a la rails gems
22:45:28  <owenb>i like the idea of saying my module is designed to work with a specific version
22:45:30  <owenb>and it's been tested
22:45:41  <owenb>but the app dev should be able to override that
22:45:44  <owenb>i'm really not sure yet
22:45:51  <owenb>lots to think about here
22:45:58  <k1i>also
22:45:59  <owenb>this area is likely to change a lot over the next few days
22:46:04  <k1i>you said browserify doesnt allow you to send stuff over the wire?
22:46:08  <owenb>no
22:46:28  <k1i>er
22:46:33  <k1i>what was the limitation of browseirfy?
22:47:04  <owenb>it's a design decision. results in less code
22:47:05  <owenb>which is good
22:47:09  <owenb>and less to go wrong
22:47:22  <owenb>client-side code for realtime services will defintily use common js
22:47:24  <owenb>that's for sure
22:47:52  <owenb>but it will be left up to the realtime server and framework to use whatever module system they want
22:48:01  <owenb>which should be fine
22:48:03  <k1i>what will ss use?
22:48:04  <owenb>right must go
22:48:08  <owenb>browerify i think
22:48:14  <owenb>but yet to decide totally
22:48:24  <k1i>gotcha
22:48:26  <owenb>common js for 100% sure
22:48:28  <k1i>let me know next time you are on
22:48:29  <owenb>not require js
22:48:35  <owenb>when i know you'll know :)
22:48:39  <owenb>lots still to figure out
22:48:50  <k1i>yep
22:48:59  <k1i>I hope this model system can make it in and work with the new api
22:49:13  <owenb>bye for now
22:49:18  <owenb>bye for now
22:49:30  <k1i>later
22:50:02  <owenb>don't worry - the stuff you've done will work fine. that area of things won't change much. it's the browser asset serving stuff that's not sorted yet
22:50:04  <owenb>right. going!
22:50:05  <owenb>:)\