But we noticed MASSIVE traffic.
My model was to pump out the data and have intelligent pages decide what to show.
It turns out this is very inefficient.
Not ready for primetime
We have 4 'channels' of information. The software had originally been built on a 'tv station' idea. And, in TV, everything gets pumped down the line (or over the air) and then the tv 'tunes in' to a channel and shows the appropriate content.In our case, we had a 'screen structure' message that was sent out once a minute and then in between each element on the screen would send out its own updates if it was altered.
It was the screen structure message that was the problem.
The old way
I was doing something like this on the page:client.subscribe('/source', function(message) { window.setTimeout(function() { process_source_message(message); }, 1); })
And in Node app.js I was doing this:
app.post('/source_state', function(req, res) { bayeux.getClient().publish('/source', req.body); res.send(200); });
And in Rails, something like this:
RestClient.post [NODE_CONFIG[Rails.env]['node_url'], NODE_CONFIG[Rails.env]['source_update_path']].join('/'), JSON(json)
So, every page would get every 'source' message.
The new way
I realised that I should have the pages listen to their specific channel. I had got hung up in the past on the idea that I didn't know the name of the source in advance - it could be anything as it is user defined.But, inside of the source JSON was already this:
json = { "source" => {"name" => source.slug} }
In the app.js I just needed
app.post('/source_state', function(req, res) { bayeux.getClient().publish('/'+req.body.source.name, req.body); res.send(200); });
No comments:
Post a Comment