Code is *hard* - Grin with cat attached — LiveJournal
|Code is *hard*||Sep. 17th, 2003 09:45 pm|
Just realised the caching strategy for the lj-userinfo XMLRPC interface is going to be a bugger; essentially "known data" will be a (probably) sparsely-populated matrix, with elements expiring (fairly) independently.
Since I can't just fetch exactly the user/value sets I'm missing (if I want to wrap multiple data fetches up in one call) I need to construct a minimal square superset of data... probably by building a 'need' matrix, matching against a 'gotcached' matrix, and marking whole rows/columns as 'dirty' - up to a maximum fetch size.
Problem is, this is succeptible to a simple pathological case where one user is missing all data, and each user is missing one piece of data; whole row dirty = all cols dirty; whole col dirty = all rows dirty.
I think this can be tackled by ensuring that data is only ever aggregated on a like basis; if you want one person's life story, and each of their friends's journal types, do 2 fetches!
Nuts - and I need to store refusals to serve, too - or I'll end up trying to refetch refused data every time...
Hrm, better make a comment on that last one to ciphergoth's latest lj_dev post tomorrow...