generalized search caching

Idea

generalized search caching+status

generalized search caching+priority

generalized search caching+tag

 

generalized search caching+issues

WikiRate is definitely going to need some sophistication in cache clearing.   We'll need to store calculated metric values, intricate counts, etc.  It won't be workable to recount, redo math, etc everytime we load a page, especially since some queries need to use these counts and values for sorting.

 

generalized search caching+solution

For now picking some simple cases and trying to figure out how we could deal with them.  Here I'm specifically going after the cases where we want to store values (especially integers) in cards, but there may be separate cases for memory caching.

 

 

 

generalized search caching+example

Say Coke+claim count is configured by:

claim_count+*right+*structure (type = CachedSearch?):

type:Claim
right_plus:[Company, refer_to: _self]
return: count

(somewhere between ruby and js :) )

When we save that card, that updates an entry in a global "cache_clear" hash.  The key is the rule name (claim_count+*right+*structure), the value is some sort of CacheClear object, which has three key parts:

1. the test(s) of whether cached items need repopulating

Based on the WQL, Wagn automatically contructs a test as to whether the current change is one that should cause an update.  It is run on every card change.

To frame this a bit, there are really two main kinds of changes we'd be looking at: changes to the Claim card itself (create, delete, or type update) or changes to the +Company card (basically any change matters, though a type update is admittedly debatable).  Technically renaming the Company or Claim card could break things, too, but I'm not going to deal with that here.

So we have two main tests cases;

  • type==Claim && type_changed?   ## and optionally a test to see if it's a create, detele, or type update
  • junction? && right_name=='Company' && left.type_id == ClaimID

The first one is a fairly obvious translation of the WQL.  The second starts from the perspective of the inner query and works its way out.

If either test passes it should return the +company card.  Otherwise it should return false.

2. the list of cached items to repopulate

In this case, what we need to do is figure out everything referred to by the +company card (both old and new versions if its content is getting updated)

At this point, we take the rule's set (claim count+*right) and use the referees (companies) as anchors and reconstruct, for example, Coke+claim count.  This list is kept somewhere in the main action of the act  until the "extend" phase, at which point...

3. repopulate cached items

We go through each item on the list and (re)populate it by running the structured search.  Unlike most structured cards, the content of the card itself matters here, because we use, eg Coke+claim_count to store the value of the count.  I suspect we want to skip all the history handling (act/action/change) on these cards because the history of caches is not particularly worth the performance hit.

 

~~~~~

 

I'm not actually entirely sure #2 and #3 will require any special configuration.  Might be able to do the whole thing with a Proc rather than this CacheClear object.

 

a few observations:

  • Hard for me to tell so far how generalizable the "Test" part is; would be good to work out some examples.
  • Worth noting that no db queries are involved in the tests (provided the cache is populated)
  • Even if we don't generalize the capacity to auto-generate a test from WQL, this general structure could be pretty useful in moving the rationale of cache clearing away from the clearers and toward the clearees (which would seem to be a more reusable pattern).
  • For example, CalculatedValues would not be a CachedSearch card, but it could certainly add code to a "cache_clear hash" and function largely as outlined above.
  • Would be cool to explore how this might work in caching "contributions" on WikiRate as one of the next steps.
--Ethan McCutchen.....2015-05-21 05:46:24 +0000

generalized search caching+relevant user stories