Use Caching when you are ready to build on top of Keen Compute query results. This unlocks a few benefits: subsecond response times for customer-facing dashboards, and secondly leveraging the cached query's pre-computed result does not count against your per-project query limit.
Cached Queries and Cached Datasets regularly perform the computation in advance so that results can always be retrieved with sub-second latency. A common use case for caching is to prepare query results that are going to be embedded into live customer-facing dashboards. This provides your customers with a lightning fast and positive user experience every time. Take advantage of Cached Queries, so your dashboard view that is running live can get subsecond results from the cached query. As a side bonus, caching queries and datasets helps save on compute query costs.
If you aren't currently using Cached Queries, a user pulling up your dashboard would be scanning events each time live. A cached query, would be refreshed once on an interval (every hour for example), and thus many many many users could access the result from one query run where the result was precomputed. Alternatively, many many users hitting the dashboard would reactivating a series of 2-6 queries that scan many events, each time.
In summary, Cached Queries and Cached Datasets pre-computations do not contribute to your per-project query rate limit. This combined with subsecond response times makes them ideal for building customer-facing embedded analytics.
Read more on "How Caching Saves on Query Costs": https://keen.io/docs/compute/compute-pricing-guide/#how-caching-saves-on-query-costs