If you want to use Qlik Web Connectors to pull down data on a regular basis, you'll want to be able to build up the historical data and just add the latest to it so that you're not getting everything, every time.
This page gives an example of how this could be done.
Although it's possible to cache data with Qlik Web Connectors, this is more useful where you are potentially using identical requests and want to check if you've already got the data before you make a call to the API. For example, when getting a sentiment score for a Tweet there's no point getting it again if you've already got it.
For other scenarios, where say you want to build up Google Analytics data over time, you want to just get the latest data and add it to what you already have and store it to a file.
Qlik Web Connectors doesn't have a function to store data to a file, but you can use QlikView script to save tables to QVDs, which is QlikView's file storage format.
Saving data to QVDs has a number of advantages including:
- being very quick to reload into your apps
- they are compressed
- they can be easily backed up as part of your normal data management processes and
- they can also be reused in other QlikView applications.
Of course you are not restricted to saving data to QVDs as QlikView can save data in other ways as well.
Below is one example of how this can be achieved using QlikView script and is modified from our Twitter demo application.
// create a variable for the QVD name as it will be reused
let allTweets =
// We check the size of the QVD
let size = filesize(
the size is greater than zero then we already have (historical) data stored in it
// and so we should load it to the TwitterConnector_Search table
// After the first time this script is executed, this should always run
if not isnull(size) then
LOAD * FROM
// We now make the call to QVSource and get the latest data
// Notice at the end that we
're only adding Search_id's to the table that aren't already in it by using the 'where not exist'
from_user_id_str as Search_from_user_id_str,
id as Search_id,
entity_source as Search_entity_source
(txt, utf8, embedded labels, delimiter is
where not exists (
// At this point the table includes both historical and the latest data
// We can now store it to the QVD, ready to be reloaded next time the script is run
STORE TwitterConnector_Search INTO