While Qlik offers abundant system-wide logging, application specific logging remains somewhat inaccessible. Moreover, it is difficult to configure and requires parsing to obtain what is required. In this article, we’ll construct a custom function that leverages existing Qlik Script logging. This will make message customization and retrieval simple, effective, and reusable.
Table of contents
Logging is critical to application development
Developers know that feedback while constructing an application is critical. No application is built without a few (or many!) errors and hurdles that were unknown at the outset. The quicker the time between issue identification and resolution, the more effective development can be. This is where logging can be a critical piece of the development process.
There are many reasons to enable your applications to deliver feedback. But these two are arguably the most important:
- Identify errant data in the development cycle
- Note any current or potential issues during the maintenance phase
Keeping needed messages in an accessible place can be the difference in one’s sanity. Whether that’s in a Qlik app or simple text logs. Depending on the messages created, you can also make logs available to application consumers; they can provide confidence that underlying processes are operating as expected. For example, they could find out whether a given file loaded with the expected number of rows or if certain field values were present.
Leveraging Qlik script logs
Qlik provides many types of logging and no shortage of information. However, the logging messages, log location and volume makes them less than friendly to sift through. By default, Qlik stores its logs at %ProgramData%\Qlik\Sense\Log as configured by the local log configuration file. By loading up the targeted information contained in these files and then programmatically sifting through it to identify relevant messages, we can output more relevant and accessible log files to a custom locale.
Because our purpose is to provide application feedback, we will be looking exclusively at the script logs. In the default installation, a Data Source Connection called ’ServerLogFolder’ is made available for log inquiry purposes. Perusing the ‘Script’ folder via this data source connection, we find a multitude of cryptically labeled files. The names of these files correspond to the underlying application identifier known as the DocumentName() – not to be confused with the DocumentTitle() which states the human-readable application name.
Qlik log files
Understanding the location, naming convention, and content allows conceptual understanding of how to acquire feedback messages for developers or consumers. However, there are several hurdles that now present themselves, not least of which is the volume of log files available (especially in a larger or aging installation) as well as the amount of what could be considered noise within each log file.
A quick peek at one of these many files reveals a lot of information that we just won’t need on a regular basis.
We can quite effectively reduce the number of files searched by only considering those matching the application GUID identifier – note the wildcard load clause for vLogID defined in the attached file.
Supposing that we can load these logs into a Qlik application and filter them, we will quickly find the need to add specific information to this file. Fortunately, Qlik has provided the TRACE function which not only writes to the output window during logging but also to the Scripts log. Loading and filtering the log for custom statements are the foundational components upon which we’ll build our log tapping function.
1: Adding Log Messages via Trace
By simply adding the following line to any Qlik application, the current respective log file identified by the DocumentName (or app GUID) will contain the designated message.
TRACE My Log message;
If we extend our TRACE message with a consistent and unique prefix, then we can filter the application log for any of our messages.
SET vPrefix=mymsg; TRACE $(vPrefix) - My Log message; …. TRACE $(vPrefix) - End Application build;
Now we can completely customize our message with variables and send it to the TRACE command.
2: Leverage dynamic variables to specify custom message
A dynamic variable is assigned programmatically at the time it is run. We can define the vLog input as below to provide a run-time message within the TRACE statement. We’ll customize our log message by adding variables for User, Client, Application, a delimiter, and message Text. Notice that we use the SET instead of LET command to enable the dynamic variable expansion at run time.
SET vLog = '$(vDelim)$(vMsgPrefix)$(vDelim)$(vMsgUser)$(vDelim)$(vMsgClient)$(vDelim)$(vMsgApp)$(vDelim)$(vMsgText)';
Assigning these variables and structuring their storage and eventual output is programming that we want to do for every application for which we want to deploy our logging function. Therefore, we’ll specify our message from within the TRACE statement with as little added effort as possible using this Dynamic Variable. The only input we need at logging time is the message.
By structuring the variable in this way, we can now program the variable assignment with a sub-routine to construct our message in whatever way we want by calling the dynamic variable.
3: Reuse Code with SubRoutines and MustInclude
We’re now ready to begin writing a specific function to construct our log. The method here is to write up a callable sub routine using Qlik’s SUB function. Subroutines need to be placed before the script commands in which they’re called so they can be available. You can then invoke them with CALL when needed.
To demonstrate, we’ll start with a simple variable assignment for the App GUID and a log message.
SUB GenerateAppLog LET vLogID=DocumentName()&' - $1'; END SUB; CALL GenerateAppLog; TRACE $(vLogID(My Log message));
Using this structure, the logging and log file generation is coded and is made callable. The Log subroutine is below:
SUB LOG (vMsgText) SET vLog = '$(vDelim)$(vMsgPrefix)$(vDelim)$(vMsgUser)$(vDelim)$(vMsgClient)$(vDelim)$(vMsgApp)$(vDelim)$(vMsgText)'; TRACE $(vLog); END SUB; CALL LOG(‘Insert Message Here’);
Abstracting from a Qlik application
We’ll make a short departure now to discuss abstracting this from any individual Qlik application into a function you can leverage across applications with just a few commands. Copying the code into a text file and saving it to a Qlik accessible location allows us to INCLUDE our new function within any application on our installation. Use this code:
$(Must_Include=lib://<<FILEPATH>>/fxnGenerateAppLog.txt); //NOTE: <<FILEPATH>> represents the location determined by file placement. //HINT: If done within an existing data source connection no additional setup is required.
Combining the include and subroutine functions ensures – with a single line of code – that our variables are initialized, and our subroutines are defined and available for our use. This is where we can set default values for Prefix, Delim, Client, and a vPurgeDays variable to define message retention. You can access the complete code in the file below. Its usage becomes straightforward.
$(Must_Include=lib://<<FILEPATH>>/fxnGenerateAppLog.txt); //Optionally customize vMsgXXXX variables //LET vMsgFile = $(vMsgClient); //LET vMsgPrefix = 'ABC'; CALL LOG ('Insert Message Here'); //Placed at end of application to consolidate and write log file. CALL GenerateAppLog;
In the fxnGenerateAppLog.txt file you’ll find variables and definitions, the most important of which to setup is the output file location, vLogOutput. Again, using an already defined data source location,you can do this easily without additional configuration. However, you can customize as the environment demands. You can customize variables AFTER the include statement; this will have varying effects on log overwriting depending on how this is done. For example, if log files are more usefully kept by Client, the vMsgFile variable can simply be reset to vMsgClient. Consistent use of the vMsgFile and vMsgPrefix will allow log consolidation.
By tapping Qlik’s default logging scheme and applying the concepts of TRACE, Dynamic Variables, subroutines and include statements developers can make customizable logging solutions to support their development and consumer needs. Happy logging!
Keep Reading: Back To The Basics With Qlik >>