Featured Content
-
How to contact Qlik Support
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical e... Show MoreQlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
- Support and Professional Services; who to contact when.
- Qlik Support: How to access the support you need
- 1. Qlik Community, Forums & Knowledge Base
- The Knowledge Base
- Blogs
- Our Support programs:
- The Qlik Forums
- Ideation
- How to create a Qlik ID
- 2. Chat
- 3. Qlik Support Case Portal
- Escalate a Support Case
- Phone Numbers
- Resources
Support and Professional Services; who to contact when.
We're happy to help! Here's a breakdown of resources for each type of need.
Support Professional Services (*) Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. - Error messages
- Task crashes
- Latency issues (due to errors or 1-1 mode)
- Performance degradation without config changes
- Specific questions
- Licensing requests
- Bug Report / Hotfixes
- Not functioning as designed or documented
- Software regression
- Deployment Implementation
- Setting up new endpoints
- Performance Tuning
- Architecture design or optimization
- Automation
- Customization
- Environment Migration
- Health Check
- New functionality walkthrough
- Realtime upgrade assistance
(*) reach out to your Account Manager or Customer Success Manager
Qlik Support: How to access the support you need
1. Qlik Community, Forums & Knowledge Base
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
The Knowledge Base
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
- Go to the Official Support Articles Knowledge base
- Type your question into our Search Engine
- Need more filters?
- Filter by Product
- Or switch tabs to browse content in the global community, on our Help Site, or even on our Youtube channel
Blogs
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)Our Support programs:
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.The Qlik Forums
- Quick, convenient, 24/7 availability
- Monitored by Qlik Experts
- New releases publicly announced within Qlik Community forums (click)
- Local language groups available (click)
Ideation
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation GuidelinesHow to create a Qlik ID
Get the full value of the community.
Register a Qlik ID:
- Go to register.myqlik.qlik.com
If you already have an account, please see How To Reset The Password of a Qlik Account for help using your existing account. - You must enter your company name exactly as it appears on your license or there will be significant delays in getting access.
- You will receive a system-generated email with an activation link for your new account. NOTE, this link will expire after 24 hours.
If you need additional details, see: Additional guidance on registering for a Qlik account
If you encounter problems with your Qlik ID, contact us through Live Chat!
2. Chat
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
- Answer common questions instantly through our chatbot
- Have a live agent troubleshoot in real time
- With items that will take further investigating, we will create a case on your behalf with step-by-step intake questions.
3. Qlik Support Case Portal
Log in to manage and track your active cases in the Case Portal. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
Your advantages:
- Self-service access to all incidents so that you can track progress
- Option to upload documentation and troubleshooting files
- Option to include additional stakeholders and watchers to view active cases
- Follow-up conversations
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Problem Type
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
Priority
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
Severity
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
Escalate a Support Case
If you require a support case escalation, you have two options:
- Request to escalate within the case, mentioning the business reasons.
To escalate a support incident successfully, mention your intention to escalate in the open support case. This will begin the escalation process. - Contact your Regional Support Manager
If more attention is required, contact your regional support manager. You can find a full list of regional support managers in the How to escalate a support case article.
Phone Numbers
When other Support Channels are down for maintenance, please contact us via phone for high severity production-down concerns.
- Qlik Data Analytics: 1-877-754-5843
- Qlik Data Integration: 1-781-730-4060
- Talend AMER Region: 1-800-810-3065
- Talend UK Region: 44-800-098-8473
- Talend APAC Region: 65-800-492-2269
Resources
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
Recent Documents
-
How to send a bursted report using Qlik Application Automation
This article explains how the Qlik Reporting connector in Qlik Application Automation can be used to generate a bursted report that delivers recipient... Show MoreThis article explains how the Qlik Reporting connector in Qlik Application Automation can be used to generate a bursted report that delivers recipient-specific data.
For more information on the Qlik Reporting connector, see this Reporting tutorial.
This article offers two examples where the recipient list and field for reduction are captured in an XLS file or a straight table in an app. Qlik Application Automation allows you to connect to a variety of data sources, including databases, cloud storage locations, and more. This allows you to store your recipient lists in the appropriate location and apply the concepts found in the examples below to create your reporting automation. By configuring the Start block's run mode, the reporting automations can be scheduled or driven from other business processes.
Contents
Using a straight table as a distribution list
In this example, the email addresses of the recipients are stored in a straight table. Add a private sheet to your app and add a straight table to it. This table should contain the recipients' email address, name, and a value to reduce the app on. We won't go over the step-by-step creation of this automation since it's available as a template in the template picker under the name "Send a burst report to email recipients from a straight table".
Instead, a few key blocks of this template are discussed below.
- List Values Of Field. This block will return a list of all values of the specified field. The selected field should directly map to the recipient scope of data to be delivered. For each value, a report will be generated and distributed to the recipients who have that value assigned in the distribution list. We'll call this value the scope of the report, an example field could be Region to build a report for each different region and delivery to regional employees.
- The Create Report block and attached blocks will then be executed for each unique value of the field. Tip: you can dynamically name your reports by mapping the current field's value to the Report name parameter of the Create Report block:
- Add any sheets to the report by using the Add Sheet to Report block. Use the block Add Selection To Sheet to add selections. Add at least a selection for the current scope of the report. Also, apply a selection to the app using the Select Field Value block. This will reduce the straight table that contains the distribution list to the current scope.
- Now, use the block Get Straight Table Data and configure it to retrieve the distribution list. The important step here is to append each email address from the distribution list to a string variable. Also, add a semicolon ';' to the field to separate the email addresses.
- The scope from the distribution list should be used to add a sheet to the report and apply a selection for the current recipient. For example, if the recipients are sales reps, this could be used to provide an overview of each recipient's individual sales.
- Use the Generate Report block to finalize the report. Map this block's output to the attachment parameter of the Send Mail block. Map the string of email addresses to the To input field of the Send Mail block.
- Then make sure to execute the block Unlock Selections to allow the selections to be overwritten for the next report scope. Also, empty the email addresses from the string variable to make sure they aren't persisted for the next scope.
Helpful tips
- The Output block can be used to keep track of the automation's progress by showing the current scope.
- When building the report automation, you can test the email output by sending the report to yourself before using the distribution variable for operational execution.
- It's also possible to use the Copy File block from the Cloud Storage connector to store a copy of the report on any file storage system like Dropbox or SFTP.
Using an Excel File as a distribution list
In this example, the email addresses of the recipients are stored in an Excel file. This can be a simple file that contains one worksheet with headers on the first row (name, email & a value for reduction) and one record on each subsequent row.
- The first step is to gather the Excel file's item id. This can be retrieved from the URL when the file is open in a browser window or by executing the List Items In Root Folder Of Drive block. Execute this block in a test run to fetch these records.
- The results from the test run can now be used in other blocks, click the Item Id input field of the List Rows WIth Headers block and select the id key from the right item in the output returned by the List Items In Root Folder Of Drive block.
- Further configure the List Rows With Header block by specifying the worksheet name and the dimensions of the data (header row included). Below is an example of the excel file shown at the beginning of this section.
- The other steps of the automation are similar to the first part of this article so they won't be repeated here. Optionally, values from the Excel file can be used for reducing the report with the block Add Selection To Report.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
- List Values Of Field. This block will return a list of all values of the specified field. The selected field should directly map to the recipient scope of data to be delivered. For each value, a report will be generated and distributed to the recipients who have that value assigned in the distribution list. We'll call this value the scope of the report, an example field could be Region to build a report for each different region and delivery to regional employees.
-
Introducing Automation Sharing and Collaboration
Watch this space for when the feature has been successfully rolled out in your region. This capability is being rolled out across regions over time: ... Show MoreWatch this space for when the feature has been successfully rolled out in your region.
This capability is being rolled out across regions over time:
- May 5th: India, Japan, Middle East, Sweden (completed)
- June 4th: Asia Pacific, Germany, United Kingdom, Singapore (completed)
- TBD: United States
- TBD: Europe
- TBD: Qlik Cloud Government
The previously scheduled rollouts of Automation Sharing and Collaboration for some regions have been temporarily postponed. We are working on an updated release plan, and updated release dates are soon to be determined (TBD). Thank you for your understanding.
With the introduction of shared automations, it will be possible to create, run, and manage automations in shared spaces.
Content
- Allow other users to run an automation
- Collaborate on existing automations
- Collaborate through duplication
- Extended context menus
- Context menu for owners:
- Context menu for non-owners:
- Monitoring
- Administration Center
- Activity Center
- Run history details
- Metrics
Allow other users to run an automation
Limit the execution of an automation to specific users.
Every automation has an owner. When an automation runs, it will always run using the automation connections configured by the owner. Any Qlik connectors that are used will use the owner's Qlik account. This guarantees that the execution happens as the owner intended it to happen.
The user who created the run, along with the automation's owner at run time, are both logged in the automation run history.
These are five options on how to run an automation:
- Run an automation from the Hub and Catalog
- Run an automation from the Automations activity center
- Run an automation through a button in an app
You can now allow other users to run an automation through the Button object in an app without needing the automation to be configured in Triggered run mode. This allows you to limit the users who can execute the automation to members of the automation's space.
More information about using the Button object in an app to trigger automation can be found in How to run an automation with custom parameters through the Qlik Sense button. - Programmatic executions of an automation
- Automations API: Members of a shared space will be able to run the automations over the /runs endpoint if they have sufficient permissions.
- Run Automation and Call Automation blocks
- Note for triggered automations: the user who creates the run is not logged as no user specific information is used to start the run. The authentication to run a triggered automation depends on the Execution Token only.
Collaborate on existing automations
Collaborate on an automation through duplication.
Automations are used to orchestrate various tasks; from Qlik use cases like reload task chaining, app versioning, or tenant management, to action-oriented use cases like updating opportunities in your CRM, managing supply chain operations, or managing warehouse inventories.
Collaborate through duplication
To prevent users from editing these live automations, we're putting forward a collaborate through duplication approach. This makes it impossible for non-owners to change an automation that can negatively impact operations.
When a user duplicates an existing automation, they will become the owner of the duplicate. This means the new owner's Qlik account will be used for any Qlik connectors, so they must have sufficient permissions to access the resources used by the automation. They will also need permissions to use the automation connections required in any third-party blocks.
Automations can be duplicated through the context menu:
As it is not possible to display a preview of the automation blocks before duplication, please use the automation's description to provide a clear summary of the purpose of the automation:
Extended context menus
With this new delivery, we have also added new options in the automation context menu:- Start a run from the context menu in the hub
- Duplicate automation
- Move automation to shared space
- Edit details (owners only)
- Open in new tab (owners only)
Context menu for owners:
Context menu for non-owners:
Monitoring
The Automations Activity Centers have been expanded with information about the space in which an automation lives. The Run page now also tracks which user created a run.
Note: Triggered automation runs will be displayed as if the owner created them.
Administration Center
The Automations view in Administration Center now includes the Space field and filter.
The Runs view in Administration Center now includes the Executed by and Space at runtime fields and filters.
Activity Center
The Automations view in Automations Activity Center now includes Space field and filter.
Note: Users can configure which columns are displayed here.
The Runs view in the Automations Activity Center now includes the Space at runtime, Executed by, and Owner fields and filters.
In this view, you can see all runs from automations you own as well as runs executed by other users. You can also see runs of other users's automations where you are the executor.
Run history details
To see the full details of an automation run, go to Run History through the automation's context menu. This is also accessible to non-owners with sufficient permissions in the space.
The run history view will show the automation's runs across users, and the user who created the run is indicated by the Executed by field.
Metrics
The metrics tab in the automations activity center has been deprecated in favor of the automations usage app which gives a more detailed view of automation consumption.
-
Qlik Stitch: MySQL Integration Extracting Error "Binlog has expired for tables"
You may be experiencing a pattern of the Binlog error like below when extracting data from MySQl Integrations in Stitch. Fatal Error Occurred - Binlog... Show MoreYou may be experiencing a pattern of the Binlog error like below when extracting data from MySQl Integrations in Stitch.
Fatal Error Occurred - Binlog has expired for tables
The problem always stems from how did you configured certain parameters in your AWS RDS instance due to a lack of explicit specification.
As MySQL Integrations can be configured with AWS RDS instances, for MySQL integrations that are configured in this way, it means the requirements for the Binary Log Retention Period differs from that of standalone MySQL instances and this can cause issues if the Binary Log Retention Period is incorrectly or insufficiently configured.
Resolution
The Binary Log Retention Period should be configured to a value of 168 hours or 7 days.
Running callcall mysql.rds_set_configuration('binlog retention hours', 168)
or
call mysql.rds_set_configuration('binlog retention hours', 7)
will fix the issue.
Cause
The Binary Log Retention Period is controlled by AWS RDS which functions differently and independently of a standalone MySQL instance.
If the Binary Log Retention Period is set to 0, this will cause issues.
Related Content
There is a RDS specific stored procedure that governs bing log retention beyond traditional MySQL parameters given in the case.
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/mysql-stored-proc-configuring.html
Environment
- Talend Cloud
- Qlik Stitch
-
Qlik Talend Studio environment flagged with spring-core-5.3.33.jar and how to us...
This article provides step-by-step instructions to ensure your Talend Studio environment no longer uses the flagged spring-core-5.3.33.jar file and in... Show MoreThis article provides step-by-step instructions to ensure your Talend Studio environment no longer uses the flagged spring-core-5.3.33.jar file and instead uses the updated spring-core-6.1.14.jar. It also explains how to verify your job builds are free of the old JAR.
Steps to remove the flagged JAR and confirm the update
These steps are only necessary in cases where Qlik Talend Studio did not remove the old jar when applying the relevant patch (2025-02 or newer).
- Verify the Updated JAR in Qlik Talend Studio:
- Open Qlik Talend Studio
- Go to the Modules view
- Search for spring-core
- Confirm that spring-core-6.1.14.jar is installed and marked as active
- Remove the Old JAR from Your Studio Environment
Once you’ve confirmed the new version is active, you can remove the old JAR:
- Navigate to the following example path (adjust accordingly if your installation directory differs):
C:\TalendStudio8_install\studio\configuration\.m2\repository\org\springframework\spring-core\5.3.33.jar - Delete spring-core-5.3.33.jar from this location
Create a backup copy of the JAR before deletion, in case you need to restore it later.
- Navigate to the following example path (adjust accordingly if your installation directory differs):
- Check your Job Builds
When you build a job in Qlik Talend Studio, all required JAR files are packaged into a ZIP archive.
- Locate the lib folder inside the archive
- Check if spring-core-5.3.33.jar is present
- If not present: No further action is needed
- If present: Rebuild the job using your patched (2025-02) version or newer of Qlik Talend Studio to ensure only the updated JAR is included
- Restart Qlik Talend Studio to ensure all updates take effect
Additional Notes
- Qlik Talend Studio may re-download JARs from a shared repository (such as Nexus) if configured. If you use a shared repository, ensure the old JAR is also removed or blocked there to prevent re-import.
- For projects using version control or shared resources, coordinate with your team to avoid accidental reintroduction of the old JAR.
Troubleshooting
- If the old JAR reappears after deletion, check if your project is configured to pull dependencies from a shared repository. Remove or block the flagged JAR in that repository as well.
- If you encounter errors after removing the JAR, confirm that no jobs or custom code still reference the old version.
Environment
- Qlik Talend Studio
- Verify the Updated JAR in Qlik Talend Studio:
-
Qlik Talend Administration Center login fails with error 500: MySQL server is ru...
Cannot log in to Qlik Talend Administration Center. Login fails with: Error: 500, The call failed on the server; see server log for details The log fi... Show MoreCannot log in to Qlik Talend Administration Center. Login fails with:
Error: 500, The call failed on the server; see server log for details
The log file reports:
ERROR SqlExceptionHelper - The MySQL server is running with the LOCK_WRITE_GROWTH option so it cannot execute this statement
Resolution
Expand cloud storage for MySQL or perform a data cleanup to free up the space required.
Cause
Cloud MySQL has reached the cloud storage quota during data writes.
Environment
- Qlik Talend Administration Center
-
Long term offline use for Qlik Sense and QlikView Signed Licenses
Using a Signed License Key with its Signed License Definition in a long term offline environment past the 90 days provided by Delayed Sync requires (b... Show MoreUsing a Signed License Key with its Signed License Definition in a long term offline environment past the 90 days provided by Delayed Sync requires (besides license modification) additional configuration steps!
- No license has long term offline capability enabled by default.
- Long term offline capability needs to be specifically approved by Qlik and the conditions of offline use agreed to by the customer. See Request license off-line approval - April 2020 and onwards. Once approval has been obtained through the CSO, the license will be modified with an additional attribute OFFLINE;YES;;
- Additional configuration changes are necessary for long term offline mode to function
- (Delayed Sync requires a Signed License Definition, but does not require additional configuration steps)
Configuration changes needed:
These changes will need to be done on all nodes running the Service Dispatcher. Not only the Central node.
- Stop the ServiceDispatcher service
- As Administrator, open the Service Dispatcher services.conf file.
Default location Qlik Sense on-prem: C:\Program Files\Qlik\Sense\ServiceDispatcher
Default location QlikView: C:\Program Files\QlikView\ServiceDispatcher - Locate [Licenses.parameters]and add the parameter -offline
Example:
[licenses.parameters]
-qsefw-mode
-offline
-app-settings="..\Licenses\appsettings.json"
The displayed order is required in the February 2024 IR release. Previous and future versions (Patch 2 and later) do not have a fixed order requirement. - Save the file
- Start the ServiceDispatcher service
Once the changes has been done you will need to retrieve the updated SLD key from https://license.qlikcloud.com/sld and then apply the same for successful offline activation.
Note on upgrading: If using a version of Qlik Sense prior to November 2022, this file may be overwritten during an upgrade. Please be sure to re-apply this parameter and restart the Service Dispatcher on all nodes after an upgrade. With Qlik Sense November 2022 or later, custom service settings are by default kept during the upgrade. See Considerations about custom configurations.
Internal Investigation ID(s):
QB-25231
-
Qlik Replicate and Google Pub/Sub target endpoint: Managing Google Pub/Sub Subsc...
The Permissions requirements for Google Pub/Sub as a target state: pubsub.subscriptions.create on the containing Cloud project (only required if you w... Show MoreThe Permissions requirements for Google Pub/Sub as a target state:
pubsub.subscriptions.create on the containing Cloud project (only required if you want Replicate to create a default subscription per topic when no subscription exists).
See Prerequisites.
However, the Pub/Sub endpoint does not offer a parameter allowing the subscription name to be specified.
Resolution
The subscription must follow the same naming convention, which involves appending the "-sub" suffix to the topic name. For example, if the topic name is "mytopic," the subscription name must be "mytopic-sub."
Cause
Qlik Replicate checks for subscriptions that follow a specific naming convention. For example, if the topic name is "mytopic," the subscription name must be "mytopic-sub." Qlik Replicate only recognizes subscriptions that adhere to this naming format.
Environment
- Qlik Replicate
-
Qlik Replicate task fails with error 0000-00-00 00:00:00 when source has type ti...
A Qlik Replicate task fails with error 0000-00-00 00:00:00 when the source column is set to type timestamp and the target is timestamp. Example: The S... Show MoreA Qlik Replicate task fails with error 0000-00-00 00:00:00 when the source column is set to type timestamp and the target is timestamp.
Example:
The Source may be Oracle, which has a column type of Date, while the target column type in Snowflake is timestamp (or the reverse).
Detailed error message:
]E: Failed (retcode -1) to execute statement: 'COPY INTO "********"."********" FROM @"SFAAP"."********"."ATTREP_IS_SFAAP_0bb05b1d_456c_44e3_8e29_c5b00018399a"/11/ files=('LOAD00000001.csv.gz')' [1022502] (ar_odbc_stmt.c:4888) 00023950: 2021-08-31T18:04:23 [TARGET_LOAD ]E: RetCode: SQL_ERROR SqlState: 22007 NativeError: 100035 Message: Timestamp '0000-00-00 00:00:00' is not recognized File '11/LOAD00000001.csv.gz', line 1, character 181 Row 1, column "********"["MAIL_DATE_DROP_DATE":15] If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client. [1022502] (ar_odbc_stmt.c:4895)
Resolution
The bindDateAsBinary attribute must be removed or set on both endpoints.
bindDateAsBinary means the date column is captured in binary format instead of date format.Cause
If the bindDateAsBinary parameter is not configured on both tasks, the logstream staging and replication task may not parse the record in the same way.
Environment
- Qlik Replicate
-
CreatorDoc is missing from QVD Data Lineage in QlikView
After upgrading to Qlik Sense Enterprise on Windows May 2024 SR1, the Data Lineage information of <CreatorDoc></CreatorDoc> is empty for all QVDs crea... Show MoreAfter upgrading to Qlik Sense Enterprise on Windows May 2024 SR1, the Data Lineage information of <CreatorDoc></CreatorDoc> is empty for all QVDs created using the Store script function.
Example:
STORE * FROM Transactions INTO C:\QlikDocuments\Dev\Transactions.qvd ;
Resolution
Workaround:
Specify the format (qvd) in the STORE command.
Example:
STORE * FROM Transactions INTO C:\QlikDocuments\Dev\Transactions.qvd (qvd);
Read more about the syntax in Store.
Fix Version:
Qlik aims to release a fix in the next possible service release.
Information provided on this defect is given as is at the time of documenting. For up-to-date information, please review the most recent Release Notes or contact support with the ID SUPPORT-4149 for reference.
Cause
Product Defect ID: SUPPORT-4149
Environment
- QlikView May 2024 SR1
-
Qlik Talend Studio: How To Create A New Branch from A Commit Before the Faulty M...
When doing a false migration update, you want a new branch based on the good commit id to start again. This article briefly introduces how to create ... Show MoreWhen doing a false migration update, you want a new branch based on the good commit id to start again.
This article briefly introduces how to create a New Branch from a Commit before the faulty Migration Update
How to
- Install Git Bash (if not already installed)
https://git-scm.com/downloads - Identify the correct commit to revert to
- Open the commit history of the affected file, for example:
http://abc.com/project/talend.project - Look for the commit where the productVersion was updated, and find the commit SHA-1 id right before the false commit from the commit history.
- Find the SHA-1 hash of the last good commit before the incorrect migration.
- Open the commit history of the affected file, for example:
- Create a new branch from that commit and sync it to Git server
Run the following commands in your terminal:git clone http://abc.com/project.git
cd project
git checkout -b newBranchName <sha1-of-good-commit>
git push -u origin newBranchName
These above steps will create and push a new branch starting from the commit just before the incorrect migration was introduced.Environment
- Install Git Bash (if not already installed)
-
Qlik Data Gateway: Table preview fails with The driver used does not comply with...
The problem occurs when working with a generic ODBC connection (DSN) for Qlik Data Gateway Direct Access. While the connection to the database can be ... Show MoreThe problem occurs when working with a generic ODBC connection (DSN) for Qlik Data Gateway Direct Access. While the connection to the database can be established, a table preview fails with:
Command fields returned non-success: The driver used does not comply with the general database column specification. Returned columns metadata does not contain COLUMN_SIZE, DECIMAL_DIGITS.
Resolution
The driver in use does not fulfill the requirements and must be updated.
Qlik support suggests contacting the driver provider and asking to update it in order to make it compatible with the Qlik Data Gateway requirements.
If possible, try a different, supported driver to connect to the data source.
Cause
The error message indicates the drive in use is based on ODBC 2.x. This is not compatible with Qlik Data Gateway, which only supports ODBC 3.x. See ODBC (via Direct Access gateway).
For additional information on the difference in metadata column names between ODBC 2.x and ODBC 3.x, see SQLProcedureColumns Function, specifically the table documenting the SQLProcedureColumns changes.
Internal Investigation ID(s)
SUPPORT-3489
Environment
- Qlik Cloud Analytics
- Qlik Data Gateway Direct Access
-
Qlik Replicate: How to rename the attrep_apply_exceptions Topic in a Kafka-Targe...
This article explains how to rename the default attrep_apply_exceptions Topic in a Kafka-targeted task within Qlik Replicate. By default, this Topic n... Show MoreThis article explains how to rename the default attrep_apply_exceptions Topic in a Kafka-targeted task within Qlik Replicate. By default, this Topic name includes underscores, which may not align with your naming conventions or Kafka topic standards. Renaming it can help maintain consistency and improve readability across your Kafka Topics.
Prerequisites
-
Skills Required:
- Basic understanding of Qlik Replicate tasks and JSON configuration files.
- Familiarity with editing and importing JSON files.
-
Tools/Software Needed:
- Qlik Replicate Console access.
- A text editor (e.g., Notepad++, VS Code).
- Access to the task JSON export file.
Step-by-Step Instructions
-
Export the Task JSON
- Open Qlik Replicate.
- Navigate to the task you want to modify.
- Click on the More Options (⋮) menu and select Export JSON.
- Save the file to your local machine.
-
Edit the JSON File
- Open the exported JSON file in a text editor.
- Locate the following section:
"exception_table_settings": {
"table_name": "dev.tm.attrep_apply_exceptions"
}
- Modify the
table_name
value to your desired topic name.
For example, to replace underscores with hyphens:
"exception_table_settings": {
"table_name": "dev.tm.attrep-apply-exceptions"
}
Tip: Ensure the new Topic name follows your Kafka naming conventions and does not conflict with existing Topics.
-
Import the Modified Task
- Return to Qlik Replicate.
- Click Import Task and select the modified JSON file.
- Review the task settings to confirm the changes.
- Save and run the task to verify that the new topic name is used.
Common Issues and Solutions
Issue: Task fails to import after editing the JSON.
Solution: Ensure the JSON syntax is valid. Use a JSON validator to check for formatting errors.Issue: Kafka topic not created with the new name.
Solution: Confirm that the task was saved and executed after importing. Also, verify that the Kafka broker allows topic creation.Issue: Topic name still shows underscores.
Solution: Double-check that thetable_name
value was correctly updated and saved in the JSON file before import.Environment
- Qlik Replicate
-
-
Quick Guide: Installing and Configuring Teradata TTU on Linux for Qlik Replicate
This guide provides a step-by-step process for installing and configuring the TTU (Teradata Tools and Utilities) for use with Qlik Replicate Teradata ... Show MoreThis guide provides a step-by-step process for installing and configuring the TTU (Teradata Tools and Utilities) for use with Qlik Replicate Teradata Target Endpoint. Note that Qlik Replicate should already be installed before proceeding.
In this article the TTU 20 is installed on RHEL 10.00. The steps are similar for TTD 17.20 on other versions of CentOS 8.x, RHEL 8.x/9.x, etc.
For more information, Setting up Qlik Replicate on Linux, a comprehensive list of supported Linux platforms and prerequisites, refer to the Supported Linux Platforms section in the User Guide.
Installation Steps
- Access https://downloads.teradata.com/download/tools/teradata-tools-and-utilities-linux-installation-package-0
To access this download, you must log in.
- Download the Linux 64-bit TTU 20.00 (or 17.20) and upload it to your Linux Server (in folder "/kit")
TeradataToolsAndUtilitiesBase__linux_x8664.20.00.23.00-1.tar.gz
Optional: TTU 20.00 Linux - Installation Guide - Uncompress the gzip file and extract the installation files:
gzip -d TeradataToolsAndUtilitiesBase__linux_x8664.20.00.23.00-1.tar.gz
tar -xf TeradataToolsAndUtilitiesBase__linux_x8664.20.00.23.00-1.tar - Launch the Installation:
cd /kit/TeradataToolsAndUtilitiesBase
./setup.sh
..................................................................
1. bteq - 20.00.00.07 - Teradata BTEQ Application 923 KB
2. fastexp - 20.00.00.02 - Teradata FastExport Utility 711 KB
3. fastld - 20.00.00.03 - Teradata FastLoad Utility 520 KB
4. jmsaxsmod - 20.00.00.02 - Teradata JMS Access Module 107 KB
5. mload - 20.00.00.02 - Teradata MultiLoad Utility 774 KB
6. mqaxsmod - 20.00.00.01 - WebSphere(r) Access Module for Teradata 263 KB
7. npaxsmod - 20.00.00.02 - Teradata Named Pipes Access Module 93 KB
8. s3axsmod - 20.00.00.08 - Teradata Access Module for S3 20056 KB
9. gcsaxsmod - 20.00.00.07 - Teradata Access Module for Google Cloud Storage 16996 KB
10. azureaxsmod - 20.00.00.05 - Teradata Access Module for Azure 20341 KB
11. kafkaaxsmod - 20.00.00.04 - Teradata Kafka Access Module 16031 KB
12. sqlpp - 20.00.00.01 - Teradata C Preprocessor 657 KB
13. tdodbc - 20.00.00.22 - Teradata ODBC Driver 96228 KB
14. tdwallet - 20.00.00.25 - Teradata Wallet Utility 18375 KB
15. tptbase - 20.00.00.15 - Teradata Parallel Transporter Base 75502 KB
16. tpump - 20.00.00.03 - Teradata TPump Utility 932 KB
Enter one or more selections (separated by space): 1 3 5 13 15 16
While Qlik Replicate does not require "1. bteq", it is strongly recommended for troubleshooting connectivity issues. - Merge lines from
/opt/teradata/client/ODBC_64/odbcinst.ini
to
/etc/odbcinst.ini
[Teradata Database ODBC Driver 20.00]
Description=Teradata Database ODBC Driver 20.00
Driver=/opt/teradata/client/20.00/odbc_64/lib/tdataodbc_sb64.so
# Note: Currently, Data Direct Driver Manager does not support Connection Pooling feature.
CPTimeout=60
The default .so file folder is /opt/teradata/client/20.00/odbc_64/lib - (Optional) Add internal parameter "provider" and set its value to "Teradata Database ODBC Driver 20.00"
Verify the DSN name in step five. - Edit file /opt/attunity/replicate/bin/site_arep_login.sh
add /opt/teradata/client/20.00/odbc_64/lib to variable LD_LIBRARY_PATH
For example:
export LD_LIBRARY_PATH=/usr/lib64:/opt/attunity/replicate/lib:/opt/teradata/client/20.00/odbc_64/lib:/usr/lib/oracle/23/client64/lib
-
(Optional) Edit file /etc/hosts to add COP Aliases. Make sure that the database name added to the hosts files is the same as the database specified in the Default database field in the Teradata Database target database settings.
For example:
192.168.33.200 targetdbcop1
Where 192.168.33.200 is the IP Address of the Teradata 20.00 Server. targetdb is the database name specified in Teradata target Endpoint: -
Restart the Qlik Replicate Service:
systemctl restart areplicate
-
Access the Qlik Replicate console using the following URL
https://192.168.33.100:3552/attunityreplicate
Where 192.168.33.100 is the Qlik Replicate Linux Server IP Address.
Teradata client and server connectivity test
To verify the connectivity between the Teradata TTU client and server, you can use bteq and isql by running the following commands:
sudo su - attunity
cd /opt/attunity/replicate/bin
source arep_login.sh
bteq
logon 192.168.33.200/dbc;
Password: dbc
select * from targetdb.kit;Important Notes
- No need to create symbolic links for the ODBC driver libraries.
- The AREP_ODBC_DRIVER_MANAGER environment variable is not required when using TTU 17.20 or later.
- There is no need to set the ODBCINI or ODBCINSTINI environment variables — Qlik Replicate uses the default configuration files located under /etc.
Related Content
Quick Guide: Installing and Configuring Oracle 21c Instant Client on Linux for Qlik Replicate
Environment
- Qlik Replicate All versions
- TTU (Teradata Tools and Utilities) versions 17.x, 20.00
- Access https://downloads.teradata.com/download/tools/teradata-tools-and-utilities-linux-installation-package-0
-
Qlik Stitch: How to add single user to multiple accounts
Sometimes, you may encounter situations where utilizing a single user for multiple accounts becomes necessary. For instance, suppose you possess two S... Show MoreSometimes, you may encounter situations where utilizing a single user for multiple accounts becomes necessary. For instance, suppose you possess two Stitch accounts—one designated for staging and another for production purposes. If you wish to employ the same user for both accounts, you can leverage the ‘+’ functionality if your email provider supports it.
If the email address 'stitch@stitchdata.com' is used for creating the first Stitch account designated for staging, you may use ‘+’ to add this team member in the second Stitch account intended for production, as follows: 'stitch+prod@stitchdata.com'.
You are advised to invite the users to the new account and subsequently deactivate the old one. To accomplish this, you may follow the procedure detailed below. Given that an email address can only be associated with a single Stitch account, try this workaround to use the same email address for multiple accounts.
If you prefer utilizing an un-aliased email address for a specific account, and said email is already associated to a Stitch account, follow the procedure detailed below to modify the account with the un-aliased email. This will subsequently enable the sending of an invitation to the un-aliased email address.
For illustrative purposes, we will employ the email address stitch@stitchdata.com in this example.
- Sign into the Stitch account using the un-aliased email. In this example, we’d sign into the account associated with the stitch@stitchdata.com email address.
- Click User menu (your icon) > Manage Account Settings.
- Click the Your Profile tab.
- Update the email address to something along the lines of name+deactivated@domain.com. In this example, we’ll update the email address to stitch+deactivated@stitchdata.com.
- Click Update Profile.
Environment
- Qlik Stitch
-
Qlik Stitch: Shopify "ServerError"
When running extractions in Shopify that involve requesting a large quantity of data, encountered a ServerError. This is notably seen during initial/h... Show MoreWhen running extractions in Shopify that involve requesting a large quantity of data, encountered a ServerError.
This is notably seen during initial/historical data extractions.
Cause
Too much data is being requested from Shopify and it creates a Sever Error because Shopify is unable to process the large amount of data via their API.
Resolution
When this error arises, Stitch Subject Matter Experts have the capability to adjust a tuning parameter labeled "Results Per Page". If you encounter this error, it is advisable to consult with a Stitch Subject Matter Expert to establish an appropriate value for this parameter and thereby resolve the issue.
Environment
- Qlik Stitch
-
Qlik Replicate Error: [SQL Server]Invalid parameter passed to OpenRowset(DBLog, ...
When running a Replicate CDC task replicating from MS-SQL source endpoint, you may get an error like the following error: [SOURCE_CAPTURE ]E: SqlStat... Show MoreWhen running a Replicate CDC task replicating from MS-SQL source endpoint, you may get an error like the following error:
[SOURCE_CAPTURE ]E: SqlStat: 42000 NativeError:9005 [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Invalid parameter passed to OpenRowset(DBLog, ...). (PcbMsg: 105) [1020417] (sqlserver_log_processor.c:4310)
Environment
- Qlik Replicate with CDC task running on MS-SQL source endpoint
Resolution
There are two possible root causes and two respective solutions.
When working with MS-SQL source endpoint, COMMIT event usually show up with LCX_NULL context. Here it comes with a context which is not recognized yet and therefore Replicate produces an error. An easy and immediate option to solve the error is to ignore the validity of the CONTEXT value in case of COMMIT, assuming that MS-SQL storage engine knows its job better than Replicate.
Solution:
Edit the MS-SQL source endpoint setting under Replicate GUI:
- Go to Manage Endpoint Connections
- Open your MS-SQL source endpoint definition
- Open the Advanced tab
- Go to Internal Parameters
- Add the internal parameter: ignoreTxnCtxValidityCheck and set it to True
When Qlik Replicate reads from the online log it will use the fn_dblog function, the first parameter passed to the query is the last LSN that Qlik Replicate processed. This error will occur if there is no access to the backups and the log has been truncated.
Solution:
Make sure that the TLOG backups are available to Qlik Replicate.
-
Qlik Talend Product: How to Manually Clean Up Jobs Deployed in Talend Runtime
This article briefly introduces How to manually clean up jobs (data services, routes, etc.) that were deployed on Talend Runtime Server and clean up t... Show MoreThis article briefly introduces How to manually clean up jobs (data services, routes, etc.) that were deployed on Talend Runtime Server and clean up the files in Talend Remote Engine if required.
Talend Runtime Server
- From Talend Runtime Sever, please run the following karaf commands.
-
feature:repo-list |grep -i serviceName
-
feature:uninstall serviceName-feature
-
feature:repo-remove file:/xxx/serviceName-feature-x.x.x.xml (where 'serviceName' is the name of the job)
-
feature:refresh
-
- Stop Talend Runtime Server
- Delete the left-over artifacts in "<Talend Runtime>\data\kar\"
- Start Talend Runtime Server
Talend Remote Engine
- Stop Talend Remote Engine
- Delete the left-over artifacts in "<Talend Remote Engine>\data\dsrunner\exec\osgi\"
- Start Talend Remote Engine
Related Content
If you are looking for Talend Remote Engine clean-up schedule information, please refer to
understanding-the-talend-remote-engine-clean-up-cycleEnvironment
- From Talend Runtime Sever, please run the following karaf commands.
-
Qlik Sense WebSocket Connectivity Tester
Qlik Sense uses HTTP, HTTPS, and WebSockets to transfer information to and from Qlik Sense. The attached Webscoket Connectivity tester can be used to... Show MoreQlik Sense uses HTTP, HTTPS, and WebSockets to transfer information to and from Qlik Sense.
The attached Webscoket Connectivity tester can be used to verify protocol compliance, indicating if a network policy, firewall, or other perimeter device is blocking any of the required connections.
If the tests return as unsuccessful, please engage your network team.
The QlikSenseWEbsocketConnectivtyTester is not an officially supported application and is provided as is. It is intended to assist in troubleshooting, but further investigation of an unsuccessful test will require your network team's involvement. To run this tool, the Qlik Sense server must have a working internet connection.
Environment:
Qlik Sense Enterprise on Windows
For Qlik Sense Enterprise on Windows November 2024 and later:
Since the introduction of extended WebSocket CSRF protection, using the WebSocket Connectivity tester on any version later than November 2024 requires a temporary configuration change.
- Open the Proxy.exe.config stored in C:\Program Files\Qlik\Sense\Proxy\
- Locate
<add key="WebSocketCSWSHCheckEnabled" value="true"/>
- Change it to
<add key="WebSocketCSWSHCheckEnabled" value="false"/>
- Restart the proxy
- Run the WebSocket Connectivity tester
- Revert the change
How to use the Websocket Connectivity Tester
- Download the attached package or download the package from GitHub: https://github.com/flautrup/QlikSenseWebsocketConnectivityTester
- Unzip the file
- Login to the Qlik Sense Management Console
- Create a Content Library
- Go to Content libraries
- Click and Create New
- Name the Content Library: WebSocketTester
- Verify and modify the Security Rule
Note that in our example, any user is allowed to access this WebSocketTester. - Click Apply
- Click Contents in the Associated items menu
- Click Upload
- Upload the QlikSenseWebsocketTest.html file.
- Copy the correct URL from the URL path
- Open a web browser (from any machine from which you wish to test the WebSocket connection).
- Paste the URL path and prefix the fully qualified domain name and https.
Example: https://qlikserver3.domain.local/content/WebSocketTester/QlikSenseWebsocketTest.html
What to do if any of them fail?
Verify that WebSocket is enabled in the network infrastructure, such as firewalls, browsers, reverse proxies, etc.
See the article below under Related Content for additional steps.
Related Content
-
Qlik Talend Product: Error Occurs on Talend Runtime Server When deploying A ESB ...
There are some errors occurring as below when creating a job on Talend Studio (Version R2025-01 or before) and running it on Talend Runtime Server tha... Show MoreThere are some errors occurring as below when creating a job on Talend Studio (Version R2025-01 or before) and running it on Talend Runtime Server that uses Camel 3.
"org.talend.daikon.exception.TalendRuntimeException:UNEXPECTED_EXCEPTION"
Resolution
As issues occurring which relate to Camel 3 is out of support, it is best to first move up to Camel 4, to see if issues get resolved.
In order to do so, please upgrade to the following versions:
- Talend Studio - R2025-02 or later
- Talend Runtime - R2025-02 or later
- Jobserver - R2025-01 or later
If you have built ESB route jobs using Talend Studio R2025-01 or previous versions which use Camel 3, please rebuild them in Talend Studio R2025-02 or later versions which use Camel 4.
Cause
As Camel 3 has reached end of life (EOL) in December, 2024, there were missing dependencies in Camel 3 which may have caused issues.
Related Content
For further information regarding Camel 4, please check the following URLs:
- Qlik Talend and your Move to Camel 4: What you need to know
- Upgrading Talend Studio Jobs to Java 17 and Camel 4: Your Comprehensive Guide
- https://help.qlik.com/talend/en-US/release-notes/8.0/r2025-02-studio-new-features
Environment
-
Qlik Talend CIDI Job Builder is Failing with error " Failed to execute goal org....
When building a job using Jenkins, it fails with the following error. [ERROR] Failed to execute goal org.talend.ci:builder-maven-plugin:8.0.13:genera... Show MoreWhen building a job using Jenkins, it fails with the following error.
[ERROR] Failed to execute goal org.talend.ci:builder-maven-plugin:8.0.13:generateAllPoms (default-cli) on project standalone-pom: Transfer https://update.talend.com/Studio/8/updates/R2025-04/plugins/com.ibm.icu_67.1.0.v20200706-1749.jar failed -> [Help 1]
Resolution
- Check your configurations in Jenkins
If there is a p2.base option, please remove it from TALEND_CI_RUN_CONFIG part and restart the build. - Host the update repository and add the missing dependency manually
- Please host the update repository, by following the instructions here:
Setting up an update repository by hosting it - Install the missing dependency like below.
https://update.talend.com/Studio/8/base/plugins/com.ibm.icu_67.1.0.v20200706-1749.jar
In this example, it is the plugin "com.ibm.icu_67.1.0.v20200706-1749.jar" that is found in base. - Manually add the missing dependency to the Talend Studio update repository, under the following directory:
<Talend Studio update repository>/studio/plugins/ - Set the update repository that you have hosted as a Talend CommandLine parameter:
For example:-Dtalend.studio.p2.update=D:/Talend/v80/StudioR202504
- Please host the update repository, by following the instructions here:
Cause
From above error message, it indicates the com.ibm.icu_67.1.0.v20200706-1749.jar file is missing from the following URL which causes this issue.
https://update.talend.com/Studio/8/updates/R2025-04/plugins/com.ibm.icu_67.1.0.v20200706-1749.jarIn Talend Studio R2024-04 or lower, the following CI Builder parameter was still supported:
-Dtalend.studio.p2.base (Only supported for users with Talend Studio 8.0.1 R2024-04 or lower)
However, in Talend Studio R2024-05 or later, the base option has been merged to the update repository.
Related Content
For further details on configurations, please check here:
CI builder-related Maven parametersEnvironment
- Check your configurations in Jenkins