Featured Content
-
How to contact Qlik Support
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical e... Show MoreQlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
- Support and Professional Services; who to contact when.
- Qlik Support: How to access the support you need
- 1. Qlik Community, Forums & Knowledge Base
- The Knowledge Base
- Blogs
- Our Support programs:
- The Qlik Forums
- Ideation
- How to create a Qlik ID
- 2. Chat
- 3. Qlik Support Case Portal
- Escalate a Support Case
- Phone Numbers
- Resources
Support and Professional Services; who to contact when.
We're happy to help! Here's a breakdown of resources for each type of need.
Support Professional Services (*) Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. - Error messages
- Task crashes
- Latency issues (due to errors or 1-1 mode)
- Performance degradation without config changes
- Specific questions
- Licensing requests
- Bug Report / Hotfixes
- Not functioning as designed or documented
- Software regression
- Deployment Implementation
- Setting up new endpoints
- Performance Tuning
- Architecture design or optimization
- Automation
- Customization
- Environment Migration
- Health Check
- New functionality walkthrough
- Realtime upgrade assistance
(*) reach out to your Account Manager or Customer Success Manager
Qlik Support: How to access the support you need
1. Qlik Community, Forums & Knowledge Base
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
The Knowledge Base
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
- Go to the Official Support Articles Knowledge base
- Type your question into our Search Engine
- Need more filters?
- Filter by Product
- Or switch tabs to browse content in the global community, on our Help Site, or even on our Youtube channel
Blogs
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)Our Support programs:
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.The Qlik Forums
- Quick, convenient, 24/7 availability
- Monitored by Qlik Experts
- New releases publicly announced within Qlik Community forums (click)
- Local language groups available (click)
Ideation
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation GuidelinesHow to create a Qlik ID
Get the full value of the community.
Register a Qlik ID:
- Go to register.myqlik.qlik.com
If you already have an account, please see How To Reset The Password of a Qlik Account for help using your existing account. - You must enter your company name exactly as it appears on your license or there will be significant delays in getting access.
- You will receive a system-generated email with an activation link for your new account. NOTE, this link will expire after 24 hours.
If you need additional details, see: Additional guidance on registering for a Qlik account
If you encounter problems with your Qlik ID, contact us through Live Chat!
2. Chat
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
- Answer common questions instantly through our chatbot
- Have a live agent troubleshoot in real time
- With items that will take further investigating, we will create a case on your behalf with step-by-step intake questions.
3. Qlik Support Case Portal
Log in to manage and track your active cases in the Case Portal. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
Your advantages:
- Self-service access to all incidents so that you can track progress
- Option to upload documentation and troubleshooting files
- Option to include additional stakeholders and watchers to view active cases
- Follow-up conversations
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Problem Type
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
Priority
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
Severity
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
Escalate a Support Case
If you require a support case escalation, you have two options:
- Request to escalate within the case, mentioning the business reasons.
To escalate a support incident successfully, mention your intention to escalate in the open support case. This will begin the escalation process. - Contact your Regional Support Manager
If more attention is required, contact your regional support manager. You can find a full list of regional support managers in the How to escalate a support case article.
Phone Numbers
When other Support Channels are down for maintenance, please contact us via phone for high severity production-down concerns.
- Qlik Data Analytics: 1-877-754-5843
- Qlik Data Integration: 1-781-730-4060
- Talend AMER Region: 1-800-810-3065
- Talend UK Region: 44-800-098-8473
- Talend APAC Region: 65-800-492-2269
Resources
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
Recent Documents
-
How to install Qlik Replicate on a docker image
This article provides a guide on installing Qlik Replicate on a docker image. Overview (Optional) Remove older versions of Docker Install Docker on t... Show MoreThis article provides a guide on installing Qlik Replicate on a docker image.
Overview
- (Optional) Remove older versions of Docker
- Install Docker on the host Linux Server
- FTP upload the rpm file to the Linux Server folder
- Follow the Installing Qlik Replicate on docker steps
- Prepare the docker build files
- Modify the Dockerfile
- Build the docker image
- Startup and run the docker image
Steps
- (Optional) Remove older versions of Docker
sudo rpm -qa|grep docker
sudo yum remove docker docker-client docker-client-latest docker-common docker-latest docker-laster-logrotate docker-logrotate docker-engine
- Install Docker on host Linux Server
sudo yum install -y yum-utils device-mapper-persistent-data lvm2 sudo yum-config-manager --add-repo https://download.docker.com/linux/centos/docker-ce.repo sudo yum install --allowerasing docker-ce docker-ce-cli containerd.io sudo systemctl start docker sudo systemctl enable docker
- FTP upload the Qlik Replicate rpm installation file to Linux Server folder "
/kit
" The /kit directory is a temporary folder used to store installation files. In the example below, the RPM file isareplicate-2024.5.0-357.x86_64.rpm
. Once the installation is complete, the entire /kit folder can be safely deleted. - Follow the Installing Replicate on a docker steps
mkdir -p /kit/ar_docker cd /kit rpm2cpio areplicate-2024.5.0-357.x86_64.rpm | cpio -iv --make-directories --no-absolute-filenames -D ar_docker/ ./opt/attunity/replicate/addons/samples/docker/* mv ./ar_docker/opt/attunity/replicate/addons/samples/docker/* ./ar_docker rm -rf ./ar_docker/opt
- Prepare the docker build files
cd /kit/ar_docker cp ../areplicate-2024.5.0-357.x86_64.rpm . ./create-dockerfile.sh
- Change the line of file
/kit/ar_docker/Dockerfile
, from:RUN yum -y install /tmp/areplicate-*.rpm
toRUN systemd=no yum -y install /tmp/areplicate-*.rpm
NOTE!
The parameter systemd=no is used to solve the below error you may hit during the docker build stage: This rpm is not supported on your system (no systemctl), exiting. error: %prein(areplicate-2024.5.0-357.x86_64) scriptlet failed, exit status 43NOTE!
The password should not be empty and must be strong enough.WARNING!
If you want to install ODBC Drivers, please make the corresponding ODBC Drivers rpm files are ready in this folder. If you want to skip the ODBC Drivers installation at present, rename or delete the file "drivers" in this folder.- Build the docker image
docker build --no-cache -t johnw/replicate:2024.5 .
where/johnw/replicate:2024.5
is the image tag.
Do not forget the last period "."
- Startup and Run the docker image
docker run -d --name ar --hostname cdc2 -e ReplicateRestPort=3552 -p 3552:3552 -v /dockermount/data/replicate/data:/replicate/data johnw/replicate:2024.5
Now Qlik Replicate is running in the docker and can be accessed from Qlik Replicate Console GUI.
Environment
- Qlik Replicate 2024.5 SP03 and above builds
- Qlik Replicate 2023.11 SP05 and above builds
- Linux 8.x and 9.x
-
Qlik Cloud Analytics: Can you unpublish community sheets with a Qlik Automate bl...
Can Qlik Cloud Analytics Community Sheets be unpublished using a Qlik Automate block or the Qlik REST API? There is no unpublish sheet block in Qlik A... Show MoreCan Qlik Cloud Analytics Community Sheets be unpublished using a Qlik Automate block or the Qlik REST API?
There is no unpublish sheet block in Qlik Automate, and a simple unpublish option cannot be done with the Qlik REST API.
To note:
- Duplciating a published sheet will create a private copy for the user who has unpublished it. It will only be visible to them.
- The sheet will need to be unpublished first, and ownership then needs to be transferred back to the original user. (Example: Set owner on an app object. The user must be the owner of the object)
- The published sheet behaviour in Qlik Sense is such that when published, it cannot be modified. You first need to unpublish the sheet before deleting.
- If the status or ownership of the sheet has not changed as expected: a new session will be created when the existing session is closed, either through inactivity or by the browser tab being explicitly closed (after around 120 seconds).
Environment
- Qlik Cloud Analytics
-
Qlik Replicate: Errors Due to Unsupported SQL Server Versions
Starting from Qlik Replicate versions 2024.5 and 2024.11, Microsoft SQL Server 2012 and 2014 are no longer supported. Supported SQL Server versions in... Show MoreStarting from Qlik Replicate versions 2024.5 and 2024.11, Microsoft SQL Server 2012 and 2014 are no longer supported. Supported SQL Server versions include 2016, 2017, 2019, and 2022. For up-to-date information, see Support Source Endpoints for your respective version.
Attempting to connect to unsupported versions, both on-premise and cloud, can result in various errors.
Examples of reported Errors:
- Qlik Replicate 2024.5 accessing Microsoft Azure SQL Server 2014
[SOURCE_CAPTURE ]W: Table 'dbo'.'tableName' has encrypted column(s), but the 'Capture data from Always Encrypted database' option is disabled. The table will be suspended (sqlserver_endpoint_capture.c:157) - Qlik Replicate 2024.11 accessing Microsoft SQL Server 2014
[SOURCE_UNLOAD ]E: RetCode: SQL_ERROR SqlState: 42S02 NativeError: 208 Message: [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Invalid object name 'sys.column_encryption_keys'. Line: 1 Column: -1 [1022502] (ar_odbc_stmt.c:4067)
Cause
The system view sys.column_encryption_keys is only available starting from SQL Server 2016. Attempting to query this view on earlier versions results in errors.
Reference: sys.column_encryption_keys (Microsoft Docs)
Resolution
Upgrade your SQL Server instances to a supported version (2016 or later) to ensure compatibility with Qlik Replicate 2024.5 and above.
Internal Investigation ID(s)
00375940, 00376089
Environment
- Qlik Replicate versions 2024.5, 2024.11, and higher
- Microsoft SQL Server 2014, 2012, and lower (unsupported)
- Qlik Replicate 2024.5 accessing Microsoft Azure SQL Server 2014
-
Qlik Replicate: Transforming Primary Key UPDATEs into DELETE + INSERT Operations
By default, Qlik Replicate translates an UPDATE operation on the source into an UPDATE on the target. However, in some scenarios, especially when a pr... Show MoreBy default, Qlik Replicate translates an UPDATE operation on the source into an UPDATE on the target. However, in some scenarios, especially when a primary key column is updated, you may want to capture this change as a DELETE followed by an INSERT.
This behavior can be enabled in Qlik Replicate through a task setting called "DELETE and INSERT when updating a primary key column." For more details, refer to the Qlik Replicate User Guide: Miscellaneous tuning.
Explanation of the "DELETE and INSERT when updating a primary key column" Option
Consider the following Oracle source table example, where ID is a primary key and name is a non-primary key column:
- Non-PK columns update: update kit set name='test-upd' where id=44;
This will be replicated as a standard UPDATE on the target. - PK columns update: update kit set id=4 where id=44;
With the above option enabled, this will be replicated as a DELETE followed by an INSERT.
Applicable Target Types
This behavior is supported for the following types of targets:
- File targets with Store Changes mode enabled (Apply Changes mode is not supported):
Includes targets such as local files, Amazon S3, Azure Data Lake Storage (ADLS), Google Cloud Storage (GCS), etc. - RDBMS targets with Store Changes mode enabled:
Supported databases include Oracle, MySQL, SQL Server, IBM Db2, and others. - Streaming targets with Apply Changes mode enabled (Store Changes mode is not supported):
Includes Apache Kafka, Amazon MSK, Amazon Kinesis Data Streams, Azure Event Hubs, Confluent Cloud, Google Cloud Pub/Sub, etc.
Important Notes
- When the "DELETE and INSERT when updating a primary key column" option is enabled, Qlik Replicate will automatically add all columns to Oracle supplemental logging to capture the complete row data.
- Even though the operation is internally transformed into a DELETE and an INSERT, the Qlik Replicate UI will still display it as an UPDATE under the Apply Changes view.
Internal Investigation ID(s)
00350953
Environment
- Qlik Replicate, all versions
- Non-PK columns update: update kit set name='test-upd' where id=44;
-
Qlik Talend Data Integration: Cannot invoke "java.util.jar.Manifest.getMainAttri...
When attempting to execute a Data Integration (DI) Job, encountered the following error messages: Execution failed :Cannot invoke "java.util.jar.Manif... Show MoreWhen attempting to execute a Data Integration (DI) Job, encountered the following error messages:
Execution failed :Cannot invoke "java.util.jar.Manifest.getMainAttributes()" because "manifest" is null
The detailed log can be found from workspace\.metadata\.log:
!MESSAGE 2025-05-06 19:18:06,261 ERROR org.talend.commons.exception.CommonExceptionHandler - Cannot invoke "java.util.jar.Manifest.getMainAttributes()" because "manifest" is null
!STACK 0
java.lang.NullPointerException: Cannot invoke "java.util.jar.Manifest.getMainAttributes()" because "manifest" is null
at org.talend.designer.runprocess.java.JavaProcessor.compareSapjco3Version(JavaProcessor.java:1803)
at org.talend.designer.runprocess.java.JavaProcessor.appendLibPath(JavaProcessor.java:1653)
at org.talend.designer.runprocess.java.JavaProcessor.getNeededModulesJarStr(JavaProcessor.java:1599)
at org.talend.designer.runprocess.java.JavaProcessor.getLibsClasspath(JavaProcessor.java:1343)
at org.talend.designer.runprocess.java.JavaProcessor.getCommandLine(JavaProcessor.java:1243)
at org.talend.designer.core.runprocess.Processor.getCommandLine(Processor.java:304)
at org.talend.designer.core.runprocess.Processor.getCommandLine(Processor.java:294)Cause
The error signifies that the JAR file utilized in Job lacks the necessary manifest meta-information, specifically, the MANIFEST.MF file is absent from the JAR.
Resolution
Navigate to the <Studio_installation_Home>\configuration\.m2 folder, locate the JAR file (specifically 'sapjco3.jar' in this instance), and unzip it to confirm the presence of the MANIFEST.MF file. If the MANIFEST.MF file is absent, it indicates an issue with the JAR file. Proceed to delete the directory containing this JAR file and reinstall the right JAR file within Talend Studio, for more further details, please refer to Installing external modules to Talend Studio.
Environment
-
Available Space options are missing when uploading dataset in Qlik Cloud Analyti...
Uploading a file to a dataset using the Qlik Cloud Analytics Catalog only lists the Personal Space. No other Spaces are available. This persists even ... Show MoreUploading a file to a dataset using the Qlik Cloud Analytics Catalog only lists the Personal Space. No other Spaces are available.
This persists even if the user has all the correct Space permissions or owns the missing Shared Space.
Example:
- Log in to Qlik Cloud Analytics
- Switch to the Catalog
- Click the Create new dropdown
- Click Dataset
- Click Upload data file
- Only the Personal (B) space is available. Clicking the dropdown (A) shows No options
Resolution
- Remove the Personal/ Space from the list by clicking the X.
- An error A space is required. Create a space. will be displayed. Ignore the error.
- Click the drop-down arrow. All available spaces will be listed.
Environment
- Qlik Cloud Analytics
-
Qlik Sense Enterprise on Windows: Cannot change Date Format in Data Manager
The feature allowing the Date format to be changed from the Data Manager is inaccessible from the Qlik Sense Enterprise on Windows November 2024 IR re... Show MoreThe feature allowing the Date format to be changed from the Data Manager is inaccessible from the Qlik Sense Enterprise on Windows November 2024 IR release to the SR 10 release.
Resolution
Upgrade to the latest version of Qlik Sense Enterprise on Windows.
Workaround:
If an upgrade is not possible, change the Date Format from the Data Load Editor.
Example:
Date(Date#(YearMonth, 'YYYYMM'), 'YYYY/MM') AS YearMonthFormatChanged
Fix Version:
Qlik Sense Enterprise on Windows November 2024 SR 11 and higher.
Cause
Product Defect ID: SUPPORT-1846
Environment
- Qlik Sense Enterprise on Windows November 2024 IR to SR 10
-
Release Notes Qlik Sense PostgreSQL installer version 1.2.0 to 2.0.0
The following release notes cover the Qlik PostgreSQL installer (QPI) version 1.2.0 to 2.0.0. Content What's New2.0.0 May 2025 Release NotesKnown Limi... Show MoreThe following release notes cover the Qlik PostgreSQL installer (QPI) version 1.2.0 to 2.0.0.
Content
- What's New
- 2.0.0 May 2025 Release Notes
- Known Limitations (2.0.0)
- 1.4.0 December 2023 Release Notes
- Known Limitations (1.4.0)
- 1.3.0 May 2023Release Notes
- Known Limitations (1.3.0)
- 1.2.0 Release Notes
What's New
- The PostgreSQL version used by QPI has been updated to 14.17
- An upgrade of an already upgraded external instance is now possible
- Support for an upgrade of the embedded 14.8 database was added
- Silent installs and upgrades are supported beginning with Qlik PostgreSQL Installer 1.4.0 and later. QPI can now be used with silent commands to install or upgrade the PostgreSQL database. (SHEND-973)
2.0.0 May 2025 Release Notes
Improvement / Defect Details SHEND-2273 - Upgrade of Qlik Sense embedded PostgreSQL v 9.6, v 12.5, and v 14.8 databases to v 14.17. The PostgreSQL database is decoupled from Qlik Sense to become a Standalone PostgreSQL database with its own installer and can be upgraded independently of the installed Qlik Sense version.
- Upgrade of already decoupled standalone PostgreSQL databases versions 9.6, 12.5, and 14.8 to version 14.17.
QCB-28706 Upgraded PostgreSQL version to 14.17 to address the pg_dump vulnerability (CVE-2024-7348).
SUPPORT-335 Upgraded PostgreSQL version to 14.17 to address the libcurl vulnerability (CVE-2024-7264). QB-24990 Fixed an issue with upgrades of PostgreSQL if Qlik Sense was installed in a custom directory, such as D:\Sense. Known Limitations (2.0.0)
- Rollback is not supported.
- The database size is not checked against free disk space before a backup is taken.
- Windows Server 2012 R2 does not support PostgreSQL 14.8. QPI cannot be used on Windows Server 2012 R2.
- QPI will only upgrade a PostgreSQL instance that only contains Qlik Sense Databases (QSR, SenseServices, QSMQ, Licenses, QLogs, etc.)
1.4.0 December 2023 Release Notes
Improvement / Defect Details SHEND-1359, QB-15164: Add support for encoding special characters for Postgres password in QPI If the super user password is set to have certain special characters, QPI did not allow upgrading PostgreSQL using this password. The workaround was to set a different password, use QPI to upgrade the PostgreSQL database and then reset the password after the upgrade. This workaround is not required anymore with 1.4.0 QPI, as 1.4.0 supports encoded passwords. SHEND-1408: Qlik Sense services were not started again by QPI after the upgrade QPI failed to restart Qlik services after upgrading the PostgreSQL database. This has been fixed now. SHEND-1511: Upgrade not working from 9.6 database In QPI 1.3.0, upgrade from PostgreSQL 9.6 version to 14.8 was failing. This issue is fixed in QPI 1.4.0 version. QB-21082: Upgrade from May 23 Patch 3 to August 23 RC3 fails when QPI is used before attempting upgrade.
QB-20581: May 2023 installer breaks QRS if QPI was used with a patch before.Using QPI on a patched Qlik Sense version caused issues in the earlier version. This is now supported. Known Limitations (1.4.0)
- QB-24990: Cannot upgrade PostgreSQL if Qlik Sense was installed in a custom directory such as D:\Sense. See Qlik PostgreSQL Installer (QPI): No supported existing Qlik Sense PostgreSQL database found.
- Rollback is not supported.
- Database size is not checked against free disk space before a backup is taken.
- QPI can only upgrade bundled PostgreSQL database listening on the default port 4432. Using QPI to upgrade a standalone database or a database previously unbundled with QPI is not supported.
- Cannot migrate a 14.8 embedded DB to standalone
- Windows Server 2012R2 does not support PostgreSQL 14.8. QPI cannot be used on Windows Server 2012R2.
1.3.0 May 2023 Release Notes
-
Qlik Sense Service Account requirements and how to change the account
This article will outline how to successfully change the service account running Qlik Sense. Contents: Account Requirements: What the account needs... Show MoreThis article will outline how to successfully change the service account running Qlik Sense.
Contents:
- Account Requirements: What the account needs access to.
- Prep work
- Changing Qlik Sense dependencies
- Change the service account
- External Dependencies
- Video Demonstration
- Related Content
Account Requirements: What the account needs access to.
- Certificates
- Access to the certificate(s) for the site
- Files and file shares
- Access to the installation path for Qlik Sense
- Access to %ProgramData%
- Access to C:\Program Files\Qlik
- Access to the Service Cluster share
- Access to external systems as data sources, e.g.
- Databases
- UNC shares to QVDs, CSVs, etc
Note: Many of the file level permissions would ordinarily be inherited from membership to the Local Administrators group. For information on non-Administrative accounts running Qlik Sense Services see Changing the user account type to run the Qlik Sense services on an existing site.
Prep work
Record the Share Path. Navigate in the Qlik Management Console (QMC) to Service Cluster and record the Root Folder.
Changing Qlik Sense dependencies
- Stop all Qlik Sense services
- Ensure permissions on the Program Files path (this should be provided by Local Administrator rights):
- Navigate to the installation path (default: C:\Program Files\Qlik)
- Select the Sense folder > Right Click > Properties > Security > Edit > Add
- Lookup the new service account
- Ensure that the account has Full control over this folder
- Ensure permissions on the %ProgramData% path (this should be provided by Local Administrator rights):
- Navigate to the installation path (default: C:\ProgramData\Qlik)
- Select the Sense folder > Right Click > Properties > Security > Edit > Add
- Lookup the new service account
- Ensure that the account has Full control over this folder
- Ensure access to the certificates used by Qlik Sense
- Start > MMC > File > Add/Remove Snap-In > Certificates > Computer Account > Finish
- Go into Certificates (Local Computer) > Personal > Certificates
- For the Qlik CA server certificate (under Certificates (Local Computer) > Personal > Certificates)
- Right Click on the Server Certificate > All Tasks > Manage Private Keys > Ensure that the new service account has control
- If using a third party certificate, do the same
- Start > MMC > File > Add/Remove Snap-In > Certificates > Computer Account > Finish
- Ensure access to the Service Cluster path used by Qlik Sense
- Start > Computer Management > Shared Folders > Shares > Select the Share path
- Right click on the Share Path > Properties > Share Permissions > Add the new service account to have full control
- Open Windows File Explorer and navigate to the folder (e.g. C:\Share) > Right click on the folder > Security > Edit > Add the new service account to have full control
- Ensure membership in the Local Groups that Qlik Sense requires:
- Start > Computer Management
- Navigate to Local Users and Groups > Local Groups
- Add the new service account as a member of:
- Administrators (if using this configuration option)
- Performance Monitor Users
- Qlik Sense Service Users
Change the service account
- Swap the account for all Qlik Services except the Qlik Sense Repository Database Service.
- Open the Windows Services Console
- Locate the services
- One by one open the Properties of the Service and change the accountover by using the Windows services control panel
- Start all Qlik Sense Services
- Access the QMC to validate functionality, preferably as a previously configured RootAdmin
- Access the Data Connections section of the QMC
- Toggle the User ID field and change the data connections used by the License and Operations Monitor apps to use the new user ID and password:
- Add the RootAdmin role to the new service account*
- QMC > Users
- Filter on the new UserID > Edit
- Add RootAdmin role
*If this account is not existing yet in Qlik Sense, you would need to try to connect to the Hub/QMC with this new account first, in order to be able to see it in QMC>Users.
- Execute the License Monitor reload task
- Inspect the configured User Directory Connectors and change the User ID and password combination if previously configured.
External Dependencies
- Go into the QMC > Data Connections section and inspect all Folder data connections to determine all network shares that the service account needs access to. Either change them yourself or alert the necessary teams to provide both Share and NT level access to these shares.
- Inspect all Data Connections and ensure that none use the old Service account and password. Follow up with necessary teams to provide access to data sources that used the old credentials.
Video Demonstration
Related Content
How to change the share path in Qlik Sense (Service Cluster)
-
Qlik Talend ESB cFile Component gives an error " java.nio.charset.MalformedInput...
A Talend ESB Route failed with below error when executing in studio org.apache.camel.processor.errorhandler.DefaultErrorHandler- Failed delivery for ... Show MoreA Talend ESB Route failed with below error when executing in studio
org.apache.camel.processor.errorhandler.DefaultErrorHandler- Failed delivery for (MessageId: xx on ExchangeId: xx).
Exhausted after delivery attempt: 1 caught: org.apache.camel.TypeConversionException: Error during type conversion from type: org.apache.camel.component.file.GenericFile to the required type: java.lang.String with value GenericFile[myfile] due to java.nio.charset.MalformedInputException: Input length = 1The Route Design looks like:
cFile →cProcessor (with the code exchange.getIn().getBody(String.class) in it)
Resolution
- Check the Source File at first.
- Adapt the File Encoding accordingly
- In cFile component, go to Basic Settings and focus on Encoding
- Select the encoding type (UTF-8/ISO-8859-15/Custom)
Cause
From above error message, you can see: "The java.nio.charset.MalformedInputException: Input length = 1” error in Java arises when the program attempts to decode a character sequence with a character encoding that is incompatible with the actual encoding of the input data. Specifically, "Input length = 1" indicates that a single byte or character is causing the decoding issue.
Environment
-
Talend Cloud and AWS System Manager Parameter Store for Talend context variables
AWS System Manager (SM), an AWS service, can be used to view and control infrastructures on AWS. It offers automation documents to simplify common m... Show MoreAWS System Manager (SM), an AWS service, can be used to view and control infrastructures on AWS. It offers automation documents to simplify common maintenance and deployment tasks of AWS resources.
AWS SM consists of a collection of capabilities related to automation, such as infrastructure maintenance and deployment tasks of AWS resources as well as some related to Application Management and Configuration. Among them, is a capability called Parameter Store.
AWS System Manager Parameter Store
AWS Systems Manager (SM) Parameter Store provides secure, hierarchical storage for configuration data management and secrets management.
It allows you to store data such as passwords, database strings, and license codes as parameter values.
AWS SM Parameter Store benefits
Parameter Store offers the following benefits and features for Talend Jobs.
-
Secured, highly scalable, hosted service with NO SERVERS to manage: compared to the setup of a dedicated database to store Job context variables.
-
Control access at granular levels: specify who can access a specific parameter or set of parameters (for example, DB connection) at the user or group level. Using IAM roles, you can restrict access to parameters, which can have nested paths that can be used to define ACL-like access constraints. This is important for the control access of Production environment parameters.
-
Audit access: track the last user who created or updated a specific parameter value.
-
Encryption of data at rest and in transit: parameter values can be stored as plaintext (unencrypted data) or ciphertext (encrypted data). For encrypted value, KMS: AWS Key Management Service is used behind the scenes. Hence, Talend context variables with a Password type can be stored and retrieved securely without the implementation of a dedicated encryption/decryption process.
Another benefit of the AWS SM Parameter Store is its usage cost.
AWS SM Parameter Store pricing
AWS SM Parameter Store consists of standard and advanced parameters.
Standard parameters are available at no additional charge. The values are limited to 4 KB size, which should cover the majority of Talend Job use cases.
With advanced parameters (8 KB size), you are charged based on the number of advanced parameters stored each month and per API interaction.
Pricing example
Assume you have 5,000 parameters, of which 500 are advanced. Assume that you have enabled higher throughput limits and interact with each parameter 24 times per day, equating to 3,600,000 interactions per 30-day month. Because you have enabled higher throughput, your API interactions are charged for standard and advanced parameters. Your monthly bill is the sum of the cost of the advanced parameters and the API interactions, as follows: Cost of 500 advanced parameters = 500 * $0.05 per advanced parameter = $25 Cost of 3.6M API interactions = 3.6M * $0.05 per 10,000 interactions = $18 Total monthly cost = $25 + $18 = $43.
For more information on pricing, see the AWS Systems Manager pricing web site.
About parameters
A Parameter Store parameter is any piece of configuration data, such as a password or connection string, that is saved in the Store. You can centrally and securely reference this data in a Talend Job.
The Parameter Store provides support for three types of parameters:
- String
- String List
- Secure String
Organizing parameters into hierarchies
In Talend, context variables are stored as a list of key-value pairs independent of the physical storage (Job, file, or database). Managing numerous parameters as a flat list is time-consuming and prone to errors. It can also be difficult to identify the correct parameter for a Talend Project or Job. This means you might accidentally use the wrong parameter, or you might create multiple parameters that use the same configuration data.
Parameter Store allows you to use parameter hierarchies to help organize and manage parameters. A hierarchy is a parameter name that includes a path that you define by using forward slashes (/).
The following example uses three hierarchy levels in the name:
/Dev/PROJECT1/max_rows
AWS SM Parameter Store with Talend Job
Parameter Store can accede from the AWS Console, AWS CLI, or the AWS SDK, including Java. Talend Studio leverage the AWS Java SDK to connect numerous Amazon Services, but, as yet, not to Amazon System Manager.
Implementation of AWS SM Parameter Store connector
This initial implementation solely uses the current capabilities of Studio, such as Routines and Joblets.
A future version will leverage the Talend Component Development Kit (CDK) to build a dedicated connector for AWS System Manager.
Routine
The connector was developed in Java using the AWS SDK and exported as an UberJar (single JAR with all his dependencies embedded in it).
The AWSSSMParameterStore-1.0.0.jar file (attached to this article) is imported into the Studio local Maven Repository and then used as a dependency in the AwsSSMParameterStore Talend routine.
The routine provides a set of high-level APIs/functions of the Parameter Store for Talend Jobs.
package routines; import java.util.Map; import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import com.talend.ps.engineering.AWSSMParameterStore; public class AwsSSMParameterStore { private static final Log LOG = LogFactory.getLog(AwsSSMParameterStore.class); private static AWSSMParameterStore paramsStore; /* * init * * Create a AWSSMParameterStore client based of the credentials parameters. * Follows the "Default Credential Provider Chain". * See https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/credentials.html * * Parameters: * accessKey : (Optional) AWS Access Key * secretKey : (Optional) AWS Secret Key * region : (Optional) AWS Region * * Return: * Boolean : False if invalid combination of parameters */ public static boolean init(String accessKey, String secretKey, String region) { ... } /* * loadParameters * * Retrieve all the parameters recursively with the path a prefix in their name * * Parameters: * path : Parameter path prefix for the parameters * * Return: * Map of name, value pair of parameters */ public static Map<String, String> loadParameters(String path){ ... } /* * saveParameter * * Retrieve all the parameters recursively with the path a prefix in their name * * Parameters: * name : Name of the parameter * value : Value of the parameter * encrypt : Encrypt the value the value in the Parameter Store * * Return: * Boolean : False if the save failed */ public static boolean saveParameter(String name, Object value, boolean encrypt) { ... } }
The init function creates the connector to AWS SSM using the AWS Default Credential Provider Chain.
The loadParameters function connects to the Parameter Store and retrieves a set/hierarchy of parameters prefixed with a specific path (see the naming convention for the parameters below).
The result is returned as a Map key-value pair.
Important: In the returned Map, the key represents only the last part of the parameter name path. If the parameter name is: /Dev/PROJECT1/max_rows, the returned Map key for this parameter is max_rows.
The saveParameter function allows you to save a context parameter name and value (derived from a context variable) to the Parameter Store.
Joblets
Two Joblets were developed to connect to the AWS Parameter Store through the routine. One is designed to initialize the context variables of a Job using the parameters from the AWS Parameter Store. The other, as a utility for a Job to store its context variables into the Parameter Store.
Joblet: SaveContextVariableToAwsSSMParameterStore
The Joblet uses a tContextDump component to generate the context variables dataset with the standard key-value pair schema.
The tJavaFlex component is used to connect to the Parameter Store and save the context variables as parameters with a specific naming convention.
Parameter hierarchies naming convention for Talend context variables
In the context of context variables, the choice is to use a root prefix (optional) /talend/ to avoid any potential collision with the existing parameter name.
The prefix is appended with a string representing a runtime environment, for example, dev, qa, and prod. This to mimic the concept of the context environment found in the Job Contexts:
The parameter name is then appended with the name of the Talend Project (which is extracted from the Job definition) and, finally the name of the variable.
Parameter naming convention:
/talend/<environment name>/<talend project name>/<context variable name>
Example Job: job1 with a context variable ctx_var1 in a Talend Project PROJECT1.
The name of the parameter for the ctx_var1 variable in a development environment (identified by dev), is:
/talend/dev/PROJECT1/ctx_var1
For a production environment, prod, the name is:
/talend/prod/PROJECT1/ctx_var1
One option is to use the Job name as well in the hierarchy of the parameter name:
/talend/prod/PROJECT1/job1/ctx_var1
However, due to the usage of Talend Metadata connection, Context Group, and other that are shared across multiple Jobs, the usage of the Job name will result in multiple references of a context variable in the Parameter Store.
Moreover, if a value in the Context Group changes, the value needs to be updated in all the parameters for this context variable, which defies the purpose of the context group.
Joblet context variables
The Joblet uses a dedicated context group specific to the interaction with the Parameter Store.
-
AWS Access & Secret keys to connect to AWS. As mentioned earlier, the routine leverages AWS Default Credential Provider Chain. If these variables are not initialized, the SDK looks for Environment variables or the ~/.aws/Credential (user directory on Windows ) or EC2 roles to infer the right credentials.
-
AWS region of the AWS SM Parameter Store.
-
Parameter Store prefix and environment used in the parameter path as described above in the naming convention.
Joblet: LoadContextVariablesFromAwsSSMParmeterStore
The second Joblet is used to read parameters from The Parameter Store and update the Job context variables.
The Joblet uses a tJavaFlex component to connect to SSM Parameter Store, leveraging the AwsSSMParameterStore.loadParameters routine function described above. It retrieves all the parameters based on the prefix path (see the defined naming convention above).
The tContextLoad use the tJavaflex output key-value pair dataset, to overwrite the default values of the context variables.
Joblet context variables
The load Joblet uses the same context group as the save counterpart.
Sample Talend Job
The sample Talend Job, generates a simple people's dataset using the tRowGenerator (first name, last name, and age), applies some transformations, and segregates the rows by age to create two distinct datasets, one for Adults ( age > 18) and one for Teenagers.
The two datasets are then inserted into a MySQL database in their respective tables.
The Job contains a mix of context variables, some are coming from a group defined for the MySQL Metadata Connection and some are specific to the Job: max_rows, table_adults, and table_teenagers.
Create Parameter Store entries for the context variables
The first step is to create all the parameters in the Parameter Store for the Job context variables. This can be done using the AWS console or through the AWS CLI, but those methods can be time-consuming and error-prone.
Instead, use the dedicated SaveContextVariableToAwsSSMParameterStore Joblet.
You need to drag-and-drop the Joblet into the Job canvas. There is no need to connect it to the rest of the Job components. It lists all the context variables, connects to AWS SM Parameter Store, creates the associated parameters, and stops the Job.
When the Job is executed, the System Manager Parameter Store web console should list the newly created parameters.
On the AWS console, the first column is not resizable, to see the full name of a parameter, you'll need to hide some of the columns.
You can also click a specific parameter to see the details.
For context variables defined with a Password type, the associated parameter is created as SecureString, which allows the value to be encrypted at rest in the store.
Talking about security, IAM access control can be leveraged to restrict access to a specific Operation team or to restrict access of a specific set of parameters such as production parameters: /talend/prod/*; developers will have access solely to the dev environment-related parameters, for example:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ // Allows to decrypt secret parameters "kms:Decrypt", "ssm:DescribeParameters" ], "Resource": "*" }, { "Sid": "VisualEditor1", "Effect": "Allow", "Action": [ "ssm:PutParameter", "ssm:LabelParameterVersion", "ssm:DeleteParameter", "ssm:GetParameterHistory", "ssm:GetParametersByPath", "ssm:GetParameters", "ssm:GetParameter", "ssm:DeleteParameters" ], // Grant access only to dev parameters "Resource": "arn:aws:ssm:AWS-Region:AWS-AccountId:parameter/talend/dev/*" } ] }
Talend Cloud Job
In the context of a Talend Cloud Job/Task, the context variables don't need to be exported as connections or resources for Talend Cloud as they are initialized from the AWS Parameter Store.
You can only create a connection for the AWS SM Parameter Store credentials and config parameters.
Custom connection for AWS SM Parameter Store
The context group for the AWS SM Parameter Store, is externalized as Talend Cloud Custom Connection because, as yet, Talend Cloud doesn't have a native connector for AWS System Manager.
Talend Cloud Task
In Studio, you create a new Talend Cloud task by publishing the Job artifact to the cloud.
You'll then add the custom connection for AWS SM.
The additional context variables are exposed as advanced parameters, including the database connection parameters that are initialized from the Parameter Store.
A successful task execution on a cloud or Remote Engine means that the Job can connect to AWS SM, retrieve the parameters based on the naming convention set above, and initialize the corresponding context variables to allows the Job to connect to the MySQL database and create the requested tables.
-
-
How to Install PostgreSQL ODBC client on Linux for PostgreSQL Target Endpoint
This article provides a comprehensive guide to efficiently install the PostgreSQL ODBC client on Linux for a PostgreSQL target endpoint. If the Postgr... Show MoreThis article provides a comprehensive guide to efficiently install the PostgreSQL ODBC client on Linux for a PostgreSQL target endpoint.
If the PostgreSQL serves as Replicate source endpoint, please check: How to Install PostgreSQL ODBC client on Linux for PostgreSQL Source Endpoint
Overview
- Download the PostgreSQL ODBC client software.
- Upload the downloaded files to your Replicate Linux Server.
- Install the RPM files in the specified order.
- Take note of considerations or setup instructions during the installation process.
Steps
- Download the PostgreSQL ODBC client software
Please choose the appropriate version of the PostgreSQL client software and the corresponding folder for your Linux operating system. In this article, we are installing PostgreSQL ODBC Client version 13.2 on Linux 8.5.
- Upload the downloaded files to a temporary folder in your Qlik Replicate Linux Server
- Install the RPM files in the specified order
rpm -ivh postgresql13-libs-13.2-1PGDG.rhel8.x86_64.rpm
rpm -ivh postgresql13-odbc-13.02.0000-1PGDG.rhel8.x86_64.rpm
rpm -ivh postgresql13-13.2-1PGDG.rhel8.x86_64.rpm - Take note of considerations or setup instructions during the installation process.
- Note the installation folder (default: "/usr/pgsql-13/lib")
- Open site_arep_login.sh in /opt/attunity/replicate/bin/ and add the installation folder as a LD_LIBRARY_PATH:
export LD_LIBRARY_PATH=/usr/pgsql-13/lib:$LD_LIBRARY_PATH
- Save the site_arep_login.sh file and restart Replicate Services.
- unixODBC is a prerequisite. If it's not already present on your Linux Server, make sure to install it before PostgreSQL ODBC client software installation:
rpm -ivh unixODBC-2.3.7-1.el8.x86_64.rpm
- "/etc/odbcinst.ini" is required:
[PostgreSQL]
Description = ODBC for PostgreSQL
Driver = /usr/lib/psqlodbcw.so
Setup = /usr/lib/libodbcpsqlS.so
Driver64 = /usr/pgsql-13/lib/psqlodbcw.so
Setup64 = /usr/lib64/libodbcpsqlS.so
FileUsage = 1 - "psql" is required if the task set to Batch optimized apply mode. Add "/usr/pgsql-13/bin" to PATH system environment and restart service if hit error "psql" cannot be found.
- "/etc/odbc.ini" is optional and typically not required, unless it becomes necessary for troubleshooting connectivity issues by "isql".A sample:
[pg15]
Driver = /usr/pgsql-13/lib/psqlodbcw.so
Database = targetdb
Servername = <targetDBHostName or IP Address>
Port = 5432
UserName = <PG User Name>
Password = <PG user's Password>
Environment
- Qlik Replicate all versions
- PostgreSQL Server all versions
- PostgreSQL Client version 13.2
-
Qlik Sense Enterprise on Windows: Task errors out in QMC with App already updati...
A task being run from the Qlik Sense Management Console fails with the error: App already updating The issue is typically only seen for specific apps... Show MoreA task being run from the Qlik Sense Management Console fails with the error:
App already updating
The issue is typically only seen for specific apps and cannot be observed with new apps.
Resolution
Restart all Qlik Sense services on the Central node. See Manual Start and Stop order of Qlik Sense services for details.
Cause
The task or app is queued within the manager schedule but has not completed successfully.
Environment
- Qlik Sense Enterprise on Windows
-
Qlik Reporting Create Report task button is greyed out
A Managed Space member is not able to create new report tasks even with the correct space permissions set. The Create button is greyed out and cannot ... Show MoreA Managed Space member is not able to create new report tasks even with the correct space permissions set. The Create button is greyed out and cannot be clicked. The minimal managed space permissions are set:
- Can View
- Can Manage
- Can Operate
Resolution
Ensure you do not have any 3rd party extensions running in the browser, such as AdBlock or similar. Disable or remove them and verify the Create Report button is once again available.
Cause
Adblock has been found to negatively interact with the Qlik Reporting create task button and other Qlik Reporting browser elements by aggressively blocking them.
Environment
- Qlik Cloud Analytics
- Qlik Reporting
-
Qlik Office 365 SharePoint Connector User: Error getting user info!
Configuring a SharePoint connection fails when attempting to save the token. The error displayed is: User: Error getting user info! Resolution From ... Show MoreConfiguring a SharePoint connection fails when attempting to save the token. The error displayed is:
User: Error getting user info!
Resolution
From the Qlik Web Connectors stand-alone site:
- Go to the Connector
- Switch to the About tab
- Select Permissions
This section will list the API endpoints that need to be reached. Verify they are reachable and configure firewall exceptions where necessary.
Note: For this connector, the endpoints are https port 443.
Cause
“Getting user info” is a request to https://graph.microsoft.com/v1.0/me. The endpoint was not reachable.
Environment
- Qlik Web Connectors
-
QlikView Multi-Box not showing all results
The QlikView Multi-box object does not show all expected results when opening from the QlikView AccessPoint. Example: Searching for *abc returns four ... Show MoreThe QlikView Multi-box object does not show all expected results when opening from the QlikView AccessPoint.
Example:
Searching for *abc returns four results, while there are five matches in the data.
This may occur after an upgrade (such as upgrading QlikView from an earlier release to 12.90).
Resolution
Review your browser's extensions. A known root cause for this issue is a custom browser extension that modifies a page's CSS (Cascading Style Sheet). This resulted in the results being hidden by the search box.
To resolve this, disable or otherwise modify the custom extension.
Internal Investigation ID(s)
SUPPORT-722
Environment
- QlikView
-
Qlik Sense and the JIRA Connector and JIRA Server: Error when displaying "Projec...
The following error (C) is shown after successfully creating a Jira Connection string and selecting a Project/key (B) from select data to load (A😞 Fa... Show MoreThe following error (C) is shown after successfully creating a Jira Connection string and selecting a Project/key (B) from select data to load (A😞
Failed on attempt 1 to GET. (The remote server returned an error; (404).)
The error occurs when connecting to JIRA server, but not to JIRA Cloud.
Resolution
Qlik Cloud Analytics
Tick the Use legacy search API checkbox. This is switched off by default.
Qlik Sense Enterprise on Windows
A Use legacy search API option is not present in Qlik Sense On-Premise. To resolve the issue, manually add useLegacySearchAPI='true' in the generated script. This is required when using both Issues and CustomFieldsForIssues tables.
Example:
[Issues]: LOAD key as [Issues.key], fields_summary as [Issues.fields_summary]; SELECT key, fields_summary FROM Issues WITH PROPERTIES ( projectIdOrKey='CL', createdAfter='', createdBefore='', updatedAfter='', updatedBefore='', customFieldIds='', jqlQuery='', maxResults='4', useLegacySearchApi='true' );
Cause
Connections to JIRA Server use the legacy API.
Internal Investigation ID(s)
SUPPORT-3600
Environment
- Qlik Cloud Analytics
- Qlik Sense Enterprise on Windows
-
Qlik Sense Enterprise on Windows and Security Vulnerability CVE-2025-29927
CVE-2025-29927 is a critical authorization bypass vulnerability in Next.js, a popular React framework. Specifically, it allows attackers to circumvent... Show MoreCVE-2025-29927 is a critical authorization bypass vulnerability in Next.js, a popular React framework. Specifically, it allows attackers to circumvent security checks within a Next.js application if those checks are performed using middleware.
Is Qlik Sense Enterprise on Windows affected by this Security Vulnerability CVE-2025-29927?
Resolution
Qlik Sense Enterprise on Windows is not affected by this Security Vulnerability CVE-2025-29927.
Next.js is not used in any of the on-premise Qlik Sense core services, such as the QMC or Hub.
Environment
- Qlik Sense Enterprise on Windows
-
Qlik Cloud Analytics: Outer set expression with empty selection
An outer set expression returns an empty selection. Examples: Table 1 has source data that contains a row for Beta. The inner set and outer set behave... Show MoreAn outer set expression returns an empty selection.
Examples:
Table 1 has source data that contains a row for Beta. The inner set and outer set behave identically.
Table 2 has source data that does not include a row for Beta. The inner set and outer set behave differently.
Resolution
To make this work as expected, add “&” to the beginning of the set expression.
Example:
{&<group1={'Beta'}>} sum( {&<Company1={'A'}>} salary1)
Cause
This behavior is caused by how sets are combined when using multiple set expressions.
If the outer set expression produces an empty set, it is ignored when the inner set expression is evaluated.
The result is that only the inner set expression is used.
Internal Investigation ID(s)
SUPPORT-3523
Environment
- Qlik Cloud Analytics
-
Qlik Talend Product: Error 400 - Invalid SNI error after Installed Talend Runtim...
You may encounter an error : 400 - Invalid SNI when calling Talend Runtime API (Job as service) after installed 2025-02 patch or later. In the past be... Show MoreYou may encounter an error : 400 - Invalid SNI when calling Talend Runtime API (Job as service) after installed 2025-02 patch or later. In the past before the patch version R2025-02 of Talend Runtime Server, it did work well when using the same certificate for SSL connection with Talend Runtime Server and did not cause any issue.
The SNI validation is active after 2025-02 patch or later.
Resolution
There are three options to slove this issue
- Obtain and install a proper certificate that references the correct host name and then access it with the hostname rather than by IP.
- Disable SNI host check.
- Tell Talend component to resolve IP as hostname
Disable SNI Host Check
This has the same security risk as jetty before it was updated (low security)
In <RuntimeInstallationFolder>/etc/org.ops4j.pax.web.cfg file, please add
jetty.ssl.sniRequired=false
and
jetty.ssl.sniHostCheck=false
Or configuring these jetty parameters in <RuntimeInstallationFolder>/etc/jetty.xml or jetty-ssl.xml file
- Find the <New class="org.eclipse.jetty.server.SecureRequestCustomizer"> block in your jetty.xml or jetty-ssl.xml
- Edit the <Arg name="sniRequired" ...> and <Arg name="sniHostCheck" ...> lines so that the properties' defaults are set to false as shown below:
<New id="sslHttpConfig" class="org.eclipse.jetty.server.HttpConfiguration">
<Arg><Ref refid="httpConfig"/></Arg>
<Call name="addCustomizer">
<Arg>
<New class="org.eclipse.jetty.server.SecureRequestCustomizer">
<Arg name="sniRequired" type="boolean">
<Property name="jetty.ssl.sniRequired" default="false"/>
</Arg>
<Arg name="sniHostCheck" type="boolean">
<Property name="jetty.ssl.sniHostCheck" default="false"/>
</Arg>
<Arg name="stsMaxAgeSeconds" type="int">
<Property name="jetty.ssl.stsMaxAgeSeconds" default="-1"/>
</Arg>
<Arg name="stsIncludeSubdomains" type="boolean">
<Property name="jetty.ssl.stsIncludeSubdomains" default="false"/>
</Arg>
</New>
</Arg>
</Call>
</New>
Resolve IP to Hostname
If the certification includes the domain name, you should use that domain name instead of the IP with the Jetty security updates in Talend Runtime Server.
But if your DNS server does not resolve the IP, you must call it by the IP address, so please check it at first to see if the workaround is feasible for your current situation.
In the examples the hostname is unresolvedhost.net and the IP is 10.20.30.40.
Try this API call at the command line:
curl -k -X GET --resolve unresolvedhost.net:9001:10.20.30.40 https://unresolvedhost.net:9001/services/
or
curl -k -X GET -H "Host: unresolvedhost.net" https://10.20.30.20:9001/services/
If this works, in your Talend component that makes the API call, go to "Advanced settings" or "Headers" table, add a row with Key: Host and Value: The hostname that matches your SSL certificate (e.g. unresolvedhost.net)
This will instruct Talend to send the correct Host header, which most HTTP clients (including Java's HttpClient) will also use as the SNI value during the TLS handshake.
Cause
The SNI enforcement is there for a security reason. With the 2025-02 patch, the Jetty components on Talend Runtime Server resolved a CVE security issue where they allowed a hostname to connect to a server that doesn't match the hostname in the server's TLS certificate.
Certificates require the URI not to be localhost or an IP address, and to have at least one dot, so a fully qualified domain name is best.
Related Content
Environment