During profile creation, if I choose an ODBC connection and search for a source object, the search results do not show the source object even when it exists. How can I resolve this issue?
Searches are case-sensitive for ODBC. To search for the source object, enter the source object name using the correct case.
A profile run fails and the following error message appears:
Error occurred when initialising tenant - Failed to create a tenant even after 5 attempts.
To resolve this issue, restart the profiling svc nodes and re-run the profile.
A profile run fails and the following error message appears in the session log: "The executor with id xx exited with exit code 137(SIGKILL, possible container OOM).". How do I resolve this issue?
To resolve this issue, perform the following steps:
Open the custom.properties file available in the following location on the machine where the Secure Agent runs:
/root/infaagent/apps/At_Scale_Server/<version>/spark/
Add the following property:
spark.executor.memoryOverhead = 2048MB
Save the custom.properties file.
In
Data Profiling
, run the profile.
A profile run fails and the following error message appears in the session log: "The node was low on resource: ephemeral-storage. Container spark-kubernetes-driver was using xxx, which exceeds its request of xx.". How do I resolve this issue?
To resolve this issue, increase the minimum and maximum EBS volume sizes to attach to a worker node for temporary storage during data processing.
To increase the minimum and maximum EBS volume sizes, perform the following steps in Administrator:
In
Administrator
, open the
Advanced Clusters
page.
Select the Advanced Configuration for which you want to change the EBS volume size.
In the
EBS Volume Size
field of the
Platform Configuration
area, increase the values in the
Min GB
and the
Max GB
fields to
200
.
By default, the minimum and maximum volume sizes are 100 GB.
Restart the Secure Agent.
In
Data Profiling
, run the profile.
A profile run fails with an internal error when the source object contains a column name with is more than 73 characters.
To resolve this issue, reduce the length of the column name.
Unable to save a profile using Databricks with an ODBC connection when I create tables with the same name under two different databases. How can I resolve this issue?
This issue occurs when you do not specify the schema name in the connection. To resolve this issue, specify the schema name in the connection to point to the correct database.
If columns contain a large number of rows, the profile job fails for a Microsoft Azure Synapse SQL connection and the following error message appears: "error "[FATAL] Exception: com.microsoft.sqlserver.jdbc.SQLServerException: Error 0x27 - Could not allocate tempdb space while transferring data from one distribution to another.". How can I resolve this issue?
To resolve this issue, increase the Data Warehouse Units (DWU) of the Microsoft Azure Synapse SQL instance.
A profile run fails with the error "Profile job failed with error java.lang.RuntimeException: Output Port Primary does not exist in specified rule". How do I resolve this issue?
This error appears when the following conditions are true:
In Data Profiling, you create a profile, add a rule R1, save, and run the profile.
In Data Quality, you modify the rule input or output name for rule specification R1 and save it.
In Data Profiling, you run the profile.
To resolve this issue, you can remove rule R1 from the profile and save the profile. Add the rule R1 again to the profile, save, and run the profile.
A profile run fails with the error "***ERROR: nsort_release_recs() returns -10 ". How do I resolve this issue?
To resolve this issue, increase the disk space storage of the hard drive where Secure Agent is installed.
When you run a profile on an Amazon S3 source object, the profile run fails with an error "Cloud DQ Profiling failure ERROR: Unexpected condition at [file:[..\..\..\common\reposit\trepcnx.cpp|file://[......commonreposittrepcnx.cpp/]] line: [293]". How do I resolve this issue?
To resolve this issue, ensure that you have the valid license for the Amazon S3 connection in Administrator.