I am trying to perform the data profiling in TDM Portal.When i try to create a new connection/Environment its updating in the connection profiles under the "Configuration Tab".But i can't see the connections under 'set-up' tab in the "Data Profiling".
Also,for the classifier(Zip) uploading its throwing me error.I did follow the procedure to create connection and classifiers as it shows in
Manage Data Classifiers - CA Test Data Manager - 4.4 - CA Technologies Documentation
And also i am getting the below error when i try to create new environments.
Is it something related to TDoD services or any other issue.
This is not a Tdod issue but i think the portal has not started correctly. Can you confirm which version of TDM you are using and review the startup.log in this location C:\ProgramData\CA\CA Test Data Manager Portal\logs to confirm that all of the portal components have started correctly.
Thanks for the reply.In the startup.log i can see only "Encrypt One Block" when i try to perform the portal actions.
That message is normal so shouldn't be causing the issue, if you email me the startup.log (email@example.com) i will take a look.
I sent the log to your email.Please verify and let me know.
There are several errors in the log that indicate an issue with a connection to the repository which is preventing some of the portal core services from starting, have you made any changes to the database / database server?
To debug the issue can you start datamaker and ensure that you can login and then run the repository maintenance in datamaker to confirm that there are no issues with the repository.
when i ran the maintenance i got this "Repository Schema - Inaccuracies found. Updates required" with series of SQL script having 'write to file ' and 'execute script' as options at the bottom.
Do i need to perform the update?
Yes, please execute the script. This should fix the issues.
Thanks for the reply.I tried to execute the script and ended up having error message like below.
Do i need to drop the index? If so, is it for Source or Target tables.I do not have access to repository
Here's a workaround for this problem:
1. Navigate a remote connection to the TDM Datamaker server.2. Log in with an account that has Admin privileges3. Open a File Explorer4. Navigate to the Program Files (x86)\Grid-Tools\CTDatamaker directory5. Locate the centgtrep.ndd file, and open it for edit in any text editor6. Locate the “ss_name” setting under the “CREATE TABLE gtrep_system_settings” section.7. Change the “ss_name” size from 2000 to 254For example: ss-name nvarchar (2000) NOT NULL,To read: ss-name nvarchar (254) NOT NULL,8. Save the changes9. Open GT Datamaker and log in as Administrator10. Before running the upgrade on the repository, make a backup of your database.11. Run the repository upgrade. See the steps for upgrading the repository (taken from https://docops.ca.com/ca-test-data-manager/4-5/en/installing/upgrade-product-components
However, the cause of this could be a small blocksize on your DB - check the output of:
show parameter db_block_
Some good info about the error here:
ORA-01450: maximum key length (6398) exceeded - Mohammad Nazmul Huda
You may want to check with your DBA about this error as I've seen some other customers who confirmed a reasonable block size (8k), but still received this error. It might be due to a small tablespace that may have been allocated for GTREP before the repo was initially installed even though the defaults in just about every environment I've seen are 8K for modern Oracle versions.
Thanks for the detailed reply.But those actions(steps from 1-9),is it mandatory to perform before the upgrade?
Assuming that your DBA cannot determine the cause of the ORA-01450 error and correct it, performing the steps to modify the centgtrep.ndd file AFTER upgrading TDM, but BEFORE launching datamaker for the first time post upgrade would be necessary to get the repo updated.
If as in your case currently, you've already attempted to open datamaker and run the repo upgrade script which then failed, you can modify the centgtrep.ndd accordingly and re-process the upgrade by going into repo maintenance (Open Datamaker -> ctrl + alt + m) -> Check Repository Tables -> Process:
This will then prompt you to run the repo upgrade script again, with the new value for ss-name that shouldn't run afoul of the index limits of your Oracle server configuration.
Thank you Sean!
This is really helpful and i will let you know once i am done with the changes and upgrade.