Clarity Client Automation

Expand all | Collapse all

Directory COLLECT \ 00001 accumulating many files

Jump to Best Answer
  • 1.  Directory COLLECT \ 00001 accumulating many files

    Posted 03-24-2018 12:59 AM

    Hello everyone, I have 4 scalabilities that are collecting information, however, within the DSM \ ServerDB \ SECTOR \ COLLECT \ 00000001 directory it accumulates 50,000 files.
    I also noticed that the engine is processing very slow, someone has already gone through something similar?



  • 2.  Re:  Directory COLLECT \ 00001 accumulating many files

     
    Posted 03-24-2018 08:28 AM

    It means either your collect engine is not running or the collect job fails,  check your domain manager to find the collect job for that specific SS then see what it is dong.

     

    Richard Lechner

    Principal Engineering Services Architect

     

    CA Technologies

    Mobile: +1 703 655 7161 | Richard.Lechner@ca.com

     

    <mailto:Richard.Lechner@ca.com>[CA]<http://www.ca.com/us/default.aspx>[Twitter]<http://twitter.com/CAInc>[Slideshare]<http://www.slideshare.net/cainc>[Facebook]<https://www.facebook.com/CATechnologies>[YouTube]<http://www.youtube.com/user/catechnologies>[LinkedIn]<http://www.linkedin.com/company/1372?goback=.cps_1244823420724_1>[Google]<https://plus.google.com/CATechnologies>[Google+]<http://www.ca.com/us/rss.aspx?intcmp=footernav>



  • 3.  Re:  Directory COLLECT \ 00001 accumulating many files

    Posted 03-24-2018 09:59 AM

    These are inventory files sent by the agents, to the scalability server. Is the same happening for all SS, or only one of them?

     

    These files get collected by an Engine on the DM.  You mention you have (4) SS, so you should have (5) Collect Tasks running on the DM.  One for the DM's own built-in scalability server, and four more for each remote scalability server.

     

    So what are the status of these collect tasks on the engine?

    Are all (5) collect tasks linked to an engine?

    What are the status and last run dates?

     

    In the activity window of the engine, you should see files getting processed.



  • 4.  Re:  Directory COLLECT \ 00001 accumulating many files

    Posted 03-24-2018 11:10 AM

    Hello Brian, I have 4 scalings and one engine for each. It's happening in 3 out of 4 scalabs. On the screen of the engine I see the files being processed, but it is processing only 1 files every 15 seconds, approximately.



  • 5.  Re:  Directory COLLECT \ 00001 accumulating many files

    Posted 03-24-2018 11:35 AM

    kb000046211: Scaling Client Automation: How to improve Collect Task and Replication Task Performance by Limiting the amount of Hardware and Software scans sent by Agents.
    https://comm.support.ca.com/kb/scaling-client-automation-itcm-how-to-improve-collect-task-and-replication-task-performance-by-limiting-the-amount-of-hardware-and-software-scans-sent-by-agents/kb000046211

     

    Follow these two recommendations from the linked document:

    - Recommendation: Reduce the frequency of hardware inventory scans

    - Recommendation: Advanced engine settings



  • 6.  Re:  Directory COLLECT \ 00001 accumulating many files

    Posted 03-27-2018 01:49 PM
    Hello Brian, thanks for the tip.
    I performed the tunning of the environment as the document indicates, however, it remains extremely slow.
    I opened a ticket in support to help with the problem.


  • 7.  Re:  Directory COLLECT \ 00001 accumulating many files
    Best Answer

    Posted 03-24-2018 01:29 PM

    There could be MANY reasons for the engines not keeping pace with the collections, from architectural problems to database issues, network issues and many others. Please open a support case so some investigation can be performed.

     

    Steve McCormick, ITIL

    CA Technologies

    Principal Services Consultant

    Stephen.McCormick@ca.com

    <mailto:Stephen.McCormick@ca.com>



  • 8.  Re:  Directory COLLECT \ 00001 accumulating many files

    Posted 03-27-2018 01:50 PM

    Hello Stephen, I already opened a ticket on the media. Thank you.