DX Unified Infrastructure Management

 View Only
  • 1.  Duplicate entries of Network devices in SLM

    Posted Jan 21, 2020 02:21 AM
    Hi Team 

    When we go to SLM ,there are many entries for a devices name ( around 10)  to select from. For example , when i select the QOS as QOS_INTERFACE_UTILIZATIONIN ,for a device name called as TEST ,i will see multiple entries of TEST .

    USM has only one entry.
    Performance reports has two entries , one with short name and other with the FQDN

    Environment - 9.0.2 .This was upgraded from 8.47 to 9.0.2

    Please advice on this 

    thanks

    Santhosh


  • 2.  RE: Duplicate entries of Network devices in SLM
    Best Answer

    Broadcom Employee
    Posted Jan 21, 2020 04:29 PM
    It seems that there are "duplicate" entries created in S_QOS_DATA where a QoS metric has been collected for some time using shortname or FQDN, and then the change was made, and the same QoS is now being collected under the opposite of what it was before.
    You will need to merge the entries as per the KB below.

    How can I merge a large amount of QOS data?
    https://ca-broadcom.wolkenservicedesk.com/external/article?articleId=72808


  • 3.  RE: Duplicate entries of Network devices in SLM

    Posted Jan 22, 2020 08:12 AM
    Thanks David , 
    Hope there is no loss of data when we merge the  QOS data 

    Thanks

    Santhosh


  • 4.  RE: Duplicate entries of Network devices in SLM

    Posted Feb 07, 2020 11:57 AM
    Is there an updated version of this article? I know that I have seen a newer version of the sql for the "merge oldest into newest" script that also cleans up the s_qos_data table. This version leaves records behind such that the "oldest" record always remains.


  • 5.  RE: Duplicate entries of Network devices in SLM

    Posted Feb 07, 2020 02:47 PM
    Answered my own question - here's the updated script

    --For any duplicate QOS, it will automatically merge the oldest QOS data into the newest QOS data.
    
    --it will print the results in "safe mode".  Remove the comments from the EXEC lines to make it "live".
    
    --This assumes that there are only two duplicates ... theoretically it can be run infinite number of times to reduce all dupes
    -- Remove comments in front of EXEC statements to "make live"
    
    SET NOCOUNT ON;
    
    DECLARE @s_qos_data_duplicates TABLE
    (
           id INT IDENTITY(1,1),
           qos VARCHAR(255),
           source VARCHAR(255),
           [target] VARCHAR(255)
    );
    
    DECLARE @s_qos_data_temp TABLE
    (
           table_id INT,
           created SMALLDATETIME,
           r_table VARCHAR(255),
           h_table VARCHAR(255)
    );
    
    DECLARE @qos VARCHAR(255),
           @source VARCHAR(255),
           @target VARCHAR(255),
           @r_table VARCHAR(255),
           @h_table VARCHAR(255),
           @created_old SMALLDATETIME,
           @created_new SMALLDATETIME,
           @table_id_old INT,
           @table_id_new INT,
           @table_id_previous INT,
           @i INT,
           @row_count INT,
           @sql VARCHAR(MAX);
    
    -- Prepopulate our dupes table
    INSERT INTO @s_qos_data_duplicates
    SELECT qos,source, [target]
    FROM S_QOS_DATA
    WHERE qos != 'QOS_HEALTH_INDEX'
    GROUP BY qos, source, [target]
    HAVING COUNT(*) > 1;
    
    SET @i = 1;
    SET @row_count = (SELECT COUNT(*) FROM @s_qos_data_duplicates);
    
     
    WHILE (@i <= @row_count)
    BEGIN
           SELECT @qos = qos, @source = source, @target = target
           FROM @s_qos_data_duplicates
           WHERE id = @i;
    
           INSERT INTO @s_qos_data_temp
           SELECT table_id, created, r_table, h_table
           FROM S_QOS_DATA
           WHERE qos = @qos AND source = @source AND [target] = @target
           ORDER BY created;
    
           -- Get oldest record
           SELECT TOP 1 @table_id_old = table_id, @created_old = created, @r_table = r_table, @h_table = h_table
           FROM @s_qos_data_temp;
           
           -- Get the newest record by resorting the table
           SELECT TOP 1 @table_id_new = table_id, @created_new = created
           FROM @s_qos_data_temp
           ORDER BY created DESC;
    
           -- Do something in RN and HN tables
           print 'Found ' + @qos + ': Old entry dated '+ CAST(@created_old AS VARCHAR(255))+ ', replacing with new entry '+ CAST(@created_new AS VARCHAR(255));
           SET @sql = 'UPDATE ' + @r_table + ' SET table_id = ' + CAST(@table_id_new AS VARCHAR(255))+ ' WHERE table_id = '+ CAST(@table_id_old AS VARCHAR(255))
           print @sql
    --     EXEC (@sql);
           SET @sql = 'UPDATE ' + @h_table + ' SET table_id = ' + CAST(@table_id_new AS VARCHAR(255))+ ' WHERE table_id = '+ CAST(@table_id_old AS VARCHAR(255))
           print @sql
    --     EXEC (@SQL);
    	   SET @sql = 'DELETE FROM S_QOS_DATA WHERE table_id = ' + CAST(@table_id_old AS VARCHAR(255))
    	   print @sql
    --	   EXEC (SQL);
    
           DELETE FROM @s_qos_data_temp;
    
           SET @i = @i + 1;
    END​



  • 6.  RE: Duplicate entries of Network devices in SLM

    Broadcom Employee
    Posted Feb 07, 2020 03:58 PM
    Hi Garin,

    You need a small edit on the last EXEC. Missing '@'

    Thanks for sharing.

    Steve

    ------------------------------
    Support Engineer
    Broadcom
    US
    ------------------------------



  • 7.  RE: Duplicate entries of Network devices in SLM

    Posted Feb 13, 2020 08:39 AM
      |   view attached
    Hi Garin

    We have a very different scenario- 
    For E.g , we would like to merge the data of table id - 1129898 to 964992 as our all the data is getting updated in 964992.

    When we run the script it is doing other way around , like merging 964992 to 1129898 but the new polls are updating in the 964992.

    Is there a way to change this script to achieve the above scenarios.

    All are QOS are having the above scenario.

    Attached are the screenshot

    thanks

    Santhosh

    Attachment(s)

    docx
    table_id.docx   166 KB 1 version