DX Infrastructure Manager

Expand all | Collapse all

Maintenance Mode probe issue

  • 1.  Maintenance Mode probe issue

    Posted 02-20-2018 04:29 AM

    Hi ,

    We are facing an issue for maintenance mode functionality.

    The agents are kept in maintenance mode but still the alerts are getting triggered.

    It was functioning properly but suddenly it was not working.

     

    In NAS probe it showing error " Feb 19 16:56:26:955 [6392] nas: maint:  Registration error rc: '2' to 'maintenance_mode'".

     

    Can you please suggest why it is happening and how to resolve this.

     

     

    Thank,

    Ayush



  • 2.  Re: Maintenance Mode probe issue

    Posted 02-20-2018 05:19 AM

    Did you try to restart the mainteance_mode probe? What are the logs of that probe saying?



  • 3.  Re: Maintenance Mode probe issue

    Posted 02-20-2018 05:34 AM

    Yes I tried to restrt the probe as well.

     

    When I am trying to keep the servers in MM mode on UMP page..
    there also I am getting an error

     

    communication error, I/O error on nim session (C) com.nimsoft.nimbus.NimNamedClientSession(Socket[addr=/172.18.12.27,port=48026,localport=65139]): Read timed out

     

     

    This port number is associated with maintenance mode probe



  • 4.  Re: Maintenance Mode probe issue

    Posted 02-20-2018 05:58 AM

    Hi,

     

    I am getting this error on UMP page while deleting a schedule or keeping a serevr in MM.

     

    An unknown error has occurred.
    Refreshing your browser may resolve the issue.

    Details:
    com.firehunter.ump.exceptions.DataFactoryException : null

    Stack Trace:
    (2) communication error, I/O error on nim session (C) com.nimsoft.nimbus.NimNamedClientSession(Socket[addr=/172.18.12.27,port=48026,localport=61178]): Read timed out
     at com.nimsoft.nimbus.NimSessionBase.recv(NimSessionBase.java:909)
     at com.nimsoft.nimbus.NimSessionBase.sendRcv(NimSessionBase.java:579)
     at com.nimsoft.nimbus.NimSessionBase.sendRcv(NimSessionBase.java:562)
     at com.nimsoft.nimbus.NimClientSession.send(NimClientSession.java:170)
     at com.nimsoft.nimbus.NimRequest.sendImpersonate(NimRequest.java:263)
     at com.nimsoft.nimbus.pool.NimRequestPool.sendImpersonate(NimRequestPool.java:81)
     at com.nimsoft.nimbus.pool.NimRequestPool.send(NimRequestPool.java:66)
     at com.nimsoft.nimbus.pool.NimRequestPoolInstance.send(NimRequestPoolInstance.java:170)
     at com.firehunter.umpportlet.PDSUtils.send(PDSUtils.java:65)
     at com.firehunter.usm.Maintenance.deleteMaintenanceSchedule(Maintenance.java:53)
     at com.firehunter.usm.DataFactory.deleteMaintenanceSchedule(DataFactory.java:8084)
     at com.firehunter.usm.DataFactory.deleteMaintenanceSchedule(DataFactory.java:8075)
     at sun.reflect.GeneratedMethodAccessor1302.invoke(Unknown Source)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
     at java.lang.reflect.Method.invoke(Unknown Source)
     at flex.messaging.services.remoting.adapters.JavaAdapter.invoke(JavaAdapter.java:421)
     at flex.messaging.services.RemotingService.serviceMessage(RemotingService.java:183)
     at flex.messaging.MessageBroker.routeMessageToService(MessageBroker.java:1503)
     at flex.messaging.endpoints.AbstractEndpoint.serviceMessage(AbstractEndpoint.java:884)
     at flex.messaging.endpoints.amf.MessageBrokerFilter.invoke(MessageBrokerFilter.java:121)
     at flex.messaging.endpoints.amf.LegacyFilter.invoke(LegacyFilter.java:158)
     at flex.messaging.endpoints.amf.SessionFilter.invoke(SessionFilter.java:44)
     at flex.messaging.endpoints.amf.BatchProcessFilter.invoke(BatchProcessFilter.java:67)
     at flex.messaging.endpoints.amf.SerializationFilter.invoke(SerializationFilter.java:146)
     at flex.messaging.endpoints.BaseHTTPEndpoint.service(BaseHTTPEndpoint.java:278)
     at flex.messaging.MessageBrokerServlet.service(MessageBrokerServlet.java:322)
     at javax.servlet.http.HttpServlet.service(HttpServlet.java:731)
     at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
     at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
     at com.firehunter.ump.auth.InvalidHttpSessionFilter.doFilter(InvalidHttpSessionFilter.java:29)
     at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
     at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
     at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:73)
     at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:117)
     at sun.reflect.GeneratedMethodAccessor318.invoke(Unknown Source)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
     at java.lang.reflect.Method.invoke(Unknown Source)
     at com.liferay.portal.kernel.bean.ClassLoaderBeanHandler.invoke(ClassLoaderBeanHandler.java:67)
     at com.sun.proxy.$Proxy957.doFilter(Unknown Source)
     at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:73)
     at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.processDirectCallFilter(InvokerFilterChain.java:168)
     at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:96)
     at com.liferay.portal.kernel.servlet.PortalClassLoaderFilter.doFilter(PortalClassLoaderFilter.java:72)
     at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.processDoFilter(InvokerFilterChain.java:207)
     at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilterChain.doFilter(InvokerFilterChain.java:109)
     at com.liferay.portal.kernel.servlet.filters.invoker.InvokerFilter.doFilter(InvokerFilter.java:84)
     at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
     at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
     at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
     at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
     at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:505)
     at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:169)
     at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
     at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
     at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:436)
     at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1078)
     at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:625)
     at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:318)
     at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
     at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
     at java.lang.Thread.run(Unknown Source)
    Caused by: java.net.SocketTimeoutException: Read timed out
     at java.net.SocketInputStream.socketRead0(Native Method)
     at java.net.SocketInputStream.read(Unknown Source)
     at java.net.SocketInputStream.read(Unknown Source)
     at java.net.SocketInputStream.read(Unknown Source)
     at com.nimsoft.nimbus.NimSessionBase.readNimbusHeader(NimSessionBase.java:1063)
     at com.nimsoft.nimbus.NimSessionBase.recv(NimSessionBase.java:848)
     ... 61 more



  • 5.  Re: Maintenance Mode probe issue

    Posted 02-20-2018 05:21 AM

    Hello Ayush,

     

    please increase the java heap size for maintenance_mode probe in Raw Configuration mode, and under startup, there is a key called options. Please increase Xms and Xmx values there.The size of this values depend on how much memory

    you have on this server.

     

    Kind regards,

    Britta Hoffner

    CA Support



  • 6.  Re: Maintenance Mode probe issue

    Posted 02-20-2018 05:36 AM

    Hi,

     

    In the raw configuration we get only the startup option under which we can change the size of the memory.

     

    XMs-64 and XMx 128 are the values are present.

     

    I believe this is only the java heap size.

    To which values I can change them.

     

    Thanks



  • 7.  Re: Maintenance Mode probe issue

    Posted 02-20-2018 05:42 AM

    Hello Ayush, depending on how much memory you have on this server you can for example change to -Xms64m -Xmx1024m

     

    Kind regards,

    Britta Hoffner

    CA Support



  • 8.  Re: Maintenance Mode probe issue

    Posted 02-20-2018 06:44 AM

    Ayush,

     

    If you search for the Server/Device by IP and name in the UMP, are you able to see just 1 Or Duplicated?

     

    Kind Regards,

     

    Alex Yasuda
    CA Technologies
    Sr Support Engineer



  • 9.  Re: Maintenance Mode probe issue

    Posted 02-20-2018 07:03 AM

    Hi,

     

    You are asking for the server search item or the server which I have kept in maintenance schedule.

    Generally the server is only one when found.
    There are many schedules of the past as well in the maintenance tab.

    Thanks



  • 10.  Re: Maintenance Mode probe issue

    Posted 10-29-2018 02:16 PM

    Hi Ayush,

     

    Are you facing the issue still. I am facing the exact same issue at my end. If you found a solution, kindly post it here. So i can get that done in my environment.

     

    Regards,

    Saju Mathew



  • 11.  Re: Maintenance Mode probe issue

    Posted 01-02-2019 04:04 PM

    It looks like Maintenance mode probe has lot of bugs. Removing the devices take lot of time and also even after increasing the heap size, I/O error pops up regularly. Was there any solution other than increasing heap size to fix maintenance mode probe issues?



  • 12.  Re: Maintenance Mode probe issue

    Posted 01-02-2019 04:12 PM

    This thread was closed so for all of us that have a view setup showing open questions, this will remain hidden. In general it is best to use your own question thread since what may seem like the same problem can have very different details.



  • 13.  Re: Maintenance Mode probe issue

    Posted 01-02-2019 04:16 PM

    What version of maintenance_mode probe do you have running?



  • 14.  Re: Maintenance Mode probe issue

    Posted 01-02-2019 04:17 PM

    8.43



  • 15.  Re: Maintenance Mode probe issue

    Posted 01-02-2019 04:20 PM

    sorry.. 8.53 latest version



  • 16.  Re: Maintenance Mode probe issue

    Posted 01-04-2019 06:02 AM

    Have you done this ?

     

    ****** Important New Feature ******

     

     

    maintenance_mode probe 8.53 HF3 introducing new feature.

     

    It has a new task in maintenance mode which deletes the expired maintenance windows thereby improving the UMP performance.

    It is disabled by default.

     

    To enable the task, purge_maintenance_window_interval (a new configuration) is to be set to an whole integer

    in maintenance mode probe configuration under <setup> section.

    (say 1, meaning the task would run every one hour).

     

    For example,

     

    <setup>/purge_maintenance_window_interval = 1

     

    If the task is run all the expired maintenance window entries would get deleted.

     

    To disable the task, it has to be reset to -1

     

    Apart from the above task, whenever a schedule is deleted the corresponding maintenance windows would be deleted.