I've read the following guide for checking a san environment:
There is written, that Tx/Rx Power should be inside recommended values according to SFP datasheets:
"- and Tx/Rx Power is inside recommended values according to SFP datasheets."
But in the sfp datasheets there are several values, for example the SRS (stressed receiver sensitivity?) and the unstressed sensitivity. Which values are the ones I should have an eye for? Or are there other more important values?
At the moment I try to monitor the RX values and it seems to me that 29 µW (unstressed 8GBit lw) for example as boundary are pretty low.
So, which is the boundary to have an eye for?
Independently of what specs may say, when it comes to Rx signal level (for 1/2/4G SFPs), I consider that enything below 100uW (-10 dBm) is susceptible to become problematic. And for 8G I would increase that threshold by 50%.
Alarm Warnlow high low highTemperature: 35 Centigrade -25 95 -20 90Current: 8.336 mAmps 1.000 17.000 2.000 14.000Voltage: 3268.6 mVolts 2700.0 3900.0 2900.0 3700.0RX Power: -10.1 dBm (97.7 uW) 10.0 uW 1259.0 uW 15.8 uW 794.0 uWTX Power: -4.7 dBm (341.9 uW) 67.0 uW 631.0 uW 79.0 uW 631.0 uW
As I also monitor the CRCs, I think it is enough to have an eye for the specs when it comes to attenuation, because the CRCs are the critical values. Attenuation is more like a warning to me.
And I don't want to have the values to high, as we also use TAPs with 50% (6dB) for analysis. This can cause more failures then are neccessary. Also each port in the failure range has to be cleaned or processed any other way which corresponds to more effort/costs. And when this is not neccessary I would like to avoid this.
So I would like to take the values from the manufacturer, but so far I wasn't able to figure them out respectively verify if my assumption is correct.
When taking 150µW for 8G connections as boundary, I am more than 400% above the spec for unstressed sensitivity (lw) or 200% above SRS.