Over the last few years, compressor manufacturers have invested significant research funding in an attempt to decrease bearing and lube oil temperatures in their machinery bearings by a few degrees. Sometimes hundreds of thousands of dollars have been spent to lower temperatures by just single digit numbers. These expenditures are driven by the need to meet recent generic industry standards for rotating machinery bearings that are being strictly enforced by several operators. Let’s review the origin of these standards and discuss when and where the strict application of these standards makes sense and when they are over-simplistic or overly conservative.
The function of bearings in a turbomachine is to support and center the rotating shaft against static and dynamic radial and axial forces. In most industrial compressors, steam turbines, and gas turbines, these bearings are of the tilting-pad fluid film type, using either a mineral or synthetic oil to cool and lubricate the bearings. There are clearly some fundamental physical limits on the lowest possible lube oil temperature in a bearing that is achievable based on the energy that is transferred from the rotating shaft to the stationary fluid film and the heat load that the oil can absorb, transfer, and transport for a given oil heat capacity and flow rate. Beyond these basic physical limitations, there are many design choices such as type of bearing, pad size, aspect ratios, thrust equalizing, oil injection points, angle of load variation, and a host of other complex geometric and operational parameters that affect the lube oil temperature.
American Petroleum Institute (API) standards had traditionally required bearing temperatures below 200°F for slow-running machinery such as pumps and motors. Until 2014, API standard 617, 7th edition for centrifugal compressors, allowed bearing and lube oil temperatures up to 212°F. But the 8th edition lowered this requirement to 200°F to be in line with other API machinery standards. Most machinery built and operated prior to 2014 seemed to operate just fine at the slightly higher lube oil temperatures. However, this change resulted in a rush by most machinery manufacturers to find methods to lower their bearing metal temperatures regardless of the actual benefit it provides to the bearing load carrying capability, or to the machinery unbalance response and stability.
Rather than blindly following specifications, one should always ask what the true limitations of a physical process are to make sure a specification makes sense for a given application. In this case, the limiting factors and critical parameters are: (i) the bearing pad hot spot metal temperatures, (ii) the temperature at which the lube oil starts degrading rapidly, and (iii) the temperature at which the viscosity of the lube oil is too low to provide a consistent pressure gradient film on the pad to support the required shaft loads.
With respect to hot spot metal temperatures, the material that is of concern is tin-based babbitt surface material. In principle, the failure mode is a function of the temperature, shear rate and hydrodynamic pressures, which are heavily application dependent. Tin-based babbitts can operate at temperatures exceeding 300°F. But since material strength degrades rapidly with temperature, practical limits are about 265°F for a tilting pad bearing as specified by most bearing and turbomachinery OEMs and 240°F as specified by most gear OEMs and AGMA specifications.
“The machinery operator should always evaluate previous manufacturer experience, specific design/application considerations, and operating/test data prior to insisting on a fixed temperature limit from a generic industry specification.”
And yet, the temperature targeted by the most recent API standard is specified as 200°F inclusive of all hydrodynamic bearings, both fixed geometry and tilting pad; and over a wide range of applications, from slow moving motors and bull gears to high speed pinions, compressors, and turbines. In order to meet this requirement, other margins may be impacted whether it be critical speed separation margin or stability such as by reducing the journal diameter to decrease surface speed or by using a longer bearing thereby increasing the rotor bearing span. While still meeting standards, additional margins might be reduced in areas where they might be more preferred.
For lube oil degradation, a 230°F metal temperature is discussed by the American Society of Testing and Materials (ASTM D4304-17) as a criterion to switch from standard Type 1 and Type 2 mineral oils to a more high performing ASTM Type 3 oil, which is generally formulated for heavy duty gas turbine or combined cycle applications.
The other commonly mentioned requirement, albeit not apparently anchored in any industry standard, is that the bulk temperature, either from a common drain or reservoir, should stay below 180°F. This requirement is mainly based on operational experience and the basic rule of thumb that the oxidation rate of turbine oils generally doubles for every 18°F above 140°F bulk temperature. Finally, while lube oil viscosity is non-linear (mostly logarithmic) across machinery operating temperatures, a temperature increase from 200 to 212°F will not lower the viscosity below what is required to provide adequate viscosity for pressure film stability in tilting pad and fixed geometry bearings.
The other question that has an implication on the validity and value of the lube oil temperature criteria is the actual measurement location and associated sensor accuracy to measure the true metal and lube oil temperatures. In most turbomachines, film temperature of the bearing lube oil film is measured with RTDs or thermocouples below the babbitt surface at the assumed hot spot. This measurement is an indirect indication of the bearing surface temperature since the temperature sensors are recessed from the pad surface, and the hot spot angular position is variable especially for fixed geometry bearings typically used in motors. Furthermore, the hot spot is along the axial center of the bearing, but measurements are allowed to be taken off the center line if the bearing is long.
So, measuring temperature in fixed geometry bearings or offsetting the temperature transducer from the center line in long bearings will generally mean that the measurement is not at the hot spot and place a bearing at risk for failure at a low temperature, the kind of events that lead to lowering of temperature standards. On the other hand, when properly implemented, this measurement is an excellent and fast indicator to determine if the bearing is operating at excessive temperatures with possible hot spots. Thus, bearing pad metal temperatures should still be used for alarm and shutdown controls of the machine.
The bulk lube oil temperature is usually measured in the bearing (individual and common) drain, and the reservoir. These measurements provide a good indication of bulk temperatures and can be useful as an indicator whether the lube oil degradation temperature has been exceeded. But they are not useful to determine the actual temperature condition inside the bearing or, because of the measurement delay, as an alarm or shutdown.
While the 200°F bearing and lube oil temperature limit has become the accepted reality in the turbomachinery industry, it does not always make sense. It is not achievable for some applications. Simply enforcing this limit for all machines is an over-simplification that leads to expensive design conservatism and not necessarily a better bearing or better machine. The machinery operator should always evaluate previous manufacturer experience, specific design/application considerations, and operating/test data prior to insisting on a fixed temperature limit from a generic industry specification. (Brian Pettinato provided technical input for this article.) ■