I was wondering whether this is the most suitable topic to express our problem, but I ll give a try. Unit (300MW) is running for more than year and a half. ACC consists of 30 cells, 6 streets, we have 3 vacuum pumps (1 is in operation). During last outage we have made several modification related to boiler side, which is now affecting less steam flow coming to HP turbine. Consequently, more steam is coming to IP and LP turbine ending up in ACC. It is possible that slighty increased steam flow is affecting ACC performance which is slightly different (worse) comparing to previous summer. Actually, this summer, for the first time, we were forced to decrease unit load, so we could keep the unit in operation, as backpressure increased up to 400mbar. As per my understanding, ACC operation might be jeopardized by means of heat transfer decreased or in some cases extremely high ambient temperature which cannot be avoided. On a regular basis we are conducting vaccum leakage test by means of stopping vacuum pump operation, keeping the fans running at the same speed along with constant ambient air temperature (as much as possible). Test lasts for 15 minutes and results are always very good (backpressure increase less than 2mbar/min). So, my question is whether it is possible that dirty heat exchange surface area is biggest reason for ACC performance deterioration (comparing to last summer 30-40mbar higher backpressure in same conditions)?