Honeywell HPM Redundancy Failed. The "Alternate Path" doesn't work either?

Hi. I am really stumped with this one ....

After many years of operation and no recorded "event" an HPM redundancy failed. The backup HPM cards were replaced but the backup HPM always fails during the LOADING, SYNC steps. Based on the primary HPM diagnostics it has been determined the problem is not with the backup HPM, the problem resides in the primary HPM. To replace the primary (now non-redundant) HPM cards require a plant shutdown.

My past trouble shooting/repair efforts are as follows:

1. As per Honeywell recommendation, I have tried loading the failed backup HPM via the "Alternate Path" multiple times, with and without a new Redundancy Cable every time without success.

2. Suspecting a corrupt system file I have copied and used new personality image files then tried loading the HPM again without success.

3. During a plant shutdown both HPM's were replaced but still there is no redundancy. As usual when the backup HPM is being loaded its status goes from LOADING to SYNC to FAIL. .... The databases still will not synchronize ....

4. Currently plans are being made to replace everything; card files, I/O Link cables, etc. at the next plant shutdown in a couple of years.

I would like to know if anyone else has similar experience and what they did to fix the issue? Also I would be happy to entertain any/all suggestions.

Thank you in advance.

Did you get this one fixed? I was doing an LCN upgrade, and had an similar issue restarting the back-up HPM. It failed everytime I tried to load it. What it took to get it loaded:
SHUTDOWN primary NIM on that UCN. (Backup NIM takes over as primary)
Reload NIM as backup.
SHUTDOWN (now) primary NIM.
Reload that NIM.

Then I was able to reload the backup HPM. Best as I can figure it, the NIMs had a chunk of memory in them that my backup HPM was running and refused to let it load.
Hi Marcus.

No this issue has not been fixed and I have since been demobilized from my customer site(s). Even though I am slipping into retirement I stay in contact with management and the engineers, both business-wise and personally. :)

I did go down a sort of similar thought process; I refreshed the *.PI files before loading the HPM's, all to no avail.

This plan is interesting, easy and definitely worthy to try. I will forward YOUR thoughts to site within the hour. I am not sure when it will be implemented but I will send you any/all feedback.

Thank you and cheers!
UPDATE: I have been asked to return to my customers site(s) to attempt this "fix" plus much more other TPS work.So I swapped the NIMs this morning without success. The backup HPM still fails while attempting to synchronize.

NEXT STEPS: During the next major plant shutdown (2022?) I will "pull" both HPMs (all four cards ....) from their racks at the same time. This will purge any and all node configuration and tag databases from both HPMs. Basically ground zero. Then I will load both HPMs from a recent checkpoint file.

Marcus, thanks again for your suggestion. It was a very good idea.

The short story:
We powered OFF both HPMs and the local IOPs by turning OFF both HPM power supplies.
We powered OFF the remote chassis of IOPs by turning OFF both of their cabinet power supplies.
We unseated the "Long-Distance" Remote I/O Extender cards at both ends.
We replaced both HPMs with FAT (customer witnessed) HPMs.
We replaced the UCN Interface Modules.
We replaced the A and B UCN Cable Taps.
We replaced the I/O Link Redundancy Cable.
We powered ON the HPM power supplies.
We seated the HPMs and confirmed their status as ALIVE.
We LOADED one HPM and confirmed its status as OK.
We LOADED the backup HPM and confirmed its status as BACKUP. (It worked!)
We swapped PRIMARY/SECONDARY successfully.
We seated the Long-Distance Remote I/O Extenders.
We Turned ON the power to the remote IOPs.
We swapped the PRIMARY/SECONDARY HPMs one last time.
Success. The redundant HPMs have been operating normally for a week.

NOTE: We required a total plant outage to turn OFF this HPM and its IOPs.

I hope this information will be helpful to others.