Ensure a successful migration to a high-availability environment by reviewing these prerequisites.
High-availability environments can be of varying size. A basic distributed deployment might improve vRealize Automation simply by hosting IaaS components on separate Windows servers. Many high-availability environments go even further, with redundant appliances, redundant servers, and load balancing for even more capacity. Large, distributed deployments provide for better scale, high availability, and disaster recovery.
Verify that you have a new target installation of vRealize Automation with a master and replica virtual appliance configured for high availability. See vRealize Automation High Availability Configuration Considerations.
Verify that all vRealize Automation virtual appliances use the same password for root user.
Install relevant proxy agents on the target environment according to these requirements.
Target proxy agent name must match the source proxy agent name for vSphere, Hyper-V, Citrix XenServer, and Test proxy agents.Note:
Finish these steps to obtain an agent name.
On the IaaS host, log in to Windows as a local user with administrator privileges.
Use Windows Explorer to go to the agent installation directory.
Under the serviceConfiguration tag, look for the value of the agentName attribute.
Target proxy agent endpoint name must match the source proxy agent endpoint name for vSphere, Hyper-V, Citrix XenServer, and Test proxy agents.
Do not create an endpoint for vSphere, Hyper-V, Citrix XenServer, or Test proxy agents on the target environment.
Check the version numbers of vRealize Automation components on the target vRealize Automation appliance.
In your target vRealize Automation environment, start a browser and go to the vRealize Automation appliance management console at
Log in with the user name root and the password you entered when you deployed the appliance.
To expand the Host / Node Name records so you can see the components, click the expand button.
Verify that the version numbers of vRealize Automation components match across all virtual appliance nodes.
Verify that the version numbers of vRealize Automation IaaS components match across all IaaS nodes.
Review Knowledge Base article 51531.
Perform these steps to direct traffic to only the master node.
Disable all the redundant nodes.
Remove the health monitors for these items according to your load balancer documentation:
vRealize Automation virtual appliance
IaaS Manager Service
Verify that the target Microsoft SQL Server version for the vRealize Automation target IaaS database is 2012, 2014, or 2016.
Verify that port 22 is open between the source and target vRealize Automation environments. Port 22 is required to establish Secure Shell (SSH) connections between source and target virtual appliances.
Verify that the endpoint vCenter has sufficient resources to complete migration.
Verify that you have changed the load balancer timeout settings from default to at least 10 minutes.
Verify that the target vRealize Automation environment system time is synchronized between Cafe and the IaaS components.
Verify that the IaaS Web Service and Model Manager nodes in the target environment have the right Java Runtime Environment. You must have Java SE Runtime Environment (JRE) 8, 64 bit, update 161 or later installed. Make sure the JAVA_HOME system variable points to the Java version you installed on each IaaS node. Revise the path if necessary.
Verify that each IaaS node has at least PowerShell 3.0 or later installed.
Verify that the source and target vRealize Automation environments are running.
Verify that no user and provisioning activities are happening on the source vRealize Automation environment.
Verify that any antivirus or security software running on IaaS nodes in the target vRealize Automation environment that might interact with the operating system and its components is correctly configured or disabled.