Manual Data Migration from Old to New
So, you’ve evaluated the compatibility of your applications and workloads for your cloud platform of choice. You’ve vetted your service provider to be sure it can deliver the connectivity, security, availability, and support you require. You know the tools you need to test, remediate and convert applications for the new target system.
It’s time to migrate to the cloud.
In this second in a series of six posts, we’re looking at manually taking data and configurations to your new operating system and application environment.
The process begins with getting access to that new environment — the ultimate destination for the transfer – from your CSP and copy over your data. Test thoroughly to ensure that everything that was supposed to migrate actually did. Did any unknown dependencies surface? Is the application attempting to communicate with something that’s not there?
Since we are migrating workloads over the internet, distance is not really a factor (although it could be later if data is stored far away and you need it fast). The larger the workloads, the longer it will take to send them across to the CSP. If your provider is close enough to your physical location, carry the copied files over; it may be the faster option.
The tools required for effecting a migration are easy to come by. Typically they are already built in to the operating system or application. For example The Windows server operating system includes the file replication command Robocopy (Robust File Copy), as well as SQL Backup for creating a back-up copy of a SQL database. For Linux there are mysqldump and rsync utility software. The Windows and Linux tools allow you to take small bites for copy/test, copy/test, retest, failover. This is better than doing it all at once, only to discover you have to do it again because there were data issues or you discover the Internet connection at the source is a bottleneck.
There are no economies of scale with regards to labor costs. Migration can be a lengthy process and a hefty project line item.
Manual migration is best for one or two simple applications without a lot of dependencies and custom configurations. If you have to install nine additional software components just to get the application running, this method may not be optimum; at worst it will take a long time to complete. Other possible complications could be that you don’t have the installation media for the apps, or you may be unaware of all the ‘hooks’ tying the application together. You may have inherited the application, but without the knowledge transfer or documentation.
The most significant risk during migration is that data is in flight, leaving it vulnerable to prying eyes or more sinister characters. Always encrypt. The encryption used by Secure Shell (SSH) network protocol provides confidentiality and integrity of data over unsecured networks, such as the Internet. Using a VPN tunnel to transfer sensitive data is best practice. Furthermore, do a dry run of the migration before actually committing to placing workloads into production.
Manual migration is not good for applications with short Recovery Point Objectives. Situations to consider: if data still coming into the source system while being copied, might there be data lost? Again, the bigger the amount of data to be copied, or the smaller the Internet connection, the slower the migration process will be. To avoid data loss, you should be able to complete the final cutover under normal maintenance window conditions.
In Post #3, we’ll be looking at offline media transfer by shipping portable media.