Despite the fact the support ended for Windows server 2003 on July 14th 2015 there are still many companies that are using this server-side operating system. Because most networks that require a server in the first place tend to be quite large it is often difficult for these companies to change the server’s operating system and that results in these companies being stuck with the out-dating version.
Part of the problem is that migrating away from windows server 2003 (Or any server operating system for that matter) can be a very big task indeed. Firstly, just the logistics of planning such a migration can be very time-consuming, then there is the cost of upgrading the software – the direct cost of the replacement operating system and the cost of other software that needs to be replaced as a result. Finally, there is the actual logistics of migrating all of the data and services from the old to the new operating system and or server.
Anyone who works in IT will already understand what a task this is – and especially just how many things along the way could go wrong. In addition to this, any IT provider knows that the things you need to avoid most for your networks are downtime and loss of data. Both of these things are a big concern when migrating to a new server operating system. If anything goes wrong during the migration, then there will likely be downtime as a result – and regardless of whether it is your own network or that of your clients this downtime can be very expensive and thus it needs to be avoided at all costs.
Secondly, the loss of data can be just as expensive. The problem with server-based networks and specifically the servers that run those networks is that data comes in so many forms, it can be very large in terms of both size and item count and also it is often a requirement that the data be kept for a certain number of years. This means that losing any of that data during the migration can be catastrophic and can result in downtime, loss of work and even legal action.
Even more importantly when it comes to data is the fact that data sets are often so large that even confirming that no data has been lost can be difficult. Imagine a company where client data is stored for 5 years even after the client is no longer active – these archived records might never be checked, unless there is an issue or that client comes back on board. So after a migration staff may not ever check those records and thus would not know if they were missing.
For this reason, it is often the IT providers job to have a full-proof way of checking that data before and after the migration to ensure that the data has been transferred and that it is not corrupted post transfer.
With these issues in mind here are some of the key points that you can take as an IT provider in order to ensure that your system migration goes as smoothly as possible and that even if a major issue does occur it can be easily mitigated without any downtime or loss of data.
Keeping a roll-back procedure in place
This is very important and often involves installing brand new hardware to house the upgraded operating systems. Once this hardware is installed it can be joined to your existing network and then the services can be migrated over one at a time. The advantage of this method is that you always have the existing server in place until everything has been fully tested and you are ready to remove it. Thus if anything goes wrong along the way there is always an option to roll back to that existing setup.
Test at every step
Again a very important step is to test everything along the way in a non-production environment. For example, when you are ready to migrate the first client to the new operating system this can be done in a test environment taking just one client and migrating them over as if in production, then testing everything to ensure that it all went as planned.
Use third party software for data transfer
As explained previously in this article, data transfer is one of the most crucial aspects of a migration. If it is not well planned and carried out correctly, then it can go wrong over and over again and this can increase the length of the migration massively. Not only that, if the data migration doesn’t go well the result can be a loss of data, issues with file permissions or even issues with data path lengths after the migration. All of these things are a nightmare for the IT staff responsible for the migration and they are also potential areas where data could be lost and the company could incur costs.
The best way to avoid these problems of data transfer is to use a third-party program to do the transfer. The copy and paste options that come built into windows are simply not sufficient here as they don’t have enough error correction capabilities or logging facilities.
One of the best programs on the market is GS RichCopy 360 Enterprise offered by GuruSqaud. This is a commercial software and yet it is very affordable when compared to most other data copy programs out there. This program makes copying data during your migration a hassle free task and it takes care of all the things that would normally be potential issues during a data migration.
For example, this software can copy open files, it has long path name support and it even has the ability to copy NTFS permissions between the source and destination.
Along with these tips, you should also plan your migration down to every last detail – don’t leave anything to chance and don’t leave anything to the day of the migration – make sure you know everything about the system, exactly what needs to be migrated, what needs to be tested at the other end and so on.
With all of these tips in mind, your migration should go smoothly.