Two weeks ago, I had the pleasure of attending and presenting at the 2016 Nuix User Exchange in Huntington Beach, California. I always love the User Exchange because, unlike some other conferences in our industry that are incredibly high-level or more focused on sales and marketing for vendors, Nuix has done an excellent job of making this a tactical conference – one where attendees and presenters alike come forward to share knowledge and teach one another how best to use Nuix technology. Every year I come away with some newfound knowledge about Nuix, and this year, my goal was to return the favor and share some of my own specialized knowledge.
My presentation focused on Intelligent Migrations using Nuix. No, this kind of migration is not the ediscovery case migration like we might traditionally think about, but rather the movement of data from one enterprise archiving solution to a new destination. The intelligence component refers to the strategy and steps that we can take to help add value to migrations between platforms. All too often, the migration business approach has been focused on moving data from point A to point B, without realizing that we can do so much more with the right technology and skill set!
I've always been a big proponent of using Nuix for these types of migrations for three primary reasons:
- Nuix works from the file system backwards, which means it does not rely on notoriously unreliable archive APIs that can be corrupt, missing links to the data, and slow to export.
- Nuix lets you filtering and other data reduction steps to be quickly applied at the time of indexing and extraction, without sacrificing speed.
- Nuix offers item-level auditability that might not be important to IT, but is extraordinarily important to legal and compliance to ensure that there was no loss of data fidelity.
During the migration process, Nuix offers the ability to take objective data reduction steps that, in conjunction with a solid migration plan and buyoff from key business and legal stakeholders, allows you to do the following to old legacy archive data:
- Date filtering
- Employee and mailbox culling
- File type culling
- Size-based culling
- Domain analysis
- Custom filtering from third-party inputs
To some extent, the options for reducing archive data on its way to a new destination are limited only by creativity and defensibility of the actions being taken. The important takeaway here is not that you should do all of these things for every migration, but that you can. The key is to both understand the key business objectives associated with the migration, as well as ask questions to generate conversation about potential options. For example:
- How are you going to retain and track holds between your platforms?
- Why are you moving all 200 TB of data?
- Have you talked to legal about the risk of moving this data?
Having this much capability available at your fingertips can be dangerous if not properly planned and executed. Some of what we discussed in my session at the conference included, how to use Nuix to execute a migration and the required skillsets, creative intelligent migration strategies, and case studies of successful migrations. We also had some excellent discussion amongst attendees – war stories, successes, and questions – that really took the session to the next level.
The takeaway? Migrations between legacy platforms and new destinations are always going to be painful and each one is going to be a snowflake, no matter how many times you see a particular source. However, with the right strategy, technology, and skillset applied to the migration from the outset, you'll not only have a successful migration outcome, but potentially add a considerable amount of value to your business.
Are you considering a legacy data migration or have you recently executed one? What has the experience been like for you? Feel free to reach out to me at firstname.lastname@example.org.