Posted by: ashish2k1 | July 27, 2008

Migration – Time to Put your Thinking Hats on!!!


It was a busy weekend and I had to wind up a lot of tasks.
I was busy with my preparations for moving to my new Home which I bought recently.
After being accustomed to the old place, I was counting on the tasks that I have to do to get the new Home ready to live and only then I came to know that its not just dumping my luggage from one place to another but I have to take care of many more things – time to know the new place now…

Coming back to ECM systems, it’s always a tough task to complete migration projects with 100% success.
Probable reasons are – either we are not sure about the migration strategy’s negative effects or we are not aware of the ECM system completely.
Many enterprises are still struggling to get the new system working as the old one – even after the completion of migration projects.

Here I would like to share one of my expereinces where I was involved in a Documentum Migration Project.
Hold on- It was not from one ECM to another ECM – It was a Database migration( read it: Database A to Database B)
See the requirement description here
Business already decided to move to Database B due to various reasons(read: Vendor recommendation) and I was involved in planning the strategy for the move. Various Third Party Migration utilities like Buldoser and Dixi were analyzed and were among the options we had for an OOTB migration. Documentum “Dump and Load” was also one option and the problem with this approach was the data size to migrate – It was too large (read: several hundred gigs). The question was why not to go with the EMC recommended way and use dump and load; migrate data in small chunks. I tried a small POC for this and ohhh – Dump and Load dumps and loads not only what it is asked but also the related Documentum objects to maintain the integrity. Now this was a good news as well as bad news for us.

When we will go for multiple loads, it will load the same data twice without any complain if it qualifies the “related objects criteria”. Grrr….Due to the duplication of data, we cannot go with this approach…

No worries Mate!!! Time to put your thinking hats on and understand the migration approach first and then go for the implementation.

I had a discussion with my colleagues( read: super computer minds) who were working on this assignment with me and we decided that we would go for a custom solution using Dump and Load, do a POC with the same approach but this time we will play with dump and load.

By this time I was fully enjoying this assignment…We did the POC keeping in mind following points –

  • Remove all inconsistencies and run all housekeeping jobs prior the dumps
  • Perform a dump and load of user and ACL objects prior to the dump of documents
  • No Dump file size should exceed the magic “2 GB limit” (to be very frank, i do not want to use the word “limit” over here)
  • Use “Dump without Content approach” thus no need of extra disk space
  • For dump of documents, use batch dumps so that the dump file size is always less than 2 GB
  • Identify and delete any such folders in the target docbase which is already loaded in the the prior batch dump
  • Reconcile the docbases post load

The POC was successful and now we were clear with how dump and load works and of course about our migration strategy.

Now the only task that was left was to automate the entire approach (as we were pretty clear with what we were doing) – using Windows Scheduled tasks( basic utility but helps a lot- I should admit)

We executed the Migration exercise without any issues and enjoyed the execution with “Champagne” (I am sorry to my dear colleagues ( if they are reading) as I just had a sip to accompany them)

Now when I correlate the “migration experience” with “moving to my new home”, I come to the conclusion – Be it a ECM/Database Migration Project or moving from one place to another – You have to put your Thinking Hats ON

See you next weekend…

N.B. Your suggestions and feedback are always welcome.



  1. Nice way of describing a migration activity. Did you build any custom tools for the migration effort ? I’d like to see a post describing the pre & post steps thats go with such major migrations.

    On a second note – congrats on the new home moving. When are u opening the champagne ? I promise – even I’ll just have a sip 😉

  2. It was very kind of you to share your experience with the community – especially your technical points. Some companies feel that sharing information is giving company secrets – I feel the opposite in that Documentum is already very complex and if other people are willing to share their experience, you should do the same. Kudos to you.

  3. Hey Jatin,

    We just used the dump and load api. YOu can get the details in Content Server Fundamentals/Admin Guide.

    We did not built any utility but we understood and customized the solution.

    Of course I shall write something in detail about pre and post steps for big migration projects

    Stay Tuned and Thanks for the congratulations.

    Kind Regards,
    Ashish Srivastava

  4. Thanks Johnny
    I am totally with you as far as sharing the experiences is concerned.

    Thanks a lot for the appreciation

    Kind Regards,
    Ashish Srivastava

  5. I had one similar question.

    Is it possible to recover several files, say 500+ (in one particular folder) using some DCTM APIs.

    Obviously, it looks like we need write a program also (say Java) to execute many API calls required.

    I was suspicious about this: what we did is dump, load that deleted folder from development server to production one.

    Thanks for this article.

    One other thing I want to ask, relating to this article: is it quite simple to dump and load between heterogeneous databases, say dump from Oracle and load into SQL Server without any problem in metadata.

  6. Hi Ashish21k1,
    Your statement –
    “Hold on- It was not from one ECM to another ECM – It was a Database migration( read it: Database A to Database B)”.

    Can you elaborate a bit more on your above statement? What about migrating documentum (web publisher contents) aix repository to documentum intel repository? What type of contents is your repository created from? – Web Publisher, Webtop etc…

  7. Hi,
    That was great you shared your experince, but i will like to know did you migrate dm_audittrail ojbjects also….if you can help me for same as i am looking forward to carry out this opration using dump and load….is there any other way to do that

    • No Dhiren, We did not migrated audittrail objects. not sure if you still have that requirement, Apologies for getting back so late but you can share what you did to migrate the objects if you completed the project

  8. Anshul,

    Thanks for reading the article and sharing your comments. It is not “quite simple” to dump and load between heterogeneous repositories; From metadata perspective yes it will handle it but you need to weigh other aspects as well

    Apologies for late response

    Kind Regards,
    Ashish Srivastava

  9. Thanks Anthony for your question, can you elaborate the requirement more especially with documentum server version and operating system version for source and target?

    Kind Regards,
    Ashish Srivastava

    • Hi Ashish
      My original post was made in 2010. It was a long time ago. We never proceed with the migration from an aix content server to an intel content server. If it helps, it was from aix v6.5 SP1 to an intel v6.5 SP1. Currently, we have both aix and intel servers running v6.5 SP3 Patch24.


      • Thanks Anthony !

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


%d bloggers like this: