The Client

British Telecommunications PLC is significant world Telco who provides voice, network and data solutions to both domestic and business customers. They process something in the order of 320 million transactions a day with sub one-second response times. 18 of the UK's top 20 financial institutions trust BT with their data networks and one third of financial trades rely on services from BT.  80% of FTSE 100 companies rely on networked services from BT.

The requirement was to migrate complex, high value, circuit data, breaking up existing systems and processes in a process that was open to public scrutiny by competitors and the press.


On previous migrations Data was always seen as a problem.  But curiously, nobody's problem.  The technologists did not see data quality as their problem.  The business Operations team didn't see data as a problem as all.  Everything ran smoothly enough right now didn't it?  Any data issues had to be technical not operational.  Didn't they?


There was a recognition that there was "How To" knowledge out there, and BT didn't want to relearn from their own mistakes what they could borrow from someone else.  They wanted to create pressure on Operations and Systems to get the data right so it would load but were unclear how to manage that without straining already delicate relationships with their business colleagues.  After all this was a regulatory inspired change.  It was causing an impact on the way they worked but gave them no benefit.

They wanted to "Defuse the ticking time bomb" that was the data quality within the source systems.


For scale and complexity there are few organisations with data to compare with a national telco

OSS for BT CS.png

There were multiple Operational Support Systems - the Landscape Analysis phase had identified over three thousand systems that underpinned BT

There were multiple product types and families. Products in telco's are especially tricky. Products sold to customers are nearly always combinations of lower level products some physical like cabling, some logical like network topography, some, software like VOIP.  Landscape Analysis identified one hundred thousand products.

Processes for BT CS.png

There were multiple Operational Activities/Processes - from logically designing solutions to physically implementing those designs in network connections in exchanges

There were multiple Workflows - that passed through the operational processes - Order Fulfilment,Fault Fix etc.

Master Date for BT CS.png

There were multiple master data items like Customer, and Location (a large number of which were not addressable and some beneath the ground)


put this altogether............

Our Migration team scope was simple.  We were not responsible for the solution design but we were responsible for ensuring that this design was met and the process of meeting this design did not compromise essential business drivers or lead to inconsistent data sets

In a way this project was like a large de-merger.  Although all the systems would carry on running after the separation of Domestic from Commercial, from the perspective of the Commercial team, the systems they had inhabited pre-migration would be no longer available to them post migration.  

Add in the time aspect because the migration was being phased by Product Group  and Family.  Allow for the development of new products along the way (a Telco business never stands still)  and we have an extremely complex programme to deliver.

To manage this complexity required tight Change Management supported by rigorous Configuration and Master Data Management

And this had to happen both within the project and across the business as usual development of new products and the implementation of new technological infrastructures

Master Data Management The master data management aspects of PDMv2 were called into play to create pan-programme lists of agreed values for several entities.  First the mater data item list was agreed:

  • Customers
  • Products
  • Locations

Of these three, changes to products and the introduction of new products was the most destabilising. Change Management needed to be aware of the product to product relationships as well as the product to customer to location impacts to produce coherent new platform releases with supporting data.

Change Management - The MDM lists were tightly controlled both in the production environment and on the project because this was a phased migration so changes had to be consistent in the Legacy as well as the Target for the migration to work.  This was achieved  using rigorous change control to ensure  that all project configurable items touched by changes to Masters were changed in step.

Configuration and Release Management - defining the configurable items - load programs, target and legacy data items, reports, even marketing materials for new products meant that no aspect of the total picture that was impacted by a change was not updated in line with that change.  Releases were planned and, documented at the configurable item level so all the clients of the impacted items were aware of what was being changed in each release.  By clients here we mean work streams like Testing or Training within the programme as well as work areas in the ongoing business outside the programme.

Data Quality Rules, Fault Fixing and Data Structural Analysis - Of course there were data challenges in the 300 plus Legacy Data Stores that controlled the legacy environment. Given the close proximity of the data stores to the engineering set up of live circuits, the "Reality Check" quality of the data was closer than is often the case in other ERP but as each LDS was built with only its local use in mind, there were considerable structural differences between data stores.  The use of a PDM  migration model and structural analysis helped identify these differences way in advance of the target being available. 

Theses data structural issues along with those from data profiling and test failures were fed into the DQR process which in turn provided  data readiness projections that were fed into the release planning process.  As well, of course, of data fixes, mostly in terms of transformations to be applied to data in transit from source to target rather than amending data in network supporting software.

Note This was the migration where the technique, incorporated into PDMv2 as the Migration Model, was fully developed to cater for the large number of Legacy Data Stores that needed to be analysed and to get us "Ahead of the game".  It was also the first place where a Fall Forward (as opposed to a Fall Back) plan was created so that, if all else failed, we would be compliant, come what may.  The Fall Forward plan did not have to be enacted.

Testing - by aligning test packs to known planned releases the testing could be optimised and maximum reuse made of exiting tests and test data.


This white paper was produced on the back of a presentation that was given at the Informatica World conference, Orlando, 2007.  At that point the first three phases had been undertaken and the remaining phases were planned out over the next 4 years the whole programme being successfully completed in 2011.

Lessons Learned

  • Data Migration is a speciality in its own right.  Treat it as such and employ trained, experienced, specialist resources
  • Analyse your situation and create your own strategy using tools and techniques available to you.
  • Don’t be afraid to innovate to cover the unique aspects of your migration
  • The higher the operational involvement the higher the chance of success

Hard Copy