As we plant to move large dataset from a enterprise level. Request is to have a data prep tools.
Practical use of functionality?
As we move 100's of gig of data, we wanted to make sure that we have the best data quality on the first run |
|
What is the impact of not doing this?
Making sure when moving large dataset of data ( in the order of 100's gig and more), that we can validate the integrity of the data to ensure the best data quality as possible. This will have an huge impact on the production and the legacy data being integrate to BCDE. |