Data Verification

After the data ingestion is complete, you should always verify and validate imported data.

It is best practice to:

  • verify that the data is correct and has correct units, especially for usage percentage, RAM size, storage size, CPU count, and physical or virtual server.

    • For a virtual server, map vCPU to number of cores per processor.
  • Check for redundancy in data:

    • Do not import a host machine; always import its virtual machines data
    • Storage IOPS can impact cost savings analysis significantly. Whenever storage’s input output operations per second are greater than a few hundred, check with the customer. Higher IOPS increase the storage cost. Customers can achieve higher IOPS with disk array thus avoiding a higher cost.

  1. Verify data is imported as expected.

    1. Go to Portfolio data → Assets, expand the help panel (click the ❕ on the right) to learn what’s important on the page. assets page

Servers, as well as Shared storage and Network assets, are primarily used for cost calculations. Applications and Databases are currently used for migration planning. Additionally, Databases are used for RDS recommendations for TCO estimation on AWS.

Each section of the Assets page has a data validation rules section (preview feature) as shown. We will explore it in the next section. dataValidationRulesBanner

  1. Go to Portfolio data → Dependencies, click the ❕ mark on the top right corner to learn what’s important on the page. dependencies page

All four of these dependency types (Server to application,Server communication, Database to application, Application dependency) are used for application grouping. These are not currently used elsewhere in the tool.