Zebra Technologies, a tracking technology giant faced challenges with SLA management, high operational costs and scalability. Migrating to a different environment required new technology and appropriate coding. Moreover, ensuring successful matching of data and reports on source and target platforms was essential.
Data migration from the AWS environment to GCP
- Datametica initiated the process of code migration and refactoring using Raven – our code conversion tool
- Automating the reconciliation process by providing greater efficiency, granular comparison, zero data copy, no coding and using faster techniques, Pelican – our automated tool for data validation ensured confidence in decommissioning of Legacy Warehouses.
- With Datametica’s expertise, the SLA task in the GCP environment (compared to AWS environment), was reduced considerably.
- The batch was completed and the turnover time was successfully reduced by 2 hours.
- The process of Data Migration was interrupted with new data entering the environment. To resolve this DistCp and Google’s service and STS for reconciling the data in an appropriate manner was utilized.
Datametica ensured a seamless migration from AWS environment to GCP for Zebra Technologies Asset Visibility IQ. This assisted in improving the SLA timings. The Composer airflow upgraded the dependencies between multiple workflows, improving the data quality.This simplified the complex migration process and helped Zebra Technologies overcome challenges related to time, cost and scalability.
- Datametica delivered advanced services like Infrastructure setup and resource provisioning, code migration and refactoring, Cloud
- Composer, optimal cluster configurations and BigQuery implementation.
- Datametica was able to develop separate and configurable Dataproc clusters for each workflow
- Migration from Oozie to Composer was implemented successfully
- Jenkins setup automated the code deployment process
- Upgrading from Spark 1.x to Spark 2.x was successful
- Quicker access to underlying data was made possible by BigQuery implementation
- While processing code refactoring, we identified existing bugs in production code and assisted in debugging and rewriting affected applications
Zebra engaged Datametica to assist in migrating from AWS to GCP to solve the performance issues we were noticing when running our applications on AWS. We also were looking at optimizing our cost structure and re-look at the overall infrastructure architecture. Datametica provided their exceptional expertise on GCP to bring in some of the critical architectural optimizations involving Data Proc, Composer and other technologies. They not only helped Zebra to migrate but also performed an upgrade to our Spark code from 1.X to 2.X. The code migration involved extensive collaboration between teams to understand the functionality of the application and to suggest the technical approach to upgrade. Some of the complex workflows were updated and migrated seamlessly. I was especially impressed with Datametica’s willingness to go the extra mile to do the right thing than taking shortcuts. The overall commitment and ownership shown by Datametica leadership was outstanding and was critical in the on time delivery of the initiative.