Technology News

latest updates from easySERVICE™

Practices for Managing Data in Application Transformation

Big Data

Introduction

Data flows freely among the applications that need it transformation. Applications are being transformed at rates never seen before. Seventy-one percent of enterprises said that IT is deploying more, and more complex, applications than it was a year ago. Businesses are integrating on-premises applications with cloud applications such as Salesforce (28 percent of enterprises did so in 2013) or moving them to the cloud (40 percent of enterprises).

Applications are being consolidated due to a once-again robust M&A market, where aggregate M&A deal value for 2013 was $958 billion, the second-highest amount spent since the 2008 financial crisis.

Whatever your reasons behind application transformation, a critical part of any initiative is migrating data from an old environment (or environments) into a new one. On average, if the migration is part of a larger project—as it typically is—the average cost is $2.8 million. The average cost of a project overrun is $268,000 or approximately one third of a data migration project. And an unfortunate reality is that such projects frequently experience significant cost overruns.

Integration is always a part of application transformation

Making sure new systems talk to existing systems and vice versa. Gartner predicts organizations will spend one-third more on application integration in 2016 than they did in 2013. What’s more, by 2018, more than half the cost of implementing new large systems will be spent on integration.

To avoid such overruns and other potential potholes on the road to a successful application transformation, here are 10 best practices you’ll be glad you knew about before embarking on your journey.

Practice No. 1: Understand process and inter-application dependencies

Data isn’t static. It flows through your organization, through multiple applications and many, many business processes. If you don’t know your data flows and dependencies, you’re heading for failure during an application transformation. A data flow is typically documented in an asset spreadsheet and then rendered graphically. This ensures data integrity within a business context.

Even when individual application owners are masters of their own domains, if they don’t understand data flows to and from other applications, they won’t be able to untangle the downstream impact of any changes they make.

Say an enterprise has established important workflows between its on-premises marketing automation and CRM systems and is decommissioning them to move to a new cloud application. Retiring them separately or at the wrong time may lead to all-important sales leads falling between the cracks.

Practice No. 2: Clean and standardize data

Data quality issues are exceedingly common. It’s therefore essential to clean and standardize data before you attempt to transform your application. One way to achieve this is to employ data profiling and data quality tools. You can also take advantage of data migration tools that provide “before” and “after” comparisons of data, allowing you to verify that the migrated data is clean and accurate.

Additionally, business glossaries can help by providing definitions and descriptions of data. Not only will this help with data migrations between applications—you can then create data quality rules that automation will enforce during transformation—but it will also help business and IT work better together, because they will be using a common terminology. For example, agreeing on what constitutes a “customer” from both business and IT perspectives is extraordinarily important.

Practice No. 3: During design of your new application, build in transparent data access and integration points to avoid data lockdown

Data lockdown occurs when data is isolated in an application silo and can’t be shared easily with other applications. Given the increased emphasis on improving operational efficiencies, analytics are moving into operational applications.

For front-line managers to get the kind of visibility that enables them to react in a more timely fashion to new trends, they need real-time access to data. In legacy applications or platforms where data integration was not considered upfront, providing access to data required customizations that involved lengthy development cycles.

Today, such development cycles are considered unacceptable. Real-time access is the new norm. Still, integrating all these data points manually is challenging. Happily, tools are available that make such data integrations easy without the need to write customizations. As you replace legacy applications, or integrate legacy applications with modern ones, look for ways to leverage a standard so that customizations can be avoided at all costs.

Practice No. 4: Sync with existing apps and data warehouses using the most current data

When it’s time to populate a new application with data, you want to make sure you are migrating the most current data from the correct “systems of record” within your organization. In many cases, such systems are dependent on others. Even transactional systems, which are theoretically up to date at all times, can depend on data from outside applications. For that reason, you need to synchronize all relevant data from existing applications and data warehouses before migrating it to your new, transformed, application.

For example, if one company has acquired another, it can end up with hundreds of older, redundant systems that it wants to transform to a single new one. Step 1 during data migration is to ensure that it has correctly identified the systems of record for each data element. Step 2 is to ensure that the data in those systems of records is current by identifying the chain of dependence on other applications.

Then, when you develop test beds for the new application, make sure you have an automated way to checkpoint and refresh test data. When you go live, you shouldn’t be surprised by gaps in data, as that could result in a skip in a business process and the loss of important transactions. Define a checkpoint in the plan so this step doesn’t get overlooked. This will avoid duplicate or missing data sets.

Practice No. 5: Classify data and define data retention and privacy requirements

In the past, enterprises typically used data’s age as the way to determine whether to retain it. Older data was either deleted or transferred to tapes or some cheaper—and less accessible—means of storage without regard for the value it might still offer the business. Today, this is not an option, as enterprises are under scrutiny to both increase data security and retain data to meet compliance regulations. Planning ahead for a cost-effective and efficient data classification strategy is therefore an imperative during an application transformation.

In many application transformation projects—you will be decommissioning now-redundant applications that support a common business process. Yet that creates challenges because business and regulatory compliance reasons may require that the data currently stored in those applications may need to be archived and stored for a longer length of time. Some industries—particularly financial services and healthcare—have very stringent standards for data retention requirements. Indeed, a recent survey found that 60 percent of all application migration projects require some degree of archival.

Creating a classification strategy typically involves collecting retention and business-user access requirements to ensure that critical data is still accessible to users once archived. Enterprises that are primarily concerned about compliance should create a data classification schema that maps out which fields in a database are most sensitive. You should also identify any dependencies the data may have with other stored information.

Practice No. 6: Leverage an automated test data provisioning process

Testing is an essential part of the application transformation process. It’s also the most time- consuming—and therefore costly part. Gartner estimates that development and testing takes the most amount of time when developing a new application, consuming 24 percent on average of the entire application development lifecycle. And a recent survey found that the tasks involved in managing test data—such as defining which data to use in a test case, creating or copying test data, and securing sensitive data—takes more than half of developers’ and the QA team’s testing time.

Manually creating appropriate test data usually adds up to significant dollars, representing a great deal of inefficiency and waste in application transformation budgets. If your organization has multiple application development projects underway, this could mean tens of millions of dollars of testing resources that could more effectively be used elsewhere. Moreover, manually writing scripts to create test data is prone to errors.

By using an automated solution, you eliminate manual processes in favor of self-service ones. You get significantly improved test data quality—with fewer errors or defects— at dramatically reduced costs. And you lay the groundwork for future application transformations.

Practice No. 7: Leverage MDM to provide a consistent single source of enterprise master data for current and future transformation initiatives

Migrating data is never a one-time event. Yet all too often during application transformations, data migration is treated as a one-off task. This can lead to higher costs, delayed deployments, longer time-to-value, and the risk of not meeting business objectives.

By using master data management (MDM), enterprises can minimize risk and speed data migration. You should choose an MDM solution that cleanses, standardizes, and enriches dirty data; removes duplicates and creates one version of the truth; and centrally manages data cleansing rules—all during the pre-migration stage. During migration, the ideal MDM solution will simplify the migration architecture by avoiding rigid models and structures that don’t allow you to adapt as your business changes. It also automates management of very large volumes of data. Finally, during post-migration, a good MDM solution will maintain consistent data across systems.

The most critical aspect when choosing an MDM solution is that it is application-agnostic, repeatable, and prepares you for your next migration project. You will consistently be able to deliver quality data results, on time, and on budget. You’ll also reduce the cost of every incremental migration. And you’ll be able to scale as your business grows.

Practice No. 8: Know your data integration team and what tools they use

This is an important but frequently overlooked practice. Managers in charge of application transformation projects must get to know their data integration teams. They need to know the skill sets of the members of those teams and the tools those individuals plan to use for the integration. After all, just because someone happens to be master of a particular application doesn’t mean he or she will be able to transform the data into a new application in a way that ensures it remains clean, safe, and connected.

Some businesses search for data scientists to do this part of the application transformation job. Unfortunately, true data scientists are in short supply—a situation that is only expected to worsen in coming years as Big Data comes into its own.

Note that your data integration team most likely possesses quite a bit of data-flow knowledge and skill sets that can be leveraged during a migration—especially since the tools of data migration are typically the same as for data integration.

Practice No. 9: Build legacy application shutdown into the plan for when you go live

We are all creatures of habit. Studies show that human behavior is 93 percent predictable, and that we won’t change our behavior unless forced to. So one way to ensure the new application gets adopted quickly is by ripping off the band-aid. Don’t allow redundant applications to run in parallel. Once you have transformed your application—whether through consolidation, upgrade, or retirement—turn off the old one.

Users simply won’t adopt the new application until the old one is no longer available. And the cost of keeping legacy systems running can be exorbitant. By turning off old systems, you free up IT dollars for innovations that can more directly drive business success. Redundant systems cost you—a lot. Approximately 75 percent of organizations spend at least $100,000 annually supporting legacy applications. Larger enterprises spend even more on people, software licensing, and maintenance expenses.

Practice No. 10: Smart partitioning for application performance

It’s a fact that data volumes are growing. Data production will be 44 times greater in 2020 than it was in 2009. And enterprises are challenged to keep application performance acceptable to users given the velocity of growth. Given that IT budgets are already strained, purchasing more—and more powerful— hardware to maintain performance isn’t typically an option.

Some organizations archive data based on a classification scheme to minimize the weight of data on systems. But this isn’t always helpful, as the volumes of active data that remain are still too large. The result is that performance of mission-critical databases slows, and business decision-making is correspondingly delayed, often to the detriment of the enterprise.

Smart partitioning involves physically organizing data in the database to optimize performance. When using an automated solution, partitions can be created based on any number of parameters or complex rules— whatever makes sense based on user requests. Smart partitioning also streamlines archiving of data as it becomes less relevant to the business.

Conclusion:

A technology-based discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency, and accountability of an enterprise’s official shared data assets.

Most application transformation projects need to mask data for security and compliance reasons. Doing this manually via hand-coding is rarely an option for time and cost reasons, and because of the potential for errors. And although most data quality and integration solutions offer limited masking capabilities, you should choose an automated data migration platform that incorporates best-of-breed data masking capabilities. Make sure that whatever solution you choose offers a full audit trail so you can prove that you have masked the data you need to have masked.

Many enterprises make the mistake of simply copying production data for testing purposes. But if you’ve outsourced your testing overseas or your test teams reside outside the firewall or in another country, you cannot use sensitive data in nonproduction, unprotected environments. You need to ensure you are in compliance with all local data privacy and data residency laws.

Organizations face a significant set of data residency (also referred to as data sovereignty) challenges when they are contemplating a move to the cloud. Cloud data residency is defined as maintaining control over the location where regulated data and documents physically reside.

Privacy and data residency requirements vary by country and users of cloud services need to consider the rules that cover each of the jurisdictions they operate in as well as the rules that govern the treatment of data at the locations where the cloud service provider(s) provision their services (e.g., their data centers).

Advertisements

One comment on “Practices for Managing Data in Application Transformation

  1. StellarPhoenixS
    July 17, 2014

    Reblogged this on Stellar Phoenix Solutions.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: