Blog Category: Strategic IT & Operations, Data migration
Many organisations believe that in moving to the cloud, working processes will become more efficient and cost-effective, silos disappear and systems will integrate seamlessly. It doesn’t. Cloud is not a quick fix.
The good news: 96% of companies have migrated to the cloud1.
The bad news: 62% of those projects were harder than expected or failed1.
Many organisations believe that in moving to the cloud, working processes will become more efficient and cost-effective, silos disappear and systems will integrate seamlessly. It doesn’t. Which is why it’s not that surprising that nearly three-quarters (74%) of companies have moved a cloud-based app back on-premise after failing to achieve the anticipated benefits2.
Cloud shouldn’t be treated as a virtual data centre where the organisation simply ‘lifts and shifts’ the IT infrastructure. 45% of companies pursuing that strategy over-provision by as much as 55% during the first 18-months and overspend by 70%3.
According to the Cloud Success Barometer study, when cloud migration is planned as a ‘core’ part of the broader business transformation strategy, over three-quarters (77%) report ‘great’ or ‘moderate improvements’4.
What we tell our customers is that a key feature of success is knowing what data they’ve got before attempting to move it. Approached in the right way, cloud migration is successful because all the relevant data is identified, prepared and remediated before the move happens:
Every business has at least one data repository they fear – a shared folder, SharePoint site or email that’s full of unstructured data, like spreadsheets and reports, which are created as part of business-as-usual.
It’s such a huge beast that people don’t know where to start. But you can’t ignore it, because the problem just grows in size and complexity with each day that passes. And you definitely can’t migrate it, because the chances of failure are pretty much guaranteed.
Identify those large legacy data sets, however, and you have the opportunity to understand what’s inside them.
With this insight, you can identify the redundant data that is no longer required and delete it. Then you can reveal the ‘risky’ unstructured data, such as PII and key client information, to ensure you don’t lose sight of it as the data is migrated, and that once in the cloud it has appropriate controls around it.
Every organisation feels the pressure of GDPR compliance. But for organisations within banking and financial services the weight is heavier as they face additional stringent regulations enforced by PCI DSS, FCA and PRA. And then there’s the reputational damage to account for. As well as consumer litigation, which is becoming popular.
Understand what’s lying inside legacy data sets, and you can avoid all that unnecessary stress. With central visibility of the organisation’s data estate, you can assess large data sets at scale, and drill into specific information to know with confidence what information you have and where it is.
People are pretty resourceful, so when systems and processes don’t work as they should, they find a workaround. Spreadsheets and reports are created every day in order to collect and enhance the information within structured data sets. The problem is unstructured data isn’t documented and it tends not to be controlled by any governance frameworks, since it isn’t stored in a centralised system and no-one knows what’s in it.
Now you potentially have a major security risk particularly if you’re going to migrate it to the cloud. Either someone internally is going to stumble across something they shouldn’t, which could cause huge harm and upset. Or a hacker is going to enter your infrastructure, take that data and you’ll have to inform the regulator that you have no idea what data was compromised.
Our research suggests that 85% of an organisation’s data is unstructured and unknown. And typically it contains:
This is the data that’s high risk, and yet it may contain information that’s critical for business success.
In over two-fifths of migrations (41%) data issues caused the project to run over schedule and over budget5.
So, the need to know what’s in your data is critical and will eliminate unknown threats that could derail your cloud migration.
If you can identify your unstructured data at scale and classify it before it’s moved, you can also address the challenge of how to keep on top of data policies, processes and regulations once the migration is complete.
Did you know that only 19% of organisational data is business critical?6
And according to a study from the Harvard Business Review, only 3% is rated as being of ‘acceptable’ quality, with 47% of newly-created data records having at least one critical, ‘work-impacting’ error7. Consider that most data sets are connected – whether because systems are integrated, or because people are creating reports from structured data sets – and that single error will have a wide-ranging impact.
Gartner believes that poor data quality is responsible for 40% of all business initiatives failing to achieve their targeted benefits8. When data quality is poor it:
Cleansing data before it’s migrated may at first glance appear to prolong the process, but in reality, prevents problems later on down the line.
It will also give the board confidence. According to a report from KPMG, 84% of CEOs are concerned about the quality of the data they’re basing decisions on, fearing it could lead to missed opportunities, lost revenue and reputational damage9.
Improve the quality of your data to get rid of anything that’s duplicated, redundant, over-retained and trivial, and you place the business in a far more competitive position.
The added bonus is that by removing the 46% of duplicated information that’s in your unstructured data, it’s going to significantly reduce your cloud costs.
Transformational data migration starts with understanding your data at scale and in detail ...
We are offering you the opportunity to quickly index real organisational data from your company to validate how Exonar Reveal can find the data you really shouldn’t be migrating, and the data you absolutely should, in a controlled test environment. Talk to us about claiming your FREE ‘Test Drive’ (Ts&Cs apply).
Email: tellmemore@exonar.com
1 https://www.techrepublic.com/article/73-of-cloud-migrations-take-a-year-or-longer-report-says/
2 https://www.fortinet.com/content/dam/fortinet/assets/analyst-reports/ar-2019-ihsm-fortinet-wp-q2.pdf
3 https://searchcloudcomputing.techtarget.com/definition/cloud-migration
4 https://www.idevnews.com/stories/7316/Unisys-Report-Reveals-Why-1-in-3-Cloud-Migrations-Fail
5 https://www.panorama-consulting.com/erp-data-migration-and-cleansing-tips/
6 https://www.veritas.com/en/uk/form/whitepaper/the-uk-2020-databerg-report
7 https://hbr.org/2017/09/only-3-of-companies-data-meets-basic-quality-standards
Step 1: identify large legacy data sets and gain the opportunity to understand what’s inside them and delete with confidence.
Step 2: identify large amounts of unstructured data to stop unknown threats derailing your migration.
Step 3: improve the quality of data being migrated to place the business in a far more competitive position.
Take the next step