The rate of change in the cloud and data space is phenomenal. In the wake of the pandemic and ever-advancing technologies, manager and customer expectations are forcing business units to adopt new tools and ways of working. The question is hot right now.

I see many organizations being pushed to adopt the cloud, both by political pressure and the assumption of better capabilities. Some organizations initially go all out on cloud-to-bank cost savings and find they lack the necessary processes, tools, and governance. Without proper preparation, the business case for cloud adoption rapidly declines.

Whatever the reason for the change, organizations need to understand that they shouldn’t be pressured into making hasty decisions.

Understanding your digital portfolio is key to choosing the “right” cloud platform to move to, supported by a robust decision-making framework. Cloud adoption without a transformational agenda will net you a quick buck, but ultimately a “lift and shift” doesn’t address the legacy architecture. Nor does it allow organizations to maximize the value of features built into cloud platforms.

Interestingly, data is often overlooked when embarking on a new journey on a platform, but you need to be clear about the expected lifecycle. When planning to migrate to the cloud, there is a lot of focus on the workload (application or virtual machine), but outside of data sovereignty and data classification, I don’t see prioritization on the data resources they deserve.

Most importantly, take a data portfolio approach. It’s crucial to understand your data assets, whether it’s an important spreadsheet, a customer database, or a logging system that you need to maintain.

What’s really crucial is knowing where your data resides, what it can do, and who can access it. You also need to understand how other systems (perhaps secondary data systems) feed to or from your data. So you need to apply the proper data protection and retention features to whatever cloud-based system it coexists with.

Let’s go back five years, or even just before the pandemic, when many organizations started testing the water with the cloud. Once upon a time everything was contained in a data center or two, perhaps a few buildings. Now, everything is distributed across multiple service providers across multiple data centers.

It’s also not just about understanding the value of your data in isolation but collectively as a department or organization. When considering how to transform isolated resources into actionable data in the cloud, the main challenge is balancing immediacy with a good strategy.

The first thing you need to do is determine whether you want a departmental or corporate approach. This will have a significant impact on your overall data strategy: where does your data reside and what kind of services can we provide based on the data we will have at our disposal? This is especially true for some organizational services that need to prioritize cloud adoption and transformation, creating a multi-speed approach.

You also need to be aware of the capabilities or services these data assets contribute to and the lifecycle surrounding them.

We see many people transitioning to cloud-based Platform-as-a-Service (PaaS) or Software-as-a-Service (SaaS) models and find that their backup or archiving mechanisms are no longer relevant. This is a big deal in government if you have a record retention policy that mandates records be kept for 50, 60, or maybe 70 years. What would you do if the only way to access 60-year-old government information was through a legacy system accessed through Exchange 2000?

In reality, many organizations need to better understand their data capabilities and how to take full advantage of them. What they did in a traditional tech stack doesn’t necessarily apply in a new world. The adoption of feature-rich API platforms enables greater integration, orchestration, and automation. This means that you need to start thinking about how to change the way your team works and what “good” looks like. Importantly, the way you measure success needs to change.

Finding the best mix of workforce skills is also essential. Moving from a traditional network data center game to more of a scripting, “DevOps” environment is no mean feat, especially during the current challenge of skills shortage. Finding enough people with the right mix of skills is difficult.

Last but not least, it’s about taking a security-by-design approach and understanding how to protect your digital and data assets.

We’ve seen what happens when organizations fail to protect their data. Primary and secondary data have been the focus of much cybercrime recently, with personal data in particular being the target of constant attacks. Additionally, many cybercriminals, such as through ransomware, target data backup repositories to minimize the victim’s ability to recover from an attack. Therefore, backup strategies are now an essential line of defense in cyber warfare.

Citizens will continue to challenge organizations to prove that their data is safe. They will want to know how their data is used and how it is stored. Having an end-to-end focus on data security will be an ongoing challenge for public and private sector organizations.

Essential Eight, the model introduced by the federal government to combat cyberthreats, has a great set of controls. While it is quite expensive to implement, it provides a good level of security. But we often see customers struggle with the trade-off between better functionality and going through hoops to ensure that the data they hold is safe and reliable.

However, many technology trends (and challenges) are driven by consumer demand. Customers often set expectations about how they want to deal with organizations, especially government, which can create problems when they bring together data to deliver the services consumers want and how they want to consume them.

The government has no choice; he needs to embrace change. But leaders and purchasing managers shouldn’t rush into the wrong or more expensive solution. A well thought out data strategy has to come first.

Matthew Gooden is Datacom’s Chief Innovation and Technology Officer.


Leave a Reply

Your email address will not be published. Required fields are marked *