Key Factors In Making The Most Of Your Organisations’ Data In 2018

In the same way that a CFO sets out the strategy and framework for investment and cash flow, CIOs and CTOs need to take control of making dataflow work for their organisations. Data is now a currency, but one which carries extra responsibility for the holder, especially where our personal information is involved. This data currency will come under regulation in 2018, where failing to get a clean audit will have similar reputational and monetary consequences as failing a finance audit.

Against this backdrop, as we head into 2018, below are four factors that CIOs and CTOs should consider, to ensure that they get the most value from their organisations’ data.

Cloud control – hybrid architectures will dominate

The debate around whether to use cloud technologies is history. Multi-cloud deployment has now become the norm. With increased awareness that uncapped or per-second pricing can spiral out of control, a return to hybrid architecture, which marries the strengths of controllable cost, high performance on-premises systems, with burstable, global cloud services is underway.

The increased requirements for data control are going to further boost the attractiveness of the hybrid model. A trend that will only accelerate as each business seeks to balance its own technology demands against the relative TCOs of public vs on-premises cloud platforms.

As the transition from virtualization to cloud-native and containerized applications gathers momentum, there will be an increasing focus on seamless private/public cloud application and data mobility through to 2020. Those technologies which are able to integrate security with performance and multi-tenancy to enable this mobility on-demand, will become the market and mind-share leaders. Being able to efficiently migrate and repatriate data is going to be a key feature of cloud capability as organisations review the splits between where applications and data are most advantageously hosted from both a legal and operational perspective. Returning data to centralised pools in each legislative region is likely to become more common. This will increasingly be augmented (even at the edge) by directly linked high performance storage and compute at a local level, for mission critical real time or highly sensitive applications.

Add a Comment

Your email address will not be published.