Compliance: Data Storage in a Regulated World

Why do we need Regulatory Compliance within technology?

There are many industries that are regulated – Financial, Health, Insurance, and Accounting and Tax planning to name a few.  Now, here lies the problem – each regulated industry requires different sets of rules according to its given regulator. A need for a bridge between the technical understanding of the business requirements to regulatory guidelines is very apparent.  If anyone reading this article has read the FCA’s (formerly FSA) handbook and tried to understand what IT governance is required they will know what I mean.

I will list out a couple of examples where compliance for data storage and retrieval differ vastly.

  • Health – Meeting and Minutes details – Must be held for a minimum of 30 years.
  • Insurance – Employers Liability Policies – Must be held for a minimum of 40 years.

Now, these are just two low-level examples of data retention, now add ALL of the other considerations (and there are a lot) into the mix. Data Access, Information Security, Business Continuity, Data Protection laws etc, you will soon see that the role of a CIO/CTO within these regulated firms is a difficult one as well as knowing that these regulations change too.

So, we have hoards of information that we need to store under our governing body, where do we store them? This now creates another problem, who do we trust to store them effectively and for this length of time? Let’s be honest, most technology firms cannot see past a 5-year business plan, let alone 40 years. This as well as the format that the data has been stored on, will we all be naive enough to think that in 20+ years’ time the data we stored initially can even be accessed? When I was working in the banking sector we had so many disparate systems it was crazy and over a 5-year plan we eventually standardised them through one platform. However, we still had the same problem of catering for the eventuality of recalling data from an OS2 Warp operating system from 10 years prior.

Now, consider the financial regulated world. This is a very very complicated topic and again the policies differ massively depending on what type of activities you conduct under the regulator’s adherence.

For example, the length of time records should be kept depends on which type of business the records relate to. For MiFID (Markets in Financial Instruments Directive) business, records must be kept for at least 5 years after an individual has stopped carrying on an activity; for non-MiFID business, it is 3 years after stopping the activity and for a pension transfer specialist the records must be kept indefinitely. This includes Email, File, Databases and in fact any data that has used for said given business. There’s a MiFID II on the horizon (2015) with even more significant changes are looking to be introduced.

Now, do you see the complications on this one topic (data storage) within IT Governance?

For me, conformity needs to stem from understanding. If you do not understand what you need to conform to, how on earth can you? A simple understanding of one ruling of conformity for example 2 years of data storage and not 5 could save you £1000`s and let you sleep at night.  Imagine if you knew the rulings for ALL of your data storage requirements and you have fine-tuned them to your infrastructure, or better still spoken to someone who already understands them.

Conformity needs to stem from understanding. If you do not understand what you need to conform to, how on earth can you?

There is one company who I have spoken to recently whose approach to this challenge warrants particular merit, and they stand by a 100% guaranteed data restoration rate however long you store your data for: Arkivum.

Arkivum’s storage is based on the principle that 3 copies are needed for absolute certainty that data is safe. So using active integrity checking at all times, every one of your files is copied three times, with two copies held online in geographically separated Data Centre’s and the third held offline locked away in an escrow service.

Arkivum’s Data Archiving Storage System

Arkivum’s Data Archiving Storage System


It was very interesting for me to talk to a technology company and discuss compliance; they even have a dedicated compliance officer.

The regulation of IT, especially Cloud, is paramount and right up there with security.

Is this the future for IT companies, that they must have a better understanding of compliance rulings in regulated industries?

With the state of “internet of things” gathering momentum and even your domestic items being able to talk to each other (and maybe even talk about you to each other), let alone the internet – my feelings are that the regulation of IT, especially Cloud, is paramount and right up there with security.  The only issue I have with regulations is that they sometimes stifle creativity and flexibility, but that’s a whole new topic that I am sure we will discuss in the future. What are your thoughts on regulated IT and compliance?



Top 10 Things to Consider Before Moving to the Cloud


The cloud is here to stay and it makes sense to use it. The market is already developing and changes will occur over the next few years that will allow you to have even more choice than today. Microsoft will inevitably offer a complete hosted solution, along with Oracle, VMware and many, many others. We are also seeing disparate systems on disparate platforms (AWS, Google and others) being linked together, managed by complicated orchestration products. The greatest problem with all of these platforms and services is likely to be the support element, so when choosing cloud for your business, make sure you are asking the right questions:

  1. What cloud services do you need?

In order to choose the option that is best suited to your business, it is vital to understand exactly what cloud services you need.
In order to choose the option that is best suited to your business, it is vital to understand exactly what cloud services you need. There are so many on the market, from full infrastructure hosting and application delivery through to managed backup and disaster recovery services. Even if you only want to move one or two of your services to the cloud at the moment, think about whether you may want to extend this in years to come. Choosing a provider that offers them all could give you more flexibility in the long run. One of the features of a cloud solution is its ability to scale up and down to match your size, but you should still ensure that the provider’s capabilities match your plans for growth.

Many cloud companies do not, in fact, have their own infrastructure but resell from others.

  1. Who am I dealing with?

Many cloud companies do not, in fact, have their own infrastructure but resell from others. This need not be a problem – it is common practice for a cloud provider to sell services via a channel of smaller resellers – but you should make sure you know who is actually providing them! It is quite possible that you will receive better support from a smaller value-add reseller, but you need to know whose customer you are, and who is ultimately responsible for the services you are buying.

  1. What about the contract?

Standard hosted contract terms are often 24 – 36 months, with shorter terms generally attracting higher costs. Some cloud providers, however, are now starting to offer a 12 months or less contract or “pay as you grow” option. This can be helpful if the provider is new to you or even new to the marketplace, and will allow you to gauge the type of service you will receive without making a long-term commitment.

  1. The Service Level Agreement

Don’t get tied into a service that just isn’t working for you.
This is very important. Don’t get tied into a service that just isn’t working for you. Check the terms and conditions for material breaches and downtime. Many providers offer compensation but this is likely to be insignificant compared to a loss of service for your business if your entire company’s infrastructure is running remotely. A good provider will give you the option to terminate the service if the SLA is consistently breached but beware there are many providers that will not.

  1. Where is my data?

There are so many reasons why you should know this. If you do not know where your IP (intellectual property) or data is, then how can you get this back if you fall out with your provider? It is YOUR data and YOU need to know where it is. A good provider will give you access to it, no matter what the circumstances, at very short notice. Beware “safe harbour” agreements too. Although they are designed for data protection, they often fail to stand up if challenged. If you are offered one, have it checked thoroughly by a lawyer.

  1. Security

You must ensure that your provider has the appropriate security to safeguard your business.
This is a very important point and it should be right up there with “should I have cloud services for my business?” At the end of the day, your data is accessible from the internet (and we all use it in one form or another). You must ensure that your provider has the appropriate security to safeguard your business. ISO standards are a good base to grade the provider’s competency in this area, but there are many other standards that can also be adhered to. Note that if you are regulated by a governing body such as the FCA (FSA of old) or HIPAA (healthcare), additional security standards are required. Make sure these are not just a tick-in-the-box accreditation – challenge the provider on what they offer.

  1. Internal policies

As well as being good security practices, security policies for your business are essential for cloud services. ‘Password123′ is not good enough!

On average, internet users have 25 password-protected applications they manage, but only six (or less) unique passwords

Staff will use applications to share information, whether you know it or not. On average, internet users have 25 password-protected applications they manage, but only six (or less) unique passwords. Using a cloud password management platform that enables employees with one password to access all their applications (single sign-on) will help to provide a better experience while securing company access and data.

  1. Check for hidden costs

One major problem with all the options available today is to normalise the offerings and get a fair comparison. You can compare features and functions with a bit of research – using comparison tools such as those on Compare the Cloud – however, providers differ not only in functionality but also in costs and billing methods.

Make sure you get to the bottom of the provider’s pricing. For example:

CPU costs: 2 Core (@2.5GHz) with 2GB RAM costs £x.xx /instance/month
Storage costs: Cost of 1 GB usable storage, SAN/NAS based on 10TB base infrastructure = £x.xx/GB/month
Backup costs (£x.xx/GB backed-up)
Network costs: (£x.xx/GB/month transferred in and/or out)

  1. Availability

Consider how your company’s business handles network, system and other failures. Does the cloud infrastructure need to be highly resilient, or can individual parts fail without causing a major service interruption?

A good cloud provider will have a replicated copy of your infrastructure (for their own internal disaster recovery plan). Some providers will charge you for this and some providers will simply not have this and gloss over the discussion with you. A good start would be to discuss where your provider is hosting your service – data centre – and ask about the Tier level. Every data centre can be graded by this tiering (and should be) and the results will be obvious for you to understand when you receive them.

Tier 1 = Non-redundant capacity components (single uplink and servers).

Tier 2 = Tier 1 + Redundant capacity components.

Tier 3 = Tier 1 + Tier 2 + Dual-powered equipment and multiple uplinks.

Tier 4 = Tier 1 + Tier 2 + Tier 3 + all components are fully fault-tolerant including uplinks, storage, chillers, HVAC systems, servers etc. Everything is dual-powered.

A Tier 4 data centre is considered as the most robust and less prone to failures. Naturally, the simplest is a Tier 1 data centre used by small business or shops.

  1. When it all goes wrong

So you now have a cloud service, or multiple services, and it all goes horribly wrong. How do you migrate away from the incumbent failing provider? Make sure that you are not handcuffed to large exit bills and contract penalty clauses. There are some test cases where clients were asked to pay extortionate fees just to keep their cloud services running after the firm went into financial hardship.

If your business is considering a move to the cloud and you need some advice, contact Compare the Cloud’s Cloud Practice Group and we’ll provide you some free advice and details on other ways we may be able to help make your transition a smooth and happy one.



The Miniaturisation of IT and Data Centres

Today we all hear about the consumerisation of IT, but the one subject most vendors and data centres do not wish to face is what I define as the miniaturisation of IT.

I am a proud iPhone 6s + owner (and an Apple watch, iPad Pro, MacBook Air, iPad Mini 4 all of these in Gold), when I hold my iPhone up I wonder how…

The ‘how’ I wonder is how does this one small device have more processing power than all the Allied and Axis powers combined during World War II. It is amazing when you look at the original Turing bombes or the later colossus machine and wonder how far forward the human race has come.

But as with any device, it is subject to reform, refinement, resizing or adapting to a new format altogether. An example of this is the number of functions my iPhone performs that required other devices previously;

  • Email – this required a desktop or laptop
  • Music – this required a C90 cassette or record player
  • Music store – I have iTunes which is digital delivery of services rather than the traditional vinyl record or cassette
  • iBook’s – I read my books based on a digital download rather than traditional book manufacturer or consumption

The above are just a few examples where functions and form are realised into a more compact function.

Let’s do a bit of future gazing, one of my personal hobbies looking at current devices and services and predicting future form and function:

Data centres / Servers / Storage

Future Form: Do we need such huge properties and devices guzzling power, spinning up out of date components. My view is the power of the large providers such as Microsoft Azure, IBM SoftLayer, Amazon AWS will all fit onto a single form factor of today’s single server within 20 years. As I predicted many years ago on this blog, we are beginning to see the fusion of biomechanics with computing. As Quantum computing and synthetic DNA storage become mainstream today’s devices will become more and more obsolete. I hear with interest many networking and storage companies moving towards being ‘software only’ companies we will see more of these announcements in the next few years.

Hybrid, Hybrid, Hybrid, before we see what I term as ‘the great leap’ towards cloud-based technologies, we will first see a gradual migration, using hybrid technologies which resemble current architectures. These Hybrid technologies such as those used within the car industry will allow familiarisation with future enhancements whilst retaining the current look and feel of IT hardware. The hybrid transition will move into the laggard space by 2018.

Future Function: Those that invest in data centres should now move onto another website here’s a great link In my view data centres that do not modernise and open up their doors, embracing local and national communities will die. Like the much-maligned 1960’s tower blocks being pulled down around the UK, the data centres that do not offer more than technical real estate will be akin to lemmings walking towards a cliff. The function of the future data centre will be in my view very different today and based upon biotech and be more technical hubs for those that need premises.

My view is that within 10 years most mega-cloud providers would have built their own data centres so they control the full stack, whilst regional data centres will lose 80% of capacity, which will need to be replaced by other revenue streams. Servers/storage and other hardware functions will be software controlled and take 99% less footprint that today’s technical architectures. My final thoughts modernise now or you will be extinct like a dodo.

Laptops and mobile devices

Future Form: We could make this as small as an atom if needs be, the issue is whether the artificial intelligence community can allow for the form to be made into a chip. The reason I say this, voice dictation and thought control will be essential to miniaturise these devices. The keyboard layout is crucial to many familiar with creating spread-sheets and word processing functions. My prediction: the laptop and mobile will be chip sized with embedded virtual reality functions that will pop out a virtual keyboard for those who wish to type using QWERTY.

Future Function: IoT, M2M and any other acronym I can throw out there will be controlled consumed and executed by a biochip. We are already starting to see home automation and personal health embedded via apps onto these devices. My view is long term everything from our passports to movies will be embedded onto every human as a biochip with artificial intelligence functions interacting with our thoughts.


V image 1

Is your Data Locked In?

Some of you may or may not have heard of the term Software Defined Networking (SDN), regardless for me it is Cloud for the Network. Many may disagree with this description but I believe it is apt and best describes what SDN offers and where there could be problems.

Before I continue let’s consider a simple analogy as means of explanation. SDN is like electricity; more accurately how it is supplied and distributed.

We all buy and use electricity. We connect our appliances, TVs, the laptop I am writing on right now, through standardised wall sockets. These sockets are supplied current through a fuse box which is wired to supply outside which is in turn wired to the national grid. If we want to change energy suppliers we can do so simply by migrating our billing information to a new provider. It all sounds easy and convenient.

Now imagine those wall sockets were different. What if everywhere you went the sockets varied depending on your location. Imagine each area (not a country but hotel down the road, office and airports) had its own proprietary sockets with no agreed open standards. You wouldn’t be able to charge your laptop simply by plugging it in. To make it worse it might not even be a case of having the right adapter. The voltage output could vary and even the access could be regulated, with differing priorities depending on the kind of energy request.

This is what is happening every day in data centres all over the world. And why? Simply, all SDNs are not created equal. If the software is defining the network, and it is, then it is possible to throttle the connection, use differing access protocols and define traffic inequitably. Proprietary hardware and patents that do not allow for open interconnection or free movement of a data are commonplace in today’s software-defined network.

More worrying for businesses accessing data centre services the use of these proprietary connections makes it very difficult to change providers. Where we can change energy suppliers quickly and easily (and readily compare provision and price) businesses cannot.

Data centre customers should not have to rework their routing, switching, firewall protection and addressing systems to just to change providers. The affect of current SDN practice is that businesses are being locked into their current providers. Some might say, held to ransom. If the continuity of your business services are such that you cannot move providers without incurring significant migration fees then your data is locked in.

Should a customer’s choice be limited to a single proprietary vendor or should it be open and transparent? You can guess my position. An interconnected, open system allowing both the network provider and customer easy access via a software-defined system is what the cloud marketplace needs. This is why we are seeing more MSPs and vendors going with open standards for the service provision.

Industry associations such as the CEF (Carrier Ethernet Forum) are a useful resource for understanding these issues. Take a look but don’t stop there. To promote a meaningful change lies with customer pressure. It should be easy, a fundamental right, for a business to be able to switch providers.

Ask your hardware or switching vendor what their adherence to open networking standards and systems are. And ask your data centre provider about choice. Where are your choices away from proprietary, and what is the lock-in based on the network hardware deployed?

SDN is a technology we would recommend if implemented fairly. OpenContrail is another good resource and is uses open source software that is backed by Juniper Networks, a respected name in the networking industry.

If you’re running an OpenStack or CloudStack environment there are options available including Neutron with CloudStack supporting Big Switch. Additionally, big vendors such as IBM have created partner eco-systems. Another good SDN resource can be found on the IBM website.

My view is that SDN will continue to be integrated into cloud platforms and control panels. But unless software-defined networking technologies move away from being proprietary and cumbersome, advancement will be stalled. In the long term an open source SDN could liberate proprietary systems and allow for simpler customer migration. One day it could be as easy as switching electricity providers.

The key question to ask your data centre provider, whether you’re an end-user or MSP, is how can you leave, back out and exit your current network deployment? If the answer is prefaced with a deep intake of breath or a big sum then the answer is no. You are locked into your data centre.

The cloud should be about freedom of choice and the ability to differentiate using both open and closed systems through seamless interconnectivity. Creating systems where information can be secured and easily accessed is a goal for any organisation looking to enjoy the benefits of the cloud. SDNs need to do more, be more open, to help realise this goal.

Feel free to agree or disagree with this article, all comments and opinions are welcome.