BACK

Why (almost) every SAP HANA needs a data management solution

data management sap hana datavard
3
0

In this article, I explain how to calculate SAP HANA costs considering factors such as data growth and changing prices for hardware. I also analyze how much can be saved with proper data management and answer when it is a good moment to invest into a data management solution.

Data management basics

SAP HANA is the next step for many SAP customers due to its unmatched speed and the next generation of SAP products, like S/4HANA and BW/4HANA. However, the enormous improvement in performance comes at a price. A proper data management solution is needed to optimize the storage of data and reduce storage costs.

Following data management tactics are possible:

  • Prevention of data creation
  • Application retirement / data deletion
  • Housekeeping
  • Offloading /data tiering (NLS, archiving, non-active data concept, DTO, DLM etc.)
  • Selective system copy

The data management tactics differs based on the type of system: ECC on HANA, BW on HANA, S/4HANA, BW/4HANA  or HANA native. However, the reasoning behind implementing data management is practically the same for any SAP HANA system. So for now I will not differentiate between any particular type of the system.

The cost of SAP HANA

I will focus on the pure hardware costs as it is the most tangible and best to calculate.  Other costs such as SAP licenses, labor or personal costs tend to vary greatly.

The hardware costs depend of the vendor and the configuration. In general, the bigger the HANA machine, the lower price per TB. I will abstract from this to make the calculation easier.

Examples of HANA costs can be found here.  They can also be computed with the AWS price calculator.

Based on my experience, the cost of 1 TB hardware for SAP HANA is 250,000 USD in a period of 3 years. It is a very safe assumption that excludes services and SAP licenses. What should be understood is that every terabyte of data on SAP HANA needs roughly 6 – 8 times that size in memory/hardware. This is due to:

  1. Sizing requirements: The sizing requires that SAP HANA is only filled to 50% with its data. So, for 1 TB of data you would need 2 TB of memory.
  2. Mirroring: In a standard High-Availability setup, the system is mirrored and therefore needs the hardware two times to achieve HA
  3. Non-Production-Systems (NPS): Typically, there is at least one QA system, which is created as a regular copy of the production system. It has therefore same size with another 2 TB. Some customers have also other systems such as pre-production or a sandbox, which increase the hardware requirements even more.

That means that 1 TB of data on SAP HANA requires at least 6 TB of hardware. The cost of 1 TB of productive data on SAP amounts to at least 1.5 million USD over the period of three years on hardware only.

Growth vs. hardware prices

What we see in everyday life reflects what Gordon Moore predicted in 1965 in his famous Moore’s law: CPU performance will double every 2 – 2.5 years (more information here). The costs of CPU decrease also exponentially, costs of hard disks go down as well.  However, the prices of RAM decrease slower, mostly due to the high demand, and more and more in-memory processing. In fact, the price even increased from 2012 to 2013. Therefore, I expect the reduction of price of the SAP HANA hardware by 50% at most every 3 to 5 years.

System and data growth is an opposing factor to hardware price decrease. If the price decreases by 50%, but the system size doubles in the same time, the total hardware costs remain the same. With hardware costs decreasing by 50% every (let’s say) 3 years, the balance is on data growth of 26% YoY. If your system growth YoY is higher than 26%, the hardware will cost you even more in the future. If it is less, the price will decrease in the future. Here are some examples:

Tables below show a prediction of hardware needs of 1 TB data on HANA from 2017 to 2026 with various growth ratios.

Break even on 26% YoY growth – size growth and hardware decrease is in balance so that costs state even

And this is how it looks with only 10% data growth:

sap hana costs

10% YoY data growth – hardware cost reduction is higher than the system growth

The last example is for massive 40% data growth:

40% YoY growth – costs will heavily increase in the future

How much can I save with data management?

Typical data management savings are roughly 30%, with a 60% system growth reduction (new data comes in, old and cold data goes out).  For example:

For 2 TB HANA (1 TB of data), the initial data size is reduced from 1 TB to 0.7 TB initially and the growth is reduced from 26% YoY to 10%. It looks like this:

Data management applied on 2 TB HANA (1 TB of data) – 30% initial reduction + 60% growth reduction

The cost in 2020 will drop from 1.5 million to 700 thousand USD. The net saving is 800 thousand USD – and that is only on hardware! In fact, if the warranty of the hardware can be extended, you can keep using the same hardware and not repurchase in 2020. That gives a solid basis for a data management business case.

With 8 TB HANA, the costs plummet from 6 million to 2.8 million, thus making a saving of 3.2 million.

8 TB HANA with data management

So when should I invest in data management?

In general, the other costs that come into the equation are SAP license costs, personal costs as well as costs for the data management implementation. The implementation costs can be quite high, due to secondary storage costs, but also because of the complexity of the data management roll-out. Achieving a 30% reduction requires technical know-how and experience.

In fact, there were quite a few failed data management projects where the challenge proved to be bigger than expected. The costs for a medium sized data management project should be around 250,000 to 1 million USD all-in for 3 years. I’ve taken an overview of 25 our customers to see where it makes sense to invest into a data management solution. The static results without data growth are the following:

  • Below 1 TB – no need to invest into SAP HANA data management
  • 1 to 4 TB – normally ROI below 2 years, so good business case is available especially with strong system growth
  • 4 to 8 TB – ROI less than a year is easily done, so the project should be high on the agenda
  • Over 8 TB – it is critical to do data management, as failing to do so will cost your organization millions of dollars

I took the “golden ratio” of 26% YoY growth to see how it looks on a time scale and here are the results:

The next step is to add the dynamic factor – data growth. And then it looks like this:

 

 

So, the actual decision depends on the current size and the future data growth. And here comes the conclusion – based on more than 300 Datavard Fitness Tests, we have found out that most of the small systems grow quite quickly, where bigger systems naturally grow slower in %. The situation looks like this:

The smaller the system, the faster the growth. A good data management business case is based on either a big initial system size or fast future growth.

Cloud vs. On premise

Last thing to consider is the setup – cloud vs. on premise. The ideal case is to prepare for SAP HANA migration and do the data management project before the migration. Here you will always achieve the business case. However, this is often omitted due to lack of information or difficult timing, thus a lot of money is wasted.

If you are already on SAP HANA, the business case is better and easier to calculate in cloud than on premise. The reason for that is that the cost is normally directly connected to the used storage/RAM on SAP HANA. Storage reduction will have a direct impact on costs as normally the size can be adjusted dynamically. However, be aware that it is not in the best interest of a cloud / hosting provider to support you in a data management implementation. Therefore, you should check the fine print and make sure that your contract allows for a data management initiative (e.g. that you don’t commit yourself to a minimum system size).

A different situation is on premise. If you have a 3 TB SAP HANA, it makes no difference if you use it to 40% or 50%. The only difference is if you need to increase the size. A typical sizing is setup that 50% of SAP HANA memory is used upon migration. The critical factor is when the data on SAP HANA reaches 65% – 70% of the overall memory – then you need hardware extension. The need for extension is again dependent on the system growth. The table below shows examples:

Fast system growth will provide good business case as no hardware extension will be needed for a longer period of time.

Summary

(Almost) every SAP HANA needs a data management solution. The reason for that is that:

  • The initial costs of SAP HANA are very high, every 1 TB of data storage on HANA costs approx. 1.5 million USD over the period of 3 years on hardware only
  • A medium sized data management project costs between 250 thousand and 1 million USD (3 years all in)
  • The critical factors for building a business case are the current system size and the future data growth
  • Larger (4TB+) systems have ROI below 1 year
  • Medium (1TB – 4TB) systems have balanced growth and size and will produce ROI below 2 years
  • Smaller (1TB and below) systems have often strong growth rate and the investment will pay back fast in the future
  • Always consider data management solution before migration to SAP HANA
  • If you are in cloud or in hosting, make sure you don’t limit yourself with the contract and that you are able to implement a data management solution
  • If you are on premise, the system growth is a critical factor as it will define how often you need to purchase new hardware

Leave a Reply

Your email address will not be published. Required fields are marked *