Case Study: Resizing a database service

 

Background

The client is a billion dollar provider of financial and marketing services to the financial services industry and undergoing strategy realignment among their primary business lines. The client has been aggressively seeking cost reduction opportunities within IT, had already outsourced some of their legacy environment, and was looking for further opportunities in their self-managed environments.  Ravello was asked to review their database services running on industry standard servers.

Environment

The client operates two compute pools for relational database services based using software from top-tier vendors.  The larger pool is dedicated to production, whereas the other pool is for development and test. We worked with the client to synthesize their data into the following key metrics:

  • The cost structure for the software stack and server environment, presented as a ratio of software-to-server costs, is 4.7.  Included software costs are perpetual license maintenance fees for OS, virtualization middleware, and database.
  • Current average processor utilization rate for production was 20% and peak processor utilization rated at 80% during small, non-repeating windows during normal business hours. The production environment was not considered I/O or memory constrained, and client estimated average upper boundary for processor utilization to be 35% before additional capacity was needed.
  • Client estimated future demand growth rate was 5%.
  • The servers were all of the same model year and over three-years old.

In summary, the target database environment was considered to have excess capacity.

Inventory Analysis

The primary objective of the inventory analysis was to determine how to minimize software licenses to an ideal lower level, minimize the cost to reach that level, and not trigger a need to re-purchase additional licenses over the next four years.  To simulate the evolution of inventory transactions under various assumptions, the following key data points were developed:

  • any cost assumptions changes (e.g., license pricing terms)
  • performance benchmark/ratings of existing pool servers
  • performance benchmark of possible refresh candidates (e.g., several different model servers with different configurations)
  • estimated performance gains from next generation server models

Eight primary scenarios were developed to simulate an “all at once” server refresh and a variety of incremental refreshes based on fixed time intervals versus identifying the arrival of new, higher performance server models and pegging refresh to those times.

Results

The scenarios were narrowed down to those which included changing server configuration from 4-way to 2-way, which provided higher performance per processor and core rates and indifferent space costs. Waiting to refresh four months and then refreshing with new model servers over a six months provided a net reduction of 56% in software licenses counts. Utilization boundaries (min and max average utilization rates) were not shifted up to raise overall utilization, and which would have realized more savings, but kept at current levels as a first line buffer in case of unplanned growth. The secondary backup to avoid re-purchasing licenses was to turn to a public cloud database service (license rental).