## Description

- [10 points] Assume our service has a predictable daily demand where the peak requires 250 servers but the trough requires only 50 servers. Assume that, on average, we require 150 servers per day. Assume further that each server hour costs us $5.00 per hour in a data center or $4.20 per hour with a cloud provider.
*Hint:**The example on p.10 of the supplemental**pdf will help you with this problem.*

- How many server hours per day do we actually need?

- How many server hours per day must we actually provision?

- What is the potential cost difference per year between hosting the application in our own data center versus hosting it with the cloud provider?

- [15 points] Suppose we create 250 GB of new data each week that needs to be analyzed and we have 8 local servers for that processing. A computer the speed of one EC2 instance takes 2 hours per GB to process the new data.
*Hint:**The example on p.13 of the supplemental**pdf will help you with this problem. Also, assume that 1 GB =*10^{3}*MB.*

- How long will it take us to process each week’s data locally?

- How much would it cost to process the data locally assuming each server hour costs $4.50.

- How many cloud compute instances would we need to complete the analysis in one hour?

- How much does the computation cost to process the data with our cloud provider assuming that each server hour of compute time costs $0.075?

- How much does the transfer fees cost to move the data to the cloud provider assuming that each GB will cost $0.10 to transfer?

- How long will it take to transfer the full 250 GB of data to the cloud provider assuming that they can sustain an average of 20 Mbits/second?

- How much will it cost to process each week’s data in the cloud?

- Compare the processing time and overall cost and make a recommendation as to whether or not this process should be moved to the cloud provider.