Moore's law is a wellknown prediction of the growth of computing power over time.
This is the formulation we will use in this problem:
The speed of new computers grows exponentially and doubles every 18 months. In this problem we will assume that reality precisely obeys this law.
Suppose that you have a hard computational task that would take 14 years to complete on a current computer. Surprisingly, starting its computation as soon as possible is not the best you can do. A better solution: Wait for 18 months and buy a better computer.
It will be twice as fast, and therefore solve the task in 7 years. You would have the result 8.5 years from now. In the best possible solution you should wait for slightly over 4 years. The computer you'll be able to buy then will solve the task in approximately 2.2 years, giving a total of 6.2 years.
You have a computational task you want to solve as quickly as possible. You will be given an int years giving the number of years it would take on a computer bought today.
Return a double giving the least number of years in which you will have the result of the task if you use the above approach.
