John is going to sort a sequence of numbers using a special algorithm.
First all numbers a[0], a[1], a[2], ..., a[n1] are written on a blackboard. During the first iteration, John checks all numbers in the order of increasing indices (so, he checks a[0] first, followed by a[1], a[2],..., and ends the first iteration with a[n  1]).
At any moment, he can erase the number he is looking at from the blackboard and write it into his notebook.
When looking at number a[i], John erases it from the board and writes into his notebook if and only if it is equal to the smallest unerased number on the blackboard.
All other iterations are similar to the first one, but, of course, John checks only the numbers which were not erased from the blackboard. The process is over when all numbers are erased from the board and written into John's notebook in nondescending order.
For example, if John is given a sequence of {3, 5, 1, 4, 2}, the process will go as follows. During the first iteration, John will erase 1 and 2 from the board, writing them to the notebook and changing the sequence to {3, 5, 4}. During the second iteration, 3 and 4 will be written into his notebook, so only 5 will be on the board during the third iteration.
You will be given four ints: a0, X, Y, M, n. The sequence John will need to sort can be generated using the following algorithm:
 a[0] = a0;
 a[i] = (a[i  1] * X + Y) mod M, for 0 < i < n (where mod is a remainder operation.).
You will be given two more ints start and finish. Return the sum of all numbers John will erase from the board during all iterations with indices (1based) between start and finish, inclusive. If John will need less than finish iterations to sort the sequence, return 1.
