You took a test consisting of N questions, each of which has a distinct point value between 1 and N, inclusive, and you have finally received the results. Along with your final score, you are told which questions you answered correctly. However, you are not given the point values that were assigned to the questions. For each correct answer, you received the full point value of the question, and for each wrong answer, you received 0 points. You must determine the minimum possible error of a valid point assignment that would result in the final score that you received. The error of a valid assignment of points is defined as follows: For each question i (where i is a 1-based index), let e(i) = the absolute value of (i minus the point value of the question). The error of the point assignment is the maximum value of e(i).
Given a String questions and an int score, return an int representing the minimum possible error of a valid point assignment. The ith character (where i is a 1-based index) of questions is either 'C', meaning that you answered question i correctly, or 'W', meaning that you answered it wrong. If there is no valid point assignment, return -1.