Chef and Memory Limit - Code Chef

Read problems statements in Mandarin Chinese and Russian.

Recently Chef has decided to make some changes in our beloved Codechef. As you know, each problem at Codechef has its memory and time limits. To make problems even more challenging, he decided to measure allocated memory in a different way. Now judge program will be calculating not the maximum memory usage during the execution of all test files, but all the memory ever allocated by the solution program. But as Chef is not that good in algorithms, so he asks you to write a program that will calculate total memory usage of a solution.
So, you are given N numbers M1, , ,MN representing the measurements of consumed memory (in MBs) for N test files. In other terms, it means that on i-th test file, program took Mi MBs of memory. Initially, there is no memory allocated for your program. Before running your program on each test file, if the currently allocated memory is more than memory needed for the current test file, then there will be a deallocation of the memory to fit the current program. Also, if there is less than needed memory available, then allocation of memory will happen so as to fit the current program. e.g. Let us say that our program took 10 MBs on current test file. So, assuming if there was 12 MBs memory allocated before running the program on current test file, then there will happen a deallocation of 2 MBs. Assuming if there was 8 MBs memory allocated before running the program on current test file, then there will happen a allocation of 2 MBs.
Calculate the total memory allocated for running the solution program on all the N test files. Please see third sample for more clarity.

Input

First line of input contains a single integer T denoting the number of test cases. First line of each test case contains a single integer N denoting the number of measurements. Second line of each test case contains N space separated integers, where ith integer denotes the consumption of memory for ith i-th test file.

Output

For each test case, print total memory allocated for running the solution program.

Constraints

  • 1T105
  • 1N105
  • 1Mi109
  • sum of N over all test cases does not exceed 105

Subtasks

Subtask 1 (30 points):

  • 1 ≤ T ≤ 100
  • 1 ≤ N ≤ 100
  • 1 ≤ Mi ≤ 100
Subtask 3 (70 points):
  • Original constraints.

Example

Input:
3
2
1 1
5
1 2 3 4 5
3
1 3 2

Output:
1
5
3

Explanation

Example case 1. Initially, there was no memory allocated. For running first test file, there was a memory allocation of 1 MBs. There was no allocation/ deallocation for running your program on second test file.
Example case 2. On running on each test file, there was a further allocation of 1 MBs from previous one. So, there are total 5 MBs of memory allocated while running the program.
Example case 3. Initially, there was no memory allocated. For running first test file, there was a memory allocation of 1 MBs. For running second test file, there was a further memory allocation of 2 MBs to have 3 MBs of memory needed, then in the last file, there was a deallocation of 1 MB of memory so as to get 2 MBs of memory needed for running the program. So, overall, there was 1 + 2 = 3 MBs of memory ever allocated in the program. Note that we are only counting allocated memory, not allocated + unallocated.


MY SOLUTION:

But it not solve the last 3 test cases..

#include<stdio.h>
int main()
    {
    long int t,m;
    scanf("%ld",&t);
    while(t--)
        {
        scanf("%ld",&m);
        long int a[m],alloc[m],i,j=0;
        for(i=0;i<m;i++)
         {
         scanf("%ld",&a[i]);
            if(i==0)
                alloc[j++]=a[i];
            else if(a[i]-a[i-1]>0)
                alloc[j++]=a[i]-a[i-1];
        }
      long int res=0;
        for(i=0;i<j;i++)
        {
        res+=alloc[i];
        }
        printf("%ld\n",res);
                }
    return 0;
}


OUTPUT:

Convert resultset object to list of map

Convert the resultset object to list of map In some cases, while we doing API automation, we need to fetch the value from the DB and hav...