By Cristhian Bonilha, UTFPR Brazil
Your favorite band just released a new album, and to make this experience even more exciting you decided to listen to the songs in a random order. To do that you wrote an algorithm that would make a playlist with K songs of the album. The problem is that your algorithm is not very effective in the way the songs are chosen, in a way that some songs could be played more than once before other songs would be played for the first time.
Given the number of songs of the album, the duration of each song, and the playlist generated by your algorithm, say how long it would take for you to listen to all the songs of the album at least once, if that is possible.
There will be at most 150 test cases. Each test case starts with two integers M and K, indicating the number of songs in the album and the number of songs in your playlist (1 ≤ M ≤ 100, 1 ≤ K ≤ 1000).
Following there will be M integers mi, indicating that the i-th song of the album has mi minutes (1 ≤ mi ≤ 300, for each 1 ≤ i ≤ M).
Following there will be K integers ki, indicating that the i-th song of the playlist is the track of number ki (1 ≤ ki ≤ M, for each 1 ≤ i ≤ K).
The input ends with end of file (EOF).
For each test case print one line, containing one integer, indicating how long it would take for you to listen to all the songs of the album at least once. If this is not possible, print -1.
|Input Sample||Output Sample|