You can multiple scp in background , running them parallel in background. I assume you are intenting to scp multiple large files . I am going to show how using job control feature of bash. When I initially googled , suggestion where to use screen (terminal multiplexers) but I think bash job control method is better, b'cos it less resource hungry.
The basic concept is as below, i improved upon it with a simple bash for loop.
scp user@host:/file1 .
ctrl-zscp user@host:/file2 .
ctrl-zscp user@host:/file3 .
du -h file1 file2 file3
I will explain what is happening in bit. But first i will show the above process in a bash script.
for i in file1 file2 file3
scp user@host:/$i .
Basically what is happening is ctrl-z key combination suspends the process and after that bg job-id resumes the process in background. jobs command lists the status of jobs like if they are running or not. Since you are running the scp in background we dont get to see the progress bar but thats not a issue since the du -h command will give that status information. It will say how much of the each of files is downloaded.
Now for the script part. After each scp you have to manually press ctrl-z , bg , jobs and du commands as necessary.
Note that there is a dot at the end of every scp command. As you might be aware i am trying to download to the dir where i am executing it.
One thing i noticed is that job control is pty specific. So if you open a another terminal and type part of it here and part there, it wont work.