Thursday, 11 April 2013

youtube-dl :- A audio/video download tool

youtube-dl seems to be a versatile tool. It can download from

CollegeHumor, Comedy Central,  Dailymotion, Facebook, Metacafe, MyVideo, Photobucket, The Escapist, Vimeo, Yahoo!, YouTube, blip.tv, depositfiles.com, video.google.com,  xvideos, Soundcloud, InfoQ, Mixcloud, OpenClassRoom.

Since it supports soundcloud downloads. You can get mp3 files.

Wednesday, 10 April 2013

Areas of innovation

I feel the following areas has tremendous scope of improving and we can expect break through innovation. Atleast I am really interested :

  • Battery Technology . Esp., for smartphone. Smartphone batteries don't last long. Current technology is insufficient to provide long term battery. We need ways to pack more electric power per unit of space and also less current input to output loss. Hence saving us the time battery takes to charge. 
  • Data Communication networks :  From my experience as sys admin. I strongly believe that network can go miles of improvements. multipath tcp is step in that direction. SDN (Software defined networking) is also being developed. In this age of Cloud services, networking performance becomes very important. Current technology is based on assumption that servers are fixed inside cages of data centre. But virtualisation enables to make and destroy servers at will or need. Also we move servers from one physical server  to another physical server live. And then there is also wireless networks.

run scp in background without using nohup in linux


You can multiple scp in background , running them parallel in background. I assume you are intenting to scp multiple large files . I am going to show how using job control feature of bash. When I initially googled , suggestion where to use screen (terminal multiplexers) but I think bash job control method is better, b'cos it less resource hungry. 

The basic concept is as below, i improved upon it with a simple bash for loop.

scp user@host:/file1 .
ctrl-zscp user@host:/file2 .
ctrl-zscp user@host:/file3 .

ctrl-z
bg 1
bg 2
bg 3
jobs
du -h file1 file2 file3

I will explain what is happening in bit. But first i will show the above process in a bash script.

for i in file1 file2 file3
do
scp user@host:/$i .
done

Basically what is happening is ctrl-z key combination suspends the process and after that bg job-id resumes the process in background. jobs command lists the status of jobs like if they are running or not. Since you are running the scp in background we dont get to see the progress bar but thats not a issue since the du -h command will give that status information. It will say how much of the each of files is downloaded.

Now for the script part. After each scp you have to manually press ctrl-z , bg , jobs and du commands as necessary.

Note that there is a dot at the end of every scp command. As you might be aware i am trying to download to the dir where i am executing it.

One thing i noticed is that job control is pty specific. So if you open a another terminal and type part of it here and part there, it wont work.