get all the urls in html file (local or on server).

2014-02-17 1 min read bash Fedora
To use this, you will need the lynx tool, so install that first. sudo yum install lynx Now, to get list of all the URLs in local html file or some URL, just execute this: lynx -dump -listonly Related articles Trouble in using file_get_contents() How to send image to server with url in ios Endangered species of the Web: the Link

Manage your servers the easy way with perl script over ssh with no remote client.

2013-05-06 8 min read Linux perl
For a long time I have not posted any script. So, its not that I have not written anything new, but just that did not put them here in lack of time. So, here is one interesting one. The original idea came from one posted in one of the interesting blog here. But the problem with this one was that for every time, it ran in the cron, it would make multiple entries in the โ€œlastโ€ output (about 10 or more with my modifications for differentiating between solaris and Linux). Continue reading

Cont: Get yourself some more conkyrc files.

2011-10-12 1 min read bash Learning Linux
Last time we got ourselves some conkyrc files from the ubuntu forums. But that scripts gets the files only from the First page of the thread. Lets extend this further and get the script to get all the conkyrc files. There are some 1048 pages in the thread, I am showing pages 1 to 3 but you can change 3 to whatever number you want ๐Ÿ™‚ count=0 for i in {1. Continue reading

Get yourself some conkyrc files.

2011-10-05 2 min read Fedora Linux
If you are looking for some nice conkyrc files, then you can head over to : Ubuntu Forums In this thread you can see some very nice conkyrc files with screenshots. You can browse through the thread and get the one that you like. But if you are like me and would like to download all of them to see the features and commands in each of them then you would need to copy each of these files and paste them separately. Continue reading