How to take screenshot of complete webpages?
On recent versions of Firefox, there is no need for any plugins. This capability is built into the browser.
- In Firefox, use Ctrl+Shift+F2 to open the Web Developer Console.
- Type
:screenshot --fullpage
(this will autocomplete, so:s
Tab--fullpage
will work).
That will save the screenshot as a PNG file to your downloads folder. By default, the filename includes the date and time. You can use a different filename if you wish, simply by passing it in on the CLI: :screenshot --fullpage my_name.png
.
This will also work on pop-up windows, but each window will need to be saved individually. I know of no way to combine them into a single screenshot. (You could, of course, do that in GIMP or another image-editing tool later.)
Here a list of applications that you can use from terminal:
- wkhtmltopdf (source)
wkhtmltopdf is a command line utility that converts html to pdf using webkit rendering engine.
sudo apt-get install wkhtmltopdf
- wkhtmltoimage (source)
The wkhtmltoimage utility shall take the screenshot of a given url, and save it as a png image. It uses the webkit rendering engine.
Download : http://code.google.com/p/wkhtmltopdf/downloads/list
Usage :
To use the wkthmltoimage utility simple run the command from terminal, giving the url and the name for the image file.
$ ./wkhtmltoimage-amd64 http://www.google.com google.png
It will create google.png in home directory with the screenshot of www.google.com
Other options :
wkhtmltoimage provides many options to customise the screenshot. Some examples are as follows :
Quality - Controls the quality/compression of the generation image. Default is 94
$ ./wkhtmltoimage-amd64 --quality 50 http://www.google.com google.png
Disable images
$ ./wkhtmltoimage-amd64 --no-images http://www.google.com google.png
Disable javascript
$ ./wkhtmltoimage-amd64 --disable-javascript http://www.google.com google.png
Crop the screenshot
$ ./wkhtmltoimage-amd64 --crop-h 300 --crop-w 300 --crop-x 0 --crop-y 0 http://www.google.com googl
- cutycapt (source)
Cutycapt is a utility to take the screenshot of a url, using the webkit rendering engine and save it to an image file.
Install
sudo apt-get install subversion libqt4-webkit libqt4-dev g++ cutycapt
Usage To use cutycapt, simply run the command from the terminal, providing the url and the name for the output file.
$ cutycapt --url=http://www.google.com/ --out=google.png
It will create google.png file in home directory which would have the screenshot of www.google.com
- khtml2png (source)
khtml2png uses the konqueror rendering engine to create screenshots of web pages.
Download
http://khtml2png.sourceforge.net/index.php?page=download
Install
To install khtml2png, the program has to be compiled and build on the system.
sudo apt-get install kdelibs4-dev zlib1g-dev g++ cmake
Extract the khtml2png archive.
./configure
make
sudo checkinstall (this will create a deb file and install it , so that it can easily uninstalled later)
Usage To use khtml2png run the program from commandline providing the url and other options.
$ khtml2png2 --width 800 --height 600 http://www.google.com/ google.png
This would create a google.png in home directory with the screenshot of www.google.com.
- PyWebShot (source)
Pywebshot uses python bindings embedded mozilla ( http://www.mozilla.org/unix/gtk-embedding.html )
Install
sudo apt-get install python-gtkmozembed
Download pywebshot from https://github.com/coderholic/PyWebShot
Usage :
$ python pywebshot.py www.google.com -t 1024x768<br /><br />Loading www.google.com... saved as www.google.com.png
It should create a www.google.com.png in the directory which has the screenshot of size 1024 x 768.
For anyone who came here looking for a CLI option: There is no need for any other tool, latest version of both chrome and firefox have inbuilt abilities.
Chrome
/path/to/chrome --headless --screenshot="img.png" "www.stackoverflow.com"
FireFox
/path/to/firefox -screenshot img.png www.stackoverflow.com
That's it.