Try HTTrack website copier - it will load all the images on the website. You can also try tahrfoundation.org It will grab website as. I know this has been addressed before. But could someone in the simplest way PLEASE explain how to just download all jpgs and gifs ONLY. But the website uses picture / source tags with srcset attributes which httrack does not handle, all those pictures does not work well offline.
How to use HTTrack, an open-source utility that lets you copy any building recursively all directories, getting HTML, images, and other files. It is often not possible to mirror only images, because HTTrack must follow links on the pages (HTML) to find all the images you want. The good. Try HTTrack website copier - it will load all the images on the website. You can also try tahrfoundation.org It will grab website as. But the website uses picture / source tags with srcset attributes which httrack does not handle, all those pictures does not work well offline. I know this has been addressed before. But could someone in the simplest way PLEASE explain how to just download all jpgs and gifs ONLY. But some dynamic scripts (such as tahrfoundation.org) can both generate html content, or image data content, depending on the context. What are the parameters to give to httrack to just get the images I'm interested in, and save them to the current directory so I don't have to. Even tried softwares like Extreme Picture Finder, Bulk Image downloader and HTTrack but all f them end up downloading only the thumbnails.
Please, explain more in detail