You can't do the equivalent of an ls unless the server provides such listings itself. You could however retrieve kid-kon.com and then check for. wget has a built-in flag for this: wget -i your_list You can find this kind of thing by reading man wget. That will exit as soon as one of the files is not downloadable. wget appears to return 0 when successful and non-zero otherwise, so.
That being said, if you absolutely want it, you can abuse wget s debug mode to gather a list of the links it encounters when analyzing the HTML. Wget lets you download Internet files or even mirror entire websites for list of URLs in another text file on separate lines and pass it to wget. When I try to download all files into a directory list, then wget returns no downloads Someone knows how to make it detect that it is not a html.
The wget command is an internet file downloader that can download If you want to download multiple files you can create a text file with the list of target files. wget is a free utility for non-interactive download of files from the web. actual. listing file, or the listing will be written to kid-kon.com file. Download a List of Files at Once. If you can't find an entire folder of the downloads you want, wget can still help. Just put all of the download. The wget command can be used to download files using the Linux and start listing the sites or links to download from on each line of the file.