Linux Crash Course - The wget Command

  Рет қаралды 31,492

Learn Linux TV

2 жыл бұрын

The Linux Crash Course is a tutorial series that goes over all of the core concepts regarding Linux that you'll need to know, one video at a time. In this episode, the wget command is covered.
## Thank you to Linode for sponsoring this episode of the Linux Crash Course.
- Set up your own cloud server with Akamai Connected Cloud ➜ learnlinux.link/akamai
*Support Learn Linux TV*
Note: Commission may be earned for any and all links presented here.
- Support me on Patreon and get early access to new content! ➜ learnlinux.link/patron
- Affiliate store for Linux compatible hardware/accessories ➜ learnlinux.link/amazon
- Check out the Tiny Pilot KVM for your Homelab ➜ learnlinux.link/tinypilot
*Time Codes*
00:00 - intro
01:25 - Get your very own cloud Linux server with Linode
02:38 - Checking if wget is installed
03:22 - Downloading a single file with wget
06:02 - Choosing the name of the file to be downloaded
07:21 - Choosing a location for the file to be downloaded
08:38 - Resuming an interrupted download with wget
11:18 - Using an input file with wget to download a list of files
*Full Courses from Learn Linux TV*
• Linux Essentials Certification Workshop ➜ learnlinux.link/lpi-course
• Linux Crash Course series ➜ linux.video/cc
• Learn how to use tmux ➜ linux.video/tmux
• Learn how to use vim ➜ linux.video/vim
• Bash Scripting Series ➜ linux.video/bash
• Proxmox VE Cluster Full Course ➜ linux.video/pve
• Learn Ansible ➜ linux.video/ansible
*Linux-related Podcasts*
• Enterprise Linux Security ➜ enterpriselinuxsecurity.show
• The Homelab Show ➜ thehomelab.show
*Learn Linux TV on the Web*
• Main site ➜ www.learnlinux.tv
• Community ➜ community.learnlinux.tv
• Enterprise Linux Security Podcast ➜ enterpriselinuxsecurity.show
• The Homelab Show Podcast ➜ thehomelab.show
• Content Ethics ➜ www.learnlinux.tv/content-ethics
• Request Assistance ➜ www.learnlinux.tv/request-assistance
*Disclaimer*
Learn Linux TV provides technical content that will hopefully be helpful to you and teach you something new. However, this content is provided without any warranty (expressed or implied). Learn Linux TV is not responsible for any damages that may arise from any use of this content. The person viewing Learn Linux TV's content is expected to follow their best judgement and to make their best decisions while working with any related technology. Always make sure you have written permission before working with any infrastructure. Also, be sure that you're compliant with all company rules, change control procedures, and local laws.
#Linux #LearnLinux #DevOps

Пікірлер: 57
@zauliuz
@zauliuz 2 жыл бұрын
thanks for "-i" option. that's a life saver :) also very useful option is "-b" (background download)
@ArniesTech
@ArniesTech 2 жыл бұрын
You are my online encyclopedia for Linux stuff. Perfectly understandably explained, as always 🙏
@LearnLinuxTV
@LearnLinuxTV 2 жыл бұрын
Wow, thanks!
@aaronperl
@aaronperl 2 жыл бұрын
I know you can't cover every available option, but I just wanted to mention that I first discovered wget when I needed to download the Slackware distribution, before the days that ISOs (and fast enough internet to download ISOs) were commonplace. The recursive option to wget was great for mirroring the directory structure in preparation to create an install CD.
@DL-xf3ur
@DL-xf3ur Жыл бұрын
Awesome as always Jay! Thanks.
@moamer1468
@moamer1468 4 ай бұрын
Thanks very much your Tutorials are so simple and I could understand each piece of information in your videos
@SirJohnHolmes_I
@SirJohnHolmes_I Жыл бұрын
Thanks for such nice explanation!)
@bw_merlin
@bw_merlin 2 жыл бұрын
I have been enjoying these beginner focused videos. One suggestion would be to explain what an argument means. A perfect example is you explained that -i was for input file but the other arguments you don't explain what they mean, only what they did. Explaining the meaning such as -i means input helps people learn the association between the command and function making it easier to remember.
@phaedrus2633
@phaedrus2633 Жыл бұрын
Great! I appreciate simple.
@Gabriel-of-YouTube
@Gabriel-of-YouTube 2 жыл бұрын
Thank you, it would have been useful to do tell about the difference between wget and curl.
@mikeonthebox
@mikeonthebox 2 жыл бұрын
_wget_ supports only http/https ftp/ftps While with _curl_ you can wrok with many other protocols.
@logyross6883
@logyross6883 2 жыл бұрын
Great video as always, can you cover tar next?
@sherakhela4044
@sherakhela4044 2 жыл бұрын
You read my mind. Thanks
@rudysal1429
@rudysal1429 2 жыл бұрын
Hey, I've been watching your videos lately. I saw a video you posted a year ago about migrating to Alma Linux. I also saw your opinion on CentOS and RH and not being tied to a certain distro. I was wondering if you had a video about being able to migrate a server from one distro to another, like Ubuntu/ Debian to rocky or visa versa.
@dirkpostma2135
@dirkpostma2135 2 жыл бұрын
Thanks didn't know all those wget options. Perhaps this is a nice string of commands to explain one time also: "time timeout 60 watch -n 10 ls -litra" :-)
@predragnikolic1259
@predragnikolic1259 2 жыл бұрын
Thanks!
@berkpwn
@berkpwn 2 жыл бұрын
wget is good for scraping sites, but I use cURL. its preinstalled on almost every distro, works with more protocols, and just as easy to use.
@nirabhmani9523
@nirabhmani9523 9 ай бұрын
How is wget & curl command is similar in certain scenario?
@medes26
@medes26 2 жыл бұрын
The process you used to get the url of the the downlaod page of the word press defeated the purpose of wget as you mentioned it at the beginning. Using wget when you don't have a gui installed on your server, yet you had to open your browser to get the URL of word press.
@LearnLinuxTV
@LearnLinuxTV 2 жыл бұрын
It doesn't defeat the purpose of wget at all. You can't download something without the URL. Can't change the impossible. You can find the URL of a resource from your phone or whatever you have. And until we learn how to predict what URLs are with some sort of ESP, we're going to have to look it up online. Unfortunately, that's just the way it is.
@0_1_2
@0_1_2 2 жыл бұрын
How about using Wget for fetching html values from webpages? For example if you wanted to fetch the html value of a sports score or stock price from a website?
@lawrencedoliveiro9104
@lawrencedoliveiro9104 2 жыл бұрын
Web-scraping is better done with a toolkit such as the Python-based BeautifulSoup.
@jwspock1690
@jwspock1690 2 жыл бұрын
Top! Tnx for your Videos. can you cover the topic of nmap?
@judedashiell5390
@judedashiell5390 2 жыл бұрын
The way I download a single archive from the internet shows me the download progress and when download completes without those progress bars or dots on my terminal. For this example I'm going to use archive for what to download from the internet. wget -bc archive && wc -l wget-log && grep -i saved wget-log Once entered every so often hit up arrow or ! then enter to check on archive growth on the system. Some lines will only have numbers but a longer output line will tell you when the file has finished downloading. Once finished just rm wget-log so the command can be used again. This is probably a good function candidate too.
@Broly_1
@Broly_1 2 жыл бұрын
So funny I was googling how to wget with poor connection without restarting from beginning and you made a video😀
@LearnLinuxTV
@LearnLinuxTV 2 жыл бұрын
Glad I could help!
@Joemexico
@Joemexico 2 ай бұрын
can you do one on curl
@maheadros
@maheadros 2 жыл бұрын
Dang! I just bought the 3rd edition of your book on kindle. Is it worth buying the whole book again, to get the updated info? I am studying to become a systems administrator, and I don’t want to miss vital info, but funds are also tight as I am doing school work to make this career transition
@LearnLinuxTV
@LearnLinuxTV 2 жыл бұрын
The main change is that it's been updated to be compatible with Ubuntu 22.04. There's less change between the third edition and fourth, than there was between the second and third. I think you should be good with what you have. I hope they come out with some sort of upgrade program though.
@HadToChangeMyName_YoutubeSucks
@HadToChangeMyName_YoutubeSucks 2 жыл бұрын
So in the continue download mode (-c) when you canceled it was at 12% with eta 2m 40s. When you continued it started at 35%, and by 44% it had eta of 4m 22s. I see eta flop around when I wget, which I do on a fairly regular basis, due to response from their end or even lots of tcp activity on my end, but the stopping at 12% and continuing at 35% was a new thing for me.
@LearnLinuxTV
@LearnLinuxTV 2 жыл бұрын
That's a result of editing. Sometimes, things can take so long that even speeding them up is a waste of time, so I literally cut a section of the wait in post to avoid people watching a progress bar.
@HadToChangeMyName_YoutubeSucks
@HadToChangeMyName_YoutubeSucks 2 жыл бұрын
@@LearnLinuxTV -- Yeah, figured it was something like that. Just noticed and thought it was kind of funny.
@ciaopete
@ciaopete 2 жыл бұрын
Have you done a gentoo install guide similar to your arch one?
@LearnLinuxTV
@LearnLinuxTV 2 жыл бұрын
Not as of yet. I need to find a way to determine how many people might be interested in that, because if there's an audience fit, there's no reason not to do a video on it.
@youtubeoneverything4581
@youtubeoneverything4581 2 жыл бұрын
thanks
@LearnLinuxTV
@LearnLinuxTV 2 жыл бұрын
You're welcome!
@drumpf4all
@drumpf4all 2 жыл бұрын
I only download programs from the package manager and so should you.
@lawrencedoliveiro9104
@lawrencedoliveiro9104 2 жыл бұрын
WordPress is probably a good example, because it gets developed so rapidly that your distro package is likely to be out of date.
@michaelplaczek9385
@michaelplaczek9385 2 жыл бұрын
what's the difference between wget and curl?
@LearnLinuxTV
@LearnLinuxTV 2 жыл бұрын
curl is moreso for pulling web pages, while wget is geared toward files. There's definitely some overlap though.
@lawrencedoliveiro9104
@lawrencedoliveiro9104 2 жыл бұрын
wget can do recursive downloads of linked web pages. I can’t see any option for that in curl.
@Froggie92
@Froggie92 2 жыл бұрын
thoughts on wget2?
@LearnLinuxTV
@LearnLinuxTV 2 жыл бұрын
I haven't had a chance to check that out yet.
@Froggie92
@Froggie92 2 жыл бұрын
@@LearnLinuxTV its probably beans new, I only just heard about it last week
@Jerkwad152
@Jerkwad152 2 жыл бұрын
Then there's wgive... *runs
@starkintux6251
@starkintux6251 2 жыл бұрын
like number 500 by me yo
@IgnatikVodichka
@IgnatikVodichka 2 жыл бұрын
Could you also please make a video on how to parse websites with wget and lets say get all hrefs links from website. Thank you.
@runnerlinux1474
@runnerlinux1474 2 жыл бұрын
Be my Teacher... I want to be u ..
@Kroom.54
@Kroom.54 Жыл бұрын
Your is so so so fast 😅
@NoEgg4u
@NoEgg4u 2 жыл бұрын
I find "youtube-dl" to handle just about any downloads, and it is cross platform. It will download just about any content. youtube-dl will download vidoes from nearly any site, often without needing to right-click to get the specific link to the video. You can usually give the link to the general page that has the video, and youtub-dl will find the video and download it. Some sites do not seem to have their certificates in good standing, and youtube-dl will complain and not download the file. You can add the following option: --no-check-certificate (two dashes preceding the the word no, and then a single dash after the word no). This will bypass the certificate issue, and download your file(s). "youtube-dl --help (two dashes preceding the help option) is akin to the Linux man page and will display numerous options. youtube-dl can also download playlists from files (such as what our host demonstrated with wget), and it will download a playlist from a site that has playlists (such as from youtube), and it will resume partial downloads, etc. Perhaps wget has similar functionality. I never used it. I am mentioning "youtube-dl" because many viewers of this channel are probably running Windows, too, and youtube-dl will run on both Windows and Linux (IOS, too, I believe -- not sure).
@mllarson
@mllarson 2 жыл бұрын
yt-dlp is a successor to youtube-dl and is more advanced. Give that a shot and see what you think.
@NoEgg4u
@NoEgg4u 2 жыл бұрын
@@mllarson I was not aware that there were forks of youtube-dl. I was concerned about youtube-dl, because there has not been a new version in a while. Perhaps the developers decided to go with the forks (or figured it is not worth their time if others are making forks)? I just downloaded yt-dlp. Thanks for letting me know.
@mllarson
@mllarson 2 жыл бұрын
@@NoEgg4u You're welcome. Yeah I noticed YTDL was downloading so slow for so long and then looked into alternatives. YT-DLP gives options for multithreaded downloads among other things.
@NoEgg4u
@NoEgg4u 2 жыл бұрын
@@mllarson Neither youtube-dl nor yt-dlp has a solution for youtube causing: ERROR: Did not get any data blocks It seems to happen on new-ish videos. A day or two later, and the download works.
@mllarson
@mllarson 2 жыл бұрын
@@NoEgg4u KZfaq does what KZfaq does :-D Trying again later does help sometimes.