20 Essential Linux Commands

This post provides examples of everyday Linux commands. I wrote this as a quick orientation to the CLI for newcomers. Keep in mind that there are usually several ways to accomplish a task, and other command combinations or programs might be better suited to your needs. Don’t be afraid to google it.


  • apropros, man
  • ls, ll
  • pwd, cd
  • mv, cp
  • rm, shred
  • head, tail
  • more, less
  • wget, curl
  • cat, strings
  • grep, find

Get Help With Commands By Using MAN

The examples on this page represent a small fraction of the possible uses for each command. Many commands have tons of flags and arguments that allow them to adapt to many scenarios. Since it’s impossible to remember how each program works, its arguments, etc., most come with a manual page, or man page. The man command retrieves the manual for the program and displays it on the screen. While sometimes verbose, man pages are typically your best source of initial information. This is the Manual in  “RTFM“.

The man system has several different sections, each providing documentation on a specific aspect of the program:

  1. General Commands
  2. System Calls
  3. Library functions (Particularly the C STL)
  4. Special files and drivers (Typically in /dev)
  5. File Formats & Conventions
  6. Games and Screensavers
  7. Miscellaneous
  8. System Administration Commands and Daemons

Typing man <command> will generally display section 1, if it exists. Calling man on something else — for example, the C function pthread_join, will display page 3 by default. To view other sections, type man <page> <command>. Note that not all programs have manual pages. Of those that do, most don’t have manual pages in each section. On some systems, you can type “man <page>” and press TAB to view a list of pages available for that section.

To view the manual page for man itself, type man man.

Finding the right Linux command with apropos

The Unix Tools Philosophy aims for tools that serve a specific purpose and can be chained together and results in a seemingly-endless variety of tools to accomplish the same job. The apropos command will search the manual pages for terms and return a list of possible commands. This “search” feature isn’t full-featured and works mostly on keyword matching.

    apropos ftp
apt-ftparchive (1) - Utility to generate index files
ftp (1) - Internet file transfer program
netkit-ftp (1) - Internet file transfer program
netrc (5) - user configuration for ftp
pam_ftp (8) - PAM module for anonymous access module
pftp (1) - Internet file transfer program
sftp (1) - secure file transfer program
smbclient (1) - ftp-like client to access SMB/CIFS resources on servers

Sometimes the number of commands can be unwieldy (try “apropos user” or “apropos ip“). Since the apropos command doesn’t do well with terms like “add user” or “ftp upload”, it’s sometimes useful to join the output with grep. You can also pipe the output to more or less.

    apropos user | grep add
adduser.conf (5) - configuration file for adduser(8) and addgroup(8) .
addgroup (8) - add a user or group to the system
adduser (8) - add a user or group to the system
pam_issue (8) - PAM module to add issue file to user prompt
useradd (8) - create a new user or update default new user information

List directories with ls

ls is the command for listing a directory. Useful flags include:

  • -a (–all), which shows dotfiles
  • -l which provides a long listing that includes file size and type,
  • -h (–human-readable) shows filesizes in normal units instead of bytes (1.23 GB instead of 1234921293)
  • -S to sort by file size

Useful combinations:

Default ll command, plus human-readable sizes

    alias ll=”ls -alh”

List all tar files in current directory:

    ls *.tar

List tar files in long format:

    ls -l *.tar

List the largest 10 files in a directory and output size in human readable form:

    ls -lhS | head -n 10


Navigate the filesystem with CD and MC

Navigating around the filesystem is done with the change directory (cd) command. Instead of merely listing the contents of the /tmp directory (ls -l /tmp), you can move into the /tmp directory and list the contents of the current directory:

cd /tmp && ls -l

To return to your home directory, simply type cd with no arguments. You can also use the shortcut ~ to refer to files from your home directory. The two paths below describe the same location on the filesystem, assuming that the second command is run by the cmattoon user.



If you aren’t sure which directory you’re in, you can use the pwd (Present Working Directory) command to tell you.

Midnight Commander (mc) is a third-party application that some people find useful for navigating the filesystem, copying and moving files, etc. You can find more information on their site.

Move, Rename and Copy files with mv and cp

Copy a config file to a backup:

    cp config.ini config.ini.bak


cp config.ini{,.bak}

Copy the entire config directory and it’s contents:

    cp -r config/ config-backup

Remove files with rm and shred

There is no “undo” command for rm.

The rm command removes files basically forever, so be careful.

There is no “undo” command for rm. Some people choose to edit their ~/.bashrc or ~/.bash_aliases and add the following:

alias rm="rm -i"  # Ask for confirmation before deleting files.

Linux makes the process of deleting a file forever deceptively simple:

    rm DELETEME.txr

To indiscriminately remove everything in the “/tmp” directory:

    rm -rf /tmp/*

Note: Either “rm -rf /tmp/” or “rm -rf /tmp” would delete the “/tmp” directory itself.

rm My\ Document

For private information, you might consider using shred.

cp ~/Downloads/ImportantDocument.pdf /mnt/backup/ && shred ~/Downloads/ImportantDocument.pdf

The -s (–size) flag takes an optional filesize (e.g., “1M”, “100K”, “1G”, etc.) and -u (–remove) removes the file after it’s done shredding.

Get first and last n lines from file with head and tail

The head and tail commands retrieve the first and last n lines from a file or stdin. Pipe to either of these to pass output to other commands:

Shows first 10 lines of README.md

    head README.md

Continuously output (–follow) the last screenful of information from /var/log/apache/error.log:

    tail -f /var/log/apache/error.log

Stores the last line of the log to $LAST_ENTRY

    LAST_ENTRY=$(cat /var/log/app/actions.log | tail -n 1)

Read large files one screen at a time with more and less

More and less are two programs that filter text output. They’re commonly used to page through large files, but can also be used to buffer output from programs. It’s a helpful habit to read config files with more or less, rather than open it with a text editor. In both programs, you can find help by pressing h and exit by pressing the q key.

    cat README.md | less

Writes the output of the install process to the screen for later

    ./install.sh 2>&1 | less

Although both programs are very similar, less is the newer one with more features. Specifically, less allows forward and backward navigation (via arrow keys and PgUp/PgDn) and doesn’t have to read the entire file into memory. This makes it more efficient on large files than it’s predecessor, more.

Use the slash key (/) to begin a search. While a search is in progress, the “n” key will move to the next result.

Download files with curl and wget

Since both curl and wget support HTTP/HTTPS and FTP, they are especially useful for interacting with web-based services like API’s and HTML forms. Both programs use the HTTP GET method by default, but are capable of others as well (POST, HEAD, PUT, etc..), and both support SSL. cURL supports even more protocols including Telnet, SCP, SFTP, POP3, IMAP, SMTP and LDAP, and a number of other features. 

Generally speaking, I prefer wget for downloading files and cURL for

Note: Ubuntu comes with wget, but you’ll need to install curl. CentOS and OS X are the opposite. You’ll probably need to download one or the other.

To download a file with wget:

    wget <URL>

If the URL contains special characters, or is pointing to a script, it’s sometimes better to wrap the URL in quotes and use the -O flag to specify an output file.

    wget “http://example.com/get_image.php?id=1234&size=130” -O image.png

Without the -O flag, wget would save the file as “get_image.php?id=1234&size=130” – which is unlikely to work as an image in any capacity.

While wget saves the file to the current directory (or the path specified by -O), curl’s default action is to write the output to stdout. To echo your current public IP address, you can run:

curl icanhazip.com

To download a file with curl, you’ll need to redirect stdout to a file:   

    curl “http://example.com/get_image.php?id=1234&size=130” > image.png

Curl and wget both have “quiet” or “silent” modes that suppress output. This mode is particularly useful for scripts and cron jobs where you don’t want extra output cluttering the screen.

curl -s “http://example.com/installer-x86_64-0.1.0a-rc1.tgz” > installer.tgz
wget -q “http://example.com/installer-x86_64-0.1.0a-rc2.tgz” -O installer.tgz

If you still want to see error output, but no progress bar, you can use -sS in curl. The lowercase -s is for silent mode, the uppercase -S for “show errors”.

For more details, type “man wget” or “man curl”.

Check disk usage with du and df

To check the amount of disk space available, use the df command (think “disk free space”). The du command will show you the amount of disk space used in the specified directory. Like the ls command, both df and du can output the human-readable filesize by using the -h flag.

Output of df -h will show the disk space for all mounted drives by default:


To see how much space the current directory is taking up, use du -sh. The -s flag means “summary”, and prints the total usage of all subdirectories. WIthout the -s flag, du will generate a report for each subdirectory. This feature can be useful for finding the largest n files in a directory. The following command finds the 10 largest subdirectories of the current directory. By piping the output of du into sort (-h sort by human-readable filesize, -r reverse), we can sort the files from largest to smallest. That output is then piped into head to retrieve the top 10 only.

    du -h . | sort -rh | head

Of course, you could pipe this output to more or less and peruse the entire list of directories, but there’s already a better tool for this: ncdu. (The “nc” alludes to the ncurses library used to render the user interface.) As you can see in the screenshot below, ncdu provides an easy way of tracking down large files.



Get file contents with cat and strings

The cat and strings commands are used to write file contents to stdout. The cat command will dump the raw file contents (in whatever form), while strings will print only printable characters. This feature makes the strings command a useful choice in identifying a file format or other initial discovery tasks.

strings vs. cat – Output of “strings /bin/true” is on the left; output of “cat /bin/true” is on the right.

Print raw binary data of /bin/true to stdout:

    cat /bin/true

See all human-readable strings in the “true” binary:

    strings /bin/true

Zero a file with cat:

    cat /dev/null > README.md

Find what you’re looking for with grep and find

Grep (Globally search a Regular Expression and Print) is useful for finding strings in files (or stdout). The find utility is used for searching by file name, size, etc.

To find all PHP files with the string “@todo” (case insensitive) in the src/ directory:

    grep -i "@todo" src/*.php

Recursively search the src/ directory for files containing the string “@todo” (case-insensitive):

    grep -ri "@todo" src/

This uses the -r (–recursive) and -i (–ignore-case) flags. As you may suspect, the –recursive flag searches the directory recursively, while the –ignore-case flag ignores the difference between uppercase and lowercase characters.

Grep is also useful to filter output from commands or stdout:

cat /var/log/apache2/error.log | grep -i "fatal error"

Watch the error log for lines containing the IP address “”:

tail -f /var/log/apache2/error.log | grep ""

If you have multi-line output in the log, grep will cut off all but the first line. If you want to see lines on either side of the target line use the -A (–after-context) or -B (–before-context). For example, consider grepme.txt, a file with “This is Line #n” from 0-30. Both commands produce the same output:

    grep 20 grepme.txt -A 5 -B 3
    cat grepme.txt | grep "20" -A 5 -B 3
    This is Line #17
    This is Line #18
    This is Line #19
    This is Line #20
    This is Line #21
    This is Line #22
    This is Line #23
    This is Line #24
    This is Line #25

Other useful examples include -v, which inverts the match and the -L/-l flags that show filenames of lines matched instead of lines matched.

Show all lines in access log that don’t include “GoogleBot”:

tail -f /var/log/apache2/access.log | grep -v GoogleBot

Show the names of files in the current directory (and subdirectories) that don’t have “@license” in them:

    grep -riL "@license" .

Show the names of files that have “@todo” in them:

    grep -ril "@todo" .

Show all lines with “@todo” in the current directory (recursive). Exclude the “img” and “templates” directories from the search.

grep -ri "@todo" . --exclude-dir="templates" --exclude-dir="img"

The find command is useful for finding files based on filename, size, type, or other attributes. In its simplest form, the find command searches for a filename:

find ~/Downloads -name '*.tgz'

The above command searches the ~/Downloads directory for files matching the pattern ‘*.tgz’. Since no -type is specified, it’ll search for files or directories.

Let’s look for files (only) in ~/Downloads that are over 100 MB in size:

find ~/Downloads/ -type f -size +100M

To find files smaller than 100 MB:

find ~/Downloads/ -type f -size -100M

To search /var/log for files older than 30 days and delete them:

find /var/log -type f -mtime +30 -exec rm -f {} \;

You can also use the built-in -delete flag:

find /var/log -type f -mtime +30 -delete


20 Essential Linux Commands

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.