Skip to content

Linux Overview

Command Line Mastery


Essential Commands and Search Operations

Details

Basic File Operations: Learn to manage files efficiently using basic commands.

# Copying files
cp source.txt destination.txt # Copies 'source.txt' to 'destination.txt'
cp -r source_directory/ destination_directory/ # Recursively copies a directory

# Moving/Renaming files
mv oldfile.txt newfile.txt # Renames or moves 'oldfile.txt' to 'newfile.txt'
mv files/* /backup/ # Moves all files from 'files' directory to '/backup/'

# Deleting files
rm unwanted.txt # Removes 'unwanted.txt' permanently
rm -r old_directory/ # Recursively deletes 'old_directory' and its contents

# Creating directories
mkdir new_directory # Creates a new directory named 'new_directory'
mkdir -p path/to/new_directory # Creates all intermediate directories as needed

# Listing files
ls -lah # Lists all files in a detailed, human-readable format
ls -ld */ # Lists only directories within the current directory
Advanced Search Tools: Utilize powerful commands to locate files and directories based on specific criteria.

# Find files larger than 100MB
find / -type f -size +100M # Finds files larger than 100MB throughout the filesystem

# Find files modified in the last 24 hours
find / -type f -mtime -1 # Locates files modified within the last day across the entire filesystem

# Find files by name ignoring case
find / -type f -iname "*pattern*" # Searches for files with 'pattern' in their names, case-insensitively

# Find directories named 'config' anywhere in the filesystem
find / -type d -name "config" # Locates all directories named 'config'

# Find empty files and directories
find / -type d -empty -o -type f -empty # Finds all empty directories and files

# Find and remove files larger than 500MB
find / -type f -size +500M -exec rm -i {} \; # Interactively deletes files over 500MB

# Rapid filesystem searches with updated database
locate pattern # Searches for files matching 'pattern' using a database updated by 'updatedb'

# Search for specific text within files
grep 'pattern' filename # Finds occurrences of 'pattern' within 'filename'
grep -ri 'pattern' /path/to/directory # Recursively/case-insensitive searches for 'pattern' in all files under the specified directory

# Search for 'pattern' in .log files, excluding lines starting with '#'
grep -v '^#' /path/to/*.log | grep 'pattern' # Filters out comments, then searches for 'pattern'

# Search files using regex and print lines before and after the match
grep -B 1 -A 1 'pattern' /path/to/file.log # Shows the line with 'pattern' and one line before and after each match
Combining Commands with Pipes and Redirects: Chain commands together to perform complex tasks more efficiently.

# Display files containing 'config' in their name
ls -la | grep 'config' # Uses 'grep' to filter the output of 'ls' to show only files containing 'config'

# Use the results of 'find' to search for 'error' in log files
find /var/log -name '*.log' | xargs grep 'error' # Searches for 'error' in all log files located in '/var/log'

# Count the number of files in a directory
ls -1 /path/to/directory | wc -l # Uses 'wc -l' to count the number of lines output by 'ls', which represents the number of files

# Find and delete files older than 30 days
find /path/to/directory -type f -mtime +30 -exec rm {} \; # Finds files older than 30 days and deletes them using 'rm'

Text Processing and Transformation

Details

Using sed and awk for text manipulation: Powerful tools for editing and processing text streams and files.

# Replace 'old' with 'new' in 'filename'
sed 's/old/new/g' filename # Uses global replacement to change all occurrences of 'old' to 'new'

# Delete lines containing 'pattern' in 'filename'
sed '/pattern/d' filename # Deletes all lines that match 'pattern'

# Print lines 5 to 10 from 'filename'
sed -n '5,10p' filename # Uses '-n' to suppress automatic printing and 'p' to print specific lines

# Append 'text' after lines containing 'pattern'
sed '/pattern/a text' filename # Appends 'text' after lines that contain 'pattern'

# Substitute 'foo' with 'bar' only on lines containing 'baz'
sed '/baz/s/foo/bar/' filename # Performs substitution only on lines that include 'baz'

# Print lines matching 'pattern' in 'filename'
awk '/pattern/ {print $0}' filename # Prints whole lines ($0) that match 'pattern'

# Sum the values in the first column
awk '{sum += $1} END {print sum}' filename # Adds up all values in the first column ($1)

# Print lines where the second column equals 'value'
awk '$2 == "value" {print $0}' filename # Checks if the second column ($2) is 'value' and prints the line

# Replace spaces with commas in 'filename'
awk '{gsub(" ", ",", $0); print}' filename # Uses gsub function to globally replace spaces with commas

# Sort lines by the numerical value of the second column
awk '{print $2, $0}' filename | sort -n | cut -d' ' -f2- # Prints and sorts lines by the second column numerically

Advanced grep techniques: Utilize grep to search text with complex patterns efficiently.

# Search recursively for 'pattern' in directory
grep -r 'pattern' /path # '-r' allows recursive search through all directories and subdirectories

# Exclude log files while searching recursively for 'pattern'
grep --exclude='*.log' -r 'pattern' /path # '--exclude' prevents grep from searching files ending in .log

# Count the number of occurrences of 'pattern'
grep -c 'pattern' filename # '-c' counts the occurrences of 'pattern' in 'filename'

# Display line numbers of 'pattern' in 'filename'
grep -n 'pattern' filename # '-n' shows line numbers along with the matching lines

# Highlight matches for 'pattern' in 'filename'
grep --color=always 'pattern' filename # '--color=always' highlights all occurrences of 'pattern'

# Search for lines that do not contain 'pattern'
grep -v 'pattern' filename # '-v' inverts the search, showing lines without 'pattern'

# Find patterns that are whole words only
grep -w 'pattern' filename # '-w' matches only whole words, ensuring 'pattern' is not a substring

# Display lines before and after 'pattern' matches
grep -B 1 -A 1 'pattern' filename # '-B 1' and '-A 1' show one line before and one line after each match

Command Efficiency and Customization

Details

Optimizing with Shell Aliases and Functions: Streamline your command line operations by creating aliases and functions for frequent tasks.

# Create aliases for commonly used commands
alias ll='ls -la' # 'll' now lists all files with detailed information
alias rm='rm -i' # Prompts for confirmation before removing files

# Alias to quickly navigate to frequently accessed directories
alias docs='cd ~/Documents'

# Write functions to perform complex tasks
backup() {
  tar -czf "$1.tar.gz" "$1" # Compresses the directory
  echo "Directory $1 backed up as $1.tar.gz"
}

# Function to update and clean system with a single command
sys_update() {
  sudo apt update && sudo apt upgrade -y
  sudo apt autoremove -y
  echo "System updated and cleaned."
}

# Add these aliases and functions to your .bashrc for permanent availability
# Edit .bashrc with `nano ~/.bashrc` and paste the alias or function at the end

Leveraging Command History: Enhance your efficiency by effectively utilizing the command history.

# Repeat the last command executed
!! # Executes the last command entered in the terminal

# Search command history for a specific command
history | grep 'install' # Finds all past instances of 'install'

# Search history using reverse-i-search
Ctrl + r 'part_of_command' # Interactively search and complete commands from history

# Persist command history across sessions
shopt -s histappend # Append rather than overwrite the history on disk
history -a # Writes the current session's history to the disk (.bash_history)

# Protect .bash_history from being overwritten
chmod 600 ~/.bash_history # Sets read and write permissions for the user only

# Clear command history
history -c # Clears the terminal history

# Exclude certain commands from history
HISTIGNORE='ls:cd:cd -:pwd:exit:date:* --help' # Ignore basic commands and any command with '--help'

System Operations

File Transfers and Sharing

Details

SCP (Secure Copy Protocol): Enables secure file transfers between hosts with SSH. Ideal for secure, encrypted transfers across networks.

# Basic transfer from local to remote
scp localfile.txt user@remote:/path/ # Copies 'localfile.txt' to the remote server at the specified path

# Transfer from remote to local
scp user@remote:/path/remotefile.txt /local/path # Downloads 'remotefile.txt' from a remote server to a local directory

# Transfer entire directory recursively from local to remote
scp -r localdir/ user@remote:/path/ # Recursively copies the entire 'localdir' directory to remote location

SMB (Server Message Block): Facilitates file sharing across different operating systems, particularly between Windows and Linux. This is useful for creating network shares that can be accessed from different operating systems without modifying client configurations.

# Access Windows share from Linux
sudo mount -t cifs -o username=user_name,password=password //Windows_Server/share /mnt/windows_share # Mounts a Windows share on a Linux system

# Share Linux directory with Windows using Samba
# Step 1: Install Samba
sudo apt install samba

# Step 2: Configure the Samba share
# Open the Samba configuration file in a text editor, such as vim
sudo vim /etc/samba/smb.conf

# Step 3: Add the following entry at the end of the smb.conf file
# This configures a share that Windows users can access
[ShareName]
path = /path/to/share
available = yes
valid users = user_name
read only = no
browseable = yes
public = yes
writable = yes
# Step 4: Restart the Samba service to apply changes
sudo systemctl restart smbd

# Step 5: Set a Samba password for the user
sudo smbpasswd -a user_name

# Ensure the directory permissions allow the appropriate access
sudo chmod -R 775 /path/to/share
sudo chown -R user_name:group_name /path/to/share

Data Management and Compression

Details

Compression Tools: Simplify the process of saving space and organizing files through effective compression and extraction techniques.

# Zip multiple files into a single archive
zip compressed.zip file1.txt file2.txt # Compresses 'file1.txt' and 'file2.txt' into 'compressed.zip'

# Zip an entire directory recursively
zip -r archive.zip /path/to/directory # Recursively compresses the directory and its contents into 'archive.zip'

# Unzip files into the current directory
unzip compressed.zip # Extracts all files from 'compressed.zip' into the current directory

# List contents of a zip archive without extracting
unzip -l compressed.zip # Lists the contents of 'compressed.zip'

# Create a tar.gz archive of a directory
tar -czf archive.tar.gz folder/ # Compresses 'folder/' into 'archive.tar.gz' using gzip

# Extract a tar.gz archive
tar -xzf archive.tar.gz # Extracts 'archive.tar.gz' into the current directory

# View contents of a tar.gz archive without extracting
tar -tzf archive.tar.gz # Lists the contents of 'archive.tar.gz'

Mounts: Manage how storage devices are accessed and used within the filesystem.

# Mount a filesystem
mount /dev/sdx1 /mnt/data # Attaches the filesystem found on device '/dev/sdx1' to the directory '/mnt/data'

# Mount a USB drive with specific filesystem type
mount -t vfat /dev/sdx1 /mnt/usb # Mounts a USB drive formatted with FAT32 to '/mnt/usb'

# Unmount a device safely
umount /mnt/data # Detaches the filesystem from '/mnt/data'

# Force unmount when the device is busy
umount -l /mnt/data # Performs a lazy unmount, detaching the filesystem as soon as it is not busy

File Operations and System Monitoring: Enhance system performance and management through efficient file handling and monitoring.

# Copy files with verification
cp -v source.txt destination.txt # Copies 'source.txt' to 'destination.txt' and displays what is being copied

# Move and rename files
mv -v old_location.txt new_location.txt # Moves/Renames 'old_location.txt' to 'new_location.txt', showing the process

# Remove a directory and its contents
rm -r directory_name # Recursively removes 'directory_name' and all its contents

# Check disk space usage
df -h # Displays free disk space on all mounted filesystems in human-readable form

# Summarize disk usage of the directory
du -sh /path/to/directory # Shows the total space used by the specified directory in human-readable form

# Display block devices connected to the system
lsblk # Lists all block devices connected to the system along with their mount points

Managing Users and File Permissions

Details

User Management: Essential commands for creating, modifying, and deleting user accounts, and managing group memberships effectively.

# Create a new user with a home directory and a specific shell
useradd -m -s /bin/bash newuser # '-m' creates a home directory, '-s' specifies the shell

# Add a user to the sudo group to grant administrative privileges
usermod -aG sudo newuser # '-aG' appends the user to the group without removing them from other groups

# Delete a user and their home directory
userdel -r olduser # '-r' removes the user along with their home directory

# Create a new group
groupadd newgroup # Adds a new group to the system

# Modify a user's primary group
usermod -g newgroup username # '-g' sets the primary group for the user

# Add a user to multiple groups
usermod -aG group1,group2 username # Appends the user to both 'group1' and 'group2'

File Permissions and Security: Detailed commands to manage access rights for files and directories.

# Change file permissions
chmod 755 filename # Sets read, write, and execute permissions for the owner, and read and execute for others

# Change ownership of a file
chown user:group filename # Changes both the owner and group of 'filename'

# Set permissions recursively for a directory and its contents
chmod -R 755 /path/to/directory # '-R' applies the permissions recursively to all files and subdirectories

# Set the SetUID permission on a file
chmod u+s /path/to/file # Enables users to execute the file with the permissions of its owner

# Set the SetGID on a directory
chmod g+s /path/to/directory # Files created in the directory inherit the group ID of the directory, not of the user who created the file

# Set sticky bit on a directory to restrict file deletion
chmod +t /path/to/directory # Only the file's owner, the directory's owner, or root can delete files within the directory

# Change the owner and group recursively for a directory
chown -R user:group /path/to/directory # '-R' changes owner and group recursively for all files within the directory

Analyzing Host Logs

Details

Important Log Files: Essential files for monitoring system and application activities:

    `/var/log/syslog`: General system activity log.
    `/var/log/auth.log`: Authentication and authorization logs.
    `/var/log/kern.log`: Kernel logs, useful for troubleshooting and understanding system messages related to the kernel.
    `/var/log/dmesg`: Boot and system messages, including hardware and driver messages.
    `/var/log/faillog`: User login failures, useful for security auditing.
    `/var/log/boot.log`: System boot logs.
    `/var/log/mail.log`: Mail server logs.
    `/var/log/apache2/access.log`: Web server access log (specific to Apache).
    `/var/log/apache2/error.log`: Web server error log (specific to Apache).

Commands for Log Analysis:

grep 'error' /var/log/syslog # Find all error messages in the system activity log
grep -i 'fail' /var/log/auth.log # Find failed login attempts, case-insensitive search
grep 'Failed password' /var/log/auth.log # Search for failed SSH login attempts
grep 'COMMAND=' /var/log/auth.log # Check for sudo command usage
grep 'segfault' /var/log/kern.log # Find segmentation faults in kernel messages
grep '192.168.1.1' /var/log/apache2/access.log # Search for specific IP address access in web logs
grep -i 'error' /var/log/*.log # Find all occurrences of "error" in all logs, ignoring case
grep 'denied' /var/log/audit/audit.log # List denied services from SELinux
grep 'reboot' /var/log/syslog # Check for reboot records in the system activity log
grep -i 'out of memory' /var/log/kern.log # Find out of memory issues in kernel logs

Security Log Monitoring Script:

An example script to scan various critical log files for keywords related to security and output relevant lines to the console.

#!/bin/bash
echo "Checking security-related log entries..."
keywords="fail|error|denied|segfault|unauthorized|exception"
log_files="/var/log/auth.log /var/log/syslog /var/log/kern.log /var/log/apache2/error.log"
for file in $log_files; do
    echo "Scanning $file for security keywords..."
    grep -E $keywords $file | while read -r line ; do
        echo "Security alert in $file: $line"
    done
done

Performance and Resource Monitoring

Details

Real-time monitoring: Utilize top for a dynamic view of running processes (top provides a continuous update of process statistics). htop offers an enhanced interactive interface (press F6 to sort processes by CPU, memory, etc.).

Network and Disk Statistics: dstat allows for monitoring all system resources with clear output, combining features of vmstat, iostat, and ifstat. Example: dstat -cdlmnpsy (-c for CPU, -d for disk, -l for load, -m for memory, -n for network, -p for processes, -s for swap, -y for system).

Common Examples:

# Disk usage checking with 'df'
df -h # Display disk space usage in human-readable format

# Directory size with 'du'
du -sh /path/to/directory # Summarize disk usage of the directory, human-readable

# Top command for process monitoring
top -o %MEM # Sort processes based on memory usage

# Memory and CPU usage with 'htop'
htop # Interactive process viewer, better visualization than 'top'

# Monitoring disk I/O with 'iostat'
iostat -mx 5 # Display extended disk I/O statistics every 5 seconds

# Network interface statistics with 'ifstat'
ifstat 5 5 # Report network interface status every 5 seconds for 5 intervals

# Viewing block device information with 'lsblk'
lsblk -f # Display filesystems in addition to block device attributes

# Checking active connections and listening ports with 'ss'
ss -tulwn # Display TCP, UDP, listening, and non-listening sockets with IPs and ports

# Using 'vmstat' for system performance
vmstat 1 10 # Report virtual memory statistics every 1 second for 10 intervals
Custom Scripts for Monitoring: Script to check disk usage and send email alerts if usage exceeds a threshold.
# Real-time monitoring with custom alerts script
#!/bin/bash
usage=$(df / | grep / | awk '{ print $5 }' | sed 's/%//g')
if [ $usage -gt 90 ]; then
    echo "Disk usage is above 90%, current usage is $usage%" | mail -s "Disk Usage Alert" admin@example.com
fi


Networking


Network Testing and Configuration

Details

Network Utilities: Essential tools for setting up and diagnosing network interfaces, connections, and performance.

# Configure network interfaces with ifconfig
ifconfig eth0 up # Activates the 'eth0' network interface
ifconfig eth0 down # Deactivates the 'eth0' network interface
ifconfig eth0 192.168.1.100 netmask 255.255.255.0 # Sets IP address and netmask for 'eth0'

# Manage wireless networks with iwconfig
iwconfig wlan0 essid "ExampleNetwork" key s:password # Sets the SSID and WEP key for 'wlan0'

# Check connectivity to a host with ping
ping -c 4 google.com # Sends 4 ICMP packets to 'google.com' to check connectivity

# Trace the path packets take to a host with traceroute
traceroute google.com # Shows the route packets take to reach 'google.com'

Advanced Network Tools: Advanced commands for in-depth network analysis and troubleshooting.

# Scan network devices, ports, and services with nmap
nmap -v -A scanme.nmap.org # Performs a verbose scan with OS and service detection

# Connect to remote systems with telnet to test connectivity
telnet example.com 80 # Connects to 'example.com' on port 80 to check web server connectivity

# `curl` for testing HTTP/HTTPS endpoints.
curl http://example.com # Sends a GET request to example.com and displays the response
curl -O http://example.com/file.zip # Downloads the file from the specified URL and saves it with the same filename as on the server
curl -d "param1=value1&param2=value2" -X POST http://example.com/submit # Sends a POST request with data to the server

# Use fping to check the reachability of multiple hosts
fping -a -g 192.168.1.0/24 # Lists all alive hosts in the '192.168.1.0/24' subnet

# DNS lookup and network diagnostics with dig
dig google.com # Queries DNS information for 'google.com'
dig +trace +all google.com # Traces the path of the DNS query for 'google.com'

# Use nslookup for DNS querying
nslookup google.com # Queries DNS for records associated with 'google.com'
nslookup -type=mx google.com # Queries for mail exchange records for 'google.com'

# Capture and analyze network packets with tcpdump
tcpdump -i eth0 -w capture_file.pcap # Captures packets on 'eth0' and writes them to 'capture_file.pcap'
tcpdump -r capture_file.pcap # Reads and displays packets from a pcap file

Secure Remote Operations

Details

SSH: Essential for secure remote operations, ensuring encrypted communication between machines.

# Basic SSH connection
ssh user@host # Logs into 'host' as 'user', establishing a secure shell session

# Generate SSH keys to use for authentication
ssh-keygen -t rsa -b 4096 # Generates an RSA key pair with 4096-bit encryption

# Copy SSH key to a remote host
ssh-copy-id user@host # Copies your public key to 'user@host' to enable key-based authentication

# Secure SSH configuration tips
# Edit the SSH config file to disable root login and enforce key-based authentication
sudo vim /etc/ssh/sshd_config
# Add or modify the following lines:
PermitRootLogin no
PasswordAuthentication no

SCP and SFTP: Utilize SSH for secure file transfers, protecting data integrity and privacy during transit.

# Securely copy a file to a remote server with SCP
scp localfile.txt user@host:/remote/directory/ # Copies 'localfile.txt' to the remote directory

# Start an SFTP session to interactively transfer files
sftp user@host # Initiates a secure FTP session with 'host', allowing file management commands

# Securely download a directory from a remote server with SCP
scp -r user@host:/remote/directory/ local_directory # Recursively copies a remote directory to a local directory

Firewall Configuration and Security Practices

Details

iptables and Firewalld: Master techniques for securing network traffic using robust firewall solutions.

# Using iptables for basic firewall rules
iptables -A INPUT -p tcp --dport 22 -j ACCEPT # Allows incoming SSH traffic on port 22
iptables -A OUTPUT -p tcp --dport 80 -j ACCEPT # Allows outgoing HTTP traffic on port 80
iptables -A INPUT -s 192.168.1.0/24 -j DROP # Blocks all incoming traffic from the 192.168.1.0/24 subnet

# Save iptables rules
iptables-save > /etc/iptables/rules.v4 # Saves the current IPv4 rules to a file

# Using Firewalld to manage firewall rules with zones
firewall-cmd --zone=public --add-port=80/tcp --permanent # Permanently opens HTTP port 80 in the public zone
firewall-cmd --zone=public --add-service=https --permanent # Permanently enables HTTPS service in the public zone
firewall-cmd --reload # Reloads Firewalld to apply changes

# List all rules in a specific zone
firewall-cmd --zone=public --list-all # Displays all settings for the public zone

Security Enhancements: Implementing robust security measures to fortify Linux systems.

# Regular system updates to patch vulnerabilities
sudo apt update && sudo apt upgrade # Updates and upgrades all packages on Debian-based systems

# Enforcing SELinux policies to enhance security
setenforce 1 # Puts SELinux in enforcing mode, making the security policies active
semanage port -a -t http_port_t -p tcp 8080 # Assigns TCP port 8080 a specific SELinux type for HTTP services

# Check SELinux status
sestatus # Displays the current status of SELinux, including current mode and policy version

# Using fail2ban to protect against brute force attacks
sudo apt-get install fail2ban # Installs fail2ban on Debian-based systems
sudo systemctl start fail2ban # Starts the fail2ban service
sudo fail2ban-client status # Checks the status of fail2ban to see what services are being protected

Scripting and Regex


Script Fundamentals & Optimization

Details

Writing Efficient Scripts and Handling Errors: Improve script performance, manage system resources, and enhance script robustness.

# Writing Efficient Scripts
# Use shell built-ins and minimize complexity
[[ $1 -gt 100 ]] && echo "Value is greater than 100" # Efficient condition check using built-in

# Run tasks in the background to free up the terminal
nohup ./long_running_script.sh & # Runs script in the background with no hangup

# Error Handling in Scripts
# Set up traps to catch and handle errors gracefully
trap 'echo "An error occurred." >&2; exit 1' ERR # Sets a trap for errors

# Provide clear custom error messages
cp file1 file2 || echo "Failed to copy file1 to file2" >&2 # Custom error message if cp fails

# Basics of Bash Scripting
#!/bin/bash
# Script to backup logs
tar -czf /backup/logs-$(date +%F).tar.gz /var/log # Backs up logs with a timestamp

# Error checking after operations
cp /source /destination
if [ $? -eq 0 ]; then
    echo "Copy successful"
else
    echo "Copy failed" >&2
fi

# Automating with Cron
# Cron job to check disk usage daily at midnight
0 0 * * * /home/user/checkDiskUsage.sh
#!/bin/bash
# Script to check disk usage and send email alerts
usage=$(df / | grep / | awk '{ print $5 }' | sed 's/%//g')
if [ $usage -gt 90 ]; then
    mail -s "Disk usage alert: $usage%" user@example.com <<< "Your root partition remaining free space is critically low."
fi

# Regex in Scripting
# Use regular expressions to validate and manipulate data
#!/bin/bash
# Script to find and replace phone numbers in a text file
sed -r 's/\([0-9]{3}\) [0-9]{3}-[0-9]{4}/+1-\1-\2-\3/g' contacts.txt # Standardizes phone number format

Mastery of Regular Expressions

Details

Understanding and Applying Regex: Regular expressions (regex) are a powerful tool for pattern matching and text manipulation in various Linux tools.

Basic regex usage:

# Character classes
grep '[A-Z]' file.txt # Matches any uppercase letter
grep '[0-9]' file.txt # Matches any digit

# Anchors
grep '^A' file.txt # Finds lines starting with 'A'
grep 'end$' file.txt # Finds lines ending with 'end'

# Quantifiers
grep 'o+' file.txt # Matches one or more 'o's in a line
grep '0*1' file.txt # Matches any number of '0's followed by '1'

# Exact number of characters
grep '^[0-9]{3}' file.txt # Finds lines starting with exactly three digits
grep '\b[0-9]{4}\b' file.txt # Finds four-digit numbers as whole words

# Optional characters
grep 'colou?r' file.txt # Matches both 'color' and 'colour'

# Range of characters
grep '[0-9A-Fa-f]' file.txt # Matches any hexadecimal digit
grep '[1-9][0-9]{2,4}' file.txt # Matches numbers between 100 and 99999

Complex patterns:

# Groups and Ranges
grep 'gr(a|e)y' file.txt # Matches 'gray' or 'grey'
sed -E 's/([0-9]{4})-([0-9]{2})-([0-9]{2})/\3-\2-\1/' file.txt # Changes date format from YYYY-MM-DD to DD-MM-YYYY

# Special Characters
grep 'Co\.ltd' file.txt # Finds 'Co.ltd' (dot is treated literally)
grep '\$[0-9]+' file.txt # Matches dollar amounts like $100, $20

# Backreferences
grep '\([a-z]\)\1' file.txt # Matches doubled letters like 'ee' in 'see'
sed -n '/\(.*\)\1/p' file.txt # Prints lines where any sequence of characters is repeated

# Non-capturing groups
grep -oP '(?:[0-9]{1,3}\.){3}[0-9]{1,3}' file.txt # Matches IP addresses without capturing the groups

# Lookahead and lookbehind assertions
grep -P '(?<=\bUSD)\d+' file.txt # Matches amounts that follow 'USD'
grep -P '\d+(?=\s*USD)' file.txt # Matches numbers that are immediately followed by 'USD'

# Complex nested patterns
grep -P 'foo(bar|baz)(qux|quux)' file.txt # Matches 'foobarqux', 'foobarquux', 'foobazqux', 'foobazquux'

Regex in Scripting:

# Validating email input
read email
if [[ $email =~ ^[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Za-z]{2,4}$ ]]; then
  echo "Valid email"
else
  echo "Invalid email"
fi

# Extracting IP addresses from logs
grep -oP '(\d{1,3}\.){3}\d{1,3}' server.log # Extracts IP addresses using Perl-compatible regex

# Parsing structured data
awk '/ERROR [0-9]{4}/ {print $0}' system.log # Finds and prints error messages with a four-digit code