In this short guide, we will learn how to split a large folder into multiple subfolders using Bash scripts in the terminal. This technique helps organize thousands of files into manageable batches, improving file system performance, processing speed, and organization.

Here you can find the short answer:

(1) Split files into batches of 100

i=0; for f in *; do d=dir_$(printf %03d $((i/100))); mkdir -p $d; mv "$f" $d; let i++; done

(2) Split files with custom batch size

n=50; i=0; for f in *; do d=batch_$((i/n)); mkdir -p $d; mv "$f" $d; let i++; done

So let's see how to split folders efficiently using terminal commands.

Problem: Managing Folders with Too Many Files

Large folders with thousands of files cause:

  • Slow directory listing and file browsing
  • System performance degradation
  • Difficult file management and navigation
  • File system limitations on some platforms

Solution: Automatically distribute files into numbered subfolders with specified batch sizes.

1: Basic Bash Script to Split Files

Split all files in the current directory into subfolders with 100 files each:

#!/bin/bash
files_per_folder=100
i=0

for file in *; do
    [ -f "$file" ] || continue
    folder_num=$((i / files_per_folder))
    subfolder=$(printf "batch_%03d" $folder_num)
    mkdir -p "$subfolder"
    mv "$file" "$subfolder/"
    ((i++))
done

echo "Split $i files into $(((i-1)/files_per_folder + 1)) subfolders"

Output Result:

Split 2547 files into 26 subfolders

How it works:

  • Iterates through all files in current directory
  • Calculates target folder number using integer division
  • Creates zero-padded subfolder names (batch_000, batch_001, etc.)
  • Moves each file to its designated subfolder
  • Uses [ -f "$file" ] to skip directories

Real-world example: Organizing 2,547 product images from an e-commerce site into 26 folders with 100 images each for faster gallery loading.

2: One-Liner Command for Quick Splits

For quick operations, use this one-line command in the terminal:

i=0; for f in *.jpg; do d=photos_$(printf %03d $((i/200))); mkdir -p "$d"; mv "$f" "$d/"; let i++; done

Output Result:

Created: photos_000/ photos_001/ photos_002/ ... photos_015/

Command breakdown:

  • i=0 - Initialize counter
  • for f in *.jpg - Process only JPG files
  • $((i/200)) - Calculate folder number (200 files per folder)
  • printf %03d - Format with leading zeros
  • mkdir -p "$d" - Create folder if it doesn't exist
  • let i++ - Increment counter

Use case: Quickly organizing 3,200 vacation photos into 16 folders with 200 images each.

3: Split Files by Extension and Size

Organize files by type and split each type into batches:

#!/bin/bash
batch_size=50

for ext in jpg png pdf mp4; do
    i=0
    for file in *.$ext; do
        [ -f "$file" ] || continue
        folder="${ext}_batch_$((i / batch_size))"
        mkdir -p "$folder"
        mv "$file" "$folder/"
        ((i++))
    done
    [ $i -gt 0 ] && echo "Organized $i $ext files"
done

Output Result:

Organized 1234 jpg files
Organized 567 png files
Organized 89 pdf files
Organized 23 mp4 files

Folder structure created:

├── jpg_batch_0/
├── jpg_batch_1/
├── jpg_batch_2/
├── png_batch_0/
├── png_batch_1/
├── pdf_batch_0/
└── mp4_batch_0/

Common use: Organizing mixed media downloads with different file types into categorized, manageable batches.

Advanced Options

Split with Custom Naming Pattern

#!/bin/bash
prefix="archive"
batch_size=100
i=0

for file in *.log; do
    folder="${prefix}_$(date +%Y%m)_batch_$((i / batch_size))"
    mkdir -p "$folder"
    mv "$file" "$folder/"
    ((i++))
done

Creates folders like: archive_202412_batch_0, archive_202412_batch_1

Count Files Before Processing

total_files=$(ls -1 | wc -l)
batch_size=100
num_folders=$(((total_files - 1) / batch_size + 1))

echo "Will create $num_folders folders for $total_files files"
read -p "Continue? (y/n) " -n 1 -r
echo
[[ $REPLY =~ ^[Yy]$ ]] || exit 1

Split based on file size

This section will split the files based on the file size in decreasing order from the biggest to smallest.

i=0; 
for f in $(ls -Sr1); 
do 
    d=doc_$(printf %03d $((i/10000+1))); 
    mkdir -p $d; 
    mv "$f" $d; 
    let i++; 
done

Common Use Cases

Photography: Split event photos into albums by batch size

Log Management: Organize server logs into date-based batches

Data Processing: Distribute CSV files for parallel processing

Media Production: Organize video clips into scene-based folders

Document Archiving: Split scanned documents into yearly/monthly batches

Backup Management: Organize backup files into manageable chunks

Performance Tips

My tests shows that 1 M files took several hours to split if you apply also sorting.

Test with small batches first using echo instead of mv:

echo "Would move $file to $folder"

Use variables for batch size to easily adjust:

BATCH_SIZE=100

Process specific extensions to avoid moving hidden files or directories

Add progress indicators for large operations:

echo "Processing file $i of $total_files"

Preserve file timestamps with -p flag if using cp instead of mv

Don't run on directories - use [ -f "$file" ] check

Avoid spaces in folder names - use underscores or hyphens

Don't process system files - exclude hidden files with pattern matching

Safety Checks

Create backup before running:

cp -r source_folder source_folder_backup

Dry-run mode:

DRY_RUN=true

if [ "$DRY_RUN" = true ]; then
    echo "Would move: $file -> $folder"
else
    mv "$file" "$folder/"
fi

Verify file count:

original_count=$(ls -1 source/ | wc -l)
new_count=$(find source/ -type f | wc -l)
echo "Original: $original_count, After split: $new_count"

Quick Reference Table

Batch Size Use Case Example Command
50-100 Documents, PDFs i=0; for f in *.pdf; do d=docs_$((i/100))...
100-500 Images, Photos i=0; for f in *.jpg; do d=photos_$((i/200))...
1000+ Log files i=0; for f in *.log; do d=logs_$((i/1000))...
10-50 Videos i=0; for f in *.mp4; do d=videos_$((i/20))...

Resources