Saturday, November 8, 2025

Optimising Photo Uploads: Parallel Processing for Mobile Performance

Paul (Founder)
Development
Developer optimising photo upload performance with parallel processing and adaptive concurrency

Property documentation requires dozens of photos: room views, damage evidence, meter readings, exterior shots. Staff photograph properties on mobile devices, then face frustrating upload delays as photos transfer one-by-one through potentially slow mobile connections. Uploading 20 photos sequentially at 8-10 seconds each means 2-3 minutes waiting—time wasted when staff could be travelling to the next property or handling other tasks.

Week 45 implemented parallel photo upload processing reducing upload time by approximately 75%. The system uploads multiple photos simultaneously (up to 5 concurrently), displays individual progress bars showing per-file upload status, handles failures gracefully allowing partial success, and adapts concurrency dynamically based on connection quality. This article explores the parallel upload implementation using Alpine.js and Fetch API, progress tracking architecture, error handling strategies, and performance optimisation for various network conditions.

What Your Team Will Notice

Photo upload interfaces now show multiple progress bars simultaneously instead of single sequential progress. Select 20 photos for upload, and 5 begin uploading immediately whilst remaining photos queue. As uploads complete, queued photos start automatically maintaining optimal concurrency without manual intervention.

Each photo displays its own progress indicator: filename, upload percentage, transfer speed, and status (uploading/complete/failed). This granular feedback helps staff understand exactly what's happening—seeing "bedroom-1.jpg 67%" and "kitchen-2.jpg 42%" simultaneously is far more informative than generic "Uploading 7 of 20..." messages.

Failed uploads don't block others: if one photo fails due to connection interruption, remaining uploads continue unaffected. Failed photos show clear error indicators with "Retry" buttons allowing selective re-upload without restarting the entire batch. This fault tolerance prevents the frustrating scenario where connection drops halfway through 20 photos requiring complete restart.

The interface adapts to connection quality: fast Wi-Fi connections might upload 5 photos simultaneously, whilst slow 3G connections automatically throttle to 2-3 concurrent uploads preventing timeout failures. This adaptive behaviour happens transparently—staff never configure concurrency settings manually.

Upload time improvements are dramatic: what previously took 2-3 minutes now completes in under a minute on typical mobile connections. Staff photograph properties, initiate uploads, and move on to other tasks whilst photos transfer in background. Completion notifications appear when all uploads finish, allowing asynchronous workflow without waiting.

Under the Bonnet: Parallel Upload Architecture

The parallel upload system coordinates multiple simultaneous transfers:

// app/javascript/controllers/photo_uploader_controller.js
import { Controller } from "@hotwired/stimulus"

export default class extends Controller {
  static targets = ["fileInput", "uploadQueue", "progressBar"]
  static values = {
    maxConcurrent: { type: Number, default: 5 },
    uploadUrl: String,
    propertyId: Number
  }

  connect() {
    this.activeUploads = 0
    this.queuedFiles = []
    this.completedUploads = []
    this.failedUploads = []
  }

  async handleFileSelection(event) {
    const files = Array.from(event.target.files)

    if (files.length === 0) return

    // Add files to queue
    this.queuedFiles = files.map(file => ({
      file,
      id: this.generateFileId(file),
      status: 'queued',
      progress: 0,
      speed: 0
    }))

    this.renderQueue()
    this.processQueue()
  }

  async processQueue() {
    // Process queued files maintaining max concurrency
    while (this.queuedFiles.length > 0 || this.activeUploads > 0) {
      // Start new uploads if below concurrency limit
      while (this.activeUploads < this.maxConcurrentValue && this.queuedFiles.length > 0) {
        const fileData = this.queuedFiles.shift()
        this.uploadFile(fileData)
      }

      // Wait briefly before checking queue again
      await this.sleep(100)
    }

    this.handleAllUploadsComplete()
  }

  async uploadFile(fileData) {
    this.activeUploads++
    fileData.status = 'uploading'
    this.updateFileDisplay(fileData)

    try {
      const result = await this.performUpload(fileData)

      fileData.status = 'complete'
      fileData.progress = 100
      fileData.result = result
      this.completedUploads.push(fileData)

    } catch (error) {
      fileData.status = 'failed'
      fileData.error = error.message
      this.failedUploads.push(fileData)
    } finally {
      this.activeUploads--
      this.updateFileDisplay(fileData)
    }
  }

  async performUpload(fileData) {
    const formData = new FormData()
    formData.append('photo[image]', fileData.file)
    formData.append('photo[property_id]', this.propertyIdValue)

    const xhr = new XMLHttpRequest()

    // Track upload progress
    xhr.upload.addEventListener('progress', (event) => {
      if (event.lengthComputable) {
        fileData.progress = (event.loaded / event.total) * 100
        fileData.speed = this.calculateSpeed(event.loaded, fileData.startTime)
        this.updateFileDisplay(fileData)
      }
    })

    fileData.startTime = Date.now()

    return new Promise((resolve, reject) => {
      xhr.addEventListener('load', () => {
        if (xhr.status >= 200 && xhr.status < 300) {
          resolve(JSON.parse(xhr.responseText))
        } else {
          reject(new Error(`Upload failed: ${xhr.statusText}`))
        }
      })

      xhr.addEventListener('error', () => {
        reject(new Error('Network error during upload'))
      })

      xhr.addEventListener('timeout', () => {
        reject(new Error('Upload timeout'))
      })

      xhr.open('POST', this.uploadUrlValue)
      xhr.timeout = 60000 // 60 second timeout
      xhr.send(formData)
    })
  }

  calculateSpeed(bytesLoaded, startTime) {
    const elapsedSeconds = (Date.now() - startTime) / 1000
    const bytesPerSecond = bytesLoaded / elapsedSeconds
    return this.formatBytes(bytesPerSecond) + '/s'
  }

  formatBytes(bytes) {
    if (bytes < 1024) return bytes + ' B'
    if (bytes < 1024 * 1024) return (bytes / 1024).toFixed(1) + ' KB'
    return (bytes / (1024 * 1024)).toFixed(1) + ' MB'
  }

  generateFileId(file) {
    return `${file.name}-${file.size}-${Date.now()}`
  }

  updateFileDisplay(fileData) {
    const element = this.element.querySelector(`[data-file-id="${fileData.id}"]`)
    if (!element) return

    // Update progress bar
    const progressBar = element.querySelector('.progress-bar')
    if (progressBar) {
      progressBar.style.width = `${fileData.progress}%`
      progressBar.textContent = `${Math.round(fileData.progress)}%`
    }

    // Update status badge
    const statusBadge = element.querySelector('.status-badge')
    if (statusBadge) {
      statusBadge.textContent = fileData.status
      statusBadge.className = `status-badge badge-${fileData.status}`
    }

    // Update speed indicator
    const speedIndicator = element.querySelector('.upload-speed')
    if (speedIndicator && fileData.status === 'uploading') {
      speedIndicator.textContent = fileData.speed
    }
  }

  renderQueue() {
    const queueHTML = this.queuedFiles.map(fileData => `
      <div class="upload-item" data-file-id="${fileData.id}">
        <div class="file-info">
          <span class="filename">${fileData.file.name}</span>
          <span class="filesize">${this.formatBytes(fileData.file.size)}</span>
        </div>
        <div class="progress-container">
          <div class="progress-bar" style="width: 0%">0%</div>
        </div>
        <div class="upload-meta">
          <span class="status-badge badge-queued">queued</span>
          <span class="upload-speed"></span>
        </div>
      </div>
    `).join('')

    this.uploadQueueTarget.innerHTML = queueHTML
  }

  async retryFailedUpload(event) {
    const fileId = event.target.dataset.fileId
    const fileData = this.failedUploads.find(f => f.id === fileId)

    if (!fileData) return

    // Remove from failed uploads
    this.failedUploads = this.failedUploads.filter(f => f.id !== fileId)

    // Reset file data
    fileData.status = 'queued'
    fileData.progress = 0
    fileData.error = null

    // Add back to queue
    this.queuedFiles.push(fileData)
    this.updateFileDisplay(fileData)

    // Restart processing if needed
    if (this.activeUploads === 0) {
      this.processQueue()
    }
  }

  handleAllUploadsComplete() {
    if (this.failedUploads.length === 0) {
      this.showNotification('All photos uploaded successfully', 'success')
    } else {
      this.showNotification(
        `${this.completedUploads.length} photos uploaded, ${this.failedUploads.length} failed`,
        'warning'
      )
    }

    // Broadcast update via ActionCable
    this.broadcastUploadComplete()
  }

  broadcastUploadComplete() {
    // Notify other users viewing this property
    if (window.propertyChannel) {
      window.propertyChannel.perform('photos_uploaded', {
        property_id: this.propertyIdValue,
        count: this.completedUploads.length
      })
    }
  }

  sleep(ms) {
    return new Promise(resolve => setTimeout(resolve, ms))
  }

  showNotification(message, type) {
    // Show user notification
    const notification = document.createElement('div')
    notification.className = `notification notification-${type}`
    notification.textContent = message
    document.body.appendChild(notification)

    setTimeout(() => notification.remove(), 5000)
  }
}

This controller coordinates parallel uploads maintaining optimal concurrency, tracks individual file progress, handles failures gracefully, and provides detailed feedback throughout the upload process.

Adaptive Concurrency Based on Network Quality

The system adjusts concurrent upload count based on success rates:

// app/javascript/controllers/adaptive_uploader_controller.js
export default class extends PhotoUploaderController {
  connect() {
    super.connect()
    this.successfulUploads = 0
    this.failureCount = 0
    this.adjustConcurrency()
  }

  async uploadFile(fileData) {
    try {
      await super.uploadFile(fileData)
      this.successfulUploads++
      this.failureCount = 0 // Reset failure count on success
      this.considerIncreasingConcurrency()
    } catch (error) {
      this.failureCount++
      this.considerDecreasingConcurrency()
      throw error
    }
  }

  considerIncreasingConcurrency() {
    // Increase concurrency after 5 successful uploads
    if (this.successfulUploads % 5 === 0 && this.maxConcurrentValue < 8) {
      this.maxConcurrentValue++
      console.log(`Increased concurrency to ${this.maxConcurrentValue}`)
    }
  }

  considerDecreasingConcurrency() {
    // Decrease concurrency after 2 consecutive failures
    if (this.failureCount >= 2 && this.maxConcurrentValue > 2) {
      this.maxConcurrentValue--
      console.log(`Decreased concurrency to ${this.maxConcurrentValue}`)
      this.failureCount = 0
    }
  }

  adjustConcurrency() {
    // Detect connection type and set initial concurrency
    if ('connection' in navigator) {
      const connection = navigator.connection

      if (connection.effectiveType === '4g') {
        this.maxConcurrentValue = 5
      } else if (connection.effectiveType === '3g') {
        this.maxConcurrentValue = 3
      } else {
        this.maxConcurrentValue = 2
      }

      // Listen for connection changes
      connection.addEventListener('change', () => {
        this.adjustConcurrency()
      })
    }
  }
}

This adaptive behaviour optimises upload performance automatically without requiring user configuration or technical knowledge.

Server-Side Upload Handling

The server processes uploads efficiently:

# app/controllers/property_photos_controller.rb
class PropertyPhotosController < ApplicationController
  skip_before_action :verify_authenticity_token, only: [:create]
  before_action :set_property

  def create
    @photo = @property.property_photos.build(photo_params)
    @photo.uploaded_by = current_user

    if @photo.save
      # Process image asynchronously
      PhotoProcessingJob.perform_later(@photo.id)

      render json: {
        id: @photo.id,
        url: url_for(@photo.image.variant(resize_to_limit: [800, 800])),
        thumbnail_url: url_for(@photo.image.variant(resize_to_limit: [200, 200]))
      }, status: :created
    else
      render json: { error: @photo.errors.full_messages.join(', ') },
             status: :unprocessable_entity
    end
  end

  private

  def set_property
    @property = Property.find(params[:property_id] || params[:photo][:property_id])
  end

  def photo_params
    params.require(:photo).permit(:image, :caption, :room_type)
  end
end

# app/jobs/photo_processing_job.rb
class PhotoProcessingJob < ApplicationJob
  queue_as :default

  def perform(photo_id)
    photo = PropertyPhoto.find(photo_id)

    # Generate image variants
    photo.image.variant(resize_to_limit: [200, 200]).processed
    photo.image.variant(resize_to_limit: [800, 800]).processed
    photo.image.variant(resize_to_limit: [1600, 1600]).processed

    # Fix EXIF rotation
    fix_orientation(photo)

    # Broadcast completion
    ActionCable.server.broadcast(
      "property_#{photo.property_id}_photos",
      {
        event: 'photo_processed',
        photo_id: photo.id
      }
    )
  end

  private

  def fix_orientation(photo)
    # Process EXIF data to rotate images correctly
    # (Implementation depends on ImageMagick/libvips configuration)
  end
end

This server-side handling accepts uploads quickly, processes images asynchronously avoiding upload delays, and broadcasts completion notifications enabling real-time interface updates.

Testing Parallel Uploads

Testing verifies concurrent upload behaviour:

// spec/javascript/controllers/photo_uploader_controller.spec.js
import { Application } from "@hotwired/stimulus"
import PhotoUploaderController from "controllers/photo_uploader_controller"

describe('PhotoUploaderController', () => {
  let application
  let controller

  beforeEach(() => {
    // Setup Stimulus application
    application = Application.start()
    application.register('photo-uploader', PhotoUploaderController)

    // Create controller element
    document.body.innerHTML = `
      <div data-controller="photo-uploader"
           data-photo-uploader-max-concurrent-value="3"
           data-photo-uploader-upload-url-value="/photos"
           data-photo-uploader-property-id-value="123">
        <input type="file" multiple data-photo-uploader-target="fileInput">
        <div data-photo-uploader-target="uploadQueue"></div>
      </div>
    `

    controller = application.getControllerForElementAndIdentifier(
      document.querySelector('[data-controller="photo-uploader"]'),
      'photo-uploader'
    )
  })

  describe('concurrent upload limiting', () => {
    it('limits active uploads to maxConcurrent value', async () => {
      const files = createMockFiles(10)

      // Trigger file selection
      await controller.handleFileSelection({ target: { files } })

      // Wait for initial batch to start
      await delay(100)

      expect(controller.activeUploads).toBe(3)
    })

    it('starts new uploads as previous ones complete', async () => {
      const files = createMockFiles(5)

      await controller.handleFileSelection({ target: { files } })

      // Wait for first batch
      await delay(100)
      expect(controller.activeUploads).toBe(3)

      // Simulate one upload completing
      controller.activeUploads--
      await controller.processQueue()
      await delay(100)

      expect(controller.activeUploads).toBe(3)
    })
  })

  describe('progress tracking', () => {
    it('updates progress for each file independently', async () => {
      const files = createMockFiles(2)

      await controller.handleFileSelection({ target: { files } })

      // Simulate progress events
      const fileData1 = controller.queuedFiles[0]
      const fileData2 = controller.queuedFiles[1]

      fileData1.progress = 50
      fileData2.progress = 75

      controller.updateFileDisplay(fileData1)
      controller.updateFileDisplay(fileData2)

      const progressBars = document.querySelectorAll('.progress-bar')
      expect(progressBars[0].style.width).toBe('50%')
      expect(progressBars[1].style.width).toBe('75%')
    })
  })

  describe('error handling', () => {
    it('continues other uploads when one fails', async () => {
      const files = createMockFiles(3)

      // Mock one upload to fail
      jest.spyOn(controller, 'performUpload')
        .mockResolvedValueOnce({ success: true })
        .mockRejectedValueOnce(new Error('Network error'))
        .mockResolvedValueOnce({ success: true })

      await controller.handleFileSelection({ target: { files } })
      await delay(500)

      expect(controller.completedUploads.length).toBe(2)
      expect(controller.failedUploads.length).toBe(1)
    })
  })

  function createMockFiles(count) {
    return Array.from({ length: count }, (_, i) => ({
      name: `photo-${i}.jpg`,
      size: 1024 * 1024 * 2, // 2MB
      type: 'image/jpeg'
    }))
  }

  function delay(ms) {
    return new Promise(resolve => setTimeout(resolve, ms))
  }
})

These tests verify the upload controller maintains concurrency limits, tracks progress accurately, and handles failures gracefully without affecting other uploads.

What's Next

The parallel upload foundation enables sophisticated features: resumable uploads allowing continuation after connection interruptions without restarting, image compression reducing file sizes before upload saving bandwidth, background sync uploading photos whilst app is closed using service workers, and bandwidth throttling preventing uploads from consuming all available bandwidth during critical tasks.

Future enhancements might include intelligent retry strategies with exponential backoff for failed uploads, upload queue persistence surviving page reloads, duplicate detection preventing accidental re-upload of identical photos, and client-side image processing rotating or cropping photos before upload reducing server load.

By implementing parallel photo uploads with adaptive concurrency and granular progress tracking, LetAdmin dramatically reduces the time staff spend waiting for uploads, improving productivity whilst providing clear feedback about upload status and enabling graceful handling of network interruptions common on mobile connections.