preprocessing media to save bandwidth

{almost forgot:  Merry Christmas everybody!}

There are several bandwidth-related concerns for nomads:

  1. limits on data plans
  2. spotty mobile data
  3. not abusing open wifi

These concerns can be ameliorated by preprocessing media Elsewhere to more appropriate quality levels to “good enough” before  D/Ling.   In this case, “elsewhere” is a linux VPS.

podcasts

Podcasts are often overproduce and overencoded.   It’s common for single-voice talking head ‘casts to be encoded in stereo at outrageous bitrates.

My process:

  1. grab the podcasts Elsewhere (newsbeuter + aria2c)
  2. disassemble them to .wav one by one (ffmpeg)
  3. do any processing like voxxing, normalizing, etc
  4. re-encode with a voice-friendly encoder (opusenc).  90%^ filesize reduction is common.
  5. download

video

I live in a van and don’t have any big-screen devices.  240p looks fine on my phone, chromebook, and pi-based dvr.  Fast-moving action scenes do pixelate, but I’m not an action flick fan.  On-screen text can be hard to read.

  1.  download Elsewhere
  2. convert to 240p .mp4 files (ffmpeg). 75% filesize reduction is common.
  3. download

Talking head YT videos are downloaded as audio only (youtube-dl) and processed with the podcasts above.  If there is visual content I want to see I pull the video down as “worstvideo+worstaudio” with youtube-dl.  If there is something that requires higher quality (rare) I have a separate script to handle that.

Video from Amazon’s Prime Video service is downloaded (not streamed) at non-peak times at the Data Saver rate.  They do a great job with the compression.  Similar sizes to my 240p conversions but better quality.  They do have access to the whole AWS infrastructure.  🙂

TV that comes in over-the-air via antenna( to the MythTV pi rig) obviously uses no mobile data.    Even then I prefer SD to HD for diskspace and playback purposes.

ebooks

I only started preprocessing ebooks yesterday.  Commercial ebooks tend to come with all kinds of bloaty crap in them.  I’ve been locally processing them in Calibre after d/l, but I finally started doing it from the linux command line Elsewhere.

  1.  download Elsewhere
  2. run ebook-convert (calibre) to [re]convert to epub.  66% filesize reduction is common.
  3. run ebook-polish (calibre) for final pass. Usually only makes tiny reduction.
  4. download

Example:

3025975 There_Are_Places...epub
 989351 There_Are_Places...converted.epub
 988687 There_Are_Places...polished.epub

Conversion options:
--change-justification left \
--smarten-punctuation \
--subset-embedded-fonts \ <-- removes unused font symbols
--insert-blank-line \  <-- blank line between paragraphs
--output-profile kindle


Polish options:
--compress-images \ <-- losslessly
--remove-unused-css


Published by frater jason

Full-time boondocker, usually in the American Southwest.

2 thoughts on “preprocessing media to save bandwidth

  1. I’m like most people and struggle to understand and keep track of data. Just a data consumer. Not that it’s a excuse to hoard data but will 5G make it unnecessary to conserve? I don’t know how that works. Or will people just use more data and bog down the system again. I only use cellular with a phone and tablet. But I sometimes use 30-40Gigs in a month. I don’t download much because of limited storage memory and access to WiFi. Is that a lot compared to other full timers? I dont know what to compare it to. Verizon limits data for us don’t they? Thanks.

    Like

    1. While my data plan runs full speed to 50GB, most of the places I camp have relatively limited throughput. In this particular spot (Shea Rd) I have a minimal connection. Even browsing times out.

      Like

Leave a Reply

%d