Wednesday, 4 March 2026

Just how many spare camera batteries do you need?


Just how many spare camera batteries do you need?

There are two kinds of filmmakers:

  1. The ones who carry one battery and “trust the universe”.

  2. The ones who carry eight, plus a spreadsheet, plus a small solar farm.

I have been both. Usually on the same day.

Because here’s the truth: the number of spare batteries you need isn’t “a number”. It’s a relationship between (a) how long you’re filming, (b) what you’re filming with, and (c) how likely it is that the universe will choose today to teach you humility.

The simple rule (that actually works)

Bring enough battery for your expected filming time… then add one extra battery “for chaos”.

Chaos is: wind, cold, 4K/8K, image stabilisation, autofocus, Wi-Fi, Bluetooth, a mic receiver you forgot was powered, and that moment when someone says:
“Could you just do that again, but… better?”

Start with a quick battery reality check

Ask yourself:

  • How long will I be rolling for real? (Not “how long will I be out”.)
    A two-hour sailing session can become 30 minutes of actual recording… or two hours if you’re capturing everything “just in case”.

  • What are you filming on?
    Phones sip power until they don’t. Mirrorless cameras can be efficient… until you add external monitors, IBIS, high frame rates, and continuous AF. 360 cameras have their own special talent for eating batteries while you’re distracted by “how cool this looks”.

  • Can you charge during the day?
    Car charger? USB-C PD? Power bank? Solar? If yes, you can carry fewer batteries and more charging capability.

A practical “how many spares?” guide

Use this as a starting point (then adjust for your own kit and habits):

A) Short shoot (up to ~1 hour of actual recording)

  • Minimum: 1 spare

  • Comfortable: 2 spares

  • If it’s cold / you’re filming 4K/slow-mo: add 1

B) Half-day shoot (2–4 hours of mixed filming)

  • Minimum: 2 spares

  • Comfortable: 3–4 spares

  • If you’re using an external monitor or lots of AF: lean to 4

C) Full day / event coverage

  • Minimum: 4 spares

  • Comfortable: 6+ spares or a power solution (USB-C PD / dummy battery / V-mount style setup)

  • If you cannot charge at all: assume you’ll need more than you think

D) Boats, sailing, outdoor winter filming

  • Minimum: take your normal number and add one more
    Cold weather can make a perfectly good battery behave like it’s nearing retirement.

The hidden battery killers (a short list of villains)

If your battery life feels “mysteriously short”, it’s usually one of these:

  • High resolution / high bitrate recording

  • High frame rate (50/60/120fps)

  • Image stabilisation working hard

  • Continuous autofocus and face tracking

  • Bright screens (especially external monitors)

  • Wireless features (Wi-Fi/Bluetooth)

  • Long “camera on, not recording” time (the silent assassin)

A smarter approach than hoarding batteries

Sometimes the answer isn’t more batteries. It’s a plan.

1) Power bank + USB-C PD charging
Great for breaks between takes (and for phones/360 cameras). Even better if your camera can run/charge via USB-C while operating.

2) In-car charging
If you’re travelling between locations, that car becomes your mobile power station.

3) One “big battery” option
Depending on your setup: dummy battery to power bank, or a larger external pack for longer stints (especially for video).

4) Label and rotate
Number your batteries. Rotate use. Retire the “problem child” that always dies early.

My “leave-the-house” battery checklist

Before you go:

  • Batteries charged ✅

  • Spares packed ✅

  • Charger packed ✅

  • Cable(s) packed ✅

  • Power bank / car adapter ✅

  • Batteries stored safely (caps/case, no loose batteries rolling around) ✅

Because nothing says “professional video workflow” quite like rummaging in a bag for a battery while your subject drifts away downriver.

So… how many do you need?

If you want a one-line answer:

  • Casual filming: 1–2 spares

  • Serious filming: 3–4 spares

  • All-day / mission-critical / cold / boat day: 4–6 spares or spares + a proper charging/power strategy

And always, always pack one extra battery for chaos.

Tuesday, 3 March 2026

R&D — From test tubes to tell-tales: building an electronic burgee

 


R&D — From test tubes to tell-tales: building an electronic burgee

There’s a myth that R&D only happens in white coats, surrounded by fume cupboards and the faint smell of “who left the hotplate on?”. In our world, research and development is just as likely to happen in a buoyancy aid, crouched in the bottom of a dinghy, wondering whether that “slight breeze” is actually a wind shift or just your hat trying to make a break for it.

Because here’s the thing: sailing is basically applied physics with occasional splashing. And once you’ve spent a lifetime building experiments, teaching students to measure the unmeasurable, and persuading sensors to behave themselves, it’s only a matter of time before you look at a masthead burgee and think: Yes… but what if it had data?

The problem: the masthead knows… but the helm doesn’t

A traditional burgee and wind indicator are brilliant—simple, reliable, and they don’t need charging. The only snag is that the truly useful information is happening at the top of the mast, while the people making the decisions are down below doing impressions of a human washing line.

On rivers (hello Thames), wind can be wildly different even a few metres up. Trees, banks, moored boats, bridges—everything interferes. So the burgee might be telling the truth, but it’s telling it to the clouds.

The R&D idea: an electronic burgee you can actually see

So the concept is this:

  • A combined weather vane + anemometer at the masthead

  • Measurements relayed down the mast (wire or wireless)

  • A screen at the foot of the mast showing:

    • Wind direction (relative to the boat’s heading)

    • Wind speed (and ideally gusts)

    • A “trend” indicator (strengthening / easing)

In other words: the masthead’s gossip comes down to deck level, where it can be acted on without a crew member performing an interpretive dance while staring upwards.

Why this matters (especially for learning and coaching)

This isn’t just gadget-lust (although I admit gadgets have a strong pull, like biscuits). An electronic burgee could genuinely help with:

  • Training: showing new helms how wind direction relates to boat trim and course

  • Tacking and gybing: timing manoeuvres when the wind is behaving oddly along the river

  • Sail setup: helping the crew see what changes in downhaul, kicker/gnav, outhaul actually do

  • Safety: spotting gusts building before they smack you like a wet duvet

It turns sailing into something you can observe, measure, and learn from, rather than “feel vaguely and argue about afterwards”.

Practical design thoughts (before we invent a new type of disappointment)

If we’re doing this properly, it needs to survive real sailing life, which includes: water, shocks, UV, vibration, and that special kind of accidental abuse known as “launching”.

Key R&D questions:

  • Power: small battery? solar trickle? or run a thin cable down the mast?

  • Data link: wired (reliable) vs wireless (neat but fussy around masts, water, and interference)

  • Ruggedness: waterproofing, salt resistance (for coastal), and impact survival

  • Display: sunlight readable, simple icons, big numbers, minimal menus

  • Calibration: direction needs a reference (boat heading / mast alignment) so it doesn’t lie politely

The dream is something that’s useful in 3 seconds, not something that requires a PhD and a laptop on a spinnaker bag.

The bigger point: R&D is a mindset, not a department

This is why R&D leaks out of science and into sailing. Once you start asking “how could this be better?” you can’t switch it off. The lab and the boat are both environments where small improvements make a big difference—and where the real world happily punishes sloppy thinking.

So yes, we’re looking at building an electronic burgee. Not because we need more screens in our lives… but because if we can make wind behaviour clearer, training quicker, and sailing safer (and maybe reduce the amount of shouting during a tack), then it’s exactly the kind of R&D that’s worth doing.

And if it all goes wrong? Well… we can always fall back on the traditional system: lick a finger, hold it up, and pretend that was “data”.

Monday, 2 March 2026

Using AI tools to help with video editing (subtitles, effects, colour balancing)

 


Using AI tools to help with video editing (subtitles, effects, colour balancing)

There was a time when “post-production” meant tea, biscuits, and the slow realisation that you’d filmed everything… except the bit where you explain the thing. These days, AI has barged into the edit suite like an over-helpful assistant: occasionally brilliant, occasionally confident-and-wrong, but almost always faster than doing it all by hand.

Used sensibly, AI can take the boring, repetitive bits of editing (transcribing, syncing, rough selects, basic colour matching) and give you more time for the bits that actually matter: storytelling, pacing, and making your video look like you planned it that way.

1) Extracting subtitles: the fastest “instant upgrade”

Subtitles are no longer just for accessibility (although that’s reason enough). They’re also for:

  • viewers watching on mute,

  • short-form platforms that love on-screen text,

  • anyone trying to follow a technical explanation at speed.

Typical AI workflow (10–20 minutes instead of 2 hours):

  1. Auto-transcribe your timeline or selected clips.

  2. Clean up names, jargon, and units (AI always butchers “DaVinci”, “spectrophotometer”, and any student’s surname).

  3. Turn transcript into captions and choose a readable style.

  4. Export SRT for YouTube, or burn captions in for socials.

If you’re using Adobe Premiere Pro, Speech-to-Text will generate a transcript and create captions directly from it.

If you’re using DaVinci Resolve Studio, the Neural Engine features cover a lot of AI-assisted workflow (and Blackmagic explicitly positions AI as part of Resolve’s toolset).

And if you want a very “edit by typing” approach (especially for talking-head, tuition, or voiceover-heavy videos), Descript is built around transcription-first editing and quick caption creation.

My rule: subtitles from AI are draft subtitles. Always do a quick skim before publishing. One wrong word can turn GCSE chemistry into interpretive poetry.


2) Graphical effects: letting AI do the donkey work

This is where AI can feel like magic—particularly for tasks that used to mean frame-by-frame misery:

  • Smart reframing (turn wide shots into vertical without chopping heads off),

  • Object tracking / region tracking (so text or blur sticks to the right thing),

  • Background removal / masking assistance (especially for quick explainers),

  • Auto “rough cut” helpers (finding pauses, removing silences, assembling selects).

Resolve’s AI tooling (Neural Engine) includes features like object detection, smart reframing, and more—useful for turning one master video into multiple platform-friendly versions.

For more experimental or generative workflows (and the “how on earth did you do that?” factor), platforms like Runway lean hard into AI-powered video creation and manipulation tools.

Practical tip: use AI effects to get you 80% of the way quickly, then finish manually. That last 20% is where “professional” lives.


3) Colour balancing: AI as your first pass, not your final grade

Colour is the sneaky time thief. AI can help you:

  • balance exposure and white balance,

  • match shots from different cameras,

  • get a consistent “base look” before you do your creative grade.

Resolve’s Neural Engine includes auto colour and colour matching among its AI-driven features.
That’s ideal when you’ve got (say) a Canon on a tripod, a 360 camera doing its own thing, and a phone clip your son grabbed at exactly the wrong colour temperature.

A sensible grading approach:

  1. AI auto-colour / shot match to normalise clips.

  2. Check with scopes (waveform/vectorscope) because your eyes lie.

  3. Then apply your creative grade (warmth, contrast, “sunny day optimism”, etc.).

AI gets you to “not embarrassing” fast. You get yourself to “that looks lovely”.


The “don’t let AI ruin your life” checklist

  • Check proper nouns (people, places, boat names, science terms).

  • Watch for confident hallucinations in captions (“jib” becoming “gym”, “gnav” becoming “navy”…).

  • Keep a house style: caption fonts, sizes, safe margins, brand colours.

  • Don’t overdo effects: if the AI effect is the most interesting part, your story has a problem.

  • Archive your transcript: it becomes blog material, revision notes, worksheet text, and searchable video metadata.

Sunday, 1 March 2026

The Leitner System for Flashcards (aka “Spaced Repetition Without the Spreadsheet Panic”)

 


The Leitner System for Flashcards (aka “Spaced Repetition Without the Spreadsheet Panic”)

Flashcards have a bit of an image problem. They’re either seen as the magic key to top grades, or as tiny bits of card you’ll lovingly create… and then never look at again.
The truth is: flashcards are brilliant, but only if you use them in a way that matches how memory actually works. That’s where the Leitner System comes in — a gloriously simple method that turns “random revision” into a routine that builds long-term recall.

What is the Leitner System?

The Leitner System is a spaced repetition method for flashcards. Instead of revising every card every day (which is exhausting and unnecessary), you sort your cards into boxes (or piles). Cards you find easy are reviewed less often. Cards you struggle with come back more frequently.
In other words: you spend your time where it actually helps.

The basic setup (five boxes, zero drama)

You can do this with proper flashcard boxes, envelopes, a set of labelled food tubs, or five piles on the kitchen table (if your family don’t mind living in a stationary shop).

  • Box 1 (Daily-ish): New cards and ones you got wrong.

  • Box 2: Cards you got right once.

  • Box 3: Cards you got right a few times.

  • Box 4: Cards you mostly know.

  • Box 5 (Victory Lap): Cards you know really well.

A common review schedule looks like:

  • Box 1: every day

  • Box 2: every 2 days

  • Box 3: twice a week

  • Box 4: weekly

  • Box 5: fortnightly (or “occasionally to stay smug”)

You don’t have to be perfect. Consistency beats perfection every time.

How cards move (the bit that makes it work)

This is the whole system in one rule:

  • If you get a card right: it moves up a box (reviewed less often).

  • If you get it wrong: it goes back to Box 1 (reviewed more often).

That’s it. No complicated apps required (though apps can help). The system naturally gives you more practice on weak areas, and gradually reduces the time you spend on things you already know.

Why it’s so effective (in plain English)

The Leitner System works because it uses two powerful learning ideas:

  1. Retrieval practice: Trying to pull an answer out of your brain strengthens memory more than re-reading notes.

  2. Spacing: Revisiting information with gaps in between helps your brain store it long-term.

If revision is like building a wall, Leitner isn’t “adding more bricks”. It’s cementing the ones that keep wobbling.

What makes a good flashcard (and what makes a terrible one)

A flashcard should test one clear thing.

Good:

  • “What is osmosis?”

  • “State Newton’s 2nd law.”

  • “What is the difference between ionic and covalent bonding?”

  • “Define opportunity cost.”

Not so good:

  • “Explain EVERYTHING about photosynthesis.”

  • “Tell me all the equations in Physics Paper 1.”

  • “Describe the entire Cold War (but keep it brief).”

If you need paragraphs, that’s not a flashcard — that’s an essay wearing a tiny hat.

How to use Leitner for GCSE and A-Level (fast and practical)

  • Make cards from exam mark schemes (gold dust for wording).

  • Use short answers that match what examiners reward.

  • Mix factual recall (definitions, equations, key terms) with mini-application:

    • “What happens to rate if temperature increases? Why?”

    • “Explain why increasing surface area increases reaction rate.”

And crucially: say the answer out loud before flipping the card. If you “sort of knew it”, that’s your brain negotiating. Make it commit.

Paper vs apps: which is better?

  • Paper cards are brilliant for younger students and hands-on learners. Also: no notifications.

  • Apps (like Anki or Quizlet) make spacing automatic and are great for busy students.

The best system is the one the student will actually use. The second best is the one they’ll use after you’ve removed TikTok from their phone (I’m joking… mostly).

A tiny warning: don’t confuse making cards with learning

Making flashcards can feel productive — and it is a bit — but the learning comes from testing yourself repeatedly over time.
If a student has made 400 gorgeous flashcards and revised none of them, they haven’t created a revision system. They’ve created an arts-and-crafts project.

The simple starting plan

If you want the smallest possible way to start:

  1. Make 20 flashcards from one topic.

  2. Use 3 boxes (New / Learning / Secure).

  3. Revise for 10 minutes a day.

  4. Let the boxes do the thinking.

You’ll be amazed how quickly “I keep forgetting this” turns into “Oh, that one again… fine.”

Saturday, 28 February 2026

Filming in UV, IR and Thermal — when your camera starts lying to you

 


Filming in UV, IR and Thermal — when your camera starts lying to you

If normal filming is “point camera at thing, press record, pretend you’re David Attenborough”… then filming in UV, IR and thermal is more like, “point camera at thing, press record, realise physics has other plans, and spend the next hour arguing with a lens cap.”

These worlds are brilliant for science, nature, kit testing and all sorts of “wow” shots — but they come with a special set of problems that don’t show up in ordinary visible light.

1) Your lens may not be telling the truth (or even letting the light through)

UV and thermal are the worst offenders here.

  • UV: Most normal glass blocks a lot of UV. Many lenses transmit almost nothing useful in UV, so you end up with dim footage, mushy contrast, and lots of noise. Some lenses also create bright “hot spots” or weird flare patterns because coatings weren’t designed for UV.

  • IR (near-IR): Many lenses work sort of fine, but you can get hot spots, reduced contrast, and soft corners. Some lenses shift focus slightly between visible and IR.

  • Thermal: This is a completely different optical system (often germanium lenses). You can’t just “use your nice Canon lens” and hope for the best. Also: thermal lenses are expensive enough to make you treat them like crown jewels.

Result: your image can look soft, blotchy, low contrast, or oddly vignetted… even when you did everything “right”.

2) Focusing becomes a dark art (sometimes literally)

Autofocus is usually trained and tuned for visible light. In UV/IR it can hunt, miss, or give up and go home.

  • IR focus shift is a classic trap: you nail focus in visible, flip a filter on, and suddenly everything is slightly off.

  • UV often forces you into manual focus because the image is dim and AF can’t see enough contrast.

  • Thermal cameras don’t focus on “detail” in the same way — they focus on temperature edges. Fine detail can vanish if temperatures are similar.

Practical fix: manual focus, focus peaking, test charts, and checking playback like a nervous parent at sports day.

3) Exposure is harder because the rules change

Your camera’s metering assumes the world behaves normally. In UV/IR/thermal… it does not.

  • IR: foliage can turn bright (the “Wood effect”), skies can go dark, and reflectivity changes. Your histogram becomes a comedian.

  • UV: everything is often dim, and you may need big light levels — but you also need to avoid frying your subject (or your eyeballs).

  • Thermal: exposure is essentially a temperature scale (often with auto-gain). The camera may constantly re-map the scene if something hot enters the frame (hello, hands), making the picture “pulse” as it rescales.

Fix: lock exposure where possible, use manual settings, and for thermal learn to control level/span (or equivalent) rather than thinking like a normal videographer.

4) Reflections and “phantom signals” ruin your day

This one catches people out fast:

  • Thermal reflections are real. Shiny surfaces can reflect heat like mirrors reflect light. You think you’re filming a warm object — but you’re actually filming the reflection of your own body heat. Congratulations: you’ve invented the thermal selfie.

  • IR reflections off water, varnish, glass, and polished surfaces can be wildly different than in visible.

  • UV can produce strange flare and internal reflections, especially with filters.

Fix: change angles, use matte finishes, add shields, and in thermal beware of glass (often opaque to long-wave IR) and reflective metal.

5) Colour becomes “false colour” (and your audience will believe it anyway)

In IR and thermal, colour is usually:

  • mapped (false colour palettes),

  • shifted (custom white balance in IR), or

  • manufactured (UV fluorescence where the “colour” is actually visible light emitted by materials).

That’s fine — as long as you’re honest about it.

Fix: label what’s going on (“false colour thermal”, “near-IR”, “UV fluorescence”), keep palettes consistent across a project, and don’t imply “this is the real colour”.

6) Calibration, emissivity, and why thermal numbers can be wrong

Thermal cameras are amazing, but temperature readouts can be very misleading if you don’t account for:

  • Emissivity (how well a surface emits IR radiation)

  • Reflected apparent temperature

  • Distance, humidity, and atmosphere

  • Shiny surfaces (often low emissivity)

You can be “measuring” 70°C and actually be looking at a reflection or a surface that simply doesn’t emit well.

Fix: treat thermal temperature as a measurement with assumptions, not a gospel truth. Use emissivity tables, reference patches (matte tape/paint), and compare against known temperatures where possible.

7) Noise, artefacts, and the “why is the picture crawling?” problem

  • UV/IR often require higher ISO or longer shutter times → noise, banding, hot pixels, motion blur.

  • Thermal sensors can show fixed-pattern noise, drift, and “stuck” pixels; some cameras do periodic NUC calibration (that little click/freeze).

Fix: more light (UV/IR), stable support, sensible shutter speeds, and in thermal plan around calibration pauses.

8) Lighting can be the biggest headache of all

  • UV lighting: needs proper UV sources, but also serious safety thinking (skin/eye protection, avoiding stray UV, and managing reflections).

  • IR lighting: IR illuminators help, but can create hotspots and uneven scenes.

  • Thermal: you don’t “light” it — you manage heat differences. Sometimes you need to create contrast (warm background, cool subject, or vice versa).

Fix: design the scene. In these bands you don’t just film reality — you engineer visibility.

9) Workflow: it’s slower, fussier, and needs more notes

When you mix UV, IR, thermal and normal footage, your edit can descend into chaos unless you keep track.

Fix: slate your clips (“IR 850nm filter”, “UV fluorescence 365nm torch”, “Thermal palette ironbow”), keep consistent LUT/palette choices, and grab some “normal light” reference shots so viewers know what they’re looking at.


The big takeaway

Filming in UV, IR and thermal is like gaining superpowers… with the small downside that your camera becomes a moody scientific instrument. Once you accept that you’re not just filming — you’re measuring, translating, and sometimes politely arguing with physics — it becomes hugely rewarding.