Monday, 2 March 2026

Using AI tools to help with video editing (subtitles, effects, colour balancing)

 


Using AI tools to help with video editing (subtitles, effects, colour balancing)

There was a time when “post-production” meant tea, biscuits, and the slow realisation that you’d filmed everything… except the bit where you explain the thing. These days, AI has barged into the edit suite like an over-helpful assistant: occasionally brilliant, occasionally confident-and-wrong, but almost always faster than doing it all by hand.

Used sensibly, AI can take the boring, repetitive bits of editing (transcribing, syncing, rough selects, basic colour matching) and give you more time for the bits that actually matter: storytelling, pacing, and making your video look like you planned it that way.

1) Extracting subtitles: the fastest “instant upgrade”

Subtitles are no longer just for accessibility (although that’s reason enough). They’re also for:

  • viewers watching on mute,

  • short-form platforms that love on-screen text,

  • anyone trying to follow a technical explanation at speed.

Typical AI workflow (10–20 minutes instead of 2 hours):

  1. Auto-transcribe your timeline or selected clips.

  2. Clean up names, jargon, and units (AI always butchers “DaVinci”, “spectrophotometer”, and any student’s surname).

  3. Turn transcript into captions and choose a readable style.

  4. Export SRT for YouTube, or burn captions in for socials.

If you’re using Adobe Premiere Pro, Speech-to-Text will generate a transcript and create captions directly from it.

If you’re using DaVinci Resolve Studio, the Neural Engine features cover a lot of AI-assisted workflow (and Blackmagic explicitly positions AI as part of Resolve’s toolset).

And if you want a very “edit by typing” approach (especially for talking-head, tuition, or voiceover-heavy videos), Descript is built around transcription-first editing and quick caption creation.

My rule: subtitles from AI are draft subtitles. Always do a quick skim before publishing. One wrong word can turn GCSE chemistry into interpretive poetry.


2) Graphical effects: letting AI do the donkey work

This is where AI can feel like magic—particularly for tasks that used to mean frame-by-frame misery:

  • Smart reframing (turn wide shots into vertical without chopping heads off),

  • Object tracking / region tracking (so text or blur sticks to the right thing),

  • Background removal / masking assistance (especially for quick explainers),

  • Auto “rough cut” helpers (finding pauses, removing silences, assembling selects).

Resolve’s AI tooling (Neural Engine) includes features like object detection, smart reframing, and more—useful for turning one master video into multiple platform-friendly versions.

For more experimental or generative workflows (and the “how on earth did you do that?” factor), platforms like Runway lean hard into AI-powered video creation and manipulation tools.

Practical tip: use AI effects to get you 80% of the way quickly, then finish manually. That last 20% is where “professional” lives.


3) Colour balancing: AI as your first pass, not your final grade

Colour is the sneaky time thief. AI can help you:

  • balance exposure and white balance,

  • match shots from different cameras,

  • get a consistent “base look” before you do your creative grade.

Resolve’s Neural Engine includes auto colour and colour matching among its AI-driven features.
That’s ideal when you’ve got (say) a Canon on a tripod, a 360 camera doing its own thing, and a phone clip your son grabbed at exactly the wrong colour temperature.

A sensible grading approach:

  1. AI auto-colour / shot match to normalise clips.

  2. Check with scopes (waveform/vectorscope) because your eyes lie.

  3. Then apply your creative grade (warmth, contrast, “sunny day optimism”, etc.).

AI gets you to “not embarrassing” fast. You get yourself to “that looks lovely”.


The “don’t let AI ruin your life” checklist

  • Check proper nouns (people, places, boat names, science terms).

  • Watch for confident hallucinations in captions (“jib” becoming “gym”, “gnav” becoming “navy”…).

  • Keep a house style: caption fonts, sizes, safe margins, brand colours.

  • Don’t overdo effects: if the AI effect is the most interesting part, your story has a problem.

  • Archive your transcript: it becomes blog material, revision notes, worksheet text, and searchable video metadata.

Sunday, 1 March 2026

The Leitner System for Flashcards (aka “Spaced Repetition Without the Spreadsheet Panic”)

 


The Leitner System for Flashcards (aka “Spaced Repetition Without the Spreadsheet Panic”)

Flashcards have a bit of an image problem. They’re either seen as the magic key to top grades, or as tiny bits of card you’ll lovingly create… and then never look at again.
The truth is: flashcards are brilliant, but only if you use them in a way that matches how memory actually works. That’s where the Leitner System comes in — a gloriously simple method that turns “random revision” into a routine that builds long-term recall.

What is the Leitner System?

The Leitner System is a spaced repetition method for flashcards. Instead of revising every card every day (which is exhausting and unnecessary), you sort your cards into boxes (or piles). Cards you find easy are reviewed less often. Cards you struggle with come back more frequently.
In other words: you spend your time where it actually helps.

The basic setup (five boxes, zero drama)

You can do this with proper flashcard boxes, envelopes, a set of labelled food tubs, or five piles on the kitchen table (if your family don’t mind living in a stationary shop).

  • Box 1 (Daily-ish): New cards and ones you got wrong.

  • Box 2: Cards you got right once.

  • Box 3: Cards you got right a few times.

  • Box 4: Cards you mostly know.

  • Box 5 (Victory Lap): Cards you know really well.

A common review schedule looks like:

  • Box 1: every day

  • Box 2: every 2 days

  • Box 3: twice a week

  • Box 4: weekly

  • Box 5: fortnightly (or “occasionally to stay smug”)

You don’t have to be perfect. Consistency beats perfection every time.

How cards move (the bit that makes it work)

This is the whole system in one rule:

  • If you get a card right: it moves up a box (reviewed less often).

  • If you get it wrong: it goes back to Box 1 (reviewed more often).

That’s it. No complicated apps required (though apps can help). The system naturally gives you more practice on weak areas, and gradually reduces the time you spend on things you already know.

Why it’s so effective (in plain English)

The Leitner System works because it uses two powerful learning ideas:

  1. Retrieval practice: Trying to pull an answer out of your brain strengthens memory more than re-reading notes.

  2. Spacing: Revisiting information with gaps in between helps your brain store it long-term.

If revision is like building a wall, Leitner isn’t “adding more bricks”. It’s cementing the ones that keep wobbling.

What makes a good flashcard (and what makes a terrible one)

A flashcard should test one clear thing.

Good:

  • “What is osmosis?”

  • “State Newton’s 2nd law.”

  • “What is the difference between ionic and covalent bonding?”

  • “Define opportunity cost.”

Not so good:

  • “Explain EVERYTHING about photosynthesis.”

  • “Tell me all the equations in Physics Paper 1.”

  • “Describe the entire Cold War (but keep it brief).”

If you need paragraphs, that’s not a flashcard — that’s an essay wearing a tiny hat.

How to use Leitner for GCSE and A-Level (fast and practical)

  • Make cards from exam mark schemes (gold dust for wording).

  • Use short answers that match what examiners reward.

  • Mix factual recall (definitions, equations, key terms) with mini-application:

    • “What happens to rate if temperature increases? Why?”

    • “Explain why increasing surface area increases reaction rate.”

And crucially: say the answer out loud before flipping the card. If you “sort of knew it”, that’s your brain negotiating. Make it commit.

Paper vs apps: which is better?

  • Paper cards are brilliant for younger students and hands-on learners. Also: no notifications.

  • Apps (like Anki or Quizlet) make spacing automatic and are great for busy students.

The best system is the one the student will actually use. The second best is the one they’ll use after you’ve removed TikTok from their phone (I’m joking… mostly).

A tiny warning: don’t confuse making cards with learning

Making flashcards can feel productive — and it is a bit — but the learning comes from testing yourself repeatedly over time.
If a student has made 400 gorgeous flashcards and revised none of them, they haven’t created a revision system. They’ve created an arts-and-crafts project.

The simple starting plan

If you want the smallest possible way to start:

  1. Make 20 flashcards from one topic.

  2. Use 3 boxes (New / Learning / Secure).

  3. Revise for 10 minutes a day.

  4. Let the boxes do the thinking.

You’ll be amazed how quickly “I keep forgetting this” turns into “Oh, that one again… fine.”

Saturday, 28 February 2026

Filming in UV, IR and Thermal — when your camera starts lying to you

 


Filming in UV, IR and Thermal — when your camera starts lying to you

If normal filming is “point camera at thing, press record, pretend you’re David Attenborough”… then filming in UV, IR and thermal is more like, “point camera at thing, press record, realise physics has other plans, and spend the next hour arguing with a lens cap.”

These worlds are brilliant for science, nature, kit testing and all sorts of “wow” shots — but they come with a special set of problems that don’t show up in ordinary visible light.

1) Your lens may not be telling the truth (or even letting the light through)

UV and thermal are the worst offenders here.

  • UV: Most normal glass blocks a lot of UV. Many lenses transmit almost nothing useful in UV, so you end up with dim footage, mushy contrast, and lots of noise. Some lenses also create bright “hot spots” or weird flare patterns because coatings weren’t designed for UV.

  • IR (near-IR): Many lenses work sort of fine, but you can get hot spots, reduced contrast, and soft corners. Some lenses shift focus slightly between visible and IR.

  • Thermal: This is a completely different optical system (often germanium lenses). You can’t just “use your nice Canon lens” and hope for the best. Also: thermal lenses are expensive enough to make you treat them like crown jewels.

Result: your image can look soft, blotchy, low contrast, or oddly vignetted… even when you did everything “right”.

2) Focusing becomes a dark art (sometimes literally)

Autofocus is usually trained and tuned for visible light. In UV/IR it can hunt, miss, or give up and go home.

  • IR focus shift is a classic trap: you nail focus in visible, flip a filter on, and suddenly everything is slightly off.

  • UV often forces you into manual focus because the image is dim and AF can’t see enough contrast.

  • Thermal cameras don’t focus on “detail” in the same way — they focus on temperature edges. Fine detail can vanish if temperatures are similar.

Practical fix: manual focus, focus peaking, test charts, and checking playback like a nervous parent at sports day.

3) Exposure is harder because the rules change

Your camera’s metering assumes the world behaves normally. In UV/IR/thermal… it does not.

  • IR: foliage can turn bright (the “Wood effect”), skies can go dark, and reflectivity changes. Your histogram becomes a comedian.

  • UV: everything is often dim, and you may need big light levels — but you also need to avoid frying your subject (or your eyeballs).

  • Thermal: exposure is essentially a temperature scale (often with auto-gain). The camera may constantly re-map the scene if something hot enters the frame (hello, hands), making the picture “pulse” as it rescales.

Fix: lock exposure where possible, use manual settings, and for thermal learn to control level/span (or equivalent) rather than thinking like a normal videographer.

4) Reflections and “phantom signals” ruin your day

This one catches people out fast:

  • Thermal reflections are real. Shiny surfaces can reflect heat like mirrors reflect light. You think you’re filming a warm object — but you’re actually filming the reflection of your own body heat. Congratulations: you’ve invented the thermal selfie.

  • IR reflections off water, varnish, glass, and polished surfaces can be wildly different than in visible.

  • UV can produce strange flare and internal reflections, especially with filters.

Fix: change angles, use matte finishes, add shields, and in thermal beware of glass (often opaque to long-wave IR) and reflective metal.

5) Colour becomes “false colour” (and your audience will believe it anyway)

In IR and thermal, colour is usually:

  • mapped (false colour palettes),

  • shifted (custom white balance in IR), or

  • manufactured (UV fluorescence where the “colour” is actually visible light emitted by materials).

That’s fine — as long as you’re honest about it.

Fix: label what’s going on (“false colour thermal”, “near-IR”, “UV fluorescence”), keep palettes consistent across a project, and don’t imply “this is the real colour”.

6) Calibration, emissivity, and why thermal numbers can be wrong

Thermal cameras are amazing, but temperature readouts can be very misleading if you don’t account for:

  • Emissivity (how well a surface emits IR radiation)

  • Reflected apparent temperature

  • Distance, humidity, and atmosphere

  • Shiny surfaces (often low emissivity)

You can be “measuring” 70°C and actually be looking at a reflection or a surface that simply doesn’t emit well.

Fix: treat thermal temperature as a measurement with assumptions, not a gospel truth. Use emissivity tables, reference patches (matte tape/paint), and compare against known temperatures where possible.

7) Noise, artefacts, and the “why is the picture crawling?” problem

  • UV/IR often require higher ISO or longer shutter times → noise, banding, hot pixels, motion blur.

  • Thermal sensors can show fixed-pattern noise, drift, and “stuck” pixels; some cameras do periodic NUC calibration (that little click/freeze).

Fix: more light (UV/IR), stable support, sensible shutter speeds, and in thermal plan around calibration pauses.

8) Lighting can be the biggest headache of all

  • UV lighting: needs proper UV sources, but also serious safety thinking (skin/eye protection, avoiding stray UV, and managing reflections).

  • IR lighting: IR illuminators help, but can create hotspots and uneven scenes.

  • Thermal: you don’t “light” it — you manage heat differences. Sometimes you need to create contrast (warm background, cool subject, or vice versa).

Fix: design the scene. In these bands you don’t just film reality — you engineer visibility.

9) Workflow: it’s slower, fussier, and needs more notes

When you mix UV, IR, thermal and normal footage, your edit can descend into chaos unless you keep track.

Fix: slate your clips (“IR 850nm filter”, “UV fluorescence 365nm torch”, “Thermal palette ironbow”), keep consistent LUT/palette choices, and grab some “normal light” reference shots so viewers know what they’re looking at.


The big takeaway

Filming in UV, IR and thermal is like gaining superpowers… with the small downside that your camera becomes a moody scientific instrument. Once you accept that you’re not just filming — you’re measuring, translating, and sometimes politely arguing with physics — it becomes hugely rewarding.

Friday, 27 February 2026

Writing “Similar” Tunes Without Nicking Anyone’s Copyright

 


Writing “Similar” Tunes Without Nicking Anyone’s Copyright

Every creator eventually asks the same dangerous question:

“Can you make it sound like that… but not, you know… that?”

It’s the musical equivalent of asking a baker for a Colin the Caterpillar that definitely isn’t Colin the Caterpillar, while winking so hard you sprain an eyelid.

The good news: you can write music that sits in the same “vibe family” as a reference track without breaching copyright.
The bad news: you need to do it deliberately — because your brain is a pattern-matching machine with the morals of a magpie.

1) Understand what copyright actually bites

In simple terms (and yes, lawyers will add footnotes the size of Wales):

  • Copyright protects the specific expression of a musical idea (melody/lyrics, and sometimes distinctive combinations).

  • It does not protect general style: genre, instrumentation, tempo, “80s synth vibe”, “epic trailer feel”, “lo-fi beats to mark homework to”.

Most problems happen when:

  • the melody is too close,

  • the hook rhythm is unmistakably similar,

  • or the signature combination of melody + harmony + rhythm ends up “recognisably the same song wearing a moustache”.

2) The “reference track” trick: analyse, don’t imitate

Use reference tracks like a science teacher uses a demo experiment: we’re learning the principles, not pinching the apparatus.

Make a quick checklist from the reference:

  • Tempo range (e.g., 95–105 BPM)

  • Feel (straight, swung, halftime)

  • Instrument palette (pads, plucks, guitars, piano, etc.)

  • Structure (8-bar intro, drop at bar 17, short stinger ending)

  • Energy curve (where it builds, where it breathes)

Crucially: treat the melody as radioactive. Observe it from behind glass.

3) Write a new melody first — then build the vibe around it

If you do harmony + beat + sound design first, your melody will “magically” fall into the same notes as the reference because your ear is already guided there.

Instead:

  1. Write a melody without the reference playing.

  2. Change one major “melody identity marker”:

    • Contour (shape): if theirs rises then drops, do drop then rise.

    • Rhythmic motif: if theirs goes long-long-short-short, make yours short-long-short-long.

    • Starting note: move the home base away from theirs.

A handy sanity check: sing your melody a cappella. If it instantly reminds you of the reference, it’s too close.

4) Harmony: use the same function, not the same chords

Many pop tracks share progressions. That’s fine. But copying the exact progression and the same bass movement and the same rhythmic placement is where you drift into “tribute act”.

Try:

  • Keep the same emotional function (e.g., uplifting → tension → release)

  • Swap in alternatives:

    • Change key

    • Substitute chords (relative minors/majors, add 7ths/9ths)

    • Alter bass notes (inversions)

    • Change harmonic rhythm (how often chords change)

5) Rhythm: borrow the feel, not the fingerprint

Rhythm is often what makes something recognisable.

To keep it safe:

  • Change the drum pattern accents (kick/snare placement)

  • Change the syncopation

  • Change the groove template (straight vs swung vs shuffled)

  • Change the hook rhythm if it matches the reference hook

If your listener can clap your hook and accidentally clap the other song… that’s a warning light.

6) Sound design can suggest a world — but don’t clone a signature sound

Using “a warm analogue pad with gentle chorus” is one thing. Recreating a famous, distinctive patch or a very identifiable production gimmick is another.

Safer moves:

  • Use a different lead instrument

  • Change octave/register

  • Change envelope (pluck vs swell)

  • Change articulation (legato vs staccato)

  • Change effects chain (delay timing/reverb size/modulation)

7) The “three changes” rule (practical, not legal)

For any element that feels close (melody, bass line, hook rhythm), make at least three meaningful changes:

  • Notes

  • Rhythm

  • Starting point

  • Phrasing

  • Register

  • Instrument

  • Tempo/groove

  • Harmony underneath

It’s not a legal shield — it’s a good creative discipline.

8) The safest workflow for content creators (YouTube, blogs, courses)

If you’re making background tracks for videos (science demos, sailing clips, drone footage, etc.), you want music that:

  • supports the message

  • loops cleanly

  • doesn’t distract

  • is unquestionably original

Try this recipe:

  • Two-chord or four-chord loop

  • Simple, original top-line motif (2–4 notes)

  • One “ear candy” element every 8 bars (a little riser, fill, or variation)

  • Minimal melodic density during speech

You’ll end up with something useful, repeatable, and safely yours.

9) The “someone else’s opinion” test

Before you publish:

  • Play your track to someone who knows the reference.

  • Ask: “Does this remind you of that track?”

  • If they say yes immediately, rewrite the melody/hook rhythm.

And if you want to be extra cautious, run it past a music professional or rights specialist — especially if it’s for a paid campaign.

10) The golden rule

If the goal is “make it close enough that people think it’s that”… you’re aiming at the wrong target.

Aim for: same brief, same energy, your own musical DNA.

That way you’re not just avoiding trouble — you’re building a recognisable sound of your own.


Quick “do this / avoid this” checklist

Do

  • Analyse tempo/feel/instruments/structure

  • Write melody away from reference

  • Change contour + rhythm + starting note

  • Use function-based harmony, not copy-paste chords

  • Get a “does it remind you of…?” listener test

Avoid

  • Copying the hook melody (even slightly “reshaped”)

  • Matching hook rhythm + chord rhythm + bass movement

  • Using the same signature sound + same melodic shape

  • Keeping the reference playing while composing the topline

Thursday, 26 February 2026

Filming on a boat with 1–2 (maybe 3) 360 cameras

 


Filming on a boat with 1–2 (maybe 3) 360 cameras

(aka “How to record everything… including the bit where you forgot to press record.”)

There’s a special kind of confidence that comes from mounting a 360 camera on a boat. It whispers: “Relax — you’ll capture everything.” And then, five minutes later, you discover you captured everything except the moment you actually wanted… because the lens is covered in spray, the mount is slowly drooping, and the camera is politely overheating in the sun like a tourist in Benidorm.

Still: 360 cameras are genuinely brilliant afloat. They’re the closest thing we have to a time machine for sailing videos — you can choose the shot later, follow the action, and pull out angles you didn’t even know you needed.

Here’s how I’d set up one, two, or three 360 cameras on a boat, without turning the rig into a maritime hedgehog.


1) The big idea: what story are you filming?

Before you stick cameras everywhere, decide what you want to show:

  • Training / tuition: helm inputs, sail trim, crew coordination, manoeuvres, mistakes (and fixes).

  • Vlog / family day: faces, reactions, “we definitely meant to do that”.

  • Racing: starts, mark roundings, wind shifts, traffic, rules situations.

  • Safety boat / filming boat: wider context, rescue practice, coaching perspective.

That “story” determines where cameras go. Otherwise you end up with 4 hours of beautifully stabilised… deck non-slip.


2) The golden rules (before we talk camera count)

Keep it safe and boring

  • Don’t mount anything where it can snag sheets, trap a toe, or become a projectile in a capsize.

  • Use tethers. Always. If it’s not tethered, it’s already on the bottom.

  • Avoid blocking escape routes (especially in dinghies).

Protect the lenses (because water loves lenses)

  • A tiny smear on a 360 lens becomes a full panoramic smear.

  • Use lens guards if you can, and carry a microfibre cloth in a dry pocket.

  • Plan “wipe moments” (before a start, after a tack, after a dunking).

Audio is half the film

On-water audio is… enthusiastic. Wind and spray will try to remix your masterpiece into a sea shanty.

  • If you can, use a separate mic (even a phone in a dry bag) for narration later.

  • If relying on camera audio: use wind reduction settings, and accept that your best dialogue may be “WHEEEEE” and “WHO PUT THAT BUOY THERE?”

Power and storage: plan like a pessimist

  • Use big cards, format beforehand.

  • If you’re filming a long session, consider external power (but keep cables tidy and waterproofed).

  • Nothing ruins morale like “Storage Full” when you’ve just nailed the perfect gybe.


3) Best positions (the “invisible crew member” approach)

Think of a 360 camera as a crew member who never complains, never needs tea, and never says “are we there yet?” — but does need a good seat.

The best all-rounder: stern pole

  • Great view of helm + crew + sails + wake.

  • Works for dinghies and small powerboats.

  • Keeps it out of the way (mostly).

  • Bonus: the boat looks fast even when it isn’t.

The most useful for learning: centreline / cockpit

  • Shows hands, ropes, sail trim, body movement.

  • Risk: it can get kicked, soaked, or sat on by someone who swears they “didn’t see it”.

The cinematic shot: bow / forward-facing

  • Gorgeous for scenery, waves, “we’re going on an adventure”.

  • Less useful for sail handling unless you also capture cockpit.

The “coach view”: mast / high mount

  • Fantastic overview of sail shape and manoeuvres.

  • Harder to mount securely and safely.

  • Check it won’t interfere with rigging, sails, or launching/recovery.


4) Setups that actually work

Setup A: One 360 camera (the “simple and sensible” option)

Mount: stern pole or high-ish aft position
Goal: capture everything, choose angles later

What you’ll get:

  • Wide context of manoeuvres.

  • Crew interaction.

  • Decent “TV style” reframes in editing.

Tip: With one camera, keep it stable, high, and centred. A droopy pole turns your film into “found footage”.


Setup B: Two 360 cameras (the sweet spot)

If you do anything regularly (tuition, vlogging, racing), two cameras is where it starts feeling proper.

Camera 1 (main): stern pole
Camera 2 (detail): cockpit/centreline OR forward/bow

Why it’s great:

  • Stern cam covers the “whole boat story”.

  • Second cam gives you detail: hands, jib work, expressions, the moment someone says “I thought you had it”.

Pro tip: Synchronise with a simple method:

  • Start both, then do a loud clap / tap a winch / shout something unmistakable (“MARK!” works) so you can line up audio in editing.


Setup C: Three 360 cameras (the “small film crew”)

This is for coaching days, race analysis, or “we’re making a proper episode”.

Camera 1: stern pole (overall)
Camera 2: cockpit (technique)
Camera 3: bow or mast (context / sail shape / scenery)

The danger:

  • More cameras = more battery management, more wiping, more mounts, more things to forget.

  • You can absolutely spend the whole day filming and forget to sail.

Best use case: when you want to teach from the footage later (or make a detailed breakdown video).


5) Settings: keep it reliable, not fancy

Without getting brand-specific, the “boat basics” are:

  • Resolution: high enough for reframing (because you’ll crop a lot).

  • Frame rate: 30 fps is fine for most; 60 fps helps for fast action and smoother reframes.

  • Stabilisation: yes (boats wobble; the audience shouldn’t).

  • Horizon lock/levelling: very helpful if your camera supports it.

  • Exposure: auto is usually okay; watch out for bright water and dark faces. If you can, lock exposure once it looks good.

And please, for the love of sanity:
Do a 10-second test clip before launching.
It’s the difference between content creation and interpretive dance with a memory card.


6) Editing workflow (so it doesn’t eat your life)

360 footage is wonderful… and also enormous.

A workable workflow:

  1. Dump files into folders by date + camera position (Stern / Cockpit / Bow).

  2. Make short “selects” first (best moments).

  3. Reframe after you’ve chosen the story beats.

  4. Use on-screen labels for learning videos (“Tack: jib released late” / “Too much tiller”).

  5. Keep cuts tighter than you think. Water footage can be hypnotic — in the way watching a washing machine can be hypnotic.


7) Quick checklist for the slipway

  • Batteries charged + spares

  • Cards formatted

  • Lenses cleaned + cloth packed

  • Mounts tight + tethers attached

  • Cameras started + recording confirmed

  • Quick sync clap

  • “If we capsize, nothing becomes a spear” check


Wrap-up

A single 360 camera is like having an extra pair of eyes. Two is like having a small production team. Three is like you’ve decided you’re Netflix now — which is fine, as long as you still remember to enjoy the sail.

If you want the simplest recommendation:

  • Start with one stern pole camera.

  • Add a cockpit camera when you want better learning footage.

  • Add a bow/mast camera only when you’ve got the workflow nailed.

Because the goal is to capture the day — not to spend the day arguing with batteries.