Tuesday, 3 March 2026

R&D — From test tubes to tell-tales: building an electronic burgee

 


R&D — From test tubes to tell-tales: building an electronic burgee

There’s a myth that R&D only happens in white coats, surrounded by fume cupboards and the faint smell of “who left the hotplate on?”. In our world, research and development is just as likely to happen in a buoyancy aid, crouched in the bottom of a dinghy, wondering whether that “slight breeze” is actually a wind shift or just your hat trying to make a break for it.

Because here’s the thing: sailing is basically applied physics with occasional splashing. And once you’ve spent a lifetime building experiments, teaching students to measure the unmeasurable, and persuading sensors to behave themselves, it’s only a matter of time before you look at a masthead burgee and think: Yes… but what if it had data?

The problem: the masthead knows… but the helm doesn’t

A traditional burgee and wind indicator are brilliant—simple, reliable, and they don’t need charging. The only snag is that the truly useful information is happening at the top of the mast, while the people making the decisions are down below doing impressions of a human washing line.

On rivers (hello Thames), wind can be wildly different even a few metres up. Trees, banks, moored boats, bridges—everything interferes. So the burgee might be telling the truth, but it’s telling it to the clouds.

The R&D idea: an electronic burgee you can actually see

So the concept is this:

  • A combined weather vane + anemometer at the masthead

  • Measurements relayed down the mast (wire or wireless)

  • A screen at the foot of the mast showing:

    • Wind direction (relative to the boat’s heading)

    • Wind speed (and ideally gusts)

    • A “trend” indicator (strengthening / easing)

In other words: the masthead’s gossip comes down to deck level, where it can be acted on without a crew member performing an interpretive dance while staring upwards.

Why this matters (especially for learning and coaching)

This isn’t just gadget-lust (although I admit gadgets have a strong pull, like biscuits). An electronic burgee could genuinely help with:

  • Training: showing new helms how wind direction relates to boat trim and course

  • Tacking and gybing: timing manoeuvres when the wind is behaving oddly along the river

  • Sail setup: helping the crew see what changes in downhaul, kicker/gnav, outhaul actually do

  • Safety: spotting gusts building before they smack you like a wet duvet

It turns sailing into something you can observe, measure, and learn from, rather than “feel vaguely and argue about afterwards”.

Practical design thoughts (before we invent a new type of disappointment)

If we’re doing this properly, it needs to survive real sailing life, which includes: water, shocks, UV, vibration, and that special kind of accidental abuse known as “launching”.

Key R&D questions:

  • Power: small battery? solar trickle? or run a thin cable down the mast?

  • Data link: wired (reliable) vs wireless (neat but fussy around masts, water, and interference)

  • Ruggedness: waterproofing, salt resistance (for coastal), and impact survival

  • Display: sunlight readable, simple icons, big numbers, minimal menus

  • Calibration: direction needs a reference (boat heading / mast alignment) so it doesn’t lie politely

The dream is something that’s useful in 3 seconds, not something that requires a PhD and a laptop on a spinnaker bag.

The bigger point: R&D is a mindset, not a department

This is why R&D leaks out of science and into sailing. Once you start asking “how could this be better?” you can’t switch it off. The lab and the boat are both environments where small improvements make a big difference—and where the real world happily punishes sloppy thinking.

So yes, we’re looking at building an electronic burgee. Not because we need more screens in our lives… but because if we can make wind behaviour clearer, training quicker, and sailing safer (and maybe reduce the amount of shouting during a tack), then it’s exactly the kind of R&D that’s worth doing.

And if it all goes wrong? Well… we can always fall back on the traditional system: lick a finger, hold it up, and pretend that was “data”.

Monday, 2 March 2026

Using AI tools to help with video editing (subtitles, effects, colour balancing)

 


Using AI tools to help with video editing (subtitles, effects, colour balancing)

There was a time when “post-production” meant tea, biscuits, and the slow realisation that you’d filmed everything… except the bit where you explain the thing. These days, AI has barged into the edit suite like an over-helpful assistant: occasionally brilliant, occasionally confident-and-wrong, but almost always faster than doing it all by hand.

Used sensibly, AI can take the boring, repetitive bits of editing (transcribing, syncing, rough selects, basic colour matching) and give you more time for the bits that actually matter: storytelling, pacing, and making your video look like you planned it that way.

1) Extracting subtitles: the fastest “instant upgrade”

Subtitles are no longer just for accessibility (although that’s reason enough). They’re also for:

  • viewers watching on mute,

  • short-form platforms that love on-screen text,

  • anyone trying to follow a technical explanation at speed.

Typical AI workflow (10–20 minutes instead of 2 hours):

  1. Auto-transcribe your timeline or selected clips.

  2. Clean up names, jargon, and units (AI always butchers “DaVinci”, “spectrophotometer”, and any student’s surname).

  3. Turn transcript into captions and choose a readable style.

  4. Export SRT for YouTube, or burn captions in for socials.

If you’re using Adobe Premiere Pro, Speech-to-Text will generate a transcript and create captions directly from it.

If you’re using DaVinci Resolve Studio, the Neural Engine features cover a lot of AI-assisted workflow (and Blackmagic explicitly positions AI as part of Resolve’s toolset).

And if you want a very “edit by typing” approach (especially for talking-head, tuition, or voiceover-heavy videos), Descript is built around transcription-first editing and quick caption creation.

My rule: subtitles from AI are draft subtitles. Always do a quick skim before publishing. One wrong word can turn GCSE chemistry into interpretive poetry.


2) Graphical effects: letting AI do the donkey work

This is where AI can feel like magic—particularly for tasks that used to mean frame-by-frame misery:

  • Smart reframing (turn wide shots into vertical without chopping heads off),

  • Object tracking / region tracking (so text or blur sticks to the right thing),

  • Background removal / masking assistance (especially for quick explainers),

  • Auto “rough cut” helpers (finding pauses, removing silences, assembling selects).

Resolve’s AI tooling (Neural Engine) includes features like object detection, smart reframing, and more—useful for turning one master video into multiple platform-friendly versions.

For more experimental or generative workflows (and the “how on earth did you do that?” factor), platforms like Runway lean hard into AI-powered video creation and manipulation tools.

Practical tip: use AI effects to get you 80% of the way quickly, then finish manually. That last 20% is where “professional” lives.


3) Colour balancing: AI as your first pass, not your final grade

Colour is the sneaky time thief. AI can help you:

  • balance exposure and white balance,

  • match shots from different cameras,

  • get a consistent “base look” before you do your creative grade.

Resolve’s Neural Engine includes auto colour and colour matching among its AI-driven features.
That’s ideal when you’ve got (say) a Canon on a tripod, a 360 camera doing its own thing, and a phone clip your son grabbed at exactly the wrong colour temperature.

A sensible grading approach:

  1. AI auto-colour / shot match to normalise clips.

  2. Check with scopes (waveform/vectorscope) because your eyes lie.

  3. Then apply your creative grade (warmth, contrast, “sunny day optimism”, etc.).

AI gets you to “not embarrassing” fast. You get yourself to “that looks lovely”.


The “don’t let AI ruin your life” checklist

  • Check proper nouns (people, places, boat names, science terms).

  • Watch for confident hallucinations in captions (“jib” becoming “gym”, “gnav” becoming “navy”…).

  • Keep a house style: caption fonts, sizes, safe margins, brand colours.

  • Don’t overdo effects: if the AI effect is the most interesting part, your story has a problem.

  • Archive your transcript: it becomes blog material, revision notes, worksheet text, and searchable video metadata.

Sunday, 1 March 2026

The Leitner System for Flashcards (aka “Spaced Repetition Without the Spreadsheet Panic”)

 


The Leitner System for Flashcards (aka “Spaced Repetition Without the Spreadsheet Panic”)

Flashcards have a bit of an image problem. They’re either seen as the magic key to top grades, or as tiny bits of card you’ll lovingly create… and then never look at again.
The truth is: flashcards are brilliant, but only if you use them in a way that matches how memory actually works. That’s where the Leitner System comes in — a gloriously simple method that turns “random revision” into a routine that builds long-term recall.

What is the Leitner System?

The Leitner System is a spaced repetition method for flashcards. Instead of revising every card every day (which is exhausting and unnecessary), you sort your cards into boxes (or piles). Cards you find easy are reviewed less often. Cards you struggle with come back more frequently.
In other words: you spend your time where it actually helps.

The basic setup (five boxes, zero drama)

You can do this with proper flashcard boxes, envelopes, a set of labelled food tubs, or five piles on the kitchen table (if your family don’t mind living in a stationary shop).

  • Box 1 (Daily-ish): New cards and ones you got wrong.

  • Box 2: Cards you got right once.

  • Box 3: Cards you got right a few times.

  • Box 4: Cards you mostly know.

  • Box 5 (Victory Lap): Cards you know really well.

A common review schedule looks like:

  • Box 1: every day

  • Box 2: every 2 days

  • Box 3: twice a week

  • Box 4: weekly

  • Box 5: fortnightly (or “occasionally to stay smug”)

You don’t have to be perfect. Consistency beats perfection every time.

How cards move (the bit that makes it work)

This is the whole system in one rule:

  • If you get a card right: it moves up a box (reviewed less often).

  • If you get it wrong: it goes back to Box 1 (reviewed more often).

That’s it. No complicated apps required (though apps can help). The system naturally gives you more practice on weak areas, and gradually reduces the time you spend on things you already know.

Why it’s so effective (in plain English)

The Leitner System works because it uses two powerful learning ideas:

  1. Retrieval practice: Trying to pull an answer out of your brain strengthens memory more than re-reading notes.

  2. Spacing: Revisiting information with gaps in between helps your brain store it long-term.

If revision is like building a wall, Leitner isn’t “adding more bricks”. It’s cementing the ones that keep wobbling.

What makes a good flashcard (and what makes a terrible one)

A flashcard should test one clear thing.

Good:

  • “What is osmosis?”

  • “State Newton’s 2nd law.”

  • “What is the difference between ionic and covalent bonding?”

  • “Define opportunity cost.”

Not so good:

  • “Explain EVERYTHING about photosynthesis.”

  • “Tell me all the equations in Physics Paper 1.”

  • “Describe the entire Cold War (but keep it brief).”

If you need paragraphs, that’s not a flashcard — that’s an essay wearing a tiny hat.

How to use Leitner for GCSE and A-Level (fast and practical)

  • Make cards from exam mark schemes (gold dust for wording).

  • Use short answers that match what examiners reward.

  • Mix factual recall (definitions, equations, key terms) with mini-application:

    • “What happens to rate if temperature increases? Why?”

    • “Explain why increasing surface area increases reaction rate.”

And crucially: say the answer out loud before flipping the card. If you “sort of knew it”, that’s your brain negotiating. Make it commit.

Paper vs apps: which is better?

  • Paper cards are brilliant for younger students and hands-on learners. Also: no notifications.

  • Apps (like Anki or Quizlet) make spacing automatic and are great for busy students.

The best system is the one the student will actually use. The second best is the one they’ll use after you’ve removed TikTok from their phone (I’m joking… mostly).

A tiny warning: don’t confuse making cards with learning

Making flashcards can feel productive — and it is a bit — but the learning comes from testing yourself repeatedly over time.
If a student has made 400 gorgeous flashcards and revised none of them, they haven’t created a revision system. They’ve created an arts-and-crafts project.

The simple starting plan

If you want the smallest possible way to start:

  1. Make 20 flashcards from one topic.

  2. Use 3 boxes (New / Learning / Secure).

  3. Revise for 10 minutes a day.

  4. Let the boxes do the thinking.

You’ll be amazed how quickly “I keep forgetting this” turns into “Oh, that one again… fine.”

Saturday, 28 February 2026

Filming in UV, IR and Thermal — when your camera starts lying to you

 


Filming in UV, IR and Thermal — when your camera starts lying to you

If normal filming is “point camera at thing, press record, pretend you’re David Attenborough”… then filming in UV, IR and thermal is more like, “point camera at thing, press record, realise physics has other plans, and spend the next hour arguing with a lens cap.”

These worlds are brilliant for science, nature, kit testing and all sorts of “wow” shots — but they come with a special set of problems that don’t show up in ordinary visible light.

1) Your lens may not be telling the truth (or even letting the light through)

UV and thermal are the worst offenders here.

  • UV: Most normal glass blocks a lot of UV. Many lenses transmit almost nothing useful in UV, so you end up with dim footage, mushy contrast, and lots of noise. Some lenses also create bright “hot spots” or weird flare patterns because coatings weren’t designed for UV.

  • IR (near-IR): Many lenses work sort of fine, but you can get hot spots, reduced contrast, and soft corners. Some lenses shift focus slightly between visible and IR.

  • Thermal: This is a completely different optical system (often germanium lenses). You can’t just “use your nice Canon lens” and hope for the best. Also: thermal lenses are expensive enough to make you treat them like crown jewels.

Result: your image can look soft, blotchy, low contrast, or oddly vignetted… even when you did everything “right”.

2) Focusing becomes a dark art (sometimes literally)

Autofocus is usually trained and tuned for visible light. In UV/IR it can hunt, miss, or give up and go home.

  • IR focus shift is a classic trap: you nail focus in visible, flip a filter on, and suddenly everything is slightly off.

  • UV often forces you into manual focus because the image is dim and AF can’t see enough contrast.

  • Thermal cameras don’t focus on “detail” in the same way — they focus on temperature edges. Fine detail can vanish if temperatures are similar.

Practical fix: manual focus, focus peaking, test charts, and checking playback like a nervous parent at sports day.

3) Exposure is harder because the rules change

Your camera’s metering assumes the world behaves normally. In UV/IR/thermal… it does not.

  • IR: foliage can turn bright (the “Wood effect”), skies can go dark, and reflectivity changes. Your histogram becomes a comedian.

  • UV: everything is often dim, and you may need big light levels — but you also need to avoid frying your subject (or your eyeballs).

  • Thermal: exposure is essentially a temperature scale (often with auto-gain). The camera may constantly re-map the scene if something hot enters the frame (hello, hands), making the picture “pulse” as it rescales.

Fix: lock exposure where possible, use manual settings, and for thermal learn to control level/span (or equivalent) rather than thinking like a normal videographer.

4) Reflections and “phantom signals” ruin your day

This one catches people out fast:

  • Thermal reflections are real. Shiny surfaces can reflect heat like mirrors reflect light. You think you’re filming a warm object — but you’re actually filming the reflection of your own body heat. Congratulations: you’ve invented the thermal selfie.

  • IR reflections off water, varnish, glass, and polished surfaces can be wildly different than in visible.

  • UV can produce strange flare and internal reflections, especially with filters.

Fix: change angles, use matte finishes, add shields, and in thermal beware of glass (often opaque to long-wave IR) and reflective metal.

5) Colour becomes “false colour” (and your audience will believe it anyway)

In IR and thermal, colour is usually:

  • mapped (false colour palettes),

  • shifted (custom white balance in IR), or

  • manufactured (UV fluorescence where the “colour” is actually visible light emitted by materials).

That’s fine — as long as you’re honest about it.

Fix: label what’s going on (“false colour thermal”, “near-IR”, “UV fluorescence”), keep palettes consistent across a project, and don’t imply “this is the real colour”.

6) Calibration, emissivity, and why thermal numbers can be wrong

Thermal cameras are amazing, but temperature readouts can be very misleading if you don’t account for:

  • Emissivity (how well a surface emits IR radiation)

  • Reflected apparent temperature

  • Distance, humidity, and atmosphere

  • Shiny surfaces (often low emissivity)

You can be “measuring” 70°C and actually be looking at a reflection or a surface that simply doesn’t emit well.

Fix: treat thermal temperature as a measurement with assumptions, not a gospel truth. Use emissivity tables, reference patches (matte tape/paint), and compare against known temperatures where possible.

7) Noise, artefacts, and the “why is the picture crawling?” problem

  • UV/IR often require higher ISO or longer shutter times → noise, banding, hot pixels, motion blur.

  • Thermal sensors can show fixed-pattern noise, drift, and “stuck” pixels; some cameras do periodic NUC calibration (that little click/freeze).

Fix: more light (UV/IR), stable support, sensible shutter speeds, and in thermal plan around calibration pauses.

8) Lighting can be the biggest headache of all

  • UV lighting: needs proper UV sources, but also serious safety thinking (skin/eye protection, avoiding stray UV, and managing reflections).

  • IR lighting: IR illuminators help, but can create hotspots and uneven scenes.

  • Thermal: you don’t “light” it — you manage heat differences. Sometimes you need to create contrast (warm background, cool subject, or vice versa).

Fix: design the scene. In these bands you don’t just film reality — you engineer visibility.

9) Workflow: it’s slower, fussier, and needs more notes

When you mix UV, IR, thermal and normal footage, your edit can descend into chaos unless you keep track.

Fix: slate your clips (“IR 850nm filter”, “UV fluorescence 365nm torch”, “Thermal palette ironbow”), keep consistent LUT/palette choices, and grab some “normal light” reference shots so viewers know what they’re looking at.


The big takeaway

Filming in UV, IR and thermal is like gaining superpowers… with the small downside that your camera becomes a moody scientific instrument. Once you accept that you’re not just filming — you’re measuring, translating, and sometimes politely arguing with physics — it becomes hugely rewarding.

Friday, 27 February 2026

Writing “Similar” Tunes Without Nicking Anyone’s Copyright

 


Writing “Similar” Tunes Without Nicking Anyone’s Copyright

Every creator eventually asks the same dangerous question:

“Can you make it sound like that… but not, you know… that?”

It’s the musical equivalent of asking a baker for a Colin the Caterpillar that definitely isn’t Colin the Caterpillar, while winking so hard you sprain an eyelid.

The good news: you can write music that sits in the same “vibe family” as a reference track without breaching copyright.
The bad news: you need to do it deliberately — because your brain is a pattern-matching machine with the morals of a magpie.

1) Understand what copyright actually bites

In simple terms (and yes, lawyers will add footnotes the size of Wales):

  • Copyright protects the specific expression of a musical idea (melody/lyrics, and sometimes distinctive combinations).

  • It does not protect general style: genre, instrumentation, tempo, “80s synth vibe”, “epic trailer feel”, “lo-fi beats to mark homework to”.

Most problems happen when:

  • the melody is too close,

  • the hook rhythm is unmistakably similar,

  • or the signature combination of melody + harmony + rhythm ends up “recognisably the same song wearing a moustache”.

2) The “reference track” trick: analyse, don’t imitate

Use reference tracks like a science teacher uses a demo experiment: we’re learning the principles, not pinching the apparatus.

Make a quick checklist from the reference:

  • Tempo range (e.g., 95–105 BPM)

  • Feel (straight, swung, halftime)

  • Instrument palette (pads, plucks, guitars, piano, etc.)

  • Structure (8-bar intro, drop at bar 17, short stinger ending)

  • Energy curve (where it builds, where it breathes)

Crucially: treat the melody as radioactive. Observe it from behind glass.

3) Write a new melody first — then build the vibe around it

If you do harmony + beat + sound design first, your melody will “magically” fall into the same notes as the reference because your ear is already guided there.

Instead:

  1. Write a melody without the reference playing.

  2. Change one major “melody identity marker”:

    • Contour (shape): if theirs rises then drops, do drop then rise.

    • Rhythmic motif: if theirs goes long-long-short-short, make yours short-long-short-long.

    • Starting note: move the home base away from theirs.

A handy sanity check: sing your melody a cappella. If it instantly reminds you of the reference, it’s too close.

4) Harmony: use the same function, not the same chords

Many pop tracks share progressions. That’s fine. But copying the exact progression and the same bass movement and the same rhythmic placement is where you drift into “tribute act”.

Try:

  • Keep the same emotional function (e.g., uplifting → tension → release)

  • Swap in alternatives:

    • Change key

    • Substitute chords (relative minors/majors, add 7ths/9ths)

    • Alter bass notes (inversions)

    • Change harmonic rhythm (how often chords change)

5) Rhythm: borrow the feel, not the fingerprint

Rhythm is often what makes something recognisable.

To keep it safe:

  • Change the drum pattern accents (kick/snare placement)

  • Change the syncopation

  • Change the groove template (straight vs swung vs shuffled)

  • Change the hook rhythm if it matches the reference hook

If your listener can clap your hook and accidentally clap the other song… that’s a warning light.

6) Sound design can suggest a world — but don’t clone a signature sound

Using “a warm analogue pad with gentle chorus” is one thing. Recreating a famous, distinctive patch or a very identifiable production gimmick is another.

Safer moves:

  • Use a different lead instrument

  • Change octave/register

  • Change envelope (pluck vs swell)

  • Change articulation (legato vs staccato)

  • Change effects chain (delay timing/reverb size/modulation)

7) The “three changes” rule (practical, not legal)

For any element that feels close (melody, bass line, hook rhythm), make at least three meaningful changes:

  • Notes

  • Rhythm

  • Starting point

  • Phrasing

  • Register

  • Instrument

  • Tempo/groove

  • Harmony underneath

It’s not a legal shield — it’s a good creative discipline.

8) The safest workflow for content creators (YouTube, blogs, courses)

If you’re making background tracks for videos (science demos, sailing clips, drone footage, etc.), you want music that:

  • supports the message

  • loops cleanly

  • doesn’t distract

  • is unquestionably original

Try this recipe:

  • Two-chord or four-chord loop

  • Simple, original top-line motif (2–4 notes)

  • One “ear candy” element every 8 bars (a little riser, fill, or variation)

  • Minimal melodic density during speech

You’ll end up with something useful, repeatable, and safely yours.

9) The “someone else’s opinion” test

Before you publish:

  • Play your track to someone who knows the reference.

  • Ask: “Does this remind you of that track?”

  • If they say yes immediately, rewrite the melody/hook rhythm.

And if you want to be extra cautious, run it past a music professional or rights specialist — especially if it’s for a paid campaign.

10) The golden rule

If the goal is “make it close enough that people think it’s that”… you’re aiming at the wrong target.

Aim for: same brief, same energy, your own musical DNA.

That way you’re not just avoiding trouble — you’re building a recognisable sound of your own.


Quick “do this / avoid this” checklist

Do

  • Analyse tempo/feel/instruments/structure

  • Write melody away from reference

  • Change contour + rhythm + starting note

  • Use function-based harmony, not copy-paste chords

  • Get a “does it remind you of…?” listener test

Avoid

  • Copying the hook melody (even slightly “reshaped”)

  • Matching hook rhythm + chord rhythm + bass movement

  • Using the same signature sound + same melodic shape

  • Keeping the reference playing while composing the topline