Saturday, 28 February 2026

Filming in UV, IR and Thermal — when your camera starts lying to you

 


Filming in UV, IR and Thermal — when your camera starts lying to you

If normal filming is “point camera at thing, press record, pretend you’re David Attenborough”… then filming in UV, IR and thermal is more like, “point camera at thing, press record, realise physics has other plans, and spend the next hour arguing with a lens cap.”

These worlds are brilliant for science, nature, kit testing and all sorts of “wow” shots — but they come with a special set of problems that don’t show up in ordinary visible light.

1) Your lens may not be telling the truth (or even letting the light through)

UV and thermal are the worst offenders here.

  • UV: Most normal glass blocks a lot of UV. Many lenses transmit almost nothing useful in UV, so you end up with dim footage, mushy contrast, and lots of noise. Some lenses also create bright “hot spots” or weird flare patterns because coatings weren’t designed for UV.

  • IR (near-IR): Many lenses work sort of fine, but you can get hot spots, reduced contrast, and soft corners. Some lenses shift focus slightly between visible and IR.

  • Thermal: This is a completely different optical system (often germanium lenses). You can’t just “use your nice Canon lens” and hope for the best. Also: thermal lenses are expensive enough to make you treat them like crown jewels.

Result: your image can look soft, blotchy, low contrast, or oddly vignetted… even when you did everything “right”.

2) Focusing becomes a dark art (sometimes literally)

Autofocus is usually trained and tuned for visible light. In UV/IR it can hunt, miss, or give up and go home.

  • IR focus shift is a classic trap: you nail focus in visible, flip a filter on, and suddenly everything is slightly off.

  • UV often forces you into manual focus because the image is dim and AF can’t see enough contrast.

  • Thermal cameras don’t focus on “detail” in the same way — they focus on temperature edges. Fine detail can vanish if temperatures are similar.

Practical fix: manual focus, focus peaking, test charts, and checking playback like a nervous parent at sports day.

3) Exposure is harder because the rules change

Your camera’s metering assumes the world behaves normally. In UV/IR/thermal… it does not.

  • IR: foliage can turn bright (the “Wood effect”), skies can go dark, and reflectivity changes. Your histogram becomes a comedian.

  • UV: everything is often dim, and you may need big light levels — but you also need to avoid frying your subject (or your eyeballs).

  • Thermal: exposure is essentially a temperature scale (often with auto-gain). The camera may constantly re-map the scene if something hot enters the frame (hello, hands), making the picture “pulse” as it rescales.

Fix: lock exposure where possible, use manual settings, and for thermal learn to control level/span (or equivalent) rather than thinking like a normal videographer.

4) Reflections and “phantom signals” ruin your day

This one catches people out fast:

  • Thermal reflections are real. Shiny surfaces can reflect heat like mirrors reflect light. You think you’re filming a warm object — but you’re actually filming the reflection of your own body heat. Congratulations: you’ve invented the thermal selfie.

  • IR reflections off water, varnish, glass, and polished surfaces can be wildly different than in visible.

  • UV can produce strange flare and internal reflections, especially with filters.

Fix: change angles, use matte finishes, add shields, and in thermal beware of glass (often opaque to long-wave IR) and reflective metal.

5) Colour becomes “false colour” (and your audience will believe it anyway)

In IR and thermal, colour is usually:

  • mapped (false colour palettes),

  • shifted (custom white balance in IR), or

  • manufactured (UV fluorescence where the “colour” is actually visible light emitted by materials).

That’s fine — as long as you’re honest about it.

Fix: label what’s going on (“false colour thermal”, “near-IR”, “UV fluorescence”), keep palettes consistent across a project, and don’t imply “this is the real colour”.

6) Calibration, emissivity, and why thermal numbers can be wrong

Thermal cameras are amazing, but temperature readouts can be very misleading if you don’t account for:

  • Emissivity (how well a surface emits IR radiation)

  • Reflected apparent temperature

  • Distance, humidity, and atmosphere

  • Shiny surfaces (often low emissivity)

You can be “measuring” 70°C and actually be looking at a reflection or a surface that simply doesn’t emit well.

Fix: treat thermal temperature as a measurement with assumptions, not a gospel truth. Use emissivity tables, reference patches (matte tape/paint), and compare against known temperatures where possible.

7) Noise, artefacts, and the “why is the picture crawling?” problem

  • UV/IR often require higher ISO or longer shutter times → noise, banding, hot pixels, motion blur.

  • Thermal sensors can show fixed-pattern noise, drift, and “stuck” pixels; some cameras do periodic NUC calibration (that little click/freeze).

Fix: more light (UV/IR), stable support, sensible shutter speeds, and in thermal plan around calibration pauses.

8) Lighting can be the biggest headache of all

  • UV lighting: needs proper UV sources, but also serious safety thinking (skin/eye protection, avoiding stray UV, and managing reflections).

  • IR lighting: IR illuminators help, but can create hotspots and uneven scenes.

  • Thermal: you don’t “light” it — you manage heat differences. Sometimes you need to create contrast (warm background, cool subject, or vice versa).

Fix: design the scene. In these bands you don’t just film reality — you engineer visibility.

9) Workflow: it’s slower, fussier, and needs more notes

When you mix UV, IR, thermal and normal footage, your edit can descend into chaos unless you keep track.

Fix: slate your clips (“IR 850nm filter”, “UV fluorescence 365nm torch”, “Thermal palette ironbow”), keep consistent LUT/palette choices, and grab some “normal light” reference shots so viewers know what they’re looking at.


The big takeaway

Filming in UV, IR and thermal is like gaining superpowers… with the small downside that your camera becomes a moody scientific instrument. Once you accept that you’re not just filming — you’re measuring, translating, and sometimes politely arguing with physics — it becomes hugely rewarding.

Friday, 27 February 2026

Writing “Similar” Tunes Without Nicking Anyone’s Copyright

 


Writing “Similar” Tunes Without Nicking Anyone’s Copyright

Every creator eventually asks the same dangerous question:

“Can you make it sound like that… but not, you know… that?”

It’s the musical equivalent of asking a baker for a Colin the Caterpillar that definitely isn’t Colin the Caterpillar, while winking so hard you sprain an eyelid.

The good news: you can write music that sits in the same “vibe family” as a reference track without breaching copyright.
The bad news: you need to do it deliberately — because your brain is a pattern-matching machine with the morals of a magpie.

1) Understand what copyright actually bites

In simple terms (and yes, lawyers will add footnotes the size of Wales):

  • Copyright protects the specific expression of a musical idea (melody/lyrics, and sometimes distinctive combinations).

  • It does not protect general style: genre, instrumentation, tempo, “80s synth vibe”, “epic trailer feel”, “lo-fi beats to mark homework to”.

Most problems happen when:

  • the melody is too close,

  • the hook rhythm is unmistakably similar,

  • or the signature combination of melody + harmony + rhythm ends up “recognisably the same song wearing a moustache”.

2) The “reference track” trick: analyse, don’t imitate

Use reference tracks like a science teacher uses a demo experiment: we’re learning the principles, not pinching the apparatus.

Make a quick checklist from the reference:

  • Tempo range (e.g., 95–105 BPM)

  • Feel (straight, swung, halftime)

  • Instrument palette (pads, plucks, guitars, piano, etc.)

  • Structure (8-bar intro, drop at bar 17, short stinger ending)

  • Energy curve (where it builds, where it breathes)

Crucially: treat the melody as radioactive. Observe it from behind glass.

3) Write a new melody first — then build the vibe around it

If you do harmony + beat + sound design first, your melody will “magically” fall into the same notes as the reference because your ear is already guided there.

Instead:

  1. Write a melody without the reference playing.

  2. Change one major “melody identity marker”:

    • Contour (shape): if theirs rises then drops, do drop then rise.

    • Rhythmic motif: if theirs goes long-long-short-short, make yours short-long-short-long.

    • Starting note: move the home base away from theirs.

A handy sanity check: sing your melody a cappella. If it instantly reminds you of the reference, it’s too close.

4) Harmony: use the same function, not the same chords

Many pop tracks share progressions. That’s fine. But copying the exact progression and the same bass movement and the same rhythmic placement is where you drift into “tribute act”.

Try:

  • Keep the same emotional function (e.g., uplifting → tension → release)

  • Swap in alternatives:

    • Change key

    • Substitute chords (relative minors/majors, add 7ths/9ths)

    • Alter bass notes (inversions)

    • Change harmonic rhythm (how often chords change)

5) Rhythm: borrow the feel, not the fingerprint

Rhythm is often what makes something recognisable.

To keep it safe:

  • Change the drum pattern accents (kick/snare placement)

  • Change the syncopation

  • Change the groove template (straight vs swung vs shuffled)

  • Change the hook rhythm if it matches the reference hook

If your listener can clap your hook and accidentally clap the other song… that’s a warning light.

6) Sound design can suggest a world — but don’t clone a signature sound

Using “a warm analogue pad with gentle chorus” is one thing. Recreating a famous, distinctive patch or a very identifiable production gimmick is another.

Safer moves:

  • Use a different lead instrument

  • Change octave/register

  • Change envelope (pluck vs swell)

  • Change articulation (legato vs staccato)

  • Change effects chain (delay timing/reverb size/modulation)

7) The “three changes” rule (practical, not legal)

For any element that feels close (melody, bass line, hook rhythm), make at least three meaningful changes:

  • Notes

  • Rhythm

  • Starting point

  • Phrasing

  • Register

  • Instrument

  • Tempo/groove

  • Harmony underneath

It’s not a legal shield — it’s a good creative discipline.

8) The safest workflow for content creators (YouTube, blogs, courses)

If you’re making background tracks for videos (science demos, sailing clips, drone footage, etc.), you want music that:

  • supports the message

  • loops cleanly

  • doesn’t distract

  • is unquestionably original

Try this recipe:

  • Two-chord or four-chord loop

  • Simple, original top-line motif (2–4 notes)

  • One “ear candy” element every 8 bars (a little riser, fill, or variation)

  • Minimal melodic density during speech

You’ll end up with something useful, repeatable, and safely yours.

9) The “someone else’s opinion” test

Before you publish:

  • Play your track to someone who knows the reference.

  • Ask: “Does this remind you of that track?”

  • If they say yes immediately, rewrite the melody/hook rhythm.

And if you want to be extra cautious, run it past a music professional or rights specialist — especially if it’s for a paid campaign.

10) The golden rule

If the goal is “make it close enough that people think it’s that”… you’re aiming at the wrong target.

Aim for: same brief, same energy, your own musical DNA.

That way you’re not just avoiding trouble — you’re building a recognisable sound of your own.


Quick “do this / avoid this” checklist

Do

  • Analyse tempo/feel/instruments/structure

  • Write melody away from reference

  • Change contour + rhythm + starting note

  • Use function-based harmony, not copy-paste chords

  • Get a “does it remind you of…?” listener test

Avoid

  • Copying the hook melody (even slightly “reshaped”)

  • Matching hook rhythm + chord rhythm + bass movement

  • Using the same signature sound + same melodic shape

  • Keeping the reference playing while composing the topline

Thursday, 26 February 2026

Filming on a boat with 1–2 (maybe 3) 360 cameras

 


Filming on a boat with 1–2 (maybe 3) 360 cameras

(aka “How to record everything… including the bit where you forgot to press record.”)

There’s a special kind of confidence that comes from mounting a 360 camera on a boat. It whispers: “Relax — you’ll capture everything.” And then, five minutes later, you discover you captured everything except the moment you actually wanted… because the lens is covered in spray, the mount is slowly drooping, and the camera is politely overheating in the sun like a tourist in Benidorm.

Still: 360 cameras are genuinely brilliant afloat. They’re the closest thing we have to a time machine for sailing videos — you can choose the shot later, follow the action, and pull out angles you didn’t even know you needed.

Here’s how I’d set up one, two, or three 360 cameras on a boat, without turning the rig into a maritime hedgehog.


1) The big idea: what story are you filming?

Before you stick cameras everywhere, decide what you want to show:

  • Training / tuition: helm inputs, sail trim, crew coordination, manoeuvres, mistakes (and fixes).

  • Vlog / family day: faces, reactions, “we definitely meant to do that”.

  • Racing: starts, mark roundings, wind shifts, traffic, rules situations.

  • Safety boat / filming boat: wider context, rescue practice, coaching perspective.

That “story” determines where cameras go. Otherwise you end up with 4 hours of beautifully stabilised… deck non-slip.


2) The golden rules (before we talk camera count)

Keep it safe and boring

  • Don’t mount anything where it can snag sheets, trap a toe, or become a projectile in a capsize.

  • Use tethers. Always. If it’s not tethered, it’s already on the bottom.

  • Avoid blocking escape routes (especially in dinghies).

Protect the lenses (because water loves lenses)

  • A tiny smear on a 360 lens becomes a full panoramic smear.

  • Use lens guards if you can, and carry a microfibre cloth in a dry pocket.

  • Plan “wipe moments” (before a start, after a tack, after a dunking).

Audio is half the film

On-water audio is… enthusiastic. Wind and spray will try to remix your masterpiece into a sea shanty.

  • If you can, use a separate mic (even a phone in a dry bag) for narration later.

  • If relying on camera audio: use wind reduction settings, and accept that your best dialogue may be “WHEEEEE” and “WHO PUT THAT BUOY THERE?”

Power and storage: plan like a pessimist

  • Use big cards, format beforehand.

  • If you’re filming a long session, consider external power (but keep cables tidy and waterproofed).

  • Nothing ruins morale like “Storage Full” when you’ve just nailed the perfect gybe.


3) Best positions (the “invisible crew member” approach)

Think of a 360 camera as a crew member who never complains, never needs tea, and never says “are we there yet?” — but does need a good seat.

The best all-rounder: stern pole

  • Great view of helm + crew + sails + wake.

  • Works for dinghies and small powerboats.

  • Keeps it out of the way (mostly).

  • Bonus: the boat looks fast even when it isn’t.

The most useful for learning: centreline / cockpit

  • Shows hands, ropes, sail trim, body movement.

  • Risk: it can get kicked, soaked, or sat on by someone who swears they “didn’t see it”.

The cinematic shot: bow / forward-facing

  • Gorgeous for scenery, waves, “we’re going on an adventure”.

  • Less useful for sail handling unless you also capture cockpit.

The “coach view”: mast / high mount

  • Fantastic overview of sail shape and manoeuvres.

  • Harder to mount securely and safely.

  • Check it won’t interfere with rigging, sails, or launching/recovery.


4) Setups that actually work

Setup A: One 360 camera (the “simple and sensible” option)

Mount: stern pole or high-ish aft position
Goal: capture everything, choose angles later

What you’ll get:

  • Wide context of manoeuvres.

  • Crew interaction.

  • Decent “TV style” reframes in editing.

Tip: With one camera, keep it stable, high, and centred. A droopy pole turns your film into “found footage”.


Setup B: Two 360 cameras (the sweet spot)

If you do anything regularly (tuition, vlogging, racing), two cameras is where it starts feeling proper.

Camera 1 (main): stern pole
Camera 2 (detail): cockpit/centreline OR forward/bow

Why it’s great:

  • Stern cam covers the “whole boat story”.

  • Second cam gives you detail: hands, jib work, expressions, the moment someone says “I thought you had it”.

Pro tip: Synchronise with a simple method:

  • Start both, then do a loud clap / tap a winch / shout something unmistakable (“MARK!” works) so you can line up audio in editing.


Setup C: Three 360 cameras (the “small film crew”)

This is for coaching days, race analysis, or “we’re making a proper episode”.

Camera 1: stern pole (overall)
Camera 2: cockpit (technique)
Camera 3: bow or mast (context / sail shape / scenery)

The danger:

  • More cameras = more battery management, more wiping, more mounts, more things to forget.

  • You can absolutely spend the whole day filming and forget to sail.

Best use case: when you want to teach from the footage later (or make a detailed breakdown video).


5) Settings: keep it reliable, not fancy

Without getting brand-specific, the “boat basics” are:

  • Resolution: high enough for reframing (because you’ll crop a lot).

  • Frame rate: 30 fps is fine for most; 60 fps helps for fast action and smoother reframes.

  • Stabilisation: yes (boats wobble; the audience shouldn’t).

  • Horizon lock/levelling: very helpful if your camera supports it.

  • Exposure: auto is usually okay; watch out for bright water and dark faces. If you can, lock exposure once it looks good.

And please, for the love of sanity:
Do a 10-second test clip before launching.
It’s the difference between content creation and interpretive dance with a memory card.


6) Editing workflow (so it doesn’t eat your life)

360 footage is wonderful… and also enormous.

A workable workflow:

  1. Dump files into folders by date + camera position (Stern / Cockpit / Bow).

  2. Make short “selects” first (best moments).

  3. Reframe after you’ve chosen the story beats.

  4. Use on-screen labels for learning videos (“Tack: jib released late” / “Too much tiller”).

  5. Keep cuts tighter than you think. Water footage can be hypnotic — in the way watching a washing machine can be hypnotic.


7) Quick checklist for the slipway

  • Batteries charged + spares

  • Cards formatted

  • Lenses cleaned + cloth packed

  • Mounts tight + tethers attached

  • Cameras started + recording confirmed

  • Quick sync clap

  • “If we capsize, nothing becomes a spear” check


Wrap-up

A single 360 camera is like having an extra pair of eyes. Two is like having a small production team. Three is like you’ve decided you’re Netflix now — which is fine, as long as you still remember to enjoy the sail.

If you want the simplest recommendation:

  • Start with one stern pole camera.

  • Add a cockpit camera when you want better learning footage.

  • Add a bow/mast camera only when you’ve got the workflow nailed.

Because the goal is to capture the day — not to spend the day arguing with batteries.

Wednesday, 25 February 2026

As Life Returns to the Garden… I Point a Multispectral Camera at It (Naturally)


 As Life Returns to the Garden… I Point a Multispectral Camera at It (Naturally)

There’s a moment every year when the garden quietly stops pretending it’s dead. One morning it’s all “grey, damp, and existential”. The next, you’re tripping over crocuses and discovering that the lawn has been photosynthesising behind your back. So, like any sensible person, I celebrated the return of spring by doing what any normal gardener would do: I brought out a multispectral camera.

Because why simply notice new growth when you can measure it, map it, and interrogate it under infrared like a plant passport officer?

1) The visible world: “Ah, lovely… green things!”

In normal light, early spring is all about subtle optimism: tiny shoots, swelling buds, and that suspiciously confident patch of weeds that clearly trained all winter. You get colour, texture, and the basic “is it alive?” check.

But visible light is also a bit of a liar. A plant can look “fine” and still be quietly struggling—too dry, too wet, nutrient-starved, or recovering from winter damage.

2) Near-infrared: “The plants are glowing. The weeds are thriving.”



Near-infrared (NIR) is where multispectral gets properly smug. Healthy vegetation reflects a lot of NIR, so vigorous leaves show up bright. Stressed plants reflect less. Suddenly the garden stops being “pretty” and becomes a health report.

This is the point where you discover:

  • That shrub you’ve been worrying about? It’s actually doing alright.

  • The lawn? Patchy in ways you hadn’t noticed.

  • The weeds? Absolutely smashing it. (They always are.)

If you’re generating something like an NDVI-style view (a vegetation index), you can spot differences in plant vitality before the human eye sees much change. It’s not magic—just physics and chlorophyll doing what it does best.

3) Ultraviolet: “Everything looks like it’s been dusted in secrets.”



UV can reveal surface details and contrasts that are easy to miss in visible light—think waxy coatings, residues, and certain flower patterns. It’s not necessarily a “plant health meter” in the same direct way as NIR, but it’s brilliant for texture, detail, and ‘what on earth is that?’ moments.

Also: it makes the garden look like it’s starring in a low-budget sci-fi film. Which is a strong aesthetic choice for daffodils.

4) Practical uses (beyond ‘because it’s fun’)

Multispectral imaging can actually help with sensible garden decisions:

  • Watering: dry-stressed areas can show up differently from well-watered ones.

  • Lawn diagnosis: identify struggling patches early (compaction, shade, poor drainage).

  • Comparing beds: which areas “wake up” first and which lag behind.

  • Tracking changes: repeat the same shots weekly and you’ll build a proper spring timeline.

You don’t need to turn your garden into a research paper. But it’s strangely satisfying to match what you think is happening with what the spectrum says.

5) The real takeaway: spring is returning… and so is curiosity

The best bit isn’t the tech—it’s what it makes you do: slow down, look properly, and ask questions. Why is that corner always weaker? Why is the hedge brighter in one section? Why do the weeds appear to be receiving private tutoring?

So yes, life is returning to the garden. And this year, it’s returning in visible, infrared, and ultraviolet.

Next up: explaining to the neighbours why I’m crouched near the compost bin with a camera, whispering “Show me your chlorophyll.”

Tuesday, 24 February 2026

It Starts When You Need Something and It Doesn’t Exist

 


It Starts When You Need Something and It Doesn’t Exist

There’s a particular moment that triggers research and development. It isn’t a board meeting. It isn’t a grant application. It isn’t even a dramatic eureka with lightning and violins. No — real R&D starts when you’re halfway through doing a job and you realise the tool you need simply doesn’t exist. Or it does exist, but costs the same as a small bungalow and comes with a “licence fee” that makes your eyes water.

This is the point where normal people shrug and improvise. But if you’ve got the R&D itch, your brain immediately whispers: “Well… I could build it.” Suddenly you’re sketching something on the back of an envelope, mentally rummaging through your boxes of sensors, cameras, brackets, PVC tubing, spare bolts, and that mysterious bag of “useful bits” you refuse to throw away because it contains the future.

It’s rarely about being fancy. It’s about being practical. Teaching science? You need a rig that shows the experiment clearly, survives student handling, and doesn’t explode when someone plugs the wrong thing into the wrong thing. Filming a sailing session? You need a camera mount that doesn’t vibrate like a washing machine full of bricks. Trying to measure something properly? You discover that the “standard” apparatus is either missing, flimsy, or designed for a world where nobody ever drops anything.

Then comes the best bit: the prototype stage. Prototype One works brilliantly, as long as nobody breathes near it. Prototype Two is sturdier, but now it blocks the view of the very thing you wanted to film. Prototype Three is nearly perfect… except for the one tiny issue where it behaves beautifully in the lab and then immediately misbehaves outdoors because wind exists, water exists, and reality is rude.

And somewhere in all that tinkering, you end up learning far more than you would have done if you’d just bought a solution off the shelf. You learn what matters, what doesn’t, what fails first, and which “obvious” idea is only obvious until you actually try it. Most importantly, you build knowledge you can reuse: a method, a rig, a workflow, a sensor setup, a filming technique — something that becomes a repeatable system rather than a one-off bodge.

The punchline is that R&D doesn’t start with a lab coat. It starts with frustration. A missing bracket. An experiment that should be easy but isn’t. A piece of kit that’s nearly right but not quite. And if you’re the sort of person who can’t leave that alone… congratulations. You’re doing R&D. You’ve simply disguised it as “trying to get on with the job”.