On May 10 this year, the strongest geomagnetic storm in over 20 years occurred, delivering a northern lights display that reached parts of the world it rarely does. I was lucky enough to learn that it was happening with enough time to venture out with my camera gear, and it was a truly memorable evening.
The G5-rated geomagnetic storm was so strong that the aurora borealis could be seen even where I live in the south of the UK, dancing in the night sky and illuminating the landscape with other-worldly hues. I wrote about my experiences of that night, pitting a mirrorless camera against a smartphone to see which could get the best Northern Lights pictures.
More recently, on October 11, there was another chance to see the aurora, but sadly on this occasion I only discovered that it had taken place the next day. And yes, the images on my news and social feeds were spectacular.
The night sky had been clear, I was at home with no particular plans, and I had learned a thing or two about photographing the lights from my first experience that I wanted to put into practice should I ever be lucky enough to see them again – so I was heartbroken to have missed out.
A few days after the event, as I watched a steady stream of spectacular photos appear in my social media feeds that only heightened my disappointment, Meta piped up on Threads with a sentiment that fell completely flat, and worse.
"POV: you missed the northern lights IRL, so you made your own with Meta AI" read Meta's post on Threads, together with an AI-generated image of the aurora over famous landmarks, including the Golden Gate Bridge (see the main image above). Cue a Meta roasting.
It's OK to lose some
I was one of those people that missed the northern lights IRL. However, even though I was disappointed to have missed out, especially after seeing so many incredible photos from photographers around the world and especially the UK, I'm not going to fake the experience of capturing the real thing, whether it's with AI or old-school Photoshop fakery.
For me, such events primarily are about being there, and experiencing them in the moment. I echo the sentiment of my colleague Phil Berne, who gave up trying to capture April's rare total eclipse in the US with an array of camera gear to ultimately enjoy the once-in-a-lifetime event.
When I witnessed the northern lights for the first time back in May (you can see one of my photos above), there was an otherworldly feel to the night, and the atmosphere was electric. I got distracted taking photos and timelapses, much to my own detriment. I wish I had taken a photo or two that I was pleased with artistically, then put my camera away and spent more of my time simply soaking it in. I don't need a photo to prove I was somewhere, sometime, especially if it limits my enjoyment of the moment.
On the most recent occasion I missed out, but I'm okay with that; you win some, you lose some. But, if you listen to Meta, you don't ever have to lose some, thanks to the magic of AI-powered photo editing. I do not echo that sentiment. For the less secure, FOMO is what Meta has peddled ever since it was born as Facebook. Yet Meta's post last week wasn't just in poor taste – there's a darker side to it.
It's not OK to fake it
Look, I get it, Meta's post on Threads was simply a way to show off its fresh AI image-generator skills. But the message it contained didn't just go down like a lead balloon, it had a sinister element to it: you can fake being at an event with AI.
It's one thing to use photo-editing tools creatively – though image manipulation has been a gray area ever since the arrival Photoshop – but to fake elements of real events with AI editing and image generation? That's not okay. Once you start using Meta AI's tools to fake reality, where does it end? The creation of fake news (and potentially catastrophic consequences) is just one example of the darker side of AI.
AI-powered photo editors and image generation can be delightful tools to use, helping you to realize your creative vision with ease. In and of itself, adding the northern lights to photos actually looks like a good use of Meta AI in that you can get reasonable results (however scientifically inaccurate they may be). But displaying these images in a way that tricks people into believing that you were there? That's a no from me.
In the wrong hands – and there's no control over which hands will be using it – AI image generation can be all too convincing, to the point where we simply don't know what's real. If Meta is actively promoting the deceptive use of AI image generation when it should be leading the fight against it, what hope do we have?