We’re heads down, gearing up for a big product launch next week. Wish us luck! Subscribe to our YouTube channel to receive the announcement.
In building new tech, we’re thinking a lot about the future. Or creating the future…
That's why we asked Efosa to write up a crystal ball moment—from cameras getting smaller to workflows getting smarter. No far-fetched sci-fi hypotheticals. Only early signals of what’s coming next. What do you think the future holds?
- Shamir
PS - are you in the San Francisco Bay Area? If so, drop us a line (just reply)
CLOUD CAPTURE: POST WITHOUT DRIVES
Footage. In the cloud. Before lunch.
It’s no longer sci-fi. It’s the storm front gathering over every production.
Back in the day (well, five years ago), the process was longwinded. Shoot. Offload. Clone. Verify. Hand to post. A ballet commenced, G-Raids, and USB-C prayers.
But a new model is creeping in, one where the camera doesn’t hand off footage. It beams it up.
Welcome to camera-to-cloud.
The tech today:
Frame.io Camera to Cloud: It started with Teradek and Sound Devices. Now it’s RED and ARRI-friendly.
Adobe + Frame.io integration: Editors can cut dailies while the actor’s still in makeup.
5G + Satellite Uplink: Imagine a production where "footage dump" is just a tab in the browser.
The quiet revolution. A shoot with no media wrangler. A wrap without waiting on offloads. Editors in another city (or timezone) cutting the morning’s takes before the director yells “cut” on the next.
Sure, it's not yet perfect. It’s not the quickest (yet). But the shift is here. And once it hits critical mass a DIT may be a thing of the past. A world complete without drives might just arrive before one complete without drivers.
BIG MOVIES, SMALL CAMERAS
Gareth Edwards operating $4K FX3 on $80M Sci-Fi Epic The Creator
Cut to: A blockbuster. Lasers. Explosions. Alien cities in 8K. Now cut to: The BTS footage. A guy with an ill-fitting tee. A Sony FX3. Not a fantasy. This is The Creator.
Gareth Edwards’ 2023 sci-fi epic, shot on a $4K Sony FX3, a camera that fits in your carry-on. We touched on this last week but it’s worth repeating. It’s one of the clearest, boldest examples of the future we’re heading towards.
Then there’s Boiling Point, the anxiety-inducing one-shot kitchen drama captured with a DJI Ronin 4D. That’s a $6,000 gimbal-cam that gives you stabilized, follow-focus 6K in a single, shoulder-slung unit.
Tech rundown:
Sony FX3: Full-frame, low-light beast. Paired with vintage glass, it sings.
DJI Ronin 4D: Built-in gimbal. LiDAR. Follow focus. Basically a robot operator.
Upcoming mega budget Hollywood F1 movie shot on DJI Ronin 4D
There are murmurs paddock-side that the upcoming F1 film starring Brad Pitt, produced by Apple and helmed by Top Gun: Maverick director Joseph Kosinski has been leaning on an unexpected piece of kit: the same DJI Ronin 4D.
Yes, the same $6K camera you can order online might be capturing scenes for a $200 million blockbuster. It's not confirmed, but multiple behind-the-scenes shots show operators wielding the Ronin like it’s standard issue.
Why? Because it's fast, nimble, stabilized, and crucially can follow the speed of Formula 1 without needing a chase car and a Steadicam arm the size of a crane.
A Tangerine Dream
Director, Sean Baker operating an Iphone 5s on the set of Tangerine
It started on an iPhone. No crew van. No dolly track. Just Sean Baker, three iPhone 5S units, handful of borrowed lenses, and a whole lot of sunlight bouncing off LA sidewalks.
Tangerine (2015) wasn’t just shot on a phone, it was shot on a shoestring and still made Sundance stop and stare. Raw. Electric. Baker. It put him on the map and where did that map lead him?
To Florida for The Florida Project. Baker had Disney World in his lens and Willem Dafoe on his call sheet.
The map then took him to the French Riviera in 2024 where he won the coveted Palme D’or for Anora at Cannes before leading him back to Hollywood this year to win four (yes four) Oscars.
Tangerine wasn’t Baker’s first feature but it was the first that got people talking. Shot on iPhone and now he’s the most in-demand director in Hollywood.
Today, sensors are smarter, glass - cheaper, and post workflows are more format-agnostic than ever.
It isn't about gear anymore. Because if people walk out of your film wondering what camera it was shot on, you’ve already lost.
Didn’t shoot your feature on an Alexa or a Venice? No one noticed.
And even if you did. No one cares.
AI PRODUCERS: CALLSHEETS, COMM CALLS & CONTROL
“Hey, generate tomorrow’s shoot plan, confirm crew, and reroute lighting to Stage 3.”
Not a command from Spike Jonze’s Her. Just the production office of the near future.
AI editors are fast becoming a thing (👋). Generative soundtracks too. But what about production? In 2025, the call sheet still rules. That beige PDF relic—born from an age when you literally had to call everyone to tell them where to go.
But what happens when AI replaces the Assistant Director’s clipboard?
Enter the AI Producer.
The tech (some of it already creeping in):
Moviestar.ai / Previs tools: AI-driven breakdowns from scripts
Slack + GPT agents: Dynamic shoot schedules, crew coordination
Production-specific AIs (Filmustage, Studiobinder): That log, alert, and coordinate in real-time
What can the tech currently do?
AI can read scripts. Tags locations, cast, props.
It builds a stripboard. Flags weather. Pings crew availability.
Crew gets personalized call times, location maps, and wardrobe notes
The Missing Link: Why AI Isn’t a Producer Yet
Filmustage can break down a script in seconds. Studiovity builds schedules with a click. RivetAI promises to turn your screenplay into something shootable.
And if you want a basic call sheet, Taskade or Filmanize will happily oblige. But here’s the thing: real producing doesn’t live in clean dashboards.
It lives in chaos. The real production office runs on texts at midnight, WhatsApp groups buzzing during location scouts, DMs to crew asking if they can swap days. Phone calls. Email chains.
Scribbled notes on the back of parking tickets. No current AI system can tap into that messy, human network, and that’s the real limitation. What if there was an AI that could monitor all those channels, Slack, WhatsApp, Gmail, voice memos, and automatically translate that mess into a clean, living production workflow?
Now that would be the future. We don’t have it yet. But it’s probably coming soon.