iPhone 11 Pro Photography, Backup Strategies, Ursula Le Guin
On computational photography, making lots of copies of data, Harold Bloom, Ursula K. Le Guin, and tap essay publishing
Recondite Rodenites —
As promised, compared to last Roden, this one is a less-theory more camera-and-grab-baggy-things edition. Let’s get going.
But first! It drives me nuts when I subscribe to a newsletter and the author sends it out expecting me to know who they are, so: I’m Craig Mod. Fighter of humidity. Photographer of pizza-shaped things. Walker of Pachinko Roads. You may know me from my how to fly article or why fast software, the best software. Unsubscription is always a quick click at the bottom of this newsletter. Archives are here. This is funded through your memberships.
iPhone 11 Pro & Computational Photography
Last weekend the strongest typhoon to hit Japan in 60 years made landfall just down the peninsula from my home. Though parts of Tokyo and Nagano and elsewhere flooded, it ended up being not nearly as bad as it could been. Our area lost power for about 10 hours. The winds and rain died down around midnight. I went for a walk in the middle of the blackout and took this photo with an iPhone 11 Pro:
The walk was on a whim. I had been inside for 24 hours, was antsy. I wasn’t thinking about aesthetics or photography or anything really, other than, whoa, this is a rare moment on this street. So I didn’t bring my “beefier” cameras out on the walk. This shot was an off-the-cuff grab. Handheld. I believe the exposure was something like nine seconds long. The only light was moonlight. Individual stars are visible.
I think this is pretty damn impressive.
Now, if you zoom in, you can see the quality isn’t that of a 86mp Sony Super Night Vision camera, but, come. on. — handheld, tiny, in pocket at all times. We’re deep into a fascinating era of photography.
But what is the iPhone doing here? The folks from the camera app Halide have written up an excellent overall technical explainer. Germane to our night shooting, the sensors are more sensitive (33%-42% depending on the lens) but a little ISO bump won’t get you images like the above. Instead, Night Mode (much like Google’s Night Sight on the Pixel cameras) uses adaptive bracketing — multiple exposures at varying speeds — combined with a mystery stew of algorithms to account for sharpness, shadow detail, highlights, et cetera.
This “mystery stew” of algorithms is analogous to the chemical mixtures for film of yore. The parameters of sensors are well defined / constrained (size, pixel density, light sensitivity; of most recent curiosity: “unclippable self-resetting pixels”) but the software component of photography is wide open, unbounded, and thanks to capitalistic and market impulses, it’s being driven by intense competition between the two biggest companies in the world: Google and Apple. Which overlapping math functions and machine learning models will create a “Velvia” of computational photography?
The first iteration of Apple Velvia is “Deep Fusion.” Like Velvia (which was ISO 50, and therefore only usable for daytime or well-lit shots), Deep Fusion is a daylight algorithmic stew. It’s for devices with the new, crazily overpowered A13 chip. It was supposed to ship a few weeks ago but we’re still waiting (it arrives with iOS 13.2).
Deep Fusion is explained well in this Vergearticle:
By the time you press the shutter button, the camera has already grabbed four frames at a fast shutter speed to freeze motion in the shot and four standard frames. When you press the shutter it grabs one longer-exposure shot to capture detail.
Those three regular shots and long-exposure shot are merged into what Apple calls a “synthetic long.” This is a major difference from Smart HDR.
Deep Fusion picks the short-exposure image with the most detail and merges it with the synthetic long exposure. Unlike Smart HDR, Deep Fusion merges these two frames, not more — although the synthetic long is already made of four previously-merged frames. All the component frames are also processed for noise differently than Smart HDR, in a way that’s better for Deep Fusion.
The images are run through four detail processing steps, pixel by pixel, each tailored to increasing amounts of detail — the sky and walls are in the lowest band, while skin, hair, fabrics, and so on are the highest level. This generates a series of weightings for how to blend the two images — taking detail from one and tone, color, and luminance from the other.
The final image is generated.
It’s all about the chip and about performing these calculations on-device, in near real-time. No round-trips to servers, no dependency on bandwidth. The year-over-year increase in iPhone chip speed is, arguably, most “felt” or “experienced” in the act of taking photographs. Not in the opening of the Camera.app or pressing the shutter (though those are incrementally quicker), but in the finished artifact, the seemingly normal but until-recently impossibly-detailed image that your handheld pocket supercomputer just produced.
Back on the Night Mode front, Google announced Pixel 4 a few days ago, and headlined with photography related improvements, including an “astrophotography” mode. It’s not very practical but, boy, does it get the mind going, and proves that these innovations are why monopolies are bad and competition is good; it’s unlikely we’d see advancements like these if Apple or Google alone had “won” the entire smartphone market:
Unless you’ve been in a somnambulistic haze the past decade, none of this is out of the blue. Nearly six years ago (!) I (over-) wrote an essay called Software is the Future of Photography. At that time it was somewhat heretical to imagine non-camera-shaped-objects replacing “true” cameras. Namely: smartphones. People got Internet Upset at you for suggesting this. Especially when you suggested it in The New Yorker and were, like, pissing yourself with nerves because: the goddamned New Yorker. Anyway — six years later, software innovation in photography is chugging along at full speed and the smartphone has become the number one image production tool for most of the world.
The implications of this are protean, shifting year-by-year — from allowing average folks to capture the banalities of their beach vacation, to getting better front-line data and reports from protestors and war-zones around the world. The social and political implications of better, easier photography in the hands of more people, all of the time, can’t be overestimated. Furthermore, it’s inevitable that the algorithms will point backwards out of the phones and into the single-purpose cameras themselves. Leica is restructuring the company, in part, around computational photography. And which dork among us wouldn’t be excited to see Deep Fusion level enhancements work in real-time seamlessly integrated into an M12, or M13? Fat, fast glass, big sensors, smart chips. Yes, please.
Back in Japan, along my little night walk, here’s another shot with a flaring moon up above:
As for the actual mechanics of taking a handheld night photo: Some interesting user interface and experience tweaks occur while you’re capturing a long exposure. Before you press the shutter you see a rough preview of what the final exposure should be like in terms of details. This lets you dial in highlights and shadows. I find the standard iPhone Camera.app tends to over-expose, so I usually perform a tap-slidedown-slidedown to pull back a 1/3 or 1/2 a stop. (How nice would it be to set a blanket exposure compensation?) Release the shutter and the screen goes black and then slowly fades in as the image is processed / exposed / computed. The final image is always cleaner and richer than the preview.
It’s … neat. Like — fun. It’s fun. It’s a fun way to take a long exposure. Long exposures in my mind are fiddly, wonky, usually not very forgiving. Certainly not without a tripod. The iPhone’s is doing a lot to mitigate that elemental wonkiness and replace it with a process you could describe as elegant. In fact, it’s doing more than that — with a normal DSLR, a handheld nine-second exposure would almost certainly be useless, and in no way sharp. The iPhone makes it work. It’s only going to get better.
In fact: The entire photography package of the iPhone 11 Pro, taken as a whole, is kinda breathtaking in its grace. Think about it: The system juggles three sensors (of varying capabilities), three lenses (of varying speeds), multiple modes of exposure (HDR, Smart HDR, Normal, Deep Fusion, Night Mode, Flash). It records data outside of the frame for re-framing or perspective correction or slight rotation without loss of information. It allows for freakishly smoothly interpolated movement between focal lengths (despite “changing lenses,” et cetera). It’s a beast. But it feels like — and is as fun to play with — as a puppy. Kudos to the Apple camera team. To see what you can do with an iPhone 11 Pro in an exotic locale, look no further than Austin Mann’s excellent report. It’s a great moment in time to be working with images (and video, natch).
We spend three days investigating why we feel drawn to side projects, how they can compete with “traditional” career paths, how they sometimes become careers in and of themselves, and how side project impulses can become guiding forces helping us plan the next few years. Folks apply and we try and pick as diverse and talented a group as possible.
“Diverse” meaning not just racially or ethnically diverse, but also life-stage diverse, and a diversity of operating scale. This year we had attendees fly in from New Zealand, New York, Bali, California, London, India, Amsterdam, and folks with connections to other countries still: the Philippines, Malaysia. Of the 17 attendees, some created things (apps, objects, music, literature, courses) used by hundreds of millions of people, others by just hundreds. One was a professor, another a mother of four on half a dozen corporate boards, yet another a longtime Library of Congresser. We had an age span of 35+ years.
Side projects ran the gamut from obsessing over designing the platonic ideal of toasters, to indie rock albums, to angel investing, to thinking about countryside revitalization and generational duty.
Everyone was kind, warm, and immediately helpful. You never know what’s going to happen when you throw 17 strangers together, but these folks glommed and created a humming energy all their own. Sleep was scant. Conversations continued long into the night. We pretty much didn’t stop chatting from 8am until midnight.
It was exceptional and three of the best days of the year. Next year’s is already in the works. Explorer Club Members get first dibs to apply.
The Exciting World of Backups
Backblaze has been my cloud-backup provider of choice for nearly a decade. I’ve tried many others, like Crashplan, but I’ve found Backblaze to have the fastest, most reliable, least resource-intensive software of them all. I use them to back up almost 2TB of laptop data + photos. Two years ago during my Ragdale residency, my MacBook Pro died on me, I had to do a full reinstall, and Backblaze next-day mailed me an external HD with my complete, encrypted backup. I restored and mailed them back the HD. The only cost was postage.
Previously, Backblaze used to delete external drive data after 30 days. Meaning: You had to remember to plug in and initiate a backup with external drives at least once a month. This was a painful design decision, but the service was so reliable and reasonably priced it was a pain-point I willingly endured. I’m happy to announce, they just updated their service to hold external HD backups for a full year between plugging in. (This costs an extra $2 / mo.)
Backblaze is now, IMO, the best “thoughtless” cloud backup available. If you sign up with this link you and I both get a month for free.
On the topic of backups, I keep a lot of copies, all encrypted, in various places. (LOCKSS — Lots Of Copies Keeps Stuff Safe)
For cloud backups, Backblaze is my main squeeze. But I also use Arq and backup to Google Cloud Storage which is reasonably cheap and will hold data as long as I keep paying for storage. I haven’t restored from Google Cloud, and I suspect it will be painful and slow, so I consider this a last-resort emergency backup of photography, design, and writing related work.
I don’t trust iCloud to reliably keep my photos safe, so I have Photos.app on my MacBook Pro download original copies of everything I take on my iPhone to be backed up on my terms.
Locally, I have multiple backup strategies. I have an external Time Machine compatible disk that is theoretically backing up my laptop and photos whenever I plug them in, but Time Machine has gotten so slow and unreliable these past few years, I’ve gone back to CarbonCopyCloner for my main local backups. Apple should just buy CCC and incorporate whatever tech they’re using into Time Machine. CCC is freakishly fast. So fast that you may wonder if it’s working properly. But, it is — I just used CCC to restore my system a few months ago when, yet again, my MacBook Pro had to be fixed, resulting in a new SSD and a loss of all data. (These are both the best and worst laptops Apple’s ever produced.)
I also have a RAID 1 disk array as a secondary Time Machine disk (RAID 1 offers redundancy within its own redundancy). And I have a few cheap external HDs strategically placed with friends around the world. Whenever I visit, I perform an encrypted CCC clone. If I lose more than a year’s worth of data then it means the world has ended or I have died and stopped paying my bills.
HDs are so cheap these days (I usually buy whatever latest Western Digital is available), it’s difficult to not justify an overly redundant backup strategy. If you don’t have a good backup system, I’d suggest using Backblaze at a minimum and, if you’re technical, adding Carbon Copy Cloner and an external HD to the mix. If you’re not technically inclined, use Time Machine and an external HD.
Somewhat backup related, I found this discussion on files and files systems fascinating. Files are weird. They are a skeumorphic hangover from Physical Times. They kinda stink, but they’re also resilient — as much as iOS has tried to obscure them, the recent iPadOS updates to Files.app and external drives shows that compute data shuffling requires distinct units and user-friendly organization. Now, do I think Finder.app on macOS needs a serious rethinking? Good god, yes.
Booze Free; Still Manly?
I’ve almost completely excised alcohol from my life. It feels good. I’ve been working on this in fits and starts for the last decade. If you’re thinking about doing so, I highly recommend giving it a go. I’ve found it to be like regaining a superpower of energy, focus, sensibility, and an easy hack against one’s worst impulses. That said, I really like alcoholic drinks, and happily drink them sans-alcohol. For beer, this has gotten easier. Almost every restaurant or bar in Japan offers non-alcoholic beer that, while not perfect, is acceptable. Germany has their varieties which are superior to the Japanese offerings. I like Bitburger (I buy it on Amazon in Japan). And my go to at bars is tonic water with biters and lime.
The New Yorker just published a piece on the rise of non-alcoholic beverages. I find this trend heartening. I’d love to see the drink-or-your-not-a-man culture of braindead masculinity nipped in the bud by this generation. I grew up on the edge of it. I’ve probably blacked out 50 times in my life. This isn’t something I recommend. I am wired to drink. I can drink 15 pints of Guinness, knock down 20 stiff drinks over the course of an evening. My genes love it, my body hates it. As other alcoholic-inclined folks will recognize, there is a line that is crossed in an evening and the alcohol becomes a feral fuel, non-negotiable, you simply can’t get enough of it, and more often than not, you pull others into your sad orbit of the binge.
I see the rise in fake-meat (and the corresponding rise in their stock prices) in conversation with this alcohol-freeness. So much identity is tied up (irrationally so in the contemporary context of an abundance of alternative proteins) in butchering and grilling red meat, polishing off a six-pack — essentially going to the mat with various poisons to show strength. Not to woo-woo you dear readers to death, but it takes a helluva lot more strength to stand in the middle of a bar and not drink than go with the flow.
That said, if someone cracks open a $1,000 bottle of vintage blabbity-blah, will I have a glass? Hell yes. But that happens at most once or twice a year. And one glass is about all I need. Still, it feels like playing with a razor.
I bring this topic up occasionally not to preach, but to offer a counterpoint to the usual casual hagiography around alcohol. If you can modulate your drink, good on ya. I can’t, and so have mostly tapped out on this fight.
Books & Publishing
On the “publishing innovation” side of things, The New York Public Library has been publishing novels to their Instagram account via Stories. You can check some out at https://www.instagram.com/nypl/. Note the format-specific design affordances — the “hold thumb” space to keep the story from flipping to the next page.
Of course, this style of novel-ing was first done on the iPhone with Robin Sloan’sFish. He called it a “tap essay.” Instagram Stories are simply auto-playing tap essays. Robin probably should have gotten some $FB for that one. Instead he just gets our newsletter affection.
And if you’re itching to build your own e-reader, now you can with the help of the Open Book Project. Open source schematics. Total bill of materials is about $43. (The most expensive piece is the eink panel, coming in at $18.26.)
The literary critic and academic Harold Bloom passed away the other day. Far more erudite folks can comment on his oeuvre, but can we just take a second to recognize what a savage producer this guy was? I mean:
… 40 books of his own authorship and hundreds of volumes he edited. And he remained prolific to the end, publishing two books in 2017, two in 2018 and two this year: “Macbeth: A Dagger of the Mind” and “Possessed by Memory: The Inward Light of Criticism.” His final book is to be released on an unspecified date by Yale University Press, his wife said.
OK. And he “could read, and absorb, a 400-page book in an hour.” I love it when pulsing brains are focused on literature. Foster-Wallace fell into that category. As does Ursula K. Le Guin.
It turns out, Le Guin out-booked Bloom! As of 2008 she was at 60 books. Obviously, the number of books published is not a high-signal proxy for value, but it becomes totemic of something in the context of clearly talented people. That is, folks who have won Hugo and PEN and National Book Awards (also potential flawed proxies but just roll with me), as Le Guin has.
There’s a wonderful quote towards the end of the interview that sums up the creative life (and more), about a peasant in a fairy tale doing the thing he’s not supposed to do. A fated knight could cut the cursed hedge with one swipe of excalibur, but a peasant? Using a butter knife?
In fact he has no idea what is on the other side of the great hedge. He doesn’t know there is something he wants on the other side because he doesn’t have any idea what might be there. He is simply determined to get through. Is this to escape from his hard and hopeless life? Is it to escape from or somehow to realize his hopeless, half-admitted love for his stepmother? I don’t know, but I think he is doing what most of my male and female heroes do: butting his head against a wall, knocking down a barrier, opening a door, enlarging the space available (for life, for thought, for knowledge). Creating freedom. With passion, patience, obsession, transgression.