Dear Facet Community,
Over the last year, we have fielded an almost overwhelming interest in our product — many of you are excited about working with a fully content-aware photo editor.
In the past we have prioritized onboarding a small group of pioneers: artists and creatives who have done some amazing work and have been very thoughtful and generous with their time and product critiques. (On a personal note, I can’t underscore enough how humbling and pulse-quickening it has been to watch artists attain flow state in a tool you built from the ground up.)
Going forward, we are now formalizing requirements for our Early Access program and more importantly, we’re rolling out Early Access to more users!
Are you an artist, designer, creative professional or freelancer? Send us a link to your portfolio and we will fast track you for access over the next few weeks.
Don’t have a portfolio but still want to try Facet yourself? Just fill out the form linked above and send us a note about something you find beautiful — anything at all — but we want to understand *why* you find it beautiful.
Already submitted your portfolio and *still* can’t wait to get in? Drop us a shout on social media. We’re @facet_ai on twitter
F(t): Facet in the time dimension
When we originally architected Facet, we knew that we would eventually want to target short, punchy video sequences and raster-heavy motion graphics. It felt like a lot to bite off for a small team, so instead we engineered our core backend infrastructure to work seamlessly on large batches of images.
That’s how we shipped content-aware batch-editing, and now, surprise, it turns out that videos can be thought of, at some level, as just large-ish batches of still images (with some additional, *nice* continuity structure).
Although the editor experience is still a bit rough, some of our users have been running with this feature and have already made some pretty compelling stuff.
What we’re reading
Our Seed in Facet
We raised a seed round from Accel last summer and they have been absolutely terrific partners for us through this stage of our journey.
“AI” is biased and we have to recognize that
"If the idea of tech not being neutral is new to you, or if you think of tech as just a tool (that is equally likely to be used for good or bad), I want to share some resources & examples in this thread."
The WebAssembly App Gap / Why Figma Wins
"Google docs and Figma have use cases that would have never occurred to Microsoft Word or Adobe XD users."
GANs are the second best solution to every problem
"Here’s something a lot of people still don’t know: The latest DeOldify doesn’t use GANs anymore. And I’m not being cute with terminology- NoGAN isn’t used either. We needed something more production worthy and controllable and it just wasn’t cutting it."
3K, 60fps, 130ms
Looking back on it 5 years from now, I predict that this post on using Rust for low-latency video streaming will feel obvious and boring to everyone, and that is exactly why I’m excited about Rust.
This is going to be *the* book on open source
"Hi I wrote a book!
It’s called Working in Public, and it’s the story of modern open source and its implications for online communities and the creator economy."
If you are able to please support the Equal Justice Initiative
We the migrating they trans-lated. Draft, meant drift, meant
scheme, meant sketch
— Nathaniel Mackey, “We The Migrating They”