More on using AI to make Film Simulation Recipes

Captured with the AI-made Urban Dreams Film Simulation Recipe on my Fujifilm X100V.

In early 2023, Artificial Intelligence burst onto the Film Simulation Recipe scene, and, at the request of photographers across the world, AI began creating all sorts of camera settings to emulate all sorts of different aesthetics for Fujifilm models. I was on the leading edge of this, and in March published Using AI to Create Film Simulation Recipes, which included the ChatGPT-created Urban Dreams Recipe.

I didn’t stop there. In April I published a YouTube video (which you’ll find below) entitled Kodachrome X Fujifilm Recipe made by ChatGPT vs Fuji X Weekly Kodachrome II. I don’t publish very many YouTube videos (only three in 2023); however, I felt this topic was important enough to dedicate a video to it. Shortly thereafter, also in April, I published a companion article called Can AI Make a Fujifilm Recipe?, which included the ChatGPT-created Kodak Ektachrome E100VS v1 Recipe (along with the non-AI v2).

I didn’t quit there—not even close! In June, on the SOOC Live broadcasts, Fujifilm X-Photography Nathalie Boucry and myself chatted twice (for about three hours in total) about ChatGPT Film Simulation Recipes. If you’re curious about this topic, that’s probably the most informative discussion you’ll find. I’ve included both of those videos below.

Also in June, as a companion piece for those two SOOC Live episodes, I published an article entitled Shooting with ChatGPT AI Fujifilm Film Simulation Recipes. That article included two more AI-made Recipes: Soft Blue Classic and Vivid Summer Glow. I’ve published a total of four ChatGPT Film Simulation Recipes on this website. I think two are pretty good, and two are very mediocre—only the two that are “good” are in the Fuji X Weekly App.

While I’ve published four AI-made Recipes, I’ve asked ChatGPT to create probably close to 50 (using multiple accounts). Most of them weren’t good, so they never saw the light of day. I’ve talked about all of this before, but just to quickly rehash, here are some of the big issues with using ChatGPT to create Film Simulation Recipes for Fujifilm cameras:

  • AI will always give you an answer, but whether that answer has any value is another question entirely.
  • AI can’t see, and doesn’t know how the various camera settings translate to real-world use. It can only attempt to match descriptions.
  • If you ask it to create the same Recipe—using the same exact wording—on multiple accounts, you will get multiple answers, often quite divergent from each other.
  • I’ve never experienced an instance where all of the required Recipe parameters were included in the first try. Each time, I’ve had to identify the missing camera settings, and ask AI to add those to the Recipe.
  • AI will sometimes include nonexistent or nonsensical settings.
  • ChatGPT is very predictable, and will not usually stray outside of certain box. Ask it to make enough of these, and the box is easily identifiable.
  • ChatGPT doesn’t cite its sources, even when asked. It definitely operates unethically, and probably (in my opinion) illegally—eventually courts will make various rulings as they hear different lawsuits.

So why bring this up again? First, I’ve had several people over the last month comment that ChatGPT is a wonderful resource for Fujifilm Film Simulation Recipes. I strongly disagree with that for the reasons mentioned above, and I thought it worthwhile to discuss those reasons again for those who might be unaware. Second, someone asked if Google’s Bard AI was any better than ChatGPT at creating Film Simulation Recipes. I wasn’t sure, so I spent some time over Thanksgiving putting it to the test. So let’s find out!

Captured with the AI-made Soft Blue Classic Film Simulation Recipe on my Fujifilm X100V.

The first question I asked Bard was, “Make me a Film Simulation Recipe for my Fujifilm X-T5 that mimics the aesthetic of Kodachrome 64 film.” Well, I could tell very quickly that this wasn’t going to work out well, as the first setting it gave was the Acros film simulation. It included the nonsensical setting of Color +2 (you cannot select Color when using Acros). The settings that Bard failed to provide were Dynamic Range, White Balance Shift, Color Chrome Effect (although it did give Color Chrome FX Blue), and Clarity. I assume that it would have also missed Grain size, but it said to set Grain to Off.

Obviously that “Kodachrome 64 Recipe” will not look anything like actual Kodachrome 64 slides, even if it did somehow make sense. I asked Bard to cite its sources, and to my surprise it did! It gave me (broken) links to my website, including these articles: Kodachrome 64, Kodachrome 25, and Monochrome Kodachrome. Apparently, Bard has been trained using Fuji X Weekly (or to search for it… Bard is a Google product after all), but it’s not very good at it. In my opinion, AI should provide a bibliography of its sources upfront without being asked. Even Wikipedia does that, and apparently it’s not a reliable enough source for serious work. Bard will cite its sources when asked, which is better than nothing, while ChatGPT won’t even do that. If Wikipedia can’t be taken seriously, than AI should be taken even less so. It’s super sketchy that AI doesn’t cite it’s sources, but at least Bard will do so when prompted to.

I asked Bard to tell me why it chose the settings that it did. This demonstrates very clearly that trying to match descriptions of camera settings with descriptions of the film is an unrealistic method for achieving accurate approximations of various looks. For example, “Acros is a black and white film simulation that provides a good starting point for replicating Kodachrome’s tonal range and contrast.” And, “Kodachrome had a tendency to lift shadows, creating a slightly brighter and more open appearance. Raising the shadow setting by 0.5 replicates this effect.” And, “Kodachrome had a subtle ability to retain detail in highlights, preventing them from becoming blown out. Lowering the highlight setting by -1 helps achieve this result.” If you know even a little about Kodachrome 64 film and Fujifilm camera settings, you should have no issues identifying the problems there.

Bard’s “Kermit the Frog juggling Elmo Recipe” is not a good choice for portraits.

I didn’t stop with that one Recipe. I asked Bard to make many more. After a number of tries, I began to ask it for ridiculous Recipes, just to see what it would do. Make me a Film Simulation Recipe that mimics Kermit the Frog juggling Elmo. Make me a Film Simulation Recipe that mimics the look of digesting rotten apples that are crawling with spiders. Make me a Film Simulation Recipe that mimics the aesthetic of comedian Brian Regan hitchhiking across the Delaware River during the Clone Wars. In each case—instead of responding with “you are nuts” or “that makes no sense”—it provided me with a Recipe. It always will. For the Brian Regan Recipe, it tells me, in order to capture the essence of his adventure, I should photograph him on a rural road next to a river bank with some fog. Sounds intriguing, but there’s a problem.

You see, Bard has a bigger tendency than even ChatGPT to include nonexistent settings. Some of these are Color Chrome FX Red (which, arguably, could be just Color Chrome Effect), Color Chrome FX Green, Grain Effect Medium, WB Shift +1, and Image Enhancement On. This is in addition to the missing settings that are required for a Recipe but weren’t included, and the occasional nonsensical ones (like Color +2 with Acros). Sometimes it’s hard to make sense of what Bard provides when asked to make a Recipe.

For those wondering about Color Chrome FX Green, Bard had this to say: Color Chrome FX Green increases saturation and vibrancy of green pixels and slightly lifts green shadows for a more balanced tonal range. Use sparingly, as a subtle application of Color Chrome FX Green can produce pleasing results. Excessive use may make the image appear unnatural. Of course, there is no such thing as Color Chrome FX Green, but even if there was, it wouldn’t work as described by Bard.

Captured with the AI-made Kodak Ektachrome E100VS v1 Film Simulation Recipe on my Fujifilm X-T5.

The conclusion of this whole article is that ChatGPT and Bard are not good resources for Film Simulation Recipes for Fujifilm cameras. They are both capable of making Recipes (they will always give you one if you ask), but whether or not those Recipes are worth using or resemble what they’re supposed to is another story entirely. A broken clock is correct twice each day, and sometimes you can get a good Recipe by chance when using AI, but most often that won’t be the case. However, because these chatbots tend to stay within a box and don’t often stray outside of it, the Recipes that they make are often not terrible, they’re usually okay-ish. But you have to put up with the missing, nonexistent, and nonsensical settings. And you have to be alright with the unethical nature of it. My personal advice, if you want a Recipe that is more likely to give you good results, is to look no further than my website.

Someone said that AI will usually do a better job than a complete novice, but never as good as an expert. I agree with that assessment. I think if you ask ChatGPT or Bard enough times for Film Simulation Recipes, you are bound to get some that are trash, a bunch that are so-so, and a couple that are actually good. You could probably have as much luck rolling dice or throwing darts to create Recipes. That’s just the way it is. However, the technology is advancing rather quickly, and it’s only a matter of time before AI will be able to analyze pictures, have a good understanding of Fujifilm camera settings, and be able to approximate an accurate replication of picture aesthetics with the available JPEG settings. We’re not there yet, though. There’s a significant gap. I don’t doubt that the technology will get there eventually, perhaps in another five or ten years.

That will put me out of business. I understand that day is eventually coming, and that’s ok. It is what it is, as they say. The sad thing is that AI is using my own words and work against me. The more I publish, the more resources OpenAI and Google have to train their AI. It’s a type of theft. I do think that the courts will eventually rule that much of what they’ve done doesn’t qualifying as “fair use” under the law, but the damage will already have been done, and folks like me wont be compensated a dime for it. That’s the way life goes sometimes. The big guy walks all over the little guy. But, when life throws you punches, bob and weave (I’ve heard that quote attributed to Joe Frazier, but I’m not certain), which simply means that I have to evolve. As Bob Dylan sang, the times, they are a changin’. They always have been, and always will be. That’s why I spent so much time this year exploring the topic of AI Film Simulation Recipes. I had to know where Fuji X Weekly stands in all of it. This website has evolved a whole bunch since its launch in 2017, and it will continue to do so in the coming months and years. I appreciate all of those who have come along for the ride, and I look forward to seeing where this Fuji X Weekly thing goes as the future unfolds. If anything, it will be an interesting adventure, perhaps like Brian Regan’s hitchhiking escapades along the foggy Delaware River during the Clone Wars.

12 comments

  1. Randy · November 26, 2023

    Very Interesting! I also tried Chat GPT and the results were a long way from what I was seeking, but I haven’t tried Bard yet. I did have success using Microsoft Bing, though, to generate recipes. Two out of three were actually pretty good, especially one where I requested a recipe to mimic aspects of Eggleston’s photographs.

    • Ritchie Roesch · November 26, 2023

      I’ve never been able to get Bing Chat (I think that’s what they’re calling it now) to make me a Recipe, only to provide ones already made by others (mostly those made by me). Maybe I’m using the wrong wording? Anyway, this is preferable, I think, because it provides a link to the source and gives credit to the Recipe creator. However, I’d pretty much get the same results just Googling it. Maybe I’m just not asking the right way, though.

  2. Nick PlanB Photo · November 26, 2023

    Ritchie,

    I was writing a document in a fairly obscure subject (Women’s roller derby) a few months ago and asked ChatGPT a few questions to see if it would help …. While the language it used was clumsy the content was reasonable and it even quoted the sources (I had to ask for that).

    When I went to read the articles it had quoted from I couldn’t find them – the journals existed but not the articles or their authors.

    I asked for URLs for the articles and was given some nonsense about ChatGPT not being a search engine.

    Then I asked “did you make up the names of the authors and the articles?” Short answer: Yep.

    I can see a time, when this ‘glitch’ has been ‘fixed’ and it’ll be able to write my documents for me, or create wonderful film sims.

    I just hope it takes a loooong time before we get there.

    • Ritchie Roesch · November 26, 2023

      Yeah, it’s less trustworthy than most other possible sources available (but that will change with time, I’m certain). I just wish more people would see that it’s not especially trustworthy at the moment, and take what it says with a very large grain of salt, and not as gospel.

  3. Mark Scheuern · November 26, 2023

    Now that ChatGPT 4 can see, and browse the web, I gave it an Eggleston photo (the tricycle) and asked it to come up with a recipe. As it was browsing, it showed me what it was looking at—this very web site—and came up with this:

    For capturing the look of William Eggleston’s photography with your Fujifilm X-T5, you can use the “1970’s Summer” film simulation recipe which aims to reproduce the aesthetic of the 1970s American New Color movement, reminiscent of old Kodak photographs and Eggleston’s style. Here are the settings for this film simulation recipe [oai_citation:1,1970’s Summer — Fujifilm X-T5 (X-Trans V) Film Simulation Recipe | FUJI X WEEKLY](https://fujixweekly.com/2022/11/27/1970s-summer-fujifilm-x-t5-x-trans-v-film-simulation-recipe/):

    – Film Simulation: Nostalgic Neg.
    – Grain Effect: Strong, Large
    – Color Chrome Effect: Strong
    – Color Chrome FX Blue: Strong
    – White Balance: 6500K, -1 Red & -4 Blue
    – Dynamic Range: DR400
    – Highlight: -2
    – Shadow: -0.5
    – Color: -2
    – Sharpness: -2
    – High ISO NR: -4
    – Clarity: -3
    – ISO: Auto, up to ISO 6400
    – Exposure Compensation: +1/3 to +1 (typically)

    This recipe works best in sunny daylight and is particularly suited for midday photography, capturing the warm tones of the summer season of the 70s. It is exclusively for X-Trans V sensor cameras like the X-T5, so it should work perfectly for your needs.


    So in this case, it pretty much did a search rather than coming up with something on its own, and cited the source. I’m not sure if my providing the photo helped any, since I also told that I was after an Eggleston look.

    • Ritchie Roesch · November 26, 2023

      From what I’ve seen on the image recognition side, there seems to be an extensive database that it can compare to and glean from (and I’m guessing that they’re using these pictures without permission, probably planning to claim “fair use” whenever they are inevitably challenged on it). Ultimately, though, it cannot “see” the pictures but can recognize objects by comparison and gather the keywords or descriptions that have been given for those pictures.

      If I were to emulate Eggleston’s tricycle picture, I would not choose 1970’s Summer. It was captured with second-era Kodachrome, so Kodachrome II would be the logical choice. The Kodachrome 64 Recipe could probably work, too. It seems to be an overcast day, so that has to be taken into consideration. Also, the picture isn’t the slide or a scan of the slide, but a reproduction of a dye-transfer print (which is an art in itself), which certainly affects the outcome.

      I’m glad to see it refer to an existing Recipe and give credit and a link. I’ve not seen ChatGPT do that before.

      • Mark Scheuern · November 27, 2023

        I was wondering if you would agree. Yeah, Kodachrome makes more sense.
        It actually did say that printing and scanning would affect the look, so it got that right, anyway.

      • Ritchie Roesch · November 27, 2023

        I’m guessing that it said that printing and scanning would affect the look because I’ve said that over and over and over and over in the various articles. In other words, I (unknowingly and unwillingly) trained it to say that (and probably others did, too). Just like it even knows what a Fujifilm Recipe is largely because of me. Without permission, compensation, and (most often) attribution, of course. To say it’s sad is such a massive understatement, because it is so very much worse than that. Thanks for sharing!

  4. Mark Scheuern · November 26, 2023

    Out of curiosity, I gave it another Eggleston picture and asked what film it thinks was used. It said the photo has a vintage look and was probably shot in the ’70s or ’80s, but then said that it was probably Kodak Portra (no, that was introduced in 1998) or Gold (not sure but I think that wasn’t available until later, too). It also guessed Kodachrome or Ektachrome.
    So, as is often the case, it kind of gets it but also gets things quite wrong.

    • Ritchie Roesch · November 26, 2023

      It probably didn’t have much information on that particular picture in its database. If there’s not much of a description available within its database, it doesn’t have much to give you. ChatGPT also has a tendency to just make stuff up sometimes…..

  5. Mark Scheuern · November 27, 2023

    Yes, I think the ‘70s era car in the picture was a big clue for it. And it definitely does make things up. When I pointed out that Portra wasn’t around then, it did its “yes, you’re right” thing.
    Always interesting how it can be amazingly good at some things but really bad at others, often in the same response. It’s only going to get better, though. It’s already come along hugely.
    I’ve tried it on some of my own shots, for keywording purposes, and it can at times be surprisingly perceptive.

    • Ritchie Roesch · November 27, 2023

      I think the pool of pictures that it can compare to is massive. Probably a lot of people are completely unaware that their own images are in that pool. 😮

Leave a Reply to RandyCancel reply