Let’s do an experiment. I want to see if I can recreate an existing stock photo, taken by a real photographer, using the AI image generator Dall-E. Is it even possible to recreate a stock photo using Dall-E or can we only create something similar?
Well, through a lot of trial and error, I found out that you can actually get pretty close to recreating a stock photo using Dall-E, but it takes a bit of work and probably won’t be identical to the original. But is that good enough? Here’s my whole process of recreating a stock photo in Dall-E.
The Experiment Parameters
Here are the parameters I’ll be following to decide if this experiment is a success or not:
- Image to be recreated must be an existing stock photo from a free stock photo website (This is because I am cheap and don’t want to buy a photo just for this experiment).
- The Dall-E generated image must be photo-realistic when complete.
- The Dall-E generated image can be edited, in Dall-E, as many times as necessary to get a realistic looking ‘photo’.
- If at some point the image edits or image generations are only getting worse, I will conclude that you can not realistically recreate a stock photo using Dall-E.
- The end result Dall-E generated image does not need to be an exact copy of the original stock photo, but needs to be close enough that both images would work for the same design project with only minor edits in Photoshop.
- We’re ignoring the fact that Dall-E generated images are lower resolution for now. Where a professional photograph would be high resolution; getting a Dall-E generated image to high res is another challenge altogether.
Finding A Simple Stock Photo To Replicate
Since this is my first experimental experience trying to replicate a stock photo using Dall-E, I figured we should stick to locating a very simple photo on Unsplash. If the experiment works and I’m able to fully recreate the image, then I’ll move up to more complicated imagery. Until then, we keep things simple.
For that reason, I think I’ll avoid people. In the image I mean, not IRL. In other words, I’ll try to find an inanimate object, landscape, vehicle, or something floral that I can easily describe to Dall-E. The image shouldn’t have a lot going on in the background either. If Dall-E can create something close enough, then maybe I’ll move on to a different more complex image. But for now, let’s start with this landscape image I found on Unsplash by Marek Piwnicki.
Recreating The Stock Photo in Dall-E
For my first attempt at recreating this stock photo in Dall-E, I’m going to try the prompt ‘dark blue mountain range in front of an orange sunset realistic‘. I’m not sure how else to describe the image so that’ll have to do.
The results were less than spectacular. What I got looked more like vector illustrations than photographs. The second batch of images weren’t great either, but they were somewhat more realistic; likely because I added the word ‘photo’ to the prompt.
Next I altered the prompt to ‘far away dark blue rocky mountain range in front of an orange sunset sky realistic photo‘ and saw better results. Though the results were not as close as I would like. Plus the colours were a bit off. I tried to generate more imagery using the same prompt, and each time came pretty close to something that would work, but not without significant editing, either in Dall-E or in Photoshop.
I wasn’t getting much closer just by regenerating the same prompt, though the last image generated wasn’t terrible. Regardless, I had to try something different. I added a few words to the original prompt to get ‘far away pale blue very rocky mountain range in front of an orange sunset sky realistic photo‘ and the results were very different. I actually liked these images better, but they were much further away from the original stock photo in many cases.
I kept regenerating the prompt but wasn’t getting much closer to the original. In some images the mountains weren’t rocky enough, or there wasn’t enough texture, or the composition was off, or the sky was too blue. I felt like unless I regenerated the image about a hundred times, I wouldn’t be able to get something close to the original. Unfortunately, constantly generating Dall-E imagery costs money, so I need to keep a limit on things here.
Anyway, I added the word ‘detailed‘ into the prompt and changed ‘sunset’ to ‘sunrise‘. I basically got the same results, but the mountains seemed to have a bit more texture and the orange appeared more accurate to the original photo.
Are we there yet? Has Dall-E generated a single image that could replace the original stock photo in a design project? There seem to be a few that are close, but not close enough. I added the work ‘textured‘ into the prompt and that seemed to help things a bit.
This next batch is actually pretty decent. I think if we take the second image and edit the clouds a bit, in Dall-E, we could get something close to what we’re looking for.
So using the second image in this most recent batch of photos, I erased the clouds and regenerated that area. I also expanded out the sides a bit since the original image is rectangular. We needed to match that. Next, I took the newly generated image into Photoshop to get rid of the bars at the bottom right and put a bit of blue into the mountains to match the original. This is what I ended up with after some minor edits.
Recreating A Stock Photo using DALL-E Results
As you can see, the resulting photo isn’t an exact match of the original stock photo, but the colours and general composition are very similar. The Dall-E one, surprisingly, has a bit more texture than the stock photo, though the mountains do appear closer. If I wanted an even closer match I could likely play around with it in Photoshop a bit more, but that would take much more time. For a design project, however, the two photos are similar enough that the Dall-E one should be able to replace the stock photo.
The most realistic scenario, where recreating a stock photo using Dall-E would be useful, would be if a designer located a stock photo they wanted to use for a project but there was absolutely no budget for a stock photo. The designer could recreate the photo in Dall-E, then edit it in Photoshop to get exactly what they need, or perhaps even improve upon the original. Although, until Dall-E allows you to generate dramatically higher resolution images, we’re stuck using Dall-E images in digital format only. Though you can use tool like Img.Upscaler if you’re okay with losing some of the detail in your AI generated imagery.
Recreating A Stock Photo using DALL-E – In Summary
Overall I think this experiment turned out well. I was able to nearly recreate a simple stock photo using Dall-E that I could realistically use for a design project. I suppose this means I’ll have to move onto recreating a more complex photo next; maybe a face or architecture? That actually sounds difficult.
Without being able to completely and accurately describe every single detail about the original stock photo, the only real option to replicate a stock photo using Dall-E is to keep generating until you get something close, then make minor edits in Photoshop. If Dall-E would also allow things like making general compositional sketches before generating an image that would dramatically increase our ability to generate something highly specific and useful for a certain project.
Have you tried to recreate a stock photo using Dall-E for a project? Is there a design scenario that you can think of where being able to recreate a stock photo using Dall-E would be beneficial? Let me know in the comments below!
–
Check out a similar post where I test the real-world capabilities of Dall-E for design projects: Can You Make Awesome Album Cover Art With Dall-E?
Photo Credits
Photo by Marek Piwnicki on Unsplash