top of page
Blog_Header Picture.png
Writer's pictureAli Menard

Testing Out the Beta Version of the New Firefly Integrated Adobe Photoshop - Part 1



Using AI in all creative fields is going to become standard practice in the future whether we like it or not. While many are not ready to embrace AI, I believe it is imperative for all designers and all creative types to learn how to integrate AI into their work. Not long ago, I wrote about the beta version of Adobe Firefly and testing out the online version with the new AI features. You can read that blog here. For this blog, I explored the new integrated beta version of Adobe Photoshop with the new AI features.


Before beginning, I did a little deep dive into the social media world to better understand what this software was capable of. Between Instagram, Tik Tok, and Youtube, I found hundreds and hundreds of videos of artists doing the same thing I was about to do. Of the plethora of videos I discovered, there were lots to choose from whether they were successful attempts or unsuccessful attempts in users testing out the integrated Photoshop.


Background Replacement

I started simple with a feature I had previously used in the online beta version of Adobe Firefly, generative fill. Again, starting with a picture of my fiancé (because I just can’t resist the urge to possibly make him look funny (sorry Erik)). Just like before, the more descriptive, the better the results.


For the first experiment, I decided to change the background and added a little bow tie just for kicks. Once the background was selected, I prompted the program to change out the background and to have the foreground subject sitting on a bench using earthy tones. Then, I selected a small portion around his neck and prompted the program to add a bright blue, no patterned bow tie (see results below). The first big visual difference between the online version of Adobe Firefly and the integrated Photoshop was the accuracy in lighting and shading. Take a close look at the bow tie I added to him. You can see the shadow beneath it where it should be. If the shadow didn’t look right, I would have spent the time and changed the direction of the lighting using Photoshop or another Adobe program. Thankfully, there was no need for that extra step.


Online Version of Firefly


Adobe Photoshop with Firefly


And what good would an experiment be if you didn’t attempt to replicate the results a second time? Thanks to an older picture of my friend Rita sitting in an open field of grass, I was able to experiment with background replacement once more. After selecting the background using the quick selection tool, I prompted the program to “replace background with large park filled with lots of grass flowers, tall trees, and a small pond”. I am pleased to say I was not disappointed. Both of the images the program produced were viable replacements for the current background. You can see both results below.



Expanding the Background

Moving on to a trick I saw over and over again on Instagram; expanding the picture to extend the background. To start, I found a photo I thought would work well for this experiment. Next, I expanded the canvas. Once the canvas was expanded, I selected both blank areas. After selecting both blank sections, I prompted the generative fill to fill in the background using elements from the picture. Minus the fact that the tall grass doesn’t have flowers like the rest of the picture, this is not a bad start. To add in more flowers, I took the photo back into the normal Adobe Photoshop. The ending results weren’t perfect but satisfying enough to move on.


Before


After Using Adobe Photoshop with Firefly


After Using Adobe Photoshop


While the online version of Adobe Firefly and the integrated Adobe Photoshop were very similar, there were several differences when using each tool. For instance, in the online version of Adobe Firefly, using the additive generative fill tool was only accurate to a point. Once in photoshop the lighting and angles of the objects and backgrounds were much more accurate. Plus, since you are already in Photoshop, you can easily correct the mistakes the AI programs use. When using the online version of Adobe Firefly, once you added a description for the AI to generate, you controlled the look and feel of each picture with the settings on the screen. When using the AI additive generation tool in Photoshop, you must use more descriptive words and more specific descriptions to produce the results you are looking for.


What other things is the new integrated Photoshop capable of? Does it have the ability to generate entire images just like the online version of Firefly? Find out more in my next blog.

Comentários


About A3 Media

A3 transforms media from an expense into a smart investment. Since 1997, we have successfully helped regional businesses launch new products, expand into new markets and increase sales through media plans that make every dollar spent do more. Our clients include brands such as Yuengling and Ashley Furniture. For more information about how A3 Media can help your digital marketing efforts, please call A3 Media at (610) 631-5500.

bottom of page