For those of you who don’t know me: Hi, I’m Kelly and I’m a graphic designer.
I’ve been using Adobe products for roughly ten years now and every year, the updates get better and better. The main products I use from Adobe are InDesign, Illustrator, Photoshop, Lightroom, AfterEffects, Acrobat.. you get the picture, I love them all. Well today is day 2 of the Adobe MAX Conference and yesterday they unveiled some really cool updates but the one that caught my eye – Photoshop’s Sky Replacement tool and Neural Filters.
We’ve talked about artificial intelligence a lot during this semester so imagine my excitement when I saw it brought to life through this new update. Adobe introduced an AI-powered tool that allows you *with a single click* to make facial alterations like changing someone’s expression, smoothing their skin, changing the direction of their head, transfer makeup from a reference image, and even aging! This update, I assume, was created in response to the countless photo-editing apps social media users turn to for selfie manipulation but man is it cool.
They’ve also introduced a Sky Replacement feature which again, allows you to replace the sky with a single click. For readers who are not familiar with Photoshop – replacing the sky in a detailed image can be very time consuming. You have to mask out details, clip out the image, alter the lighting, it’s a lot of work! The best part of this feature though is not only does it replace the sky, the AI is able to detect lighting, shadows, colorization that are all affected when replacing the sky, especially if you’re changing the time of day from a mid-morning sunny sky to a late sunset.
Let’s test these new features out.
Sky Replacement. On the left is the original image of City Hall in Philadelphia. On the right, I replaced the sky with the new sky replacement feature in Photoshop and made NO tweaks to it. It can definitely still be refined to brighten up the top of City Hall but you can see that the colorization in the foreground and on the buildings echo the colors from the new sky, making it feel more realistic.
Neural Filters. On the left is my original headshot image. I decided to play around with a few of the new features in these filters. First up, the expression feature. There were a few different options for this: happy, shock, and anger… I took the shock route since anger distorted my face in a way that was truly unflattering. Next, I tried the age filter. This is where I started to notice that they really only alter the face and hair within the blue frame and I haven’t quite figured out how to alter that frame. All I got out of this was I will, in fact, look like my mom in 30 years. Finally I tried out the head angle because how is this even possible? Out of all the new neural filters, I found this one the most successful as it altered the image just enough to change the angle of my face without causing a ton of changes that I can’t fix with a little finessing.
All of these features were done with one-click and can be adjusted and finessed but I can see this AI really changing the game for designers when it comes to editing photos quickly. The neural filters are still in the BETA phase so I’m excited to see how they evolve over the next few updates.
Adobe also introduced some other new features this October. If you’re an illustrator, you’ve been wowed by Procreate (not affiliated with Adobe) since its inception. It broke the barrier between illustrating through tools like Wacom tablets and attempting to sketch with your mouse by allowing you to create full digital illustrations on your iPad. Procreate has been making great updates over the years to ensure its compatibility with Adobe programs by allowing you to export PSD files and editing layers in Photoshop but there was still a disconnect between the two programs that really slowed down the design process. This year, Adobe finally introduced a competing product, Illustrator for the iPad. The beauty of this app is you can create completely customizable vector illustrations on your iPad, even more efficiently than you can on desktop. This new product took what Procreate was doing and pushed it further by bringing back the ability to actually sketch and draw digital forms while also providing the ability to really tweak and fine-tune anchor points that you would typically do on your desktop.
Adobe also introduced a Content Authenticity initiative that will help creatives to claim their work while also allowing the viewer to see how images are digitally modified over time… reducing “fake Photoshop edits”, “fake news”, and increasing proper exposure to the artist behind the work. Think about how many artists’ work currently go viral but no one knows who is behind it? With this new AI-technology comes equal opportunity. Artists will be able to have their work properly associated with them which will allow for more collaborations, opportunities and exposure.