Midjourney's New Style Reference Tool is a Big Deal! - AIxC: 40
+ Things that Don't Help and I Have an Idea for... Custom GPT
Welcome back to newsletter #40. Today’s newsletter is exciting as we dive into a new tool Midjourney just made available. One that sets the tone for what’s to come with Ai tools.
Tips: Midjourney’s New Style Reference Tool
Crystal Ball: Things that Don’t Help
Strategy: I Have an Idea for…
Community: Consultations
01 / TIPS
Midjourney’s New Game-Changing Style Reference Tool
Midjourney recently released a new tool that gives designers more control, the Style Reference tool.
The tool references images you input and pulls out visual style cues, which it then includes in your outputs.
In the video above you’ll see that I started with the memorable Rabbit R1 and utilized that to create a range of products with a similar design language, ie rounded corners, minimal, orange, etc.
The results are not 100%, but that’s a symptom of limited time on my part and not limited capability of the tool. With more time and care, I’m confident I could really get the design language to come across even more.
My key takeaways:
SREF is a big deal, it will be a crucial part to the majority of prompting.
In our courses I often talk about the importance of term value dilution/boosting. This feature helps that out immensely.
The principles of IMAGELCP are still important, however SREF is taking a big role in communicating the attributes (A in IMAGELCP).
Use alongside other features like reference images / image weights for even better results.
Can get complicated with all the levers, but worth figuring out.
Process shown here:
Gather reference images.
Upload to Midjourney.
Build a trimmed down IMAGELCP prompt, ie model, guidance and maybe one more is enough.
Add --sref and image links (up to 3). Would look like this /imagine: helmet, orange, front view --sref imageurl.
Add --sw preferred weight (0 to 1000).
Run permutations as needed. I suggest exploring --sw, --s and --iw and see how they impact things.
Run everyting else we've learned to iterate: re-roll, vary subtle, vary strong, vary by region, remix, etc.
I’ll be doing an entire module on this one. Coming soon.
02 / CRYSTAL BALL
This Doesn’t Help
I see different posts like this almost weekly. It’s always tied to a new tool coming out and how it is ending a profession. Almost always it comes from someone not in the profession that is set to be “destroyed”.
I get it, it gets clicks.
Will new tools impact tasks we do in our professions? Yes.
Will they alter our job roles? Yes.
Will we need to adapt? Yes.
Will we come out ahead? I strongly believe so.
I think we’re just now starting to see the first glimpses as to what can happen when you combine Ai tools with professional human creative experience. It gives me a lot of hope as to how this will play out. And in my eyes, it’s far from the “destroyed” narrative.
03 / STRATEGY
I Have an Idea for…
Late last year, Greg and I took part in this 4 day Ai Design Sprint. Since then, Greg has put a lot of work into refining the UX portion of this project, crafting a custom GPT that begins laying the front-end groundwork by simply typing “I have an idea for…”.
It’s 20+ years of knowledge and real world experience packed into a custom GPT.
If you haven’t yet, I’d highly suggest taking a look at Greg’s Gen Ai for UX/UI course. It’s four live sessions packed with everything you need to start down this path. Designed by professional for professionals.
04 / COMMUNITY
AIxC Consultation Calls
Do any of these scenarios sound like you?
Have specific issues you want to tackle with Ai?
Have questions about a specific Ai tool?
Trying to figure out an Ai enhanced workflow?
Have a project that’s run into a dead-end?
Need to get a game plan together for how to start down the Ai path?
Need to better understand the lay of the land of Ai?
If you can relate, book a consultation session with me to work through these or any other questions you may have. We can get into workflows, tool tips, how-tos, tackle a project… anything you need.
If you simply just want to chat 10,000ft view Ai or have questions about our services and how we can help, then book a free chat here on my Calendly.
As always, thanks for being a part of this community!
Hector
Hi, I'm Hector, a seasoned industrial designer, brand builder, and entrepreneur, with 22+ years of diverse experience. I have worked in hard goods, soft goods, bicycles & micro-mobility, structural packaging, proactive innovation and even running my own consumer goods business.
My superpower aside from AI? Leveraging my dual-edged expertise in design and business to supercharge organizations. If you need Ai advisory support for your organization, don't hesitate to drop me a message or catch me on LinkedIn and Instagram.
Lovely article! Found you from some AI newsletter and keep coming back weekly to this post. It inspired me to start an Etsy shop selling Digital Prints. As I dug deeper, I was confused about the difference between stand alone image prompts, the blend function, and style reference. Here's what GPT-4 via Copilot has to say:
In MidJourney, the concepts of "blend" and "style reference" play distinct roles in shaping your AI-generated art. Let's explore the differences:
1. Blend:
- Function: The blend feature allows you to combine elements from different styles or eras to create a harmonious fusion.
- Objective: When using blend, you aim to merge various artistic influences, resulting in a unique blend of visual aesthetics.
- Example: Imagine taking the texture of an impressionist painting and mixing it with the colors of a vibrant abstract piece. The outcome would be a delightful hybrid of both styles.
- Analogy: Think of it as crafting a custom cocktail by blending ingredients from various flavor profiles.
2. Style Reference:
- Function: The style reference parameter acts like a magic wand, allowing you to replicate the visual aesthetics** of existing images in your new creations.
- Objective: With style reference, you're not aiming to copy the base image directly. Instead, you want to capture its style, tone, or mood.
- Result: The images produced using style reference** maintain a stylistic connection to the reference but are not exact copies.
- Use Case: Suppose you admire the neon glow and dynamic energy of a sci-fi artwork. By using a style reference, you can infuse these elements into your own composition.
- Flexibility: Style reference is more flexible than direct image prompts, as it focuses on styles rather than specific subjects.
In summary, blend combines influences, while style reference captures aesthetics. With the latter, you can twist and reimagine elements from various styles to create something truly unique and personal.