š§Ā In progress [Last edited: 2/24/2026]
<aside> <img src="/icons/magic-wand_gray.svg" alt="/icons/magic-wand_gray.svg" width="40px" />
A seed is a field of randomly generated noise from which Midjourney is denoising into an image in the diffusion process. You can retrieve the seed number from an image by reacting with an āļøĀ in Discord, or using the menu on Web. You can also specify the seed yourself in the prompt, with the --seed N parameter (where N is an integer between 0 and 4294967295).
</aside>
<aside> <img src="/icons/help-alternate_red.svg" alt="/icons/help-alternate_red.svg" width="40px" />
</aside>
Midjourney official doc about seeds: https://docs.midjourney.com/docs/seeds
A seed is a field of randomly generated noise from which Midjourney is denoising into an image in the diffusion process. Imagine a snapshot of television static. Midjourney never starts with a blank canvas. When you send your initial text prompt ( /imagine ), a random seed is assigned to the job unless the prompt includes a seed parameter or an image. The same seed is used for an entire batch of initial images (also called a job, imagine, grid).
The denoising process, handled by a neural network and a diffusion model, makes billions of decisions about the correct placement or removal of those random pixels according to billions of lessons it learned based on billions of other pictures it has examined. Gradually, through these billion decisions, Midjourney transforms the random pixels to an image that strives to express the correct pixel placement for your prompt.
<aside> <img src="/icons/forward_lightgray.svg" alt="/icons/forward_lightgray.svg" width="40px" />
For more on diffusion and denoising, check out:
What is a ādiffusion modelā?
</aside>
Seeds are not used in prompts with image references because that image serves as a āparent imageā from which the denoising process begins. Same for Variations and Remixes. Midjourney is refining your parent image further. This is why variations and remixes can sometimes fix problems with faces or fingersāthe GPU gets to spend more time refining since itās not starting from scratch.
Hereās what a seed might look like (see, TV static!):



<aside> <img src="/icons/help-alternate_lightgray.svg" alt="/icons/help-alternate_lightgray.svg" width="40px" />
Can seeds transfer or ābookmarkā a style, character, etc?
No, they are simply starting noise and carry no other information. Beware of anyone claiming that you can use XYZ seed to recreate some effect!
</aside>
<aside> <img src="/icons/help-alternate_lightgray.svg" alt="/icons/help-alternate_lightgray.svg" width="40px" />
Are seeds permanent?
Not really. In theory, if you use the same seed with the same prompt and settings, youāll get a comparable version of your imageāhelpful if you want to test in the short term how small changes in your prompt affect the outcome. But hereās the catch: Midjourney assigns your job to a random GPU each time you generate an image, so it isnāt useful for maintaining consistency long-term. We call this āseed driftā. Seeds are also different between model versions.
</aside>
<aside> <img src="/icons/help-alternate_lightgray.svg" alt="/icons/help-alternate_lightgray.svg" width="40px" />
How much does the seed influence the image?
Seeds are the weakest force in Midjourney compared to the prompt and parameters. They may not work as expected even for testing purposes. If your prompt or parameters change significantly, the influence of the seed may not be discernible at all.
</aside>