Using Runway to animate Midjourney generations

For a while now, we’ve been wanting to revamp our DTLA space and wanted to get some inspiration going for what our small-but-mighty production office could be transformed into.

During lunch one day, we described what the space already looks like in Midjourney as well as some qualifiers plucked from frequent descriptors of our retail design products.

What was generated in Midjourney definitely isn’t suitable for our space re-design, but it did make for interesting source material to try out Runway’s Gen-2 image-to-video features.

Midjourney generating images of imaginary modernist spaces in a Discord dialog box screenshot

If you want to learn more about how to get started with Midjourney check out our blog post here! It’s a great primer for those who have never used Midjourney before and provides use cases for anyone and any career field to use AI image generation for inspiration and ideation.

Runway.ml dialog box for the image upscaling tool. Image of an avant garde colorful space being upscaled to 4K resolution.

In Runway, there is an image upscaling tool that we’ve found superior to Photoshop’s upscaler. Each Midjourney image generation was exported to a local drive and then resized to be 4K optimized in Runway before re-exporting the file to a local drive for backup.


From Runway’s Gen 2 asset builder, you can search for the image you just upsized in your library and use it to create a video animation. Users are also able to set some parameters around camera motion and add text instructions as well. (This experiment only relies on camera motion adjustments without text- prompts.)


Runway Gen-2 takes about a minute to generate a 4-second clip, and when the video has completed processing, you have the option to extend the clip another four seconds or export it to your local drive.



GIF made from MP4 generated in Runway, in which the base prompt was a Midjourney image generation. ➿

After some iterating, we had a collection of footage we were able to bring into Adobe Premiere Pro to cut together. Oddly enough, some of the Runway generations needed to have a camera stabilization effect applied to them to iron out a frenetic ‘camera shake’ that appeared in some of the clips.

Check out the whole cut here on TikTok :)

@echoechostudio Exploring new ways to work between #midjourneyai and #runwayml. #ai #generativeart #generativeanimation #generativephotography #vfx ♬ My Love Mine All Mine - Mitski

 

If you know of anyone who you think would like this kind of learning out loud in the AI space, please share this newsletter and check out Echo Echo Studio for more videos, design goods, client work, and experiments and leave a comment down below if you have any suggestions or ideas for us to try!

Leave a comment

Please note, comments must be approved before they are published