Runway Gen-3 Alpha mini review

I spent $15 USD to test Runway‘s new Gen-3 Alpha engine for Gen AI video generation so that you don’t have to.

Verdict: SAVE YOUR MONEY.

My prompt for this video was: “A wide angle panning shot of a tiger playing the piano.” Runway generated a person playing the piano, while a tiger with only two legs posed next to him.

I tried multiple types of prompts on different subjects. Short prompts, long prompts. Different camera angles. Different lighting scenarios. I even asked ChatGPT to study the Runway prompting guide to generate a complex prompt. Each time, I was left disappointed with Runway’s inability to generate anything remarkable or faithful to my prompt.

Like Luma AI, it seems that Runway has a library of generic visuals that it slaps together to try to fit your prompt. It cannot understand my prompts that ask it to transform a seed into a flower, to get a robot to dance, or to get an animal to play the piano.

Runway is more adept at doing videos where things hardly move (eg. roses swaying in the wind). If so, what is the point of Gen AI video apart from making animated screensavers?

Participants in my Gen AI workshops often ask me which text-to-video apps they should use for marketing. My response remains the same: “None, they’re all terrible for commercial use. Use stock videos instead.”

Will OpenAI’s Sora change the game? I hope so.

I’m cancelling my one-month Runway subscription after I post this.

Part 2: We should not have to pay to try

Which Gen AI apps should you pay for? Answer: The ones that help you get things done better.

I pay for four Gen AI apps today – ChatGPT, Midjourney, Adobe Creative Cloud, ElevenLabs – and sometimes I activate Claude when necessary. I use these apps to help me design my teaching content for university and they also aid in my comms consultancy work.

Yesterday, I got annoyed with having to pay $15 USD for a month of Runway’s text-to-video service. I just wanted to try its new Gen-3 Alpha engine, but there’s no free trial.

After paying up, I found that Runway’s video output has improved, but none of the videos meet the bar for commercial use, or even sending to friends for fun.

The prompt for this clip was “A cinematic low angle shot of a white horse running across the lake, luscious mane, reflective water, purple and orange clouds, diffused lighting, high contrast”.

At first, I was glad this video was turning out better than the other videos I had generated earlier. Then, as the clip ended, the horse sank helplessly into the water and alas, we will never know its fate.

Generating such a video is not cheap due to the computing power involving. It costs subscribers $1 USD to generate 10 seconds of Gen-3 Alpha video. The cost of computing is understandable, but the nature of Gen AI means that you have to keep iterating and regenerating to finetune the outcome. Iterating means that your $15 subscription will disappear quickly and you will probably still not get an acceptable video.

I get it. Every AI startup wants eyeballs, signups and subscriptions. But they ought to learn from OpenAI which opened up ChatGPT 4o to free users. That move allowed everyone to taste the power of the latest LLM engine, and also to try out the custom GPTs that other users create.

Good marketers let people try before they buy, not pay just to try.