After The Walking Dead: The Ones Who Live, What's Next for the Franchise?
The Walking Dead: The Ones Who Live might have ended, but there's still plenty more story to tell in The Walking Dead universe.
The Walking Dead: The Ones Who Live ended with a beautifully tied bow and plenty of reveals. There are no plans for a second season, and arguably no need for one either. But that leaves a hole for fans of The Walking Dead who want to see the post-apocalyptic story continue in any way, shape, or form.
What's Your Reaction?