A new trend in Hollywood? Could be given the popularity of the movie Son of God, a bible-themed movie about the life of Jesus. Mega-churches are helping to spread the word and there is a strong demand it seems. What other bible-themed movies could be coming? What is Hollywood trying to do, make everyone religious, or are they just in it for money?
Recommend this post