Prior to Hollywood, the film industry was centered in New York. Then in the early 1900s the entire thing moved to Hollywood California for several reasons, like to save money on filming all year around, cheap land (back then), etc.
I think it is time for the center of the film industry to move again… far away form California.
Or maybe it is better if the film industry doesn’t have a center anymore. Might be best to have it spread around everywhere to democratize it, and include all points of view and free speech. Not just one over-saturated super-toxic-woke point of view.