So I just found out that authors SELL their books ot a movie company and after that, they basically have no rights whatsoever on what they want for their book. Not the casting,places or anything. I feel that its unfair and horrible, I mean if it wasnt for YOUR book the directors and everyone else involved wouldnt be getting the money their earning . Shouldbt you have a say as to what you want? Isnt it YOUR vision? Well it is for me, im thinking about not only writing but directing, acting, and becoming a screenplay writer, thats the only way you get to create your vision. Does anyone else agree or disagree?