I receive a lot of review books, but I have never once told lies about the book just because I got a free copy of it. However, some authors seem to feel that if they send you a copy of their book for free, you should give it a positive review.
Do you think reviewers are obligated to put up a good review of a book, even if they don’t like it? Have we come to a point where reviewers *need* to put up disclaimers to (hopefully) save themselves from being harassed by unhappy authors who get negative reviews?
I review books, and if you read my reviews, you will know that I am honest. I give my opinion of the work--simple as that. I will add, though, that I recognize that an author has put his time and effort into the title; no one should be treated badly, even if the job is not done well. Therefore, I do look for the good things. I state the weak points, but highlight those good points so that readers can make their own informed choices.
I have had interactions with authors, but always been on a professional level. I do not feel the need to define the job of reviewing by establishing disclaimers. My integrity cannot be bought for the price of a book!
What's your take on this question?
Check out my giveaways on my sidebar!