Marketing Pilgrim's "Legal" Channel

Sponsor Marketing Pilgrim's Legal Channel today! Get in front of some of the most influential readers in the Internet and social media marketing industry. Contact us today!

Judges Want Fair Use of Google Porn

If you’re one of the many Google Image users that have not turned on safe search, you’ll likely be glad to know that the U.S. Court of Appeals has overturned a previous ruling that would have forced Google to remove thumbnail images of porn taken from the Perfect 10 web site.

The appeals court ruled that the thumbnails fell within a “fair use” exception in copyright law because they play a role in the search process and thus have a function different from that of the original photos.

“We conclude that the significantly transformative nature of Google’s search engine, particularly in light of its public benefit, outweighs Google’s superseding and commercial uses of the thumbnails in this case,” Judge Sandra S. Ikuta wrote for the panel.

Google’s not quite out of the woods yet. The panel of judges wants the lower court to take another look at whether Google’s display of thumbnail images, encourages other web sites to distribute copyrighted images.

The appeals court opinion said, “There is no dispute that Google substantially assists Web sites to distribute their infringing copies to a worldwide market and assists a worldwide audience of users to access infringing material.” The appeals court instructed the district judge to evaluate whether Google knew that unauthorized copies of Perfect 10′s photos were being made available and failed to take steps to prevent it.

  • http://www.rubyonrailsexamples.com/ Ruby on Rails Examples

    These are good news. Google Rocks

  • http://www.terryhoward.net/ Terry Howard

    I guess this really shows we need to answer some basic legal questions regarding web hosting and what it implies in relation to indexing and caching. In my opinion, if you place content in a public location that is linked from other pages that you are opening encouraging to be indexed, then that implies to the spider that the linked content follows the same guidelines, unless said otherwise by one of the many ways a developer can designate content as off limits. Sort of like a files inheriting the permissions of the folder it is contained within, so does crawling rights. I think the smoking gun is that engines get content based off linking and have put in place easy and apparent means of excluding content. The idea of using analogies of leaving your door unlocked not excusing unlawful entry (a common one I’ve read) is not comparable. The issue needs to be considered on its own merits not by nonparallel abstractions or by judges who can’t barely charge their cell phone, much less understand networking, web serving and search technology.

  • http://www.mynetnuke.com MyNetNuke

    My opinion is similar to Terry’s. If you want to prevent indexing of content you can secure your pages/images, also use robots.txt to filter spiders.

    But when it comes to stole content. These things change.