Font Size: a A A

Discussions On The Legal Significance About "Robots Exclusion Protocol" And Robots.txt Document

Posted on:2017-07-06Degree:MasterType:Thesis
Country:ChinaCandidate:H ChenFull Text:PDF
GTID:2416330590490165Subject:legal
Abstract/Summary:PDF Full Text Request
"Robots exclusion protocol is a subdivision concept of the internet technology.It is an industry standard,according to which website owners can use computer language to guide search engines.Robots.txt document is the specific guidance document made by website owners.The two are different but closely linked.The existing related cases are mostly in the field of Anti Unfair Competition Law and Copyright Law.In the sense of Anti Unfair Competition Law,the two are separately internet industry standard and the specific implementation of the industry standard.When a website has a robots.txt,whether the content of the document is reasonable,wouldn t affect the legitimacy of Robots exclusion protocol".Meanwhile,the Robots exclusion protocol" is legal doesn t mean the specific robtos.txt document is reasonable.In the sense of copyright law,"robots exclusion protocol" gave the website owners some rights to limit the spreading range and object of the website content,but if the specific robots.txt is reasonable,it wouldn t be harmful to the copyright owner.When a website has no robtos.txt document,it would be a implied license to the visit of the research engine,but not to the use of the website content.Both legal principle and cases are used in this paper to help to analysis,and to give the premise and limit of the protect from Anti Unfair Competition Law and Copyright Law,attempting to provide reference to the related judicial application and legislation in future.
Keywords/Search Tags:Robots exclusion protocol, unfair competition, implied license, internet technology
PDF Full Text Request
Related items