| The current law of our country stipulates that if the Internet service provider fails to actively review the user’s infringement on the website,it does not constitute a subjective fault.In the judiciary,the judge recognizes that the network service provider should bear a certain degree of duty of care,under the premise that they dose not undertake the obligation to actively monitor.But there are no specific criteria for the degree of duty of care.The author argues that this kind of substantively low standards of duty of care cannot guarantee effective governance of cyber piracy.With the continuous progress of content filtering technology,more and more network platforms have begun to have the ability to take technical measures to identify and shield pirated information uploaded by users.Therefore,the author believes that when users request specific features,and under certain special circumstances,the online platform should adopt active copyright content filtering measures.Firstly,the author analyzed the original intention of establishing a copyright content filtering obligation.First,with the advent of the ear of Internet platform,the current phenomenon of Internet piracy is characterized by a large number of scattered actors relying on the Internet platform to implement infringements,and the platform is not willing to conduct strict management or even encourage the occurrence of infringement out of consideration for its own interests.Therefore,it is necessary to strengthen platform governance.Second,the existing "notice&shutdown" piracy governance model is costly and inefficient."notice&shutdown" brings a lot of unavoidable labor costs,and the implementation process is lengthy and cannot remove infringing information in a timely and effective manner.Third,content filtering technology is now becoming more advanced.Machines can efficiently and reliably identify texts,images,and audio-visual formats,and related technologies will further develop.Fourth,China’s sector legislation and the EU’s latest legislation have already made useful attempts in this regard.Second,the author demonstrates the basis for the existence of this obligation in three dimensions:legal economic analysis,administrative law theory,and tort law theory.First,the author argues that past research has overemphasized the cost to the platform,but ignores the negative externality cost brought by pirates.A reasonable cost-benefit analysis should compare the size of the negative externality costs of piracy and the cost of content filtering measures.The author believes that the latter cost is smaller,so choosing content filtering is a more efficient option.Second,the content filtering obligation of the Internet platform is not only a private law obligation,but also has the nature of a public law obligation.The administrative agency transfers its own responsibilities to the third party private entity in the form of administrative orders or departmental legislation.The theory of administrative law is called third party obligations.The author believes that giving the network platform a content filtering obligation is in line with the requirements for establishing a good third party obligation.Third,if the security protection obligation in tort law is applied in cyberspace,the network platform’s duty of care should be close to the administrator of the public place and meet the standards of diligent attention of the good father.Under this standard,the copyright content filtering obligation is due.Finally,the author believes that the establishment of a copyright content filtering obligation should pay attention to the following aspects:First,it needs to reconcile with the relevant provisions of the online platform under the current law to actively review the compulsory exemption.The author advocates that the current law only removes the general review obligation of the network platform for all contents on the network.However,under the copyright owner’s request and under certain special circumstances,giving the network platform this obligation is not contrary to the law.Second,the intelligent algorithm of content filtering technology has the characteristics of a black box.Legislators should pay attention to enhancing the transparency of the algorithm and avoiding the damage to the rights of copyright owners and users.Third,we can refer to the "notice&shutdown" model to set up relevant complaint relief mechanisms to effectively protect users’ freedom of speech and other legitimate rights and interests. |