| Technological change is one of the bases for causing changes in legislations,and when online content filtering technology is advanced through information technologies such as artificial intelligence,online platforms as information medium also have technological resources that are more advantageous than administrative authorities.As ’technological empowerment’ gradually affects cyberspace,online platforms also have the ability to identify and control the contents of their platforms more efficiently.Therefore,the online content examination obligations of online platforms should be appropriately increased,and the online platforms should be required to assume differentiated illegal-content-filtering obligations on the basis of clear categories and standards of illegal content,in response to the objective requirements of China’s online content governance.This article first introduces the concept and types of online platforms,which are discussed in this article as digital organizations that provide information services and contents to users based on information technology,while online platforms can be divided into online service providers and online content providers;the article then takes China’s current online content governance rules as the basis for classifying illegal content into content that endangers national interests,content that endangers public interests and content that violates the rights and interests of private subjects.The article then analyses the challenges faced in establishing the obligation to filter illegal contents.Firstly,current online content governance rules in China are still imperfect,mainly in terms of unclear definitions of illegal content and the lack of uniformity in the subjects of obligation;secondly,as content filtering suffers from "over-filtering","under-filtering" and lack of transparency,it often raises concerns about freedom of expression and privacy;although current content filtering technology has been developed to a certain extent,the recognition rate in specific scenarios is not yet satisfactory.Although the current content filtering technology has been developed to a certain extent,the recognition rate is not satisfactory in certain scenarios,and illegal content is also spread more widely through information technology such as deepfake,so the current filtering technology still needs to be further developed.The reasonability of establishing the obligation to filter illegal content can be analyzed from the following four aspects: First,as the filtration obligation is in the nature of a review obligation in public law,the legitimacy of the content filtration obligation of the online platform can be justified by the third-party liability under administrative law,and as the online platform is closer to the source of danger than the user,and at the same time can more effectively control and identify danger,it should have a duty of care within a reasonable range;secondly,from the perspective of the inherent needs of online content governance,the current challenges of regulating illegal content have put forward higher requirements for online platforms;thirdly,from the perspective of the public nature of online platforms,filtering illegal content is an inherent requirement for online platforms to perform public duties,and the "quasi-public nature" of online platforms requires them to assume certain social responsibilities;finally,from the perspective of technological development,filtering illegal content is technically feasible,and filtering technology has already been applied in practice.In addition,as what contents shall be filtered and filtering standards are the basis for content filtering,the types of online platform is a prerequisite for clarifying the subject of the obligation and the assumption of liability,and whether the online platform can afford the filtering technology at an appropriate cost is a reflection of the reasonability of the obligation to undertake filtering,and the fundamental rights of users should also be effectively protected in the filtering process,the above four factors must be taken into account when establishing the obligation to filter illegal contents.The last part of this article proposes suggestions for the improvement of the filtration obligation of illegal content on online platforms,firstly,the principle of law-based content,proportionality and transparency should be observed when establishing the filtration obligation on online platforms;secondly,the future online content regulation should further refine the scope of illegal content on the Internet,provide specific standards for content filtering,and avoid establishing the filtration obligation as a general monitoring obligation.Finally,it is necessary to clarify the subject of the obligation,unify the various subjects of examination obligations in current legislations,further improve the liability regime of online platforms,and use mechanisms such as counter notice to effectively protect the remedy that users should have. |