Font Size: a A A

Research On Auxiliary Judgment Of Court Based On Data Mining Of Judgment Books

Posted on:2020-07-05Degree:MasterType:Thesis
Country:ChinaCandidate:C CaoFull Text:PDF
GTID:2416330575975831Subject:Applied Statistics
Abstract/Summary:PDF Full Text Request
Comprehensive rule of law is a very big systematic project,which involves a wide range of aspects.In the report of the 19 th CPC national congress,it was proposed to establish the central group of comprehensive rule of law in order to strengthen the unified leadership of China under the rule of law.From the perspective of strategy,legal big data is an effective tactic,which plays an important role in China's successful realization of the rule of law government,country and society.Since 2015,China has formulated the big data strategy,which has become more and more important in China's social development,and its position and role as a basic strategic resource have become increasingly prominent.Big data has had an important impact on the development of various aspects of the country.As the judgment documents start to be published online,the legal data are increasing day by day.At the same time,with the continuous improvement of computer computing capacity and the continuous development of data mining,machine learning,deep learning and other technologies,the era of legal big data is bound to come.As judicial documents begin to be published online,legal data no longer come solely from the government and the judiciary.Now a large amount of legal data can be obtained online,which has laid a solid data foundation for the development of legal big data.Network with the more and more judgment text data,so much of the data is actually a great treasure,in order to fully tap the legal judgment of value,in this paper,the written judgment has carried on the deep mining,which will provide some references for future sentenced to the court,also for the country's construction of rule of law provides a certain theoretical basis.In this paper,firstly,a set of crawler program is designed for the Chinese web of judging documents,and the criminal cases are the cases of intentional injury that violate citizens' personal rights and democratic rights,and the corresponding solutions are proposed for the anti-crawler measures of the Chinese web of judging documents.Then,we use regular expressions to extract the information of the judge documents according to certain regular rules.Secondly,the basic information of the defendant,the facts of the crime and other information mapping,so as to achieve a descriptive analysis of the defendant.The method of random forest model is used to predict the sentence in the judgment documents,the mainsentence and the suspended sentence are classified,the length of the sentence is predicted by regression,and the predicted result is interpreted in detail.Finally,this paper first used regular expressions to extract different parts of the judgment book,and then used RNN cyclic neural network and LSTM neural network to train each part and generate the text of each part of the judgment book.This article through to in exploring the depth of the book,we get the following conclusions:(1)the use of random forests model to forecast the principal punishments,its average prediction accuracy is 85.86%,and found that the death toll had the greatest influence on the result of the sentence,if someone has the influence of the second death,seriously influenced the final judgment for the defendant.(2)when the random forest model was used to predict probation,the evaluation accuracy of the prediction was 69.42%,and it was found that the total number of months sentenced,the education background of the defendant,the number of minor injuries,and whether the recidivist had a significant impact on the final judgment of probation.(3)when the random forest model is used for regression prediction of the sentenced years,its goodness of fit is 42.3% at most.This value is good enough for the prediction of practical problems.And it was found that the variable of whether there was death in the case had the greatest influence when the court finally sentenced the defendant.(4)Python software and TensorFlow deep learning framework were used for modeling.The judgment documents of intentional injury crimes in hebei province in 2018,which were climbed down,were first divided into blocks,and the model was trained.Finally,the text content of the defendant's basic information was generated.This paper designs a set of analysis and excavation scheme for the judgment documents,which will play a certain auxiliary role in the final judgment of the court.
Keywords/Search Tags:judgment document, web crawler, random forest, circulatory neural network
PDF Full Text Request
Related items