Professional Documents
Culture Documents
Stemming and Removal of Stop Words - Becoc316
Stemming and Removal of Stop Words - Becoc316
word_tokenize
Patil\\Desktop\\20_newsgroups\\alt.atheism\\abc.txt")
appendFile = open('filteredtext.txt','a')
appendFile.write(" "+r)
appendFile.close()
ps = PorterStemmer()
file2 = open("C:\\Users\\Sagar
words = word_tokenize(line2)
for w in words:
INPUT FILE:
OUTPUT: