jieba分词

发布时间:2022-06-30 发布网站:脚本宝典
脚本宝典收集整理的这篇文章主要介绍了jieba分词脚本宝典觉得挺不错的,现在分享给大家,也给大家做个参考。

import jieba

txt = open("《西游记》.txt", "r", encoding='utf-8').read()words = jieba.lcut(txt) # 使用精确模式对文本进行分词counts = {} # 通过键值对的形式存储词语及其出现的次数

for word in words:if len(word) == 1:continue

elif word == "大圣" or word == "老孙" or word == "行者" or word == "孙大圣" or word == "孙行者" or word == "猴王" or word == "悟空" or word == "齐天大圣" or word == "猴子":rword = "孙悟空"elif word == "师父" or word == "三藏" or word == "圣僧":rword = "唐僧"elif word == "呆子" or word == "八戒" or word == "老猪":rword = "猪八戒"elif word == "沙和尚":rword = "沙僧"elif word == "妖精" or word == "妖魔" or word == "妖道":rword = "妖怪"elif word == "佛祖":rword = "如来"elif word == "三太子":rword = "白马"else:rword = wordcounts[rword] = counts.get(rword, 0) + 1

items = list(counts.items()) # 将键值对转换成列表items.sort(key=lambda x: x[1], reverse=True) # 根据词语出现的次数进行从大到小排序

for i in range(20):word, count = items[i]print("{0:<10}{1:>5}".format(word, count))

jieba分词

脚本宝典总结

以上是脚本宝典为你收集整理的jieba分词全部内容,希望文章能够帮你解决jieba分词所遇到的问题。

如果觉得脚本宝典网站内容还不错,欢迎将脚本宝典推荐好友。

本图文内容来源于网友网络收集整理提供,作为学习参考使用,版权属于原作者。
如您有任何意见或建议可联系处理。小编QQ:384754419,请注明来意。
标签: