Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
fix: fix content based on the suggestion
  • Loading branch information
linooohon committed Jan 27, 2022
commit a5dca7e5d05cdfc35811d601523b9a9fb3f34f2e
40 changes: 22 additions & 18 deletions library/urllib.robotparser.po
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ msgstr ""
"Project-Id-Version: Python 3.10\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2021-10-26 16:47+0000\n"
"PO-Revision-Date: 2022-01-23 14:36+0800\n"
"PO-Revision-Date: 2022-01-27 13:40+0800\n"
"Last-Translator: Phil Lin <linooohon@gmail.com>\n"
"Language-Team: Chinese - TAIWAN (https://github.com/python/python-docs-zh-"
"tw)\n"
Expand All @@ -23,7 +23,7 @@ msgstr ""

#: ../../library/urllib.robotparser.rst:2
msgid ":mod:`urllib.robotparser` --- Parser for robots.txt"
msgstr ":mod:`urllib.robotparser` --- robots.txt 的解析器"
msgstr ":mod:`urllib.robotparser` --- robots.txt 的剖析器"

#: ../../library/urllib.robotparser.rst:10
msgid "**Source code:** :source:`Lib/urllib/robotparser.py`"
Expand All @@ -37,17 +37,17 @@ msgid ""
"on the structure of :file:`robots.txt` files, see http://www.robotstxt.org/"
"orig.html."
msgstr ""
"此模組提供了一個單獨的類別 :class:`RobotFileParser`, 它可以知道某個特定 user "
"agent (使用者代理)是否能在有發布 :file:`robots.txt` 文件的網站 fetch(擷"
"取)此網站特定的 URL。有關 :file:`robots.txt` 文件結構的更多細節,請參閱 "
"http://www.robotstxt.org/orig.html 。"
"此模組 (module) 提供了一個單獨的類別 (class) \\ :class:`RobotFileParser`\\ ,"
"它可以知道某個特定 user agent(使用者代理)是否能在有發布 :file:`robots.txt` "
"文件的網站 fetch(擷取)特定 URL。有關 :file:`robots.txt` 文件結構的更多細"
"節,請參閱 http://www.robotstxt.org/orig.html。"

#: ../../library/urllib.robotparser.rst:28
msgid ""
"This class provides methods to read, parse and answer questions about the :"
"file:`robots.txt` file at *url*."
msgstr ""
"此類別提供了一些方法可以讀取、解析和回答關於 *url* 上的 :file:`robots.txt` 文"
"此類別提供了一些方法可以讀取、剖析和回答關於 *url* 上的 :file:`robots.txt` 文"
"件的問題。"

#: ../../library/urllib.robotparser.rst:33
Expand All @@ -56,19 +56,19 @@ msgstr "設置指向 :file:`robots.txt` 文件的 URL。"

#: ../../library/urllib.robotparser.rst:37
msgid "Reads the :file:`robots.txt` URL and feeds it to the parser."
msgstr "讀取 :file:`robots.txt` URL 並將其輸入到解析器。"
msgstr "讀取 :file:`robots.txt` URL 並將其輸入到剖析器。"

#: ../../library/urllib.robotparser.rst:41
msgid "Parses the lines argument."
msgstr "解析行參數(此參數為 ``robots.txt`` 文件裡的行)。"
msgstr "剖析 lines 引數。"

#: ../../library/urllib.robotparser.rst:45
msgid ""
"Returns ``True`` if the *useragent* is allowed to fetch the *url* according "
"to the rules contained in the parsed :file:`robots.txt` file."
msgstr ""
"如果根據被解析的 :file:`robots.txt` 文件中的規則,*useragent* 被允許 fetch "
"*url* 的話,則返回 ``True``。"
"根據從 :file:`robots.txt` 文件中剖析出的規則,如果 *useragent* 被允許 fetch "
"*url* 的話,則回傳 ``True``。"

#: ../../library/urllib.robotparser.rst:51
msgid ""
Expand All @@ -82,7 +82,7 @@ msgstr ""
#: ../../library/urllib.robotparser.rst:57
msgid ""
"Sets the time the ``robots.txt`` file was last fetched to the current time."
msgstr "將最後一次獲取 ``robots.txt`` 文件的時間設置為當前時間。"
msgstr "將最近一次 fetch ``robots.txt`` 文件的時間設置為當前時間。"

#: ../../library/urllib.robotparser.rst:62
msgid ""
Expand All @@ -92,8 +92,8 @@ msgid ""
"parameter has invalid syntax, return ``None``."
msgstr ""
"針對指定的 *useragent* 從 ``robots.txt`` 返回 ``Crawl-delay`` 參數的值。如果"
"此參數不存在或不適用於指定的 *useragent* ,或是此參數在 ``robots.txt`` 存在語"
"法錯誤,則返回 ``None``。"
"此參數不存在、不適用於指定的 *useragent* ,或是此參數在 ``robots.txt`` 中所指"
"的條目含有無效語法,則回傳 ``None``。"

#: ../../library/urllib.robotparser.rst:71
msgid ""
Expand All @@ -103,8 +103,9 @@ msgid ""
"``robots.txt`` entry for this parameter has invalid syntax, return ``None``."
msgstr ""
"以 :term:`named tuple` ``RequestRate(requests, seconds)`` 的形式從 ``robots."
"txt`` 返回 ``Request-rate`` 參數的內容。如果此參數不存在或不適用於指定的 "
"*useragent* ,或是此參數在 ``robots.txt`` 存在語法錯誤,則返回 ``None``。"
"txt`` 返回 ``Request-rate`` 參數的內容。如果此參數不存在、不適用於指定的 "
"*useragent* ,或是此參數在 ``robots.txt`` 中所指的條目含有無效語法,則回傳 "
"``None``。"

#: ../../library/urllib.robotparser.rst:81
msgid ""
Expand All @@ -113,10 +114,13 @@ msgid ""
"entry for this parameter has invalid syntax, return ``None``."
msgstr ""
"以 :func:`list` 的形式從 ``robots.txt`` 返回 ``Sitemap`` 參數的內容。如果此參"
"數不存在或此參數在 ``robots.txt`` 存在語法錯誤,則返回 ``None``。"
"數不存在或此參數在 ``robots.txt`` 中所指的條目含有無效語法,則回傳 ``None``。"

#: ../../library/urllib.robotparser.rst:89
msgid ""
"The following example demonstrates basic use of the :class:`RobotFileParser` "
"class::"
msgstr "下面的範例展示了 :class:`RobotFileParser` 類別的基本用法::"
msgstr ""
"下面的範例展示了 :class:`RobotFileParser` 類別的基本用法:\n"
"\n"
"::"