python-docs-fr/library/urllib.robotparser.po

76 lines
2.4 KiB
Plaintext

# SOME DESCRIPTIVE TITLE.
# Copyright (C) 2001-2016, Python Software Foundation
# This file is distributed under the same license as the Python package.
# FIRST AUTHOR <EMAIL@ADDRESS>, YEAR.
#
#, fuzzy
msgid ""
msgstr ""
"Project-Id-Version: Python 3.5\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2016-10-30 10:42+0100\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
"Language-Team: LANGUAGE <LL@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
#: ../Doc/library/urllib.robotparser.rst:2
msgid ":mod:`urllib.robotparser` --- Parser for robots.txt"
msgstr ""
#: ../Doc/library/urllib.robotparser.rst:10
msgid "**Source code:** :source:`Lib/urllib/robotparser.py`"
msgstr ""
#: ../Doc/library/urllib.robotparser.rst:20
msgid ""
"This module provides a single class, :class:`RobotFileParser`, which answers "
"questions about whether or not a particular user agent can fetch a URL on "
"the Web site that published the :file:`robots.txt` file. For more details "
"on the structure of :file:`robots.txt` files, see http://www.robotstxt.org/"
"orig.html."
msgstr ""
#: ../Doc/library/urllib.robotparser.rst:28
msgid ""
"This class provides methods to read, parse and answer questions about the :"
"file:`robots.txt` file at *url*."
msgstr ""
#: ../Doc/library/urllib.robotparser.rst:33
msgid "Sets the URL referring to a :file:`robots.txt` file."
msgstr ""
#: ../Doc/library/urllib.robotparser.rst:37
msgid "Reads the :file:`robots.txt` URL and feeds it to the parser."
msgstr ""
#: ../Doc/library/urllib.robotparser.rst:41
msgid "Parses the lines argument."
msgstr ""
#: ../Doc/library/urllib.robotparser.rst:45
msgid ""
"Returns ``True`` if the *useragent* is allowed to fetch the *url* according "
"to the rules contained in the parsed :file:`robots.txt` file."
msgstr ""
#: ../Doc/library/urllib.robotparser.rst:51
msgid ""
"Returns the time the ``robots.txt`` file was last fetched. This is useful "
"for long-running web spiders that need to check for new ``robots.txt`` files "
"periodically."
msgstr ""
#: ../Doc/library/urllib.robotparser.rst:57
msgid ""
"Sets the time the ``robots.txt`` file was last fetched to the current time."
msgstr ""
#: ../Doc/library/urllib.robotparser.rst:61
msgid ""
"The following example demonstrates basic use of the RobotFileParser class."
msgstr ""