2018-07-04 09:06:45 +00:00
|
|
|
|
# Copyright (C) 2001-2018, Python Software Foundation
|
2018-07-04 09:08:42 +00:00
|
|
|
|
# For licence information, see README file.
|
2016-10-30 09:46:26 +00:00
|
|
|
|
#
|
|
|
|
|
msgid ""
|
|
|
|
|
msgstr ""
|
|
|
|
|
"Project-Id-Version: Python 3.6\n"
|
|
|
|
|
"Report-Msgid-Bugs-To: \n"
|
2017-12-01 06:48:13 +00:00
|
|
|
|
"POT-Creation-Date: 2017-12-01 07:43+0100\n"
|
2017-12-03 13:00:24 +00:00
|
|
|
|
"PO-Revision-Date: 2017-12-01 19:44+0100\n"
|
2017-10-19 15:31:06 +00:00
|
|
|
|
"Last-Translator: \n"
|
2018-07-04 09:14:25 +00:00
|
|
|
|
"Language-Team: FRENCH <traductions@lists.afpy.org>\n"
|
2017-05-23 22:40:56 +00:00
|
|
|
|
"Language: fr\n"
|
2016-10-30 09:46:26 +00:00
|
|
|
|
"MIME-Version: 1.0\n"
|
|
|
|
|
"Content-Type: text/plain; charset=UTF-8\n"
|
|
|
|
|
"Content-Transfer-Encoding: 8bit\n"
|
2017-10-19 15:31:06 +00:00
|
|
|
|
"X-Generator: Poedit 1.6.10\n"
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:2
|
|
|
|
|
msgid ":mod:`urllib.robotparser` --- Parser for robots.txt"
|
2019-02-03 10:37:05 +00:00
|
|
|
|
msgstr ":mod:`urllib.robotparser` — Analyseur de fichiers *robots.txt*"
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:10
|
|
|
|
|
msgid "**Source code:** :source:`Lib/urllib/robotparser.py`"
|
2017-10-19 15:31:06 +00:00
|
|
|
|
msgstr "**Code source :** :source:`Lib/urllib/robotparser.py`"
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:20
|
|
|
|
|
msgid ""
|
|
|
|
|
"This module provides a single class, :class:`RobotFileParser`, which answers "
|
|
|
|
|
"questions about whether or not a particular user agent can fetch a URL on "
|
|
|
|
|
"the Web site that published the :file:`robots.txt` file. For more details "
|
|
|
|
|
"on the structure of :file:`robots.txt` files, see http://www.robotstxt.org/"
|
|
|
|
|
"orig.html."
|
|
|
|
|
msgstr ""
|
2017-10-19 15:31:06 +00:00
|
|
|
|
"Ce module fournit une simple classe, :class:`RobotFileParser`, qui permet de "
|
|
|
|
|
"savoir si un *user-agent* particulier peut accéder à une URL du site web qui "
|
|
|
|
|
"a publié ce fichier :file:`robots.txt`. Pour plus de détails sur la "
|
|
|
|
|
"structure des fichiers :file:`robots.txt`, voir http://www.robotstxt.org/"
|
|
|
|
|
"orig.html."
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:28
|
|
|
|
|
msgid ""
|
|
|
|
|
"This class provides methods to read, parse and answer questions about the :"
|
|
|
|
|
"file:`robots.txt` file at *url*."
|
|
|
|
|
msgstr ""
|
2017-10-19 15:31:06 +00:00
|
|
|
|
"Cette classe fournit des méthodes pour lire, analyser et répondre aux "
|
|
|
|
|
"questions à propos du fichier :file:`robots.txt` disponible à l'adresse "
|
|
|
|
|
"*url*."
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:33
|
|
|
|
|
msgid "Sets the URL referring to a :file:`robots.txt` file."
|
2017-10-19 15:31:06 +00:00
|
|
|
|
msgstr "Modifie l'URL référençant le fichier :file:`robots.txt`."
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:37
|
|
|
|
|
msgid "Reads the :file:`robots.txt` URL and feeds it to the parser."
|
|
|
|
|
msgstr ""
|
2017-10-19 15:31:06 +00:00
|
|
|
|
"Lit le fichier :file:`robots.txt` depuis son URL et envoie le contenu à "
|
|
|
|
|
"l'analyseur."
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:41
|
|
|
|
|
msgid "Parses the lines argument."
|
2017-10-19 15:31:06 +00:00
|
|
|
|
msgstr "Analyse les lignes données en argument."
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:45
|
|
|
|
|
msgid ""
|
|
|
|
|
"Returns ``True`` if the *useragent* is allowed to fetch the *url* according "
|
|
|
|
|
"to the rules contained in the parsed :file:`robots.txt` file."
|
|
|
|
|
msgstr ""
|
2017-10-19 15:31:06 +00:00
|
|
|
|
"Renvoie ``True`` si *useragent* est autorisé à accéder à *url* selon les "
|
|
|
|
|
"règles contenues dans le fichier :file:`robots.txt` analysé."
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:51
|
|
|
|
|
msgid ""
|
|
|
|
|
"Returns the time the ``robots.txt`` file was last fetched. This is useful "
|
|
|
|
|
"for long-running web spiders that need to check for new ``robots.txt`` files "
|
|
|
|
|
"periodically."
|
|
|
|
|
msgstr ""
|
2017-10-19 15:31:06 +00:00
|
|
|
|
"Renvoie le temps auquel le fichier ``robots.txt`` a été téléchargé pour la "
|
|
|
|
|
"dernière fois. Cela est utile pour des *web spiders* de longue durée qui "
|
|
|
|
|
"doivent vérifier périodiquement si le fichier est mis à jour."
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:57
|
|
|
|
|
msgid ""
|
|
|
|
|
"Sets the time the ``robots.txt`` file was last fetched to the current time."
|
|
|
|
|
msgstr ""
|
2017-10-19 15:31:06 +00:00
|
|
|
|
"Indique que le fichier ``robots.txt`` a été téléchargé pour la dernière fois "
|
|
|
|
|
"au temps courant."
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:62
|
|
|
|
|
msgid ""
|
|
|
|
|
"Returns the value of the ``Crawl-delay`` parameter from ``robots.txt`` for "
|
|
|
|
|
"the *useragent* in question. If there is no such parameter or it doesn't "
|
|
|
|
|
"apply to the *useragent* specified or the ``robots.txt`` entry for this "
|
|
|
|
|
"parameter has invalid syntax, return ``None``."
|
|
|
|
|
msgstr ""
|
2017-10-19 15:31:06 +00:00
|
|
|
|
"Renvoie la valeur du paramètre ``Crawl-delay`` du ``robots.txt`` pour le "
|
|
|
|
|
"*useragent* en question. S'il n'y a pas de tel paramètre ou qu'il ne "
|
|
|
|
|
"s'applique pas au *useragent* spécifié ou si l'entrée du ``robots.txt`` pour "
|
|
|
|
|
"ce paramètre a une syntaxe invalide, renvoie ``None``."
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:71
|
|
|
|
|
msgid ""
|
|
|
|
|
"Returns the contents of the ``Request-rate`` parameter from ``robots.txt`` "
|
2017-12-01 06:48:13 +00:00
|
|
|
|
"as a :term:`named tuple` ``RequestRate(requests, seconds)``. If there is no "
|
|
|
|
|
"such parameter or it doesn't apply to the *useragent* specified or the "
|
|
|
|
|
"``robots.txt`` entry for this parameter has invalid syntax, return ``None``."
|
2017-12-05 06:54:15 +00:00
|
|
|
|
msgstr ""
|
|
|
|
|
"Renvoie le contenu du paramètre ``Request-rate`` du ``robots.txt`` sous la "
|
|
|
|
|
"forme d'un :term:`named tuple` ``RequestRate(requests, seconds)``. S'il n'y "
|
|
|
|
|
"a pas de tel paramètre ou qu'il ne s'applique pas au *useragent* spécifié ou "
|
|
|
|
|
"si l'entrée du ``robots.txt`` pour ce paramètre a une syntaxe invalide, "
|
|
|
|
|
"``None`` est renvoyé."
|
2016-10-30 09:46:26 +00:00
|
|
|
|
|
|
|
|
|
#: ../Doc/library/urllib.robotparser.rst:80
|
|
|
|
|
msgid ""
|
|
|
|
|
"The following example demonstrates basic use of the :class:`RobotFileParser` "
|
|
|
|
|
"class::"
|
|
|
|
|
msgstr ""
|
2017-10-19 15:31:06 +00:00
|
|
|
|
"L'exemple suivant présente une utilisation basique de la classe :class:"
|
|
|
|
|
"`RobotFileParser` : ::"
|