python-docs-fr/library/robotparser.po

96 lines
3.5 KiB
Plaintext

# SOME DESCRIPTIVE TITLE.
# Copyright (C) 1990-2016, Python Software Foundation
# This file is distributed under the same license as the Python package.
# FIRST AUTHOR <EMAIL@ADDRESS>, YEAR.
#
#, fuzzy
msgid ""
msgstr ""
"Project-Id-Version: Python 2.7\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2016-10-30 10:44+0100\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
"Language-Team: LANGUAGE <LL@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
#: ../Doc/library/robotparser.rst:3
msgid ":mod:`robotparser` --- Parser for robots.txt"
msgstr ""
#: ../Doc/library/robotparser.rst:18
msgid ""
"The :mod:`robotparser` module has been renamed :mod:`urllib.robotparser` in "
"Python 3. The :term:`2to3` tool will automatically adapt imports when "
"converting your sources to Python 3."
msgstr ""
#: ../Doc/library/robotparser.rst:23
msgid ""
"This module provides a single class, :class:`RobotFileParser`, which answers "
"questions about whether or not a particular user agent can fetch a URL on "
"the Web site that published the :file:`robots.txt` file. For more details "
"on the structure of :file:`robots.txt` files, see http://www.robotstxt.org/"
"orig.html."
msgstr ""
"Ce module fournit une simple classe, :class:`RobotFileParser`, qui permet de "
"savoir si un *user-agent* particulier peut accéder à une URL du site web qui "
"a publié ce fichier :file:`robots.txt`. Pour plus de détails sur la "
"structure des fichiers :file:`robots.txt`, voir http://www.robotstxt.org/"
"orig.html."
#: ../Doc/library/robotparser.rst:31
msgid ""
"This class provides methods to read, parse and answer questions about the :"
"file:`robots.txt` file at *url*."
msgstr ""
"Cette classe fournit des méthodes pour lire, analyser et répondre aux "
"questions à propos du fichier :file:`robots.txt` disponible à l'adresse "
"*url*."
#: ../Doc/library/robotparser.rst:37
msgid "Sets the URL referring to a :file:`robots.txt` file."
msgstr "Modifie l'URL référençant le fichier :file:`robots.txt`."
#: ../Doc/library/robotparser.rst:42
msgid "Reads the :file:`robots.txt` URL and feeds it to the parser."
msgstr ""
"Lit le fichier :file:`robots.txt` depuis son URL et envoie le contenu à "
"l'analyseur."
#: ../Doc/library/robotparser.rst:47
msgid "Parses the lines argument."
msgstr "Analyse les lignes données en argument."
#: ../Doc/library/robotparser.rst:52
msgid ""
"Returns ``True`` if the *useragent* is allowed to fetch the *url* according "
"to the rules contained in the parsed :file:`robots.txt` file."
msgstr ""
"Renvoie ``True`` si *useragent* est autorisé à accéder à *url* selon les "
"règles contenues dans le fichier :file:`robots.txt` analysé."
#: ../Doc/library/robotparser.rst:59
msgid ""
"Returns the time the ``robots.txt`` file was last fetched. This is useful "
"for long-running web spiders that need to check for new ``robots.txt`` files "
"periodically."
msgstr ""
"Renvoie le temps auquel le fichier ``robots.txt`` a été téléchargé pour la "
"dernière fois. Cela est utile pour des *web spiders* de longue durée qui "
"doivent vérifier périodiquement si le fichier est mis à jour."
#: ../Doc/library/robotparser.rst:66
msgid ""
"Sets the time the ``robots.txt`` file was last fetched to the current time."
msgstr ""
"Indique que le fichier ``robots.txt`` a été téléchargé pour la dernière fois "
"au temps courant."
#: ../Doc/library/robotparser.rst:69
msgid ""
"The following example demonstrates basic use of the RobotFileParser class. ::"
msgstr ""