# SOME DESCRIPTIVE TITLE. # Copyright (C) 2001-2016, Python Software Foundation # This file is distributed under the same license as the Python package. # FIRST AUTHOR , YEAR. # #, fuzzy msgid "" msgstr "" "Project-Id-Version: Python 3.6\n" "Report-Msgid-Bugs-To: \n" "POT-Creation-Date: 2016-10-30 10:40+0100\n" "PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" "Last-Translator: FULL NAME \n" "Language-Team: LANGUAGE \n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" #: ../Doc/library/urllib.robotparser.rst:2 msgid ":mod:`urllib.robotparser` --- Parser for robots.txt" msgstr "" #: ../Doc/library/urllib.robotparser.rst:10 msgid "**Source code:** :source:`Lib/urllib/robotparser.py`" msgstr "" #: ../Doc/library/urllib.robotparser.rst:20 msgid "" "This module provides a single class, :class:`RobotFileParser`, which answers " "questions about whether or not a particular user agent can fetch a URL on " "the Web site that published the :file:`robots.txt` file. For more details " "on the structure of :file:`robots.txt` files, see http://www.robotstxt.org/" "orig.html." msgstr "" #: ../Doc/library/urllib.robotparser.rst:28 msgid "" "This class provides methods to read, parse and answer questions about the :" "file:`robots.txt` file at *url*." msgstr "" #: ../Doc/library/urllib.robotparser.rst:33 msgid "Sets the URL referring to a :file:`robots.txt` file." msgstr "" #: ../Doc/library/urllib.robotparser.rst:37 msgid "Reads the :file:`robots.txt` URL and feeds it to the parser." msgstr "" #: ../Doc/library/urllib.robotparser.rst:41 msgid "Parses the lines argument." msgstr "" #: ../Doc/library/urllib.robotparser.rst:45 msgid "" "Returns ``True`` if the *useragent* is allowed to fetch the *url* according " "to the rules contained in the parsed :file:`robots.txt` file." msgstr "" #: ../Doc/library/urllib.robotparser.rst:51 msgid "" "Returns the time the ``robots.txt`` file was last fetched. This is useful " "for long-running web spiders that need to check for new ``robots.txt`` files " "periodically." msgstr "" #: ../Doc/library/urllib.robotparser.rst:57 msgid "" "Sets the time the ``robots.txt`` file was last fetched to the current time." msgstr "" #: ../Doc/library/urllib.robotparser.rst:62 msgid "" "Returns the value of the ``Crawl-delay`` parameter from ``robots.txt`` for " "the *useragent* in question. If there is no such parameter or it doesn't " "apply to the *useragent* specified or the ``robots.txt`` entry for this " "parameter has invalid syntax, return ``None``." msgstr "" #: ../Doc/library/urllib.robotparser.rst:71 msgid "" "Returns the contents of the ``Request-rate`` parameter from ``robots.txt`` " "in the form of a :func:`~collections.namedtuple` ``(requests, seconds)``. " "If there is no such parameter or it doesn't apply to the *useragent* " "specified or the ``robots.txt`` entry for this parameter has invalid syntax, " "return ``None``." msgstr "" #: ../Doc/library/urllib.robotparser.rst:80 msgid "" "The following example demonstrates basic use of the :class:`RobotFileParser` " "class::" msgstr ""