Merge branch 'master' into pr_2261

This commit is contained in:
Serhiy Storchaka 2018-02-08 16:31:54 +02:00
commit c0a268962a
988 changed files with 64123 additions and 24041 deletions

1
.gitattributes vendored
View file

@ -26,6 +26,7 @@ Lib/test/decimaltestdata/*.decTest -text
Lib/test/test_email/data/*.txt -text
Lib/test/xmltestdata/* -text
Lib/test/coding20731.py -text
Lib/test/test_importlib/data01/* -text
# CRLF files
*.bat text eol=crlf

17
.github/CODEOWNERS vendored
View file

@ -5,17 +5,24 @@
# https://git-scm.com/docs/gitignore#_pattern_format
# asyncio
**/*asyncio* @1st1
**/*asyncio* @1st1 @asvetlov
# Core
**/*context* @1st1
**/*genobject* @1st1
**/*hamt* @1st1
# Hashing
**/*hashlib* @python/crypto-team
**/*pyhash* @python/crypto-team
# Import (including importlib)
**/*import* @python/import-team
# Import (including importlib).
# Ignoring importlib.h so as to not get flagged on
# all pull requests that change the the emitted
# bytecode.
**/*import*.c @python/import-team
**/*import*.py @python/import-team
# SSL
**/*ssl* @python/crypto-team
@ -50,4 +57,8 @@ Python/bootstrap_hash.c @python/crypto-team
**/*functools* @ncoghlan @rhettinger
**/*decimal* @rhettinger @skrah
**/*dataclasses* @ericvsmith
**/*idlelib* @terryjreedy
**/*typing* @gvanrossum @ilevkivskyi

21
.github/appveyor.yml vendored
View file

@ -1,4 +1,4 @@
version: 3.7.0a0.{build}
version: 3.8build{build}
clone_depth: 5
branches:
only:
@ -6,7 +6,7 @@ branches:
- /\d\.\d/
- buildbot-custom
cache:
- externals -> PCbuild\*
- externals -> PCbuild
build_script:
- cmd: PCbuild\build.bat -e
- cmd: PCbuild\win32\python.exe -m test.pythoninfo
@ -16,20 +16,3 @@ environment:
HOST_PYTHON: C:\Python36\python.exe
image:
- Visual Studio 2017
# Only trigger AppVeyor if actual code or its configuration changes
only_commits:
files:
- .github/appveyor.yml
- .gitattributes
- Grammar/
- Include/
- Lib/
- Modules/
- Objects/
- PC/
- PCbuild/
- Parser/
- Programs/
- Python/
- Tools/

7
.gitignore vendored
View file

@ -1,3 +1,10 @@
# added for local development
.buildaix/
Modules/python.exp
buildaix/
installp/
.gitignore
# Two-trick pony for OSX and other case insensitive file systems:
# Ignore ./python binary on Unix but still look into ./Python/ directory.
/python

View file

@ -7,6 +7,19 @@ group: beta
cache:
- pip
- ccache
- directories:
- $HOME/multissl
env:
global:
- OPENSSL=1.1.0g
- OPENSSL_DIR="$HOME/multissl/openssl/${OPENSSL}"
- PATH="${OPENSSL_DIR}/bin:$PATH"
- CFLAGS="-I${OPENSSL_DIR}/include"
- LDFLAGS="-L${OPENSSL_DIR}/lib"
# Set rpath with env var instead of -Wl,-rpath linker flag
# OpenSSL ignores LDFLAGS when linking bin/openssl
- LD_RUN_PATH="${OPENSSL_DIR}/lib"
branches:
only:
@ -48,6 +61,10 @@ matrix:
echo "Only docs were updated, stopping build process."
exit
fi
python3 Tools/ssl/multissltests.py --steps=library \
--base-directory ${HOME}/multissl \
--openssl ${OPENSSL} >/dev/null
openssl version
./configure
make -s -j4
# Need a venv that can parse covered code.
@ -66,11 +83,32 @@ matrix:
before_script:
- |
set -e
if ! git diff --name-only $TRAVIS_COMMIT_RANGE | grep -qvE '(\.rst$)|(^Doc)|(^Misc)'
if [ "$TRAVIS_PULL_REQUEST" = "false" ]; then
files_changed=$(git diff --name-only $TRAVIS_COMMIT_RANGE)
else
# Pull requests are slightly complicated because merging the PR commit without
# rebasing causes it to retain its old commit date. Meaning in history if any
# commits have been made on master that post-date it, they will be accidentally
# included in the diff if we use the TRAVIS_COMMIT_RANGE variable.
files_changed=$(git diff --name-only HEAD $(git merge-base HEAD $TRAVIS_BRANCH))
fi
# Prints changed files in this commit to help debug doc-only build issues.
echo "Files changed: "
echo $files_changed
if ! echo $files_changed | grep -qvE '(\.rst$)|(^Doc)|(^Misc)'
then
echo "Only docs were updated, stopping build process."
exit
fi
if [ "${TESTING}" != "docs" ]; then
# clang complains about unused-parameter a lot, redirect stderr
python3 Tools/ssl/multissltests.py --steps=library \
--base-directory ${HOME}/multissl \
--openssl ${OPENSSL} >/dev/null 2>&1
fi
openssl version
./configure --with-pydebug
make -j4
make -j4 regen-all clinic

View file

@ -68,7 +68,7 @@ taken on the bug.
.. seealso::
`How to Report Bugs Effectively <http://www.chiark.greenend.org.uk/~sgtatham/bugs.html>`_
`How to Report Bugs Effectively <https://www.chiark.greenend.org.uk/~sgtatham/bugs.html>`_
Article which goes into some detail about how to create a useful bug report.
This describes what kind of information is useful and why it is useful.

View file

@ -13,6 +13,16 @@ the module initialisation function. The macro puts a pointer to a C structure
into a static variable, :c:data:`PyDateTimeAPI`, that is used by the following
macros.
Macro for access to the UTC singleton:
.. c:var:: PyObject* PyDateTime_TimeZone_UTC
Returns the time zone singleton representing UTC, the same object as
:attr:`datetime.timezone.utc`.
.. versionadded:: 3.7
Type-check macros:
.. c:function:: int PyDate_Check(PyObject *ob)
@ -79,27 +89,41 @@ Macros to create objects:
.. c:function:: PyObject* PyDate_FromDate(int year, int month, int day)
Return a ``datetime.date`` object with the specified year, month and day.
Return a :class:`datetime.date` object with the specified year, month and day.
.. c:function:: PyObject* PyDateTime_FromDateAndTime(int year, int month, int day, int hour, int minute, int second, int usecond)
Return a ``datetime.datetime`` object with the specified year, month, day, hour,
Return a :class:`datetime.datetime` object with the specified year, month, day, hour,
minute, second and microsecond.
.. c:function:: PyObject* PyTime_FromTime(int hour, int minute, int second, int usecond)
Return a ``datetime.time`` object with the specified hour, minute, second and
Return a :class:`datetime.time` object with the specified hour, minute, second and
microsecond.
.. c:function:: PyObject* PyDelta_FromDSU(int days, int seconds, int useconds)
Return a ``datetime.timedelta`` object representing the given number of days,
seconds and microseconds. Normalization is performed so that the resulting
number of microseconds and seconds lie in the ranges documented for
``datetime.timedelta`` objects.
Return a :class:`datetime.timedelta` object representing the given number
of days, seconds and microseconds. Normalization is performed so that the
resulting number of microseconds and seconds lie in the ranges documented for
:class:`datetime.timedelta` objects.
.. c:function:: PyObject* PyTimeZone_FromOffset(PyDateTime_DeltaType* offset)
Return a :class:`datetime.timezone` object with an unnamed fixed offset
represented by the *offset* argument.
.. versionadded:: 3.7
.. c:function:: PyObject* PyTimeZone_FromOffsetAndName(PyDateTime_DeltaType* offset, PyUnicode* name)
Return a :class:`datetime.timezone` object with a fixed offset represented
by the *offset* argument and with tzname *name*.
.. versionadded:: 3.7
Macros to extract fields from date objects. The argument must be an instance of
@ -199,11 +223,11 @@ Macros for the convenience of modules implementing the DB API:
.. c:function:: PyObject* PyDateTime_FromTimestamp(PyObject *args)
Create and return a new ``datetime.datetime`` object given an argument tuple
suitable for passing to ``datetime.datetime.fromtimestamp()``.
Create and return a new :class:`datetime.datetime` object given an argument
tuple suitable for passing to :meth:`datetime.datetime.fromtimestamp()`.
.. c:function:: PyObject* PyDate_FromTimestamp(PyObject *args)
Create and return a new ``datetime.date`` object given an argument tuple
suitable for passing to ``datetime.date.fromtimestamp()``.
Create and return a new :class:`datetime.date` object given an argument
tuple suitable for passing to :meth:`datetime.date.fromtimestamp()`.

View file

@ -7,6 +7,213 @@
Initialization, Finalization, and Threads
*****************************************
.. _pre-init-safe:
Before Python Initialization
============================
In an application embedding Python, the :c:func:`Py_Initialize` function must
be called before using any other Python/C API functions; with the exception of
a few functions and the :ref:`global configuration variables
<global-conf-vars>`.
The following functions can be safely called before Python is initialized:
* Configuration functions:
* :c:func:`PyImport_AppendInittab`
* :c:func:`PyImport_ExtendInittab`
* :c:func:`PyInitFrozenExtensions`
* :c:func:`PyMem_SetAllocator`
* :c:func:`PyMem_SetupDebugHooks`
* :c:func:`PyObject_SetArenaAllocator`
* :c:func:`Py_SetPath`
* :c:func:`Py_SetProgramName`
* :c:func:`Py_SetPythonHome`
* :c:func:`Py_SetStandardStreamEncoding`
* Informative functions:
* :c:func:`PyMem_GetAllocator`
* :c:func:`PyObject_GetArenaAllocator`
* :c:func:`Py_GetBuildInfo`
* :c:func:`Py_GetCompiler`
* :c:func:`Py_GetCopyright`
* :c:func:`Py_GetPlatform`
* :c:func:`Py_GetVersion`
* Utilities:
* :c:func:`Py_DecodeLocale`
* Memory allocators:
* :c:func:`PyMem_RawMalloc`
* :c:func:`PyMem_RawRealloc`
* :c:func:`PyMem_RawCalloc`
* :c:func:`PyMem_RawFree`
.. note::
The following functions **should not be called** before
:c:func:`Py_Initialize`: :c:func:`Py_EncodeLocale`, :c:func:`Py_GetPath`,
:c:func:`Py_GetPrefix`, :c:func:`Py_GetExecPrefix`,
:c:func:`Py_GetProgramFullPath`, :c:func:`Py_GetPythonHome`,
:c:func:`Py_GetProgramName` and :c:func:`PyEval_InitThreads`.
.. _global-conf-vars:
Global configuration variables
==============================
Python has variables for the global configuration to control different features
and options. By default, these flags are controlled by :ref:`command line
options <using-on-interface-options>`.
When a flag is set by an option, the value of the flag is the number of times
that the option was set. For example, ``-b`` sets :c:data:`Py_BytesWarningFlag`
to 1 and ``-bb`` sets :c:data:`Py_BytesWarningFlag` to 2.
.. c:var:: Py_BytesWarningFlag
Issue a warning when comparing :class:`bytes` or :class:`bytearray` with
:class:`str` or :class:`bytes` with :class:`int`. Issue an error if greater
or equal to ``2``.
Set by the :option:`-b` option.
.. c:var:: Py_DebugFlag
Turn on parser debugging output (for expert only, depending on compilation
options).
Set by the :option:`-d` option and the :envvar:`PYTHONDEBUG` environment
variable.
.. c:var:: Py_DontWriteBytecodeFlag
If set to non-zero, Python won't try to write ``.pyc`` files on the
import of source modules.
Set by the :option:`-B` option and the :envvar:`PYTHONDONTWRITEBYTECODE`
environment variable.
.. c:var:: Py_FrozenFlag
Suppress error messages when calculating the module search path in
:c:func:`Py_GetPath`.
Private flag used by ``_freeze_importlib`` and ``frozenmain`` programs.
.. c:var:: Py_HashRandomizationFlag
Set to ``1`` if the :envvar:`PYTHONHASHSEED` environment variable is set to
a non-empty string.
If the flag is non-zero, read the :envvar:`PYTHONHASHSEED` environment
variable to initialize the secret hash seed.
.. c:var:: Py_IgnoreEnvironmentFlag
Ignore all :envvar:`PYTHON*` environment variables, e.g.
:envvar:`PYTHONPATH` and :envvar:`PYTHONHOME`, that might be set.
Set by the :option:`-E` and :option:`-I` options.
.. c:var:: Py_InspectFlag
When a script is passed as first argument or the :option:`-c` option is used,
enter interactive mode after executing the script or the command, even when
:data:`sys.stdin` does not appear to be a terminal.
Set by the :option:`-i` option and the :envvar:`PYTHONINSPECT` environment
variable.
.. c:var:: Py_InteractiveFlag
Set by the :option:`-i` option.
.. c:var:: Py_IsolatedFlag
Run Python in isolated mode. In isolated mode :data:`sys.path` contains
neither the script's directory nor the user's site-packages directory.
Set by the :option:`-I` option.
.. versionadded:: 3.4
.. c:var:: Py_LegacyWindowsFSEncodingFlag
If the flag is non-zero, use the ``mbcs`` encoding instead of the UTF-8
encoding for the filesystem encoding.
Set to ``1`` if the :envvar:`PYTHONLEGACYWINDOWSFSENCODING` environment
variable is set to a non-empty string.
See :pep:`529` for more details.
Availability: Windows.
.. c:var:: Py_LegacyWindowsStdioFlag
If the flag is non-zero, use :class:`io.FileIO` instead of
:class:`WindowsConsoleIO` for :mod:`sys` standard streams.
Set to ``1`` if the :envvar:`PYTHONLEGACYWINDOWSSTDIO` environment
variable is set to a non-empty string.
See :pep:`528` for more details.
Availability: Windows.
.. c:var:: Py_NoSiteFlag
Disable the import of the module :mod:`site` and the site-dependent
manipulations of :data:`sys.path` that it entails. Also disable these
manipulations if :mod:`site` is explicitly imported later (call
:func:`site.main` if you want them to be triggered).
Set by the :option:`-S` option.
.. c:var:: Py_NoUserSiteDirectory
Don't add the :data:`user site-packages directory <site.USER_SITE>` to
:data:`sys.path`.
Set by the :option:`-s` and :option:`-I` options, and the
:envvar:`PYTHONNOUSERSITE` environment variable.
.. c:var:: Py_OptimizeFlag
Set by the :option:`-O` option and the :envvar:`PYTHONOPTIMIZE` environment
variable.
.. c:var:: Py_QuietFlag
Don't display the copyright and version messages even in interactive mode.
Set by the :option:`-q` option.
.. versionadded:: 3.2
.. c:var:: Py_UnbufferedStdioFlag
Force the stdout and stderr streams to be unbuffered.
Set by the :option:`-u` option and the :envvar:`PYTHONUNBUFFERED`
environment variable.
.. c:var:: Py_VerboseFlag
Print a message each time a module is initialized, showing the place
(filename or built-in module) from which it is loaded. If greater or equal
to ``2``, print a message for each file that is checked for when
searching for a module. Also provides information on module cleanup at exit.
Set by the :option:`-v` option and the :envvar:`PYTHONVERBOSE` environment
variable.
Initializing and finalizing the interpreter
===========================================
@ -27,9 +234,11 @@ Initializing and finalizing the interpreter
single: PySys_SetArgvEx()
single: Py_FinalizeEx()
Initialize the Python interpreter. In an application embedding Python, this
should be called before using any other Python/C API functions; with the
exception of :c:func:`Py_SetProgramName`, :c:func:`Py_SetPythonHome` and :c:func:`Py_SetPath`. This initializes
Initialize the Python interpreter. In an application embedding Python,
this should be called before using any other Python/C API functions; see
:ref:`Before Python Initialization <pre-init-safe>` for the few exceptions.
This initializes
the table of loaded modules (``sys.modules``), and creates the fundamental
modules :mod:`builtins`, :mod:`__main__` and :mod:`sys`. It also initializes
the module search path (``sys.path``). It does not set ``sys.argv``; use
@ -129,7 +338,7 @@ Process-wide parameters
.. versionadded:: 3.4
.. c:function:: void Py_SetProgramName(wchar_t *name)
.. c:function:: void Py_SetProgramName(const wchar_t *name)
.. index::
single: Py_Initialize()
@ -396,7 +605,7 @@ Process-wide parameters
.. versionchanged:: 3.4 The *updatepath* value depends on :option:`-I`.
.. c:function:: void Py_SetPythonHome(wchar_t *home)
.. c:function:: void Py_SetPythonHome(const wchar_t *home)
Set the default "home" directory, that is, the location of the standard
Python libraries. See :envvar:`PYTHONHOME` for the meaning of the
@ -478,15 +687,14 @@ This is so common that a pair of macros exists to simplify it::
The :c:macro:`Py_BEGIN_ALLOW_THREADS` macro opens a new block and declares a
hidden local variable; the :c:macro:`Py_END_ALLOW_THREADS` macro closes the
block. These two macros are still available when Python is compiled without
thread support (they simply have an empty expansion).
block.
When thread support is enabled, the block above expands to the following code::
The block above expands to the following code::
PyThreadState *_save;
_save = PyEval_SaveThread();
...Do some blocking I/O operation...
... Do some blocking I/O operation ...
PyEval_RestoreThread(_save);
.. index::
@ -609,36 +817,24 @@ code, or when embedding the Python interpreter:
This is a no-op when called for a second time.
.. versionchanged:: 3.7
This function is now called by :c:func:`Py_Initialize()`, so you don't
have to call it yourself anymore.
.. versionchanged:: 3.2
This function cannot be called before :c:func:`Py_Initialize()` anymore.
.. index:: module: _thread
.. note::
When only the main thread exists, no GIL operations are needed. This is a
common situation (most Python programs do not use threads), and the lock
operations slow the interpreter down a bit. Therefore, the lock is not
created initially. This situation is equivalent to having acquired the lock:
when there is only a single thread, all object accesses are safe. Therefore,
when this function initializes the global interpreter lock, it also acquires
it. Before the Python :mod:`_thread` module creates a new thread, knowing
that either it has the lock or the lock hasn't been created yet, it calls
:c:func:`PyEval_InitThreads`. When this call returns, it is guaranteed that
the lock has been created and that the calling thread has acquired it.
It is **not** safe to call this function when it is unknown which thread (if
any) currently has the global interpreter lock.
This function is not available when thread support is disabled at compile time.
.. c:function:: int PyEval_ThreadsInitialized()
Returns a non-zero value if :c:func:`PyEval_InitThreads` has been called. This
function can be called without holding the GIL, and therefore can be used to
avoid calls to the locking API when running single-threaded. This function is
not available when thread support is disabled at compile time.
avoid calls to the locking API when running single-threaded.
.. versionchanged:: 3.7
The :term:`GIL` is now initialized by :c:func:`Py_Initialize()`.
.. c:function:: PyThreadState* PyEval_SaveThread()
@ -646,8 +842,7 @@ code, or when embedding the Python interpreter:
Release the global interpreter lock (if it has been created and thread
support is enabled) and reset the thread state to *NULL*, returning the
previous thread state (which is not *NULL*). If the lock has been created,
the current thread must have acquired it. (This function is available even
when thread support is disabled at compile time.)
the current thread must have acquired it.
.. c:function:: void PyEval_RestoreThread(PyThreadState *tstate)
@ -655,8 +850,7 @@ code, or when embedding the Python interpreter:
Acquire the global interpreter lock (if it has been created and thread
support is enabled) and set the thread state to *tstate*, which must not be
*NULL*. If the lock has been created, the current thread must not have
acquired it, otherwise deadlock ensues. (This function is available even
when thread support is disabled at compile time.)
acquired it, otherwise deadlock ensues.
.. c:function:: PyThreadState* PyThreadState_Get()
@ -748,7 +942,7 @@ example usage in the Python source distribution.
This macro expands to ``{ PyThreadState *_save; _save = PyEval_SaveThread();``.
Note that it contains an opening brace; it must be matched with a following
:c:macro:`Py_END_ALLOW_THREADS` macro. See above for further discussion of this
macro. It is a no-op when thread support is disabled at compile time.
macro.
.. c:macro:: Py_END_ALLOW_THREADS
@ -756,29 +950,29 @@ example usage in the Python source distribution.
This macro expands to ``PyEval_RestoreThread(_save); }``. Note that it contains
a closing brace; it must be matched with an earlier
:c:macro:`Py_BEGIN_ALLOW_THREADS` macro. See above for further discussion of
this macro. It is a no-op when thread support is disabled at compile time.
this macro.
.. c:macro:: Py_BLOCK_THREADS
This macro expands to ``PyEval_RestoreThread(_save);``: it is equivalent to
:c:macro:`Py_END_ALLOW_THREADS` without the closing brace. It is a no-op when
thread support is disabled at compile time.
:c:macro:`Py_END_ALLOW_THREADS` without the closing brace.
.. c:macro:: Py_UNBLOCK_THREADS
This macro expands to ``_save = PyEval_SaveThread();``: it is equivalent to
:c:macro:`Py_BEGIN_ALLOW_THREADS` without the opening brace and variable
declaration. It is a no-op when thread support is disabled at compile time.
declaration.
Low-level API
-------------
All of the following functions are only available when thread support is enabled
at compile time, and must be called only when the global interpreter lock has
been created.
All of the following functions must be called after :c:func:`Py_Initialize`.
.. versionchanged:: 3.7
:c:func:`Py_Initialize()` now initializes the :term:`GIL`.
.. c:function:: PyInterpreterState* PyInterpreterState_New()
@ -859,8 +1053,7 @@ been created.
If this thread already has the lock, deadlock ensues.
:c:func:`PyEval_RestoreThread` is a higher-level function which is always
available (even when thread support isn't enabled or when threads have
not been initialized).
available (even when threads have not been initialized).
.. c:function:: void PyEval_ReleaseThread(PyThreadState *tstate)
@ -872,8 +1065,7 @@ been created.
reported.
:c:func:`PyEval_SaveThread` is a higher-level function which is always
available (even when thread support isn't enabled or when threads have
not been initialized).
available (even when threads have not been initialized).
.. c:function:: void PyEval_AcquireLock()
@ -1068,18 +1260,18 @@ Python-level trace functions in previous versions.
registration function as *obj*, *frame* is the frame object to which the event
pertains, *what* is one of the constants :const:`PyTrace_CALL`,
:const:`PyTrace_EXCEPTION`, :const:`PyTrace_LINE`, :const:`PyTrace_RETURN`,
:const:`PyTrace_C_CALL`, :const:`PyTrace_C_EXCEPTION`, or
:const:`PyTrace_C_RETURN`, and *arg* depends on the value of *what*:
:const:`PyTrace_C_CALL`, :const:`PyTrace_C_EXCEPTION`, :const:`PyTrace_C_RETURN`,
or :const:`PyTrace_OPCODE`, and *arg* depends on the value of *what*:
+------------------------------+--------------------------------------+
| Value of *what* | Meaning of *arg* |
+==============================+======================================+
| :const:`PyTrace_CALL` | Always *NULL*. |
| :const:`PyTrace_CALL` | Always :c:data:`Py_None`. |
+------------------------------+--------------------------------------+
| :const:`PyTrace_EXCEPTION` | Exception information as returned by |
| | :func:`sys.exc_info`. |
+------------------------------+--------------------------------------+
| :const:`PyTrace_LINE` | Always *NULL*. |
| :const:`PyTrace_LINE` | Always :c:data:`Py_None`. |
+------------------------------+--------------------------------------+
| :const:`PyTrace_RETURN` | Value being returned to the caller, |
| | or *NULL* if caused by an exception. |
@ -1090,7 +1282,8 @@ Python-level trace functions in previous versions.
+------------------------------+--------------------------------------+
| :const:`PyTrace_C_RETURN` | Function object being called. |
+------------------------------+--------------------------------------+
| :const:`PyTrace_OPCODE` | Always :c:data:`Py_None`. |
+------------------------------+--------------------------------------+
.. c:var:: int PyTrace_CALL
@ -1114,14 +1307,15 @@ Python-level trace functions in previous versions.
.. c:var:: int PyTrace_LINE
The value passed as the *what* parameter to a trace function (but not a
profiling function) when a line-number event is being reported.
The value passed as the *what* parameter to a :c:type:`Py_tracefunc` function
(but not a profiling function) when a line-number event is being reported.
It may be disabled for a frame by setting :attr:`f_trace_lines` to *0* on that frame.
.. c:var:: int PyTrace_RETURN
The value for the *what* parameter to :c:type:`Py_tracefunc` functions when a
call is returning without propagating an exception.
call is about to return.
.. c:var:: int PyTrace_C_CALL
@ -1142,22 +1336,32 @@ Python-level trace functions in previous versions.
function has returned.
.. c:var:: int PyTrace_OPCODE
The value for the *what* parameter to :c:type:`Py_tracefunc` functions (but not
profiling functions) when a new opcode is about to be executed. This event is
not emitted by default: it must be explicitly requested by setting
:attr:`f_trace_opcodes` to *1* on the frame.
.. c:function:: void PyEval_SetProfile(Py_tracefunc func, PyObject *obj)
Set the profiler function to *func*. The *obj* parameter is passed to the
function as its first parameter, and may be any Python object, or *NULL*. If
the profile function needs to maintain state, using a different value for *obj*
for each thread provides a convenient and thread-safe place to store it. The
profile function is called for all monitored events except the line-number
events.
profile function is called for all monitored events except :const:`PyTrace_LINE`
:const:`PyTrace_OPCODE` and :const:`PyTrace_EXCEPTION`.
.. c:function:: void PyEval_SetTrace(Py_tracefunc func, PyObject *obj)
Set the tracing function to *func*. This is similar to
:c:func:`PyEval_SetProfile`, except the tracing function does receive line-number
events.
events and per-opcode events, but does not receive any event related to C function
objects being called. Any trace function registered using :c:func:`PyEval_SetTrace`
will not receive :const:`PyTrace_C_CALL`, :const:`PyTrace_C_EXCEPTION` or
:const:`PyTrace_C_RETURN` as a value for the *what* parameter.
.. _advanced-debugging:

View file

@ -10,6 +10,9 @@ Integer Objects
All integers are implemented as "long" integer objects of arbitrary size.
On error, most ``PyLong_As*`` APIs return ``(return type)-1`` which cannot be
distinguished from a number. Use :c:func:`PyErr_Occurred` to disambiguate.
.. c:type:: PyLongObject
This subtype of :c:type:`PyObject` represents a Python integer object.
@ -134,6 +137,8 @@ All integers are implemented as "long" integer objects of arbitrary size.
Raise :exc:`OverflowError` if the value of *obj* is out of range for a
:c:type:`long`.
Returns -1 on error. Use :c:func:`PyErr_Occurred` to disambiguate.
.. c:function:: long PyLong_AsLongAndOverflow(PyObject *obj, int *overflow)
@ -146,6 +151,8 @@ All integers are implemented as "long" integer objects of arbitrary size.
return ``-1``; otherwise, set *\*overflow* to ``0``. If any other exception
occurs set *\*overflow* to ``0`` and return ``-1`` as usual.
Returns -1 on error. Use :c:func:`PyErr_Occurred` to disambiguate.
.. c:function:: long long PyLong_AsLongLong(PyObject *obj)
@ -159,6 +166,8 @@ All integers are implemented as "long" integer objects of arbitrary size.
Raise :exc:`OverflowError` if the value of *obj* is out of range for a
:c:type:`long`.
Returns -1 on error. Use :c:func:`PyErr_Occurred` to disambiguate.
.. c:function:: long long PyLong_AsLongLongAndOverflow(PyObject *obj, int *overflow)
@ -171,6 +180,8 @@ All integers are implemented as "long" integer objects of arbitrary size.
and return ``-1``; otherwise, set *\*overflow* to ``0``. If any other
exception occurs set *\*overflow* to ``0`` and return ``-1`` as usual.
Returns -1 on error. Use :c:func:`PyErr_Occurred` to disambiguate.
.. versionadded:: 3.2
@ -186,6 +197,8 @@ All integers are implemented as "long" integer objects of arbitrary size.
Raise :exc:`OverflowError` if the value of *pylong* is out of range for a
:c:type:`Py_ssize_t`.
Returns -1 on error. Use :c:func:`PyErr_Occurred` to disambiguate.
.. c:function:: unsigned long PyLong_AsUnsignedLong(PyObject *pylong)
@ -199,15 +212,25 @@ All integers are implemented as "long" integer objects of arbitrary size.
Raise :exc:`OverflowError` if the value of *pylong* is out of range for a
:c:type:`unsigned long`.
Returns ``(unsigned long)-1`` on error.
Use :c:func:`PyErr_Occurred` to disambiguate.
.. c:function:: size_t PyLong_AsSize_t(PyObject *pylong)
.. index::
single: SIZE_MAX
single: OverflowError (built-in exception)
Return a C :c:type:`size_t` representation of *pylong*. *pylong* must be
an instance of :c:type:`PyLongObject`.
Raise :exc:`OverflowError` if the value of *pylong* is out of range for a
:c:type:`size_t`.
Returns ``(size_t)-1`` on error.
Use :c:func:`PyErr_Occurred` to disambiguate.
.. c:function:: unsigned long long PyLong_AsUnsignedLongLong(PyObject *pylong)
@ -220,6 +243,9 @@ All integers are implemented as "long" integer objects of arbitrary size.
Raise :exc:`OverflowError` if the value of *pylong* is out of range for an
:c:type:`unsigned long long`.
Returns ``(unsigned long long)-1`` on error.
Use :c:func:`PyErr_Occurred` to disambiguate.
.. versionchanged:: 3.1
A negative *pylong* now raises :exc:`OverflowError`, not :exc:`TypeError`.
@ -233,6 +259,8 @@ All integers are implemented as "long" integer objects of arbitrary size.
If the value of *obj* is out of range for an :c:type:`unsigned long`,
return the reduction of that value modulo ``ULONG_MAX + 1``.
Returns -1 on error. Use :c:func:`PyErr_Occurred` to disambiguate.
.. c:function:: unsigned long long PyLong_AsUnsignedLongLongMask(PyObject *obj)
@ -243,6 +271,8 @@ All integers are implemented as "long" integer objects of arbitrary size.
If the value of *obj* is out of range for an :c:type:`unsigned long long`,
return the reduction of that value modulo ``PY_ULLONG_MAX + 1``.
Returns -1 on error. Use :c:func:`PyErr_Occurred` to disambiguate.
.. c:function:: double PyLong_AsDouble(PyObject *pylong)
@ -252,6 +282,8 @@ All integers are implemented as "long" integer objects of arbitrary size.
Raise :exc:`OverflowError` if the value of *pylong* is out of range for a
:c:type:`double`.
Returns -1.0 on error. Use :c:func:`PyErr_Occurred` to disambiguate.
.. c:function:: void* PyLong_AsVoidPtr(PyObject *pylong)
@ -259,3 +291,5 @@ All integers are implemented as "long" integer objects of arbitrary size.
If *pylong* cannot be converted, an :exc:`OverflowError` will be raised. This
is only assured to produce a usable :c:type:`void` pointer for values created
with :c:func:`PyLong_FromVoidPtr`.
Returns NULL on error. Use :c:func:`PyErr_Occurred` to disambiguate.

View file

@ -100,9 +100,10 @@ The following function sets are wrappers to the system allocator. These
functions are thread-safe, the :term:`GIL <global interpreter lock>` does not
need to be held.
The default raw memory block allocator uses the following functions:
:c:func:`malloc`, :c:func:`calloc`, :c:func:`realloc` and :c:func:`free`; call
``malloc(1)`` (or ``calloc(1, 1)``) when requesting zero bytes.
The :ref:`default raw memory allocator <default-memory-allocators>` uses
the following functions: :c:func:`malloc`, :c:func:`calloc`, :c:func:`realloc`
and :c:func:`free`; call ``malloc(1)`` (or ``calloc(1, 1)``) when requesting
zero bytes.
.. versionadded:: 3.4
@ -165,7 +166,8 @@ The following function sets, modeled after the ANSI C standard, but specifying
behavior when requesting zero bytes, are available for allocating and releasing
memory from the Python heap.
By default, these functions use :ref:`pymalloc memory allocator <pymalloc>`.
The :ref:`default memory allocator <default-memory-allocators>` uses the
:ref:`pymalloc memory allocator <pymalloc>`.
.. warning::
@ -270,7 +272,8 @@ The following function sets, modeled after the ANSI C standard, but specifying
behavior when requesting zero bytes, are available for allocating and releasing
memory from the Python heap.
By default, these functions use :ref:`pymalloc memory allocator <pymalloc>`.
The :ref:`default object allocator <default-memory-allocators>` uses the
:ref:`pymalloc memory allocator <pymalloc>`.
.. warning::
@ -326,6 +329,31 @@ By default, these functions use :ref:`pymalloc memory allocator <pymalloc>`.
If *p* is *NULL*, no operation is performed.
.. _default-memory-allocators:
Default Memory Allocators
=========================
Default memory allocators:
=============================== ==================== ================== ===================== ====================
Configuration Name PyMem_RawMalloc PyMem_Malloc PyObject_Malloc
=============================== ==================== ================== ===================== ====================
Release build ``"pymalloc"`` ``malloc`` ``pymalloc`` ``pymalloc``
Debug build ``"pymalloc_debug"`` ``malloc`` + debug ``pymalloc`` + debug ``pymalloc`` + debug
Release build, without pymalloc ``"malloc"`` ``malloc`` ``malloc`` ``malloc``
Release build, without pymalloc ``"malloc_debug"`` ``malloc`` + debug ``malloc`` + debug ``malloc`` + debug
=============================== ==================== ================== ===================== ====================
Legend:
* Name: value for :envvar:`PYTHONMALLOC` environment variable
* ``malloc``: system allocators from the standard C library, C functions:
:c:func:`malloc`, :c:func:`calloc`, :c:func:`realloc` and :c:func:`free`
* ``pymalloc``: :ref:`pymalloc memory allocator <pymalloc>`
* "+ debug": with debug hooks installed by :c:func:`PyMem_SetupDebugHooks`
Customize Memory Allocators
===========================
@ -431,7 +459,8 @@ Customize Memory Allocators
displayed if :mod:`tracemalloc` is tracing Python memory allocations and the
memory block was traced.
These hooks are installed by default if Python is compiled in debug
These hooks are :ref:`installed by default <default-memory-allocators>` if
Python is compiled in debug
mode. The :envvar:`PYTHONMALLOC` environment variable can be used to install
debug hooks on a Python compiled in release mode.
@ -453,9 +482,9 @@ to 512 bytes) with a short lifetime. It uses memory mappings called "arenas"
with a fixed size of 256 KiB. It falls back to :c:func:`PyMem_RawMalloc` and
:c:func:`PyMem_RawRealloc` for allocations larger than 512 bytes.
*pymalloc* is the default allocator of the :c:data:`PYMEM_DOMAIN_MEM` (ex:
:c:func:`PyMem_Malloc`) and :c:data:`PYMEM_DOMAIN_OBJ` (ex:
:c:func:`PyObject_Malloc`) domains.
*pymalloc* is the :ref:`default allocator <default-memory-allocators>` of the
:c:data:`PYMEM_DOMAIN_MEM` (ex: :c:func:`PyMem_Malloc`) and
:c:data:`PYMEM_DOMAIN_OBJ` (ex: :c:func:`PyObject_Malloc`) domains.
The arena allocator uses the following functions:

View file

@ -106,6 +106,16 @@ Operating System Utilities
surrogate character, escape the bytes using the surrogateescape error
handler instead of decoding them.
Encoding, highest priority to lowest priority:
* ``UTF-8`` on macOS and Android;
* ``UTF-8`` if the Python UTF-8 mode is enabled;
* ``ASCII`` if the ``LC_CTYPE`` locale is ``"C"``,
``nl_langinfo(CODESET)`` returns the ``ASCII`` encoding (or an alias),
and :c:func:`mbstowcs` and :c:func:`wcstombs` functions uses the
``ISO-8859-1`` encoding.
* the current locale encoding.
Return a pointer to a newly allocated wide character string, use
:c:func:`PyMem_RawFree` to free the memory. If size is not ``NULL``, write
the number of wide characters excluding the null character into ``*size``
@ -127,6 +137,9 @@ Operating System Utilities
.. versionadded:: 3.5
.. versionchanged:: 3.7
The function now uses the UTF-8 encoding in the UTF-8 mode.
.. c:function:: char* Py_EncodeLocale(const wchar_t *text, size_t *error_pos)
@ -134,16 +147,31 @@ Operating System Utilities
:ref:`surrogateescape error handler <surrogateescape>`: surrogate characters
in the range U+DC80..U+DCFF are converted to bytes 0x80..0xFF.
Encoding, highest priority to lowest priority:
* ``UTF-8`` on macOS and Android;
* ``UTF-8`` if the Python UTF-8 mode is enabled;
* ``ASCII`` if the ``LC_CTYPE`` locale is ``"C"``,
``nl_langinfo(CODESET)`` returns the ``ASCII`` encoding (or an alias),
and :c:func:`mbstowcs` and :c:func:`wcstombs` functions uses the
``ISO-8859-1`` encoding.
* the current locale encoding.
The function uses the UTF-8 encoding in the Python UTF-8 mode.
Return a pointer to a newly allocated byte string, use :c:func:`PyMem_Free`
to free the memory. Return ``NULL`` on encoding error or memory allocation
error
If error_pos is not ``NULL``, ``*error_pos`` is set to the index of the
invalid character on encoding error, or set to ``(size_t)-1`` otherwise.
If error_pos is not ``NULL``, ``*error_pos`` is set to ``(size_t)-1`` on
success, or set to the index of the invalid character on encoding error.
Use the :c:func:`Py_DecodeLocale` function to decode the bytes string back
to a wide character string.
.. versionchanged:: 3.7
The function now uses the UTF-8 encoding in the UTF-8 mode.
.. seealso::
The :c:func:`PyUnicode_EncodeFSDefault` and
@ -151,6 +179,9 @@ Operating System Utilities
.. versionadded:: 3.5
.. versionchanged:: 3.7
The function now supports the UTF-8 mode.
.. _systemfunctions:

View file

@ -760,7 +760,8 @@ system.
Py_ssize_t len, \
const char *errors)
Decode a string from the current locale encoding. The supported
Decode a string from UTF-8 on Android, or from the current locale encoding
on other platforms. The supported
error handlers are ``"strict"`` and ``"surrogateescape"``
(:pep:`383`). The decoder uses ``"strict"`` error handler if
*errors* is ``NULL``. *str* must end with a null character but
@ -770,12 +771,20 @@ system.
:c:data:`Py_FileSystemDefaultEncoding` (the locale encoding read at
Python startup).
This function ignores the Python UTF-8 mode.
.. seealso::
The :c:func:`Py_DecodeLocale` function.
.. versionadded:: 3.3
.. versionchanged:: 3.7
The function now also uses the current locale encoding for the
``surrogateescape`` error handler, except on Android. Previously, :c:func:`Py_DecodeLocale`
was used for the ``surrogateescape``, and the current locale encoding was
used for ``strict``.
.. c:function:: PyObject* PyUnicode_DecodeLocale(const char *str, const char *errors)
@ -787,7 +796,8 @@ system.
.. c:function:: PyObject* PyUnicode_EncodeLocale(PyObject *unicode, const char *errors)
Encode a Unicode object to the current locale encoding. The
Encode a Unicode object to UTF-8 on Android, or to the current locale
encoding on other platforms. The
supported error handlers are ``"strict"`` and ``"surrogateescape"``
(:pep:`383`). The encoder uses ``"strict"`` error handler if
*errors* is ``NULL``. Return a :class:`bytes` object. *unicode* cannot
@ -797,12 +807,21 @@ system.
:c:data:`Py_FileSystemDefaultEncoding` (the locale encoding read at
Python startup).
This function ignores the Python UTF-8 mode.
.. seealso::
The :c:func:`Py_EncodeLocale` function.
.. versionadded:: 3.3
.. versionchanged:: 3.7
The function now also uses the current locale encoding for the
``surrogateescape`` error handler, except on Android. Previously,
:c:func:`Py_EncodeLocale`
was used for the ``surrogateescape``, and the current locale encoding was
used for ``strict``.
File System Encoding
""""""""""""""""""""

View file

@ -301,7 +301,7 @@ the same library that the Python runtime is using.
set to *NULL*.
.. c:function:: PyObject* PyEval_EvalCodeEx(PyObject *co, PyObject *globals, PyObject *locals, PyObject **args, int argcount, PyObject **kws, int kwcount, PyObject **defs, int defcount, PyObject *kwdefs, PyObject *closure)
.. c:function:: PyObject* PyEval_EvalCodeEx(PyObject *co, PyObject *globals, PyObject *locals, PyObject *const *args, int argcount, PyObject *const *kws, int kwcount, PyObject *const *defs, int defcount, PyObject *kwdefs, PyObject *closure)
Evaluate a precompiled code object, given a particular environment for its
evaluation. This environment consists of a dictionary of global variables,

View file

@ -90,11 +90,10 @@ html_split_index = True
# Options for LaTeX output
# ------------------------
latex_engine = 'xelatex'
# Get LaTeX to handle Unicode correctly
latex_elements = {
'inputenc': r'\usepackage[utf8x]{inputenc}',
'utf8extra': '',
'fontenc': r'\usepackage[T1,T2A]{fontenc}',
}
# Additional stuff for the LaTeX preamble.

View file

@ -4,7 +4,7 @@ Copyright
Python and this documentation is:
Copyright © 2001-2017 Python Software Foundation. All rights reserved.
Copyright © 2001-2018 Python Software Foundation. All rights reserved.
Copyright © 2000 BeOpen.com. All rights reserved.

View file

@ -177,6 +177,14 @@ PyDelta_FromDSU:int:days::
PyDelta_FromDSU:int:seconds::
PyDelta_FromDSU:int:useconds::
PyTimeZone_FromOffset:PyObject*::+1:
PyTimeZone_FromOffset:PyDateTime_DeltaType*:offset:+1:Reference count not increased if offset is +00:00
PyTimeZone_FromOffsetAndName:PyObject*::+1:
PyTimeZone_FromOffsetAndName:PyDateTime_DeltaType*:offset:+1:Reference count not increased if offset is +00:00 and name == NULL
PyTimeZone_FromOffsetAndName:PyUnicode*:name:+1:
PyDescr_NewClassMethod:PyObject*::+1:
PyDescr_NewClassMethod:PyTypeObject*:type::
PyDescr_NewClassMethod:PyMethodDef*:method::

View file

@ -62,7 +62,7 @@ Key terms
locally.
.. _setuptools: https://setuptools.readthedocs.io/en/latest/
.. _wheel: https://wheel.readthedocs.org
.. _wheel: https://wheel.readthedocs.io/
Open source licensing and collaboration
=======================================
@ -111,7 +111,7 @@ by invoking the ``pip`` module at the command line::
The Python Packaging User Guide includes more details on the `currently
recommended tools`_.
.. _currently recommended tools: https://packaging.python.org/en/latest/current/#packaging-tool-recommendations
.. _currently recommended tools: https://packaging.python.org/guides/tool-recommendations/#packaging-tool-recommendations
Reading the guide
=================
@ -124,11 +124,11 @@ involved in creating a project:
* `Uploading the project to the Python Packaging Index`_
.. _Project structure: \
https://packaging.python.org/en/latest/distributing/
https://packaging.python.org/tutorials/distributing-packages/
.. _Building and packaging the project: \
https://packaging.python.org/en/latest/distributing/#packaging-your-project
https://packaging.python.org/tutorials/distributing-packages/#packaging-your-project
.. _Uploading the project to the Python Packaging Index: \
https://packaging.python.org/en/latest/distributing/#uploading-your-project-to-pypi
https://packaging.python.org/tutorials/distributing-packages/#uploading-your-project-to-pypi
How do I...?
@ -160,7 +160,7 @@ Python Packaging User Guide for more information and recommendations.
.. seealso::
`Python Packaging User Guide: Binary Extensions
<https://packaging.python.org/en/latest/extensions>`__
<https://packaging.python.org/guides/packaging-binary-extensions/>`__
.. other topics:

View file

@ -285,6 +285,10 @@ the full reference.
See the :func:`setup` function for a list of keyword arguments accepted by the
Distribution constructor. :func:`setup` creates a Distribution instance.
.. versionchanged:: 3.7
:class:`~distutils.core.Distribution` now warns if ``classifiers``,
``keywords`` and ``platforms`` fields are not specified as a list or
a string.
.. class:: Command

View file

@ -22,7 +22,7 @@ very little overhead for build/release/install mechanics.
This guide only covers the basic tools for building and distributing
extensions that are provided as part of this version of Python. Third party
tools offer easier to use and more secure alternatives. Refer to the `quick
recommendations section <https://packaging.python.org/en/latest/current/>`__
recommendations section <https://packaging.python.org/guides/tool-recommendations/>`__
in the Python Packaging User Guide for more information.
.. toctree::

View file

@ -581,17 +581,19 @@ This information includes:
| | description of the | | |
| | package | | |
+----------------------+---------------------------+-----------------+--------+
| ``long_description`` | longer description of the | long string | \(5) |
| ``long_description`` | longer description of the | long string | \(4) |
| | package | | |
+----------------------+---------------------------+-----------------+--------+
| ``download_url`` | location where the | URL | \(4) |
| ``download_url`` | location where the | URL | |
| | package may be downloaded | | |
+----------------------+---------------------------+-----------------+--------+
| ``classifiers`` | a list of classifiers | list of strings | \(4) |
| ``classifiers`` | a list of classifiers | list of strings | (6)(7) |
+----------------------+---------------------------+-----------------+--------+
| ``platforms`` | a list of platforms | list of strings | |
| ``platforms`` | a list of platforms | list of strings | (6)(8) |
+----------------------+---------------------------+-----------------+--------+
| ``license`` | license for the package | short string | \(6) |
| ``keywords`` | a list of keywords | list of strings | (6)(8) |
+----------------------+---------------------------+-----------------+--------+
| ``license`` | license for the package | short string | \(5) |
+----------------------+---------------------------+-----------------+--------+
Notes:
@ -607,22 +609,30 @@ Notes:
provided, distutils lists it as the author in :file:`PKG-INFO`.
(4)
These fields should not be used if your package is to be compatible with Python
versions prior to 2.2.3 or 2.3. The list is available from the `PyPI website
<https://pypi.python.org/pypi>`_.
(5)
The ``long_description`` field is used by PyPI when you are
:ref:`registering <package-register>` a package, to
:ref:`build its home page <package-display>`.
(6)
(5)
The ``license`` field is a text indicating the license covering the
package where the license is not a selection from the "License" Trove
classifiers. See the ``Classifier`` field. Notice that
there's a ``licence`` distribution option which is deprecated but still
acts as an alias for ``license``.
(6)
This field must be a list.
(7)
The valid classifiers are listed on
`PyPI <https://pypi.python.org/pypi?:action=list_classifiers>`_.
(8)
To preserve backward compatibility, this field also accepts a string. If
you pass a comma-separated string ``'foo, bar'``, it will be converted to
``['foo', 'bar']``, Otherwise, it will be converted to a list of one
string.
'short string'
A single line of text, not more than 200 characters.
@ -650,7 +660,7 @@ information is sometimes used to indicate sub-releases. These are
1.0.1a2
the second alpha release of the first patch version of 1.0
``classifiers`` are specified in a Python list::
``classifiers`` must be specified in a list::
setup(...,
classifiers=[
@ -671,6 +681,11 @@ information is sometimes used to indicate sub-releases. These are
],
)
.. versionchanged:: 3.7
:class:`~distutils.core.setup` now raises a :exc:`TypeError` if
``classifiers``, ``keywords`` and ``platforms`` fields are not specified
as a list.
.. _debug-setup-script:
Debugging the setup script

View file

@ -27,7 +27,8 @@ your system setup; details are given in later chapters.
avoid writing C extensions and preserve portability to other implementations.
For example, if your use case is calling C library functions or system calls,
you should consider using the :mod:`ctypes` module or the `cffi
<https://cffi.readthedocs.org>`_ library rather than writing custom C code.
<https://cffi.readthedocs.io/>`_ library rather than writing
custom C code.
These modules let you write Python code to interface with C code and are more
portable between implementations of Python than writing and compiling a C
extension module.
@ -40,7 +41,7 @@ A Simple Example
Let's create an extension module called ``spam`` (the favorite food of Monty
Python fans...) and let's say we want to create a Python interface to the C
library function :c:func:`system`. [#]_ This function takes a null-terminated
library function :c:func:`system` [#]_. This function takes a null-terminated
character string as argument and returns an integer. We want this function to
be callable from Python as follows::
@ -917,7 +918,7 @@ It is also possible to :dfn:`borrow` [#]_ a reference to an object. The
borrower of a reference should not call :c:func:`Py_DECREF`. The borrower must
not hold on to the object longer than the owner from which it was borrowed.
Using a borrowed reference after the owner has disposed of it risks using freed
memory and should be avoided completely. [#]_
memory and should be avoided completely [#]_.
The advantage of borrowing over owning a reference is that you don't need to
take care of disposing of the reference on all possible paths through the code
@ -1088,7 +1089,7 @@ checking.
The C function calling mechanism guarantees that the argument list passed to C
functions (``args`` in the examples) is never *NULL* --- in fact it guarantees
that it is always a tuple. [#]_
that it is always a tuple [#]_.
It is a severe error to ever let a *NULL* pointer "escape" to the Python user.

View file

@ -32,7 +32,7 @@ approaches to creating C and C++ extensions for Python.
.. seealso::
`Python Packaging User Guide: Binary Extensions <https://packaging.python.org/en/latest/extensions/>`_
`Python Packaging User Guide: Binary Extensions <https://packaging.python.org/guides/packaging-binary-extensions/>`_
The Python Packaging User Guide not only covers several available
tools that simplify the creation of binary extensions, but also
discusses the various reasons why creating an extension module may be

View file

@ -659,7 +659,7 @@ Fortunately, Python's cyclic-garbage collector will eventually figure out that
the list is garbage and free it.
In the second version of the :class:`Noddy` example, we allowed any kind of
object to be stored in the :attr:`first` or :attr:`last` attributes. [#]_ This
object to be stored in the :attr:`first` or :attr:`last` attributes [#]_. This
means that :class:`Noddy` objects can participate in cycles::
>>> import noddy2

View file

@ -343,7 +343,7 @@ each Python stack frame. Also, extensions can call back into Python at almost
random moments. Therefore, a complete threads implementation requires thread
support for C.
Answer 2: Fortunately, there is `Stackless Python <http://www.stackless.com>`_,
Answer 2: Fortunately, there is `Stackless Python <https://bitbucket.org/stackless-dev/stackless/wiki/Home>`_,
which has a completely redesigned interpreter loop that avoids the C stack.

View file

@ -53,7 +53,7 @@ with a tool such as `SWIG <http://www.swig.org>`_. `SIP
<https://riverbankcomputing.com/software/sip/intro>`__, `CXX
<http://cxx.sourceforge.net/>`_ `Boost
<http://www.boost.org/libs/python/doc/index.html>`_, or `Weave
<https://scipy.github.io/devdocs/tutorial/weave.html>`_ are also
<https://github.com/scipy/weave>`_ are also
alternatives for wrapping C++ libraries.

View file

@ -272,7 +272,7 @@ The Python project's infrastructure is located all over the world.
`www.python.org <https://www.python.org>`_ is graciously hosted by `Rackspace
<https://www.rackspace.com>`_, with CDN caching provided by `Fastly
<https://www.fastly.com>`_. `Upfront Systems
<http://www.upfrontsystems.co.za/>`_ hosts `bugs.python.org
<http://www.upfrontsoftware.co.za>`_ hosts `bugs.python.org
<https://bugs.python.org>`_. Many other Python services like `the Wiki
<https://wiki.python.org>`_ are hosted by `Oregon State
University Open Source Lab <https://osuosl.org>`_.

View file

@ -43,7 +43,7 @@ number of platforms, with Windows, Mac OS X, GTK, X11, all listed as
current stable targets. Language bindings are available for a number
of languages including Python, Perl, Ruby, etc.
wxPython (http://www.wxpython.org) is the Python binding for
`wxPython <https://www.wxpython.org>`_ is the Python binding for
wxwidgets. While it often lags slightly behind the official wxWidgets
releases, it also offers a number of features via pure Python
extensions that are not available in other language bindings. There
@ -72,9 +72,9 @@ Gtk+
The `GObject introspection bindings <https://wiki.gnome.org/Projects/PyGObject>`_
for Python allow you to write GTK+ 3 applications. There is also a
`Python GTK+ 3 Tutorial <https://python-gtk-3-tutorial.readthedocs.org/en/latest/>`_.
`Python GTK+ 3 Tutorial <https://python-gtk-3-tutorial.readthedocs.io>`_.
The older PyGtk bindings for the `Gtk+ 2 toolkit <http://www.gtk.org>`_ have
The older PyGtk bindings for the `Gtk+ 2 toolkit <https://www.gtk.org>`_ have
been implemented by James Henstridge; see <http://www.pygtk.org>.
Kivy

View file

@ -419,7 +419,7 @@ Python program effectively only uses one CPU, due to the insistence that
Back in the days of Python 1.5, Greg Stein actually implemented a comprehensive
patch set (the "free threading" patches) that removed the GIL and replaced it
with fine-grained locking. Adam Olsen recently did a similar experiment
in his `python-safethread <http://code.google.com/p/python-safethread/>`_
in his `python-safethread <https://code.google.com/archive/p/python-safethread>`_
project. Unfortunately, both experiments exhibited a sharp drop in single-thread
performance (at least 30% slower), due to the amount of fine-grained locking
necessary to compensate for the removal of the GIL.

View file

@ -100,7 +100,7 @@ which don't. One is Thomas Heller's py2exe (Windows only) at
http://www.py2exe.org/
Another tool is Anthony Tuininga's `cx_Freeze <http://cx-freeze.sourceforge.net/>`_.
Another tool is Anthony Tuininga's `cx_Freeze <https://anthony-tuininga.github.io/cx_Freeze/>`_.
Are there coding standards or a style guide for Python programs?

View file

@ -170,8 +170,8 @@ offender.
How do I make an executable from a Python script?
-------------------------------------------------
See http://cx-freeze.sourceforge.net/ for a distutils extension that allows you
to create console and GUI executables from Python code.
See `cx_Freeze <https://anthony-tuininga.github.io/cx_Freeze/>`_ for a distutils extension
that allows you to create console and GUI executables from Python code.
`py2exe <http://www.py2exe.org/>`_, the most popular extension for building
Python 2.x-based executables, does not yet support Python 3 but a version that
does is in development.

View file

@ -126,7 +126,7 @@ Glossary
BDFL
Benevolent Dictator For Life, a.k.a. `Guido van Rossum
<https://www.python.org/~guido/>`_, Python's creator.
<https://gvanrossum.github.io/>`_, Python's creator.
binary file
A :term:`file object` able to read and write
@ -372,9 +372,11 @@ Glossary
may be accessed via the :attr:`__annotations__` special attribute of a
function object.
Python itself does not assign any particular meaning to function
annotations. They are intended to be interpreted by third-party libraries
or tools. See :pep:`3107`, which describes some of their potential uses.
See also the :term:`variable annotation` glossary entry.
Annotations are meant to provide a standard way for programmers to
document types of functions they design. See :pep:`484`, which
describes this functionality.
__future__
A pseudo-module which programmers can use to enable new language features
@ -391,7 +393,8 @@ Glossary
garbage collection
The process of freeing memory when it is not used anymore. Python
performs garbage collection via reference counting and a cyclic garbage
collector that is able to detect and break reference cycles.
collector that is able to detect and break reference cycles. The
garbage collector can be controlled using the :mod:`gc` module.
.. index:: single: generator
@ -458,6 +461,12 @@ Glossary
is believed that overcoming this performance issue would make the
implementation much more complicated and therefore costlier to maintain.
hash-based pyc
A bytecode cache file that uses the the hash rather than the last-modified
time of the corresponding source file to determine its validity. See
:ref:`pyc-invalidation`.
hashable
An object is *hashable* if it has a hash value which never changes during
its lifetime (it needs a :meth:`__hash__` method), and can be compared to
@ -1014,10 +1023,11 @@ Glossary
attribute of a class or module object and can be accessed using
:func:`typing.get_type_hints`.
Python itself does not assign any particular meaning to variable
annotations. They are intended to be interpreted by third-party libraries
or type checking tools. See :pep:`526`, :pep:`484` which describe
some of their potential uses.
See also the :term:`function annotation` glossary entry.
Annotations are meant to provide a standard way for programmers to
document types of functions they design. See :pep:`484` and :pep:`526`
which describe this functionality.
virtual environment
A cooperatively isolated runtime environment that allows Python users

View file

@ -543,7 +543,7 @@ learn more about submitting patches to Python.
* `Writing Programs with NCURSES <http://invisible-island.net/ncurses/ncurses-intro.html>`_:
a lengthy tutorial for C programmers.
* `The ncurses man page <http://linux.die.net/man/3/ncurses>`_
* `The ncurses man page <https://linux.die.net/man/3/ncurses>`_
* `The ncurses FAQ <http://invisible-island.net/ncurses/ncurses.faq.html>`_
* `"Use curses... don't swear" <https://www.youtube.com/watch?v=eN1eZtjLEnU>`_:
video of a PyCon 2013 talk on controlling terminals using curses or Urwid.

View file

@ -941,7 +941,7 @@ Using file rotation
-------------------
.. sectionauthor:: Doug Hellmann, Vinay Sajip (changes)
.. (see <http://blog.doughellmann.com/2007/05/pymotw-logging.html>)
.. (see <https://pymotw.com/3/logging/>)
Sometimes you want to let a log file grow to a certain size, then open a new
file and log to that. You may want to keep a certain number of these files, and

View file

@ -314,7 +314,7 @@ favourite beverage and carry on.
If your logging needs are simple, then use the above examples to incorporate
logging into your own scripts, and if you run into problems or don't
understand something, please post a question on the comp.lang.python Usenet
group (available at https://groups.google.com/group/comp.lang.python) and you
group (available at https://groups.google.com/forum/#!forum/comp.lang.python) and you
should receive help before too long.
Still here? You can carry on reading the next few sections, which provide a

View file

@ -433,12 +433,12 @@ to make sure everything functions as expected in both versions of Python.
.. _Futurize: http://python-future.org/automatic_conversion.html
.. _importlib: https://docs.python.org/3/library/importlib.html#module-importlib
.. _importlib2: https://pypi.python.org/pypi/importlib2
.. _Modernize: https://python-modernize.readthedocs.org/en/latest/
.. _Modernize: https://python-modernize.readthedocs.io/
.. _mypy: http://mypy-lang.org/
.. _Porting to Python 3: http://python3porting.com/
.. _Pylint: https://pypi.python.org/pypi/pylint
.. _Python 3 Q & A: https://ncoghlan-devs-python-notes.readthedocs.org/en/latest/python3/questions_and_answers.html
.. _Python 3 Q & A: https://ncoghlan-devs-python-notes.readthedocs.io/en/latest/python3/questions_and_answers.html
.. _pytype: https://github.com/google/pytype
.. _python-future: http://python-future.org/
@ -449,4 +449,4 @@ to make sure everything functions as expected in both versions of Python.
.. _"What's New": https://docs.python.org/3/whatsnew/index.html
.. _Why Python 3 exists: http://www.snarky.ca/why-python-3-exists
.. _Why Python 3 exists: https://snarky.ca/why-python-3-exists

View file

@ -289,6 +289,8 @@ Putting REs in strings keeps the Python language simpler, but has one
disadvantage which is the topic of the next section.
.. _the-backslash-plague:
The Backslash Plague
--------------------
@ -327,6 +329,13 @@ backslashes are not handled in any special way in a string literal prefixed with
while ``"\n"`` is a one-character string containing a newline. Regular
expressions will often be written in Python code using this raw string notation.
In addition, special escape sequences that are valid in regular expressions,
but not valid as Python string literals, now result in a
:exc:`DeprecationWarning` and will eventually become a :exc:`SyntaxError`,
which means the sequences will be invalid if raw string notation or escaping
the backslashes isn't used.
+-------------------+------------------+
| Regular String | Raw string |
+===================+==================+
@ -457,10 +466,16 @@ In actual programs, the most common style is to store the
Two pattern methods return all of the matches for a pattern.
:meth:`~re.Pattern.findall` returns a list of matching strings::
>>> p = re.compile('\d+')
>>> p = re.compile(r'\d+')
>>> p.findall('12 drummers drumming, 11 pipers piping, 10 lords a-leaping')
['12', '11', '10']
The ``r`` prefix, making the literal a raw string literal, is needed in this
example because escape sequences in a normal "cooked" string literal that are
not recognized by Python, as opposed to regular expressions, now result in a
:exc:`DeprecationWarning` and will eventually become a :exc:`SyntaxError`. See
:ref:`the-backslash-plague`.
:meth:`~re.Pattern.findall` has to create the entire list before it can be returned as the
result. The :meth:`~re.Pattern.finditer` method returns a sequence of
:ref:`match object <match-objects>` instances as an :term:`iterator`::
@ -844,7 +859,7 @@ backreferences in a RE.
For example, the following RE detects doubled words in a string. ::
>>> p = re.compile(r'(\b\w+)\s+\1')
>>> p = re.compile(r'\b(\w+)\s+\1\b')
>>> p.search('Paris in the the spring').group()
'the the'
@ -943,9 +958,9 @@ number of the group. There's naturally a variant that uses the group name
instead of the number. This is another Python extension: ``(?P=name)`` indicates
that the contents of the group called *name* should again be matched at the
current point. The regular expression for finding doubled words,
``(\b\w+)\s+\1`` can also be written as ``(?P<word>\b\w+)\s+(?P=word)``::
``\b(\w+)\s+\1\b`` can also be written as ``\b(?P<word>\w+)\s+(?P=word)\b``::
>>> p = re.compile(r'(?P<word>\b\w+)\s+(?P=word)')
>>> p = re.compile(r'\b(?P<word>\w+)\s+(?P=word)\b')
>>> p.search('Paris in the the spring').group()
'the the'
@ -1096,11 +1111,11 @@ following calls::
The module-level function :func:`re.split` adds the RE to be used as the first
argument, but is otherwise the same. ::
>>> re.split('[\W]+', 'Words, words, words.')
>>> re.split(r'[\W]+', 'Words, words, words.')
['Words', 'words', 'words', '']
>>> re.split('([\W]+)', 'Words, words, words.')
>>> re.split(r'([\W]+)', 'Words, words, words.')
['Words', ', ', 'words', ', ', 'words', '.', '']
>>> re.split('[\W]+', 'Words, words, words.', 1)
>>> re.split(r'[\W]+', 'Words, words, words.', 1)
['Words', 'words, words.']
@ -1140,12 +1155,12 @@ new string value and the number of replacements that were performed::
>>> p.subn('colour', 'no colours at all')
('no colours at all', 0)
Empty matches are replaced only when they're not adjacent to a previous match.
Empty matches are replaced only when they're not adjacent to a previous empty match.
::
>>> p = re.compile('x*')
>>> p.sub('-', 'abxd')
'-a-b-d-'
'-a-b--d-'
If *replacement* is a string, any backslash escapes in it are processed. That
is, ``\n`` is converted to a single newline character, ``\r`` is converted to a

View file

@ -214,10 +214,10 @@ difficult reading. `A chronology <http://www.unicode.org/history/>`_ of the
origin and development of Unicode is also available on the site.
To help understand the standard, Jukka Korpela has written `an introductory
guide <https://www.cs.tut.fi/~jkorpela/unicode/guide.html>`_ to reading the
guide <http://jkorpela.fi/unicode/guide.html>`_ to reading the
Unicode character tables.
Another `good introductory article <http://www.joelonsoftware.com/articles/Unicode.html>`_
Another `good introductory article <https://www.joelonsoftware.com/2003/10/08/the-absolute-minimum-every-software-developer-absolutely-positively-must-know-about-unicode-and-character-sets-no-excuses/>`_
was written by Joel Spolsky.
If this introduction didn't make things clear to you, you should try
reading this alternate article before continuing.
@ -463,7 +463,7 @@ The string in this example has the number 57 written in both Thai and
Arabic numerals::
import re
p = re.compile('\d+')
p = re.compile(r'\d+')
s = "Over \u0e55\u0e57 57 flavours"
m = p.search(s)
@ -487,7 +487,7 @@ References
Some good alternative discussions of Python's Unicode support are:
* `Processing Text Files in Python 3 <http://python-notes.curiousefficiency.org/en/latest/python3/text_file_processing.html>`_, by Nick Coghlan.
* `Pragmatic Unicode <http://nedbatchelder.com/text/unipain.html>`_, a PyCon 2012 presentation by Ned Batchelder.
* `Pragmatic Unicode <https://nedbatchelder.com/text/unipain.html>`_, a PyCon 2012 presentation by Ned Batchelder.
The :class:`str` type is described in the Python library reference at
:ref:`textseq`.

View file

@ -403,7 +403,7 @@ fetched, particularly the headers sent by the server. It is currently an
:class:`http.client.HTTPMessage` instance.
Typical headers include 'Content-length', 'Content-type', and so on. See the
`Quick Reference to HTTP Headers <https://www.cs.tut.fi/~jkorpela/http.html>`_
`Quick Reference to HTTP Headers <http://jkorpela.fi/http.html>`_
for a useful listing of HTTP headers with brief explanations of their meaning
and use.

View file

@ -21,7 +21,7 @@ print('To:', msg['to'])
print('From:', msg['from'])
print('Subject:', msg['subject'])
# If we want to print a priview of the message content, we can extract whatever
# If we want to print a preview of the message content, we can extract whatever
# the least formatted payload is and print the first three lines. Of course,
# if the message has no plain text part printing the first three lines of html
# is probably useless, but this is just a conceptual example.

View file

@ -17,7 +17,7 @@ Shoddy_increment(Shoddy *self, PyObject *unused)
static PyMethodDef Shoddy_methods[] = {
{"increment", (PyCFunction)Shoddy_increment, METH_NOARGS,
PyDoc_STR("increment state counter")},
{NULL, NULL},
{NULL},
};
static int

View file

@ -36,7 +36,7 @@ modules and extensions.
This guide only covers the basic tools for building and distributing
extensions that are provided as part of this version of Python. Third party
tools offer easier to use and more secure alternatives. Refer to the `quick
recommendations section <https://packaging.python.org/en/latest/current/>`__
recommendations section <https://packaging.python.org/guides/tool-recommendations/>`__
in the Python Packaging User Guide for more information.

View file

@ -48,7 +48,7 @@ Key terms
repository of open source licensed packages made available for use by
other Python users.
* the `Python Packaging Authority
<https://www.pypa.io/en/latest/>`__ are the group of
<https://www.pypa.io/>`__ are the group of
developers and documentation authors responsible for the maintenance and
evolution of the standard packaging tools and the associated metadata and
file format standards. They maintain a variety of tools, documentation,

View file

@ -351,7 +351,7 @@ and off individually. They are described here in more detail.
================================== =============================================
From To
================================== =============================================
``operator.isCallable(obj)`` ``hasattr(obj, '__call__')``
``operator.isCallable(obj)`` ``callable(obj)``
``operator.sequenceIncludes(obj)`` ``operator.contains(obj)``
``operator.isSequenceType(obj)`` ``isinstance(obj, collections.abc.Sequence)``
``operator.isMappingType(obj)`` ``isinstance(obj, collections.abc.Mapping)``

View file

@ -90,6 +90,11 @@ language using this mechanism:
| generator_stop | 3.5.0b1 | 3.7 | :pep:`479`: |
| | | | *StopIteration handling inside generators* |
+------------------+-------------+--------------+---------------------------------------------+
| annotations | 3.7.0b1 | 4.0 | :pep:`563`: |
| | | | *Postponed evaluation of annotations* |
+------------------+-------------+--------------+---------------------------------------------+
.. XXX Adding a new entry? Remember to update simple_stmts.rst, too.
.. seealso::

View file

@ -261,5 +261,5 @@ and classes for traversing abstract syntax trees:
.. seealso::
`Green Tree Snakes <https://greentreesnakes.readthedocs.org/>`_, an external documentation resource, has good
`Green Tree Snakes <https://greentreesnakes.readthedocs.io/>`_, an external documentation resource, has good
details on working with Python ASTs.

View file

@ -81,12 +81,11 @@ is called.
If you wait for a future, you should check early if the future was cancelled to
avoid useless operations. Example::
@coroutine
def slow_operation(fut):
async def slow_operation(fut):
if fut.cancelled():
return
# ... slow computation ...
yield from fut
await fut
# ...
The :func:`shield` function can also be used to ignore cancellation.
@ -99,7 +98,7 @@ Concurrency and multithreading
An event loop runs in a thread and executes all callbacks and tasks in the same
thread. While a task is running in the event loop, no other task is running in
the same thread. But when the task uses ``yield from``, the task is suspended
the same thread. But when the task uses ``await``, the task is suspended
and the event loop executes the next task.
To schedule a callback from a different thread, the
@ -192,8 +191,7 @@ Example with the bug::
import asyncio
@asyncio.coroutine
def test():
async def test():
print("never scheduled")
test()
@ -216,9 +214,9 @@ The fix is to call the :func:`ensure_future` function or the
Detect exceptions never consumed
--------------------------------
Python usually calls :func:`sys.displayhook` on unhandled exceptions. If
Python usually calls :func:`sys.excepthook` on unhandled exceptions. If
:meth:`Future.set_exception` is called, but the exception is never consumed,
:func:`sys.displayhook` is not called. Instead, :ref:`a log is emitted
:func:`sys.excepthook` is not called. Instead, :ref:`a log is emitted
<asyncio-logger>` when the future is deleted by the garbage collector, with the
traceback where the exception was raised.
@ -270,10 +268,9 @@ traceback where the task was created. Output in debug mode::
There are different options to fix this issue. The first option is to chain the
coroutine in another coroutine and use classic try/except::
@asyncio.coroutine
def handle_exception():
async def handle_exception():
try:
yield from bug()
await bug()
except Exception:
print("exception consumed")
@ -300,7 +297,7 @@ Chain coroutines correctly
--------------------------
When a coroutine function calls other coroutine functions and tasks, they
should be chained explicitly with ``yield from``. Otherwise, the execution is
should be chained explicitly with ``await``. Otherwise, the execution is
not guaranteed to be sequential.
Example with different bugs using :func:`asyncio.sleep` to simulate slow
@ -308,26 +305,22 @@ operations::
import asyncio
@asyncio.coroutine
def create():
yield from asyncio.sleep(3.0)
async def create():
await asyncio.sleep(3.0)
print("(1) create file")
@asyncio.coroutine
def write():
yield from asyncio.sleep(1.0)
async def write():
await asyncio.sleep(1.0)
print("(2) write into file")
@asyncio.coroutine
def close():
async def close():
print("(3) close file")
@asyncio.coroutine
def test():
async def test():
asyncio.ensure_future(create())
asyncio.ensure_future(write())
asyncio.ensure_future(close())
yield from asyncio.sleep(2.0)
await asyncio.sleep(2.0)
loop.stop()
loop = asyncio.get_event_loop()
@ -359,24 +352,22 @@ The loop stopped before the ``create()`` finished, ``close()`` has been called
before ``write()``, whereas coroutine functions were called in this order:
``create()``, ``write()``, ``close()``.
To fix the example, tasks must be marked with ``yield from``::
To fix the example, tasks must be marked with ``await``::
@asyncio.coroutine
def test():
yield from asyncio.ensure_future(create())
yield from asyncio.ensure_future(write())
yield from asyncio.ensure_future(close())
yield from asyncio.sleep(2.0)
async def test():
await asyncio.ensure_future(create())
await asyncio.ensure_future(write())
await asyncio.ensure_future(close())
await asyncio.sleep(2.0)
loop.stop()
Or without ``asyncio.ensure_future()``::
@asyncio.coroutine
def test():
yield from create()
yield from write()
yield from close()
yield from asyncio.sleep(2.0)
async def test():
await create()
await write()
await close()
await asyncio.sleep(2.0)
loop.stop()

View file

@ -171,7 +171,7 @@ a different clock than :func:`time.time`.
Arrange for the *callback* to be called after the given *delay*
seconds (either an int or float).
An instance of :class:`asyncio.Handle` is returned, which can be
An instance of :class:`asyncio.TimerHandle` is returned, which can be
used to cancel the callback.
*callback* will be called exactly once per call to :meth:`call_later`.
@ -193,7 +193,7 @@ a different clock than :func:`time.time`.
This method's behavior is the same as :meth:`call_later`.
An instance of :class:`asyncio.Handle` is returned, which can be
An instance of :class:`asyncio.TimerHandle` is returned, which can be
used to cancel the callback.
:ref:`Use functools.partial to pass keywords to the callback
@ -235,9 +235,6 @@ Tasks
interoperability. In this case, the result type is a subclass of
:class:`Task`.
This method was added in Python 3.4.2. Use the :func:`async` function to
support also older Python versions.
.. versionadded:: 3.4.2
.. method:: AbstractEventLoop.set_task_factory(factory)
@ -264,7 +261,7 @@ Tasks
Creating connections
--------------------
.. coroutinemethod:: AbstractEventLoop.create_connection(protocol_factory, host=None, port=None, \*, ssl=None, family=0, proto=0, flags=0, sock=None, local_addr=None, server_hostname=None)
.. coroutinemethod:: AbstractEventLoop.create_connection(protocol_factory, host=None, port=None, \*, ssl=None, family=0, proto=0, flags=0, sock=None, local_addr=None, server_hostname=None, ssl_handshake_timeout=None)
Create a streaming transport connection to a given Internet *host* and
*port*: socket family :py:data:`~socket.AF_INET` or
@ -272,9 +269,8 @@ Creating connections
socket type :py:data:`~socket.SOCK_STREAM`. *protocol_factory* must be a
callable returning a :ref:`protocol <asyncio-protocol>` instance.
This method is a :ref:`coroutine <coroutine>` which will try to
establish the connection in the background. When successful, the
coroutine returns a ``(transport, protocol)`` pair.
This method will try to establish the connection in the background.
When successful, it returns a ``(transport, protocol)`` pair.
The chronological synopsis of the underlying operation is as follows:
@ -329,6 +325,14 @@ Creating connections
to bind the socket to locally. The *local_host* and *local_port*
are looked up using getaddrinfo(), similarly to *host* and *port*.
* *ssl_handshake_timeout* is (for an SSL connection) the time in seconds
to wait for the SSL handshake to complete before aborting the connection.
``10.0`` seconds if ``None`` (default).
.. versionadded:: 3.7
The *ssl_handshake_timeout* parameter.
.. versionchanged:: 3.5
On Windows with :class:`ProactorEventLoop`, SSL/TLS is now supported.
@ -347,9 +351,8 @@ Creating connections
:py:data:`~socket.SOCK_DGRAM`. *protocol_factory* must be a
callable returning a :ref:`protocol <asyncio-protocol>` instance.
This method is a :ref:`coroutine <coroutine>` which will try to
establish the connection in the background. When successful, the
coroutine returns a ``(transport, protocol)`` pair.
This method will try to establish the connection in the background.
When successful, the it returns a ``(transport, protocol)`` pair.
Options changing how the connection is created:
@ -391,30 +394,37 @@ Creating connections
:ref:`UDP echo server protocol <asyncio-udp-echo-server-protocol>` examples.
.. coroutinemethod:: AbstractEventLoop.create_unix_connection(protocol_factory, path, \*, ssl=None, sock=None, server_hostname=None)
.. coroutinemethod:: AbstractEventLoop.create_unix_connection(protocol_factory, path=None, \*, ssl=None, sock=None, server_hostname=None, ssl_handshake_timeout=None)
Create UNIX connection: socket family :py:data:`~socket.AF_UNIX`, socket
type :py:data:`~socket.SOCK_STREAM`. The :py:data:`~socket.AF_UNIX` socket
family is used to communicate between processes on the same machine
efficiently.
This method is a :ref:`coroutine <coroutine>` which will try to
establish the connection in the background. When successful, the
coroutine returns a ``(transport, protocol)`` pair.
This method will try to establish the connection in the background.
When successful, the it returns a ``(transport, protocol)`` pair.
*path* is the name of a UNIX domain socket, and is required unless a *sock*
parameter is specified. Abstract UNIX sockets, :class:`str`, and
:class:`bytes` paths are supported.
parameter is specified. Abstract UNIX sockets, :class:`str`,
:class:`bytes`, and :class:`~pathlib.Path` paths are supported.
See the :meth:`AbstractEventLoop.create_connection` method for parameters.
Availability: UNIX.
.. versionadded:: 3.7
The *ssl_handshake_timeout* parameter.
.. versionchanged:: 3.7
The *path* parameter can now be a :class:`~pathlib.Path` object.
Creating listening connections
------------------------------
.. coroutinemethod:: AbstractEventLoop.create_server(protocol_factory, host=None, port=None, \*, family=socket.AF_UNSPEC, flags=socket.AI_PASSIVE, sock=None, backlog=100, ssl=None, reuse_address=None, reuse_port=None)
.. coroutinemethod:: AbstractEventLoop.create_server(protocol_factory, host=None, port=None, \*, family=socket.AF_UNSPEC, flags=socket.AI_PASSIVE, sock=None, backlog=100, ssl=None, reuse_address=None, reuse_port=None, ssl_handshake_timeout=None, start_serving=True)
Create a TCP server (socket type :data:`~socket.SOCK_STREAM`) bound to
*host* and *port*.
@ -458,7 +468,19 @@ Creating listening connections
set this flag when being created. This option is not supported on
Windows.
This method is a :ref:`coroutine <coroutine>`.
* *ssl_handshake_timeout* is (for an SSL server) the time in seconds to wait
for the SSL handshake to complete before aborting the connection.
``10.0`` seconds if ``None`` (default).
* *start_serving* set to ``True`` (the default) causes the created server
to start accepting connections immediately. When set to ``False``,
the user should await on :meth:`Server.start_serving` or
:meth:`Server.serve_forever` to make the server to start accepting
connections.
.. versionadded:: 3.7
*ssl_handshake_timeout* and *start_serving* parameters.
.. versionchanged:: 3.5
@ -474,16 +496,26 @@ Creating listening connections
The *host* parameter can now be a sequence of strings.
.. coroutinemethod:: AbstractEventLoop.create_unix_server(protocol_factory, path=None, \*, sock=None, backlog=100, ssl=None)
.. coroutinemethod:: AbstractEventLoop.create_unix_server(protocol_factory, path=None, \*, sock=None, backlog=100, ssl=None, ssl_handshake_timeout=None, start_serving=True)
Similar to :meth:`AbstractEventLoop.create_server`, but specific to the
socket family :py:data:`~socket.AF_UNIX`.
This method is a :ref:`coroutine <coroutine>`.
*path* is the name of a UNIX domain socket, and is required unless a *sock*
parameter is specified. Abstract UNIX sockets, :class:`str`,
:class:`bytes`, and :class:`~pathlib.Path` paths are supported.
Availability: UNIX.
.. coroutinemethod:: BaseEventLoop.connect_accepted_socket(protocol_factory, sock, \*, ssl=None)
.. versionadded:: 3.7
The *ssl_handshake_timeout* parameter.
.. versionchanged:: 3.7
The *path* parameter can now be a :class:`~pathlib.Path` object.
.. coroutinemethod:: BaseEventLoop.connect_accepted_socket(protocol_factory, sock, \*, ssl=None, ssl_handshake_timeout=None)
Handle an accepted connection.
@ -498,8 +530,81 @@ Creating listening connections
* *ssl* can be set to an :class:`~ssl.SSLContext` to enable SSL over the
accepted connections.
This method is a :ref:`coroutine <coroutine>`. When completed, the
coroutine returns a ``(transport, protocol)`` pair.
* *ssl_handshake_timeout* is (for an SSL connection) the time in seconds to
wait for the SSL handshake to complete before aborting the connection.
``10.0`` seconds if ``None`` (default).
When completed it returns a ``(transport, protocol)`` pair.
.. versionadded:: 3.7
The *ssl_handshake_timeout* parameter.
.. versionadded:: 3.5.3
File Transferring
-----------------
.. coroutinemethod:: AbstractEventLoop.sendfile(transport, file, \
offset=0, count=None, \
*, fallback=True)
Send a *file* to *transport*, return the total number of bytes
which were sent.
The method uses high-performance :meth:`os.sendfile` if available.
*file* must be a regular file object opened in binary mode.
*offset* tells from where to start reading the file. If specified,
*count* is the total number of bytes to transmit as opposed to
sending the file until EOF is reached. File position is updated on
return or also in case of error in which case :meth:`file.tell()
<io.IOBase.tell>` can be used to figure out the number of bytes
which were sent.
*fallback* set to ``True`` makes asyncio to manually read and send
the file when the platform does not support the sendfile syscall
(e.g. Windows or SSL socket on Unix).
Raise :exc:`SendfileNotAvailableError` if the system does not support
*sendfile* syscall and *fallback* is ``False``.
.. versionadded:: 3.7
TLS Upgrade
-----------
.. coroutinemethod:: AbstractEventLoop.start_tls(transport, protocol, sslcontext, \*, server_side=False, server_hostname=None, ssl_handshake_timeout=None)
Upgrades an existing connection to TLS.
Returns a new transport instance, that the *protocol* must start using
immediately after the *await*. The *transport* instance passed to
the *start_tls* method should never be used again.
Parameters:
* *transport* and *protocol* instances that methods like
:meth:`~AbstractEventLoop.create_server` and
:meth:`~AbstractEventLoop.create_connection` return.
* *sslcontext*: a configured instance of :class:`~ssl.SSLContext`.
* *server_side* pass ``True`` when a server-side connection is being
upgraded (like the one created by :meth:`~AbstractEventLoop.create_server`).
* *server_hostname*: sets or overrides the host name that the target
server's certificate will be matched against.
* *ssl_handshake_timeout* is (for an SSL connection) the time in seconds to
wait for the SSL handshake to complete before aborting the connection.
``10.0`` seconds if ``None`` (default).
.. versionadded:: 3.7
Watch file descriptors
----------------------
@ -553,7 +658,10 @@ Low-level socket operations
With :class:`SelectorEventLoop` event loop, the socket *sock* must be
non-blocking.
This method is a :ref:`coroutine <coroutine>`.
.. versionchanged:: 3.7
Even though the method was always documented as a coroutine
method, before Python 3.7 it returned a :class:`Future`.
Since Python 3.7, this is an ``async def`` method.
.. coroutinemethod:: AbstractEventLoop.sock_recv_into(sock, buf)
@ -566,8 +674,6 @@ Low-level socket operations
With :class:`SelectorEventLoop` event loop, the socket *sock* must be
non-blocking.
This method is a :ref:`coroutine <coroutine>`.
.. versionadded:: 3.7
.. coroutinemethod:: AbstractEventLoop.sock_sendall(sock, data)
@ -584,7 +690,10 @@ Low-level socket operations
With :class:`SelectorEventLoop` event loop, the socket *sock* must be
non-blocking.
This method is a :ref:`coroutine <coroutine>`.
.. versionchanged:: 3.7
Even though the method was always documented as a coroutine
method, before Python 3.7 it returned an :class:`Future`.
Since Python 3.7, this is an ``async def`` method.
.. coroutinemethod:: AbstractEventLoop.sock_connect(sock, address)
@ -594,8 +703,6 @@ Low-level socket operations
With :class:`SelectorEventLoop` event loop, the socket *sock* must be
non-blocking.
This method is a :ref:`coroutine <coroutine>`.
.. versionchanged:: 3.5.2
``address`` no longer needs to be resolved. ``sock_connect``
will try to check if the *address* is already resolved by calling
@ -622,12 +729,45 @@ Low-level socket operations
The socket *sock* must be non-blocking.
This method is a :ref:`coroutine <coroutine>`.
.. versionchanged:: 3.7
Even though the method was always documented as a coroutine
method, before Python 3.7 it returned a :class:`Future`.
Since Python 3.7, this is an ``async def`` method.
.. seealso::
:meth:`AbstractEventLoop.create_server` and :func:`start_server`.
.. coroutinemethod:: AbstractEventLoop.sock_sendfile(sock, file, \
offset=0, count=None, \
*, fallback=True)
Send a file using high-performance :mod:`os.sendfile` if possible
and return the total number of bytes which were sent.
Asynchronous version of :meth:`socket.socket.sendfile`.
*sock* must be non-blocking :class:`~socket.socket` of
:const:`socket.SOCK_STREAM` type.
*file* must be a regular file object opened in binary mode.
*offset* tells from where to start reading the file. If specified,
*count* is the total number of bytes to transmit as opposed to
sending the file until EOF is reached. File position is updated on
return or also in case of error in which case :meth:`file.tell()
<io.IOBase.tell>` can be used to figure out the number of bytes
which were sent.
*fallback* set to ``True`` makes asyncio to manually read and send
the file when the platform does not support the sendfile syscall
(e.g. Windows or SSL socket on Unix).
Raise :exc:`SendfileNotAvailableError` if the system does not support
*sendfile* syscall and *fallback* is ``False``.
.. versionadded:: 3.7
Resolve host name
-----------------
@ -642,6 +782,12 @@ Resolve host name
This method is a :ref:`coroutine <coroutine>`, similar to
:meth:`socket.getnameinfo` function but non-blocking.
.. versionchanged:: 3.7
Both *getaddrinfo* and *getnameinfo* methods were always documented
to return a coroutine, but prior to Python 3.7 they were, in fact,
returning :class:`asyncio.Future` objects. Starting with Python 3.7
both methods are coroutines.
Connect pipes
-------------
@ -661,8 +807,6 @@ Use :class:`ProactorEventLoop` to support pipes on Windows.
With :class:`SelectorEventLoop` event loop, the *pipe* is set to
non-blocking mode.
This method is a :ref:`coroutine <coroutine>`.
.. coroutinemethod:: AbstractEventLoop.connect_write_pipe(protocol_factory, pipe)
Register write pipe in eventloop.
@ -675,8 +819,6 @@ Use :class:`ProactorEventLoop` to support pipes on Windows.
With :class:`SelectorEventLoop` event loop, the *pipe* is set to
non-blocking mode.
This method is a :ref:`coroutine <coroutine>`.
.. seealso::
The :meth:`AbstractEventLoop.subprocess_exec` and
@ -716,7 +858,7 @@ Call a function in an :class:`~concurrent.futures.Executor` (pool of threads or
pool of processes). By default, an event loop uses a thread pool executor
(:class:`~concurrent.futures.ThreadPoolExecutor`).
.. coroutinemethod:: AbstractEventLoop.run_in_executor(executor, func, \*args)
.. method:: AbstractEventLoop.run_in_executor(executor, func, \*args)
Arrange for a *func* to be called in the specified executor.
@ -726,7 +868,7 @@ pool of processes). By default, an event loop uses a thread pool executor
:ref:`Use functools.partial to pass keywords to the *func*
<asyncio-pass-keywords>`.
This method is a :ref:`coroutine <coroutine>`.
This method returns a :class:`asyncio.Future` object.
.. versionchanged:: 3.5.3
:meth:`BaseEventLoop.run_in_executor` no longer configures the
@ -827,8 +969,26 @@ Server
Server listening on sockets.
Object created by the :meth:`AbstractEventLoop.create_server` method and the
:func:`start_server` function. Don't instantiate the class directly.
Object created by :meth:`AbstractEventLoop.create_server`,
:meth:`AbstractEventLoop.create_unix_server`, :func:`start_server`,
and :func:`start_unix_server` functions. Don't instantiate the class
directly.
*Server* objects are asynchronous context managers. When used in an
``async with`` statement, it's guaranteed that the Server object is
closed and not accepting new connections when the ``async with``
statement is completed::
srv = await loop.create_server(...)
async with srv:
# some code
# At this point, srv is closed and no longer accepts new connections.
.. versionchanged:: 3.7
Server object is an asynchronous context manager since Python 3.7.
.. method:: close()
@ -841,17 +1001,74 @@ Server
The server is closed asynchronously, use the :meth:`wait_closed`
coroutine to wait until the server is closed.
.. method:: get_loop()
Gives the event loop associated with the server object.
.. versionadded:: 3.7
.. coroutinemethod:: start_serving()
Start accepting connections.
This method is idempotent, so it can be called when
the server is already being serving.
The new *start_serving* keyword-only parameter to
:meth:`AbstractEventLoop.create_server` and
:meth:`asyncio.start_server` allows to create a Server object
that is not accepting connections right away. In which case
this method, or :meth:`Server.serve_forever` can be used
to make the Server object to start accepting connections.
.. versionadded:: 3.7
.. coroutinemethod:: serve_forever()
Start accepting connections until the coroutine is cancelled.
Cancellation of ``serve_forever`` task causes the server
to be closed.
This method can be called if the server is already accepting
connections. Only one ``serve_forever`` task can exist per
one *Server* object.
Example::
async def client_connected(reader, writer):
# Communicate with the client with
# reader/writer streams. For example:
await reader.readline()
async def main(host, port):
srv = await asyncio.start_server(
client_connected, host, port)
await loop.serve_forever()
asyncio.run(main('127.0.0.1', 0))
.. versionadded:: 3.7
.. method:: is_serving()
Return ``True`` if the server is accepting new connections.
.. versionadded:: 3.7
.. coroutinemethod:: wait_closed()
Wait until the :meth:`close` method completes.
This method is a :ref:`coroutine <coroutine>`.
.. attribute:: sockets
List of :class:`socket.socket` objects the server is listening to, or
``None`` if the server is closed.
.. versionchanged:: 3.7
Prior to Python 3.7 ``Server.sockets`` used to return the
internal list of server's sockets directly. In 3.7 a copy
of that list is returned.
Handle
------
@ -859,8 +1076,7 @@ Handle
.. class:: Handle
A callback wrapper object returned by :func:`AbstractEventLoop.call_soon`,
:func:`AbstractEventLoop.call_soon_threadsafe`, :func:`AbstractEventLoop.call_later`,
and :func:`AbstractEventLoop.call_at`.
:func:`AbstractEventLoop.call_soon_threadsafe`.
.. method:: cancel()
@ -873,6 +1089,34 @@ Handle
.. versionadded:: 3.7
.. class:: TimerHandle
A callback wrapper object returned by :func:`AbstractEventLoop.call_later`,
and :func:`AbstractEventLoop.call_at`.
The class is inherited from :class:`Handle`.
.. method:: when()
Return a scheduled callback time as :class:`float` seconds.
The time is an absolute timestamp, using the same time
reference as :meth:`AbstractEventLoop.time`.
.. versionadded:: 3.7
SendfileNotAvailableError
-------------------------
.. exception:: SendfileNotAvailableError
Sendfile syscall is not available, subclass of :exc:`RuntimeError`.
Raised if the OS does not support senfile syscall for
given socket or file type.
Event loop examples
-------------------
@ -952,10 +1196,7 @@ Wait until a file descriptor received some data using the
:meth:`AbstractEventLoop.add_reader` method and then close the event loop::
import asyncio
try:
from socket import socketpair
except ImportError:
from asyncio.windows_utils import socketpair
from socket import socketpair
# Create a pair of connected file descriptors
rsock, wsock = socketpair()

View file

@ -25,6 +25,13 @@ the execution of the process.
Equivalent to calling ``get_event_loop_policy().new_event_loop()``.
.. function:: get_running_loop()
Return the running event loop in the current OS thread. If there
is no running event loop a :exc:`RuntimeError` is raised.
.. versionadded:: 3.7
.. _asyncio-event-loops:
@ -189,10 +196,15 @@ An event loop policy must implement the following interface:
The default policy defines context as the current thread, and manages an event
loop per thread that interacts with :mod:`asyncio`. If the current thread
doesn't already have an event loop associated with it, the default policy's
:meth:`~AbstractEventLoopPolicy.get_event_loop` method creates one when
called from the main thread, but raises :exc:`RuntimeError` otherwise.
loop per thread that interacts with :mod:`asyncio`. An exception to this rule
happens when :meth:`~AbstractEventLoopPolicy.get_event_loop` is called from a
running future/coroutine, in which case it will return the current loop
running that future/coroutine.
If the current thread doesn't already have an event loop associated with it,
the default policy's :meth:`~AbstractEventLoopPolicy.get_event_loop` method
creates one when called from the main thread, but raises :exc:`RuntimeError`
otherwise.
Access to the global loop policy

View file

@ -118,17 +118,31 @@ ReadTransport
Interface for read-only transports.
.. method:: is_reading()
Return ``True`` if the transport is receiving new data.
.. versionadded:: 3.7
.. method:: pause_reading()
Pause the receiving end of the transport. No data will be passed to
the protocol's :meth:`data_received` method until :meth:`resume_reading`
is called.
.. versionchanged:: 3.7
The method is idempotent, i.e. it can be called when the
transport is already paused or closed.
.. method:: resume_reading()
Resume the receiving end. The protocol's :meth:`data_received` method
will be called once again if some data is available for reading.
.. versionchanged:: 3.7
The method is idempotent, i.e. it can be called when the
transport is already reading.
WriteTransport
--------------
@ -319,6 +333,16 @@ Protocol classes
The base class for implementing streaming protocols (for use with
e.g. TCP and SSL transports).
.. class:: BufferedProtocol
A base class for implementing streaming protocols with manual
control of the receive buffer.
.. versionadded:: 3.7
**Important:** this has been been added to asyncio in Python 3.7
*on a provisional basis*! Treat it as an experimental API that
might be changed or removed in Python 3.8.
.. class:: DatagramProtocol
The base class for implementing datagram protocols (for use with
@ -414,10 +438,68 @@ and, if called, :meth:`data_received` won't be called after it.
State machine:
start -> :meth:`~BaseProtocol.connection_made`
[-> :meth:`~Protocol.data_received` \*]
[-> :meth:`~Protocol.eof_received` ?]
-> :meth:`~BaseProtocol.connection_lost` -> end
.. code-block:: none
start -> connection_made
[-> data_received]*
[-> eof_received]?
-> connection_lost -> end
Streaming protocols with manual receive buffer control
------------------------------------------------------
.. versionadded:: 3.7
**Important:** :class:`BufferedProtocol` has been been added to
asyncio in Python 3.7 *on a provisional basis*! Consider it as an
experimental API that might be changed or removed in Python 3.8.
Event methods, such as :meth:`AbstractEventLoop.create_server` and
:meth:`AbstractEventLoop.create_connection`, accept factories that
return protocols that implement this interface.
The idea of BufferedProtocol is that it allows to manually allocate
and control the receive buffer. Event loops can then use the buffer
provided by the protocol to avoid unnecessary data copies. This
can result in noticeable performance improvement for protocols that
receive big amounts of data. Sophisticated protocols can allocate
the buffer only once at creation time.
The following callbacks are called on :class:`BufferedProtocol`
instances:
.. method:: BufferedProtocol.get_buffer()
Called to allocate a new receive buffer. Must return an object
that implements the :ref:`buffer protocol <bufferobjects>`.
.. method:: BufferedProtocol.buffer_updated(nbytes)
Called when the buffer was updated with the received data.
*nbytes* is the total number of bytes that were written to the buffer.
.. method:: BufferedProtocol.eof_received()
See the documentation of the :meth:`Protocol.eof_received` method.
:meth:`get_buffer` can be called an arbitrary number of times during
a connection. However, :meth:`eof_received` is called at most once
and, if called, :meth:`get_buffer` and :meth:`buffer_updated`
won't be called after it.
State machine:
.. code-block:: none
start -> connection_made
[-> get_buffer
[-> buffer_updated]?
]*
[-> eof_received]?
-> connection_lost -> end
Datagram protocols
@ -488,8 +570,9 @@ Coroutines can be scheduled in a protocol method using :func:`ensure_future`,
but there is no guarantee made about the execution order. Protocols are not
aware of coroutines created in protocol methods and so will not wait for them.
To have a reliable execution order, use :ref:`stream objects <asyncio-streams>` in a
coroutine with ``yield from``. For example, the :meth:`StreamWriter.drain`
To have a reliable execution order,
use :ref:`stream objects <asyncio-streams>` in a
coroutine with ``await``. For example, the :meth:`StreamWriter.drain`
coroutine can be used to wait until the write buffer is flushed.
@ -589,7 +672,7 @@ received data and close the connection::
:meth:`Transport.close` can be called immediately after
:meth:`WriteTransport.write` even if data are not sent yet on the socket: both
methods are asynchronous. ``yield from`` is not needed because these transport
methods are asynchronous. ``await`` is not needed because these transport
methods are not coroutines.
.. seealso::
@ -690,10 +773,7 @@ Wait until a socket receives data using the
the event loop ::
import asyncio
try:
from socket import socketpair
except ImportError:
from asyncio.windows_utils import socketpair
from socket import socketpair
# Create a pair of connected sockets
rsock, wsock = socketpair()

View file

@ -24,7 +24,7 @@ Queue
A queue, useful for coordinating producer and consumer coroutines.
If *maxsize* is less than or equal to zero, the queue size is infinite. If
it is an integer greater than ``0``, then ``yield from put()`` will block
it is an integer greater than ``0``, then ``await put()`` will block
when the queue reaches *maxsize*, until an item is removed by :meth:`get`.
Unlike the standard library :mod:`queue`, you can reliably know this Queue's

View file

@ -201,6 +201,21 @@ StreamWriter
Close the transport: see :meth:`BaseTransport.close`.
.. method:: is_closing()
Return ``True`` if the writer is closing or is closed.
.. versionadded:: 3.7
.. coroutinemethod:: wait_closed()
Wait until the writer is closed.
Should be called after :meth:`close` to wait until the underlying
connection (and the associated transport/protocol pair) is closed.
.. versionadded:: 3.7
.. coroutinemethod:: drain()
Let the write buffer of the underlying transport a chance to be flushed.
@ -208,7 +223,7 @@ StreamWriter
The intended use is to write::
w.write(data)
yield from w.drain()
await w.drain()
When the size of the transport buffer reaches the high-water limit (the
protocol is paused), block until the size of the buffer is drained down
@ -301,15 +316,14 @@ TCP echo client using the :func:`asyncio.open_connection` function::
import asyncio
@asyncio.coroutine
def tcp_echo_client(message, loop):
reader, writer = yield from asyncio.open_connection('127.0.0.1', 8888,
loop=loop)
async def tcp_echo_client(message, loop):
reader, writer = await asyncio.open_connection('127.0.0.1', 8888,
loop=loop)
print('Send: %r' % message)
writer.write(message.encode())
data = yield from reader.read(100)
data = await reader.read(100)
print('Received: %r' % data.decode())
print('Close the socket')
@ -335,16 +349,15 @@ TCP echo server using the :func:`asyncio.start_server` function::
import asyncio
@asyncio.coroutine
def handle_echo(reader, writer):
data = yield from reader.read(100)
async def handle_echo(reader, writer):
data = await reader.read(100)
message = data.decode()
addr = writer.get_extra_info('peername')
print("Received %r from %r" % (message, addr))
print("Send: %r" % message)
writer.write(data)
yield from writer.drain()
await writer.drain()
print("Close the client socket")
writer.close()
@ -387,13 +400,13 @@ Simple example querying HTTP headers of the URL passed on the command line::
connect = asyncio.open_connection(url.hostname, 443, ssl=True)
else:
connect = asyncio.open_connection(url.hostname, 80)
reader, writer = yield from connect
reader, writer = await connect
query = ('HEAD {path} HTTP/1.0\r\n'
'Host: {hostname}\r\n'
'\r\n').format(path=url.path or '/', hostname=url.hostname)
writer.write(query.encode('latin-1'))
while True:
line = yield from reader.readline()
line = await reader.readline()
if not line:
break
line = line.decode('latin1').rstrip()
@ -426,24 +439,20 @@ Coroutine waiting until a socket receives data using the
:func:`open_connection` function::
import asyncio
try:
from socket import socketpair
except ImportError:
from asyncio.windows_utils import socketpair
from socket import socketpair
@asyncio.coroutine
def wait_for_data(loop):
async def wait_for_data(loop):
# Create a pair of connected sockets
rsock, wsock = socketpair()
# Register the open socket to wait for data
reader, writer = yield from asyncio.open_connection(sock=rsock, loop=loop)
reader, writer = await asyncio.open_connection(sock=rsock, loop=loop)
# Simulate the reception of data from the network
loop.call_soon(wsock.send, 'abc'.encode())
# Wait for data
data = yield from reader.read(100)
data = await reader.read(100)
# Got data, we are done: close the socket
print("Received:", data.decode())

View file

@ -347,21 +347,20 @@ wait for the subprocess exit. The subprocess is created by the
def process_exited(self):
self.exit_future.set_result(True)
@asyncio.coroutine
def get_date(loop):
async def get_date(loop):
code = 'import datetime; print(datetime.datetime.now())'
exit_future = asyncio.Future(loop=loop)
# Create the subprocess controlled by the protocol DateProtocol,
# redirect the standard output into a pipe
create = loop.subprocess_exec(lambda: DateProtocol(exit_future),
sys.executable, '-c', code,
stdin=None, stderr=None)
transport, protocol = yield from create
transport, protocol = await loop.subprocess_exec(
lambda: DateProtocol(exit_future),
sys.executable, '-c', code,
stdin=None, stderr=None)
# Wait for the subprocess exit using the process_exited() method
# of the protocol
yield from exit_future
await exit_future
# Close the stdout pipe
transport.close()
@ -398,16 +397,16 @@ function::
code = 'import datetime; print(datetime.datetime.now())'
# Create the subprocess, redirect the standard output into a pipe
create = asyncio.create_subprocess_exec(sys.executable, '-c', code,
stdout=asyncio.subprocess.PIPE)
proc = yield from create
proc = await asyncio.create_subprocess_exec(
sys.executable, '-c', code,
stdout=asyncio.subprocess.PIPE)
# Read one line of output
data = yield from proc.stdout.readline()
data = await proc.stdout.readline()
line = data.decode('ascii').rstrip()
# Wait for the subprocess exit
yield from proc.wait()
await proc.wait()
return line
if sys.platform == "win32":

View file

@ -23,11 +23,9 @@ module (:class:`~threading.Lock`, :class:`~threading.Event`,
:class:`~threading.BoundedSemaphore`), but it has no *timeout* parameter. The
:func:`asyncio.wait_for` function can be used to cancel a task after a timeout.
Locks
-----
Lock
^^^^
----
.. class:: Lock(\*, loop=None)
@ -37,8 +35,9 @@ Lock
particular coroutine when locked. A primitive lock is in one of two states,
'locked' or 'unlocked'.
It is created in the unlocked state. It has two basic methods, :meth:`acquire`
and :meth:`release`. When the state is unlocked, acquire() changes the state to
The lock is created in the unlocked state.
It has two basic methods, :meth:`acquire` and :meth:`release`.
When the state is unlocked, acquire() changes the state to
locked and returns immediately. When the state is locked, acquire() blocks
until a call to release() in another coroutine changes it to unlocked, then
the acquire() call resets it to locked and returns. The release() method
@ -51,38 +50,12 @@ Lock
resets the state to unlocked; first coroutine which is blocked in acquire()
is being processed.
:meth:`acquire` is a coroutine and should be called with ``yield from``.
:meth:`acquire` is a coroutine and should be called with ``await``.
Locks also support the context management protocol. ``(yield from lock)``
should be used as the context manager expression.
Locks support the :ref:`context management protocol <async-with-locks>`.
This class is :ref:`not thread safe <asyncio-multithreading>`.
Usage::
lock = Lock()
...
yield from lock
try:
...
finally:
lock.release()
Context manager usage::
lock = Lock()
...
with (yield from lock):
...
Lock objects can be tested for locking state::
if not lock.locked():
yield from lock
else:
# lock is acquired
...
.. method:: locked()
Return ``True`` if the lock is acquired.
@ -110,7 +83,7 @@ Lock
Event
^^^^^
-----
.. class:: Event(\*, loop=None)
@ -151,7 +124,7 @@ Event
Condition
^^^^^^^^^
---------
.. class:: Condition(lock=None, \*, loop=None)
@ -166,6 +139,9 @@ Condition
object, and it is used as the underlying lock. Otherwise,
a new :class:`Lock` object is created and used as the underlying lock.
Conditions support the :ref:`context management protocol
<async-with-locks>`.
This class is :ref:`not thread safe <asyncio-multithreading>`.
.. coroutinemethod:: acquire()
@ -239,11 +215,8 @@ Condition
This method is a :ref:`coroutine <coroutine>`.
Semaphores
----------
Semaphore
^^^^^^^^^
---------
.. class:: Semaphore(value=1, \*, loop=None)
@ -254,12 +227,13 @@ Semaphore
counter can never go below zero; when :meth:`acquire` finds that it is zero,
it blocks, waiting until some other coroutine calls :meth:`release`.
Semaphores also support the context management protocol.
The optional argument gives the initial value for the internal counter; it
defaults to ``1``. If the value given is less than ``0``, :exc:`ValueError`
is raised.
Semaphores support the :ref:`context management protocol
<async-with-locks>`.
This class is :ref:`not thread safe <asyncio-multithreading>`.
.. coroutinemethod:: acquire()
@ -285,7 +259,7 @@ Semaphore
BoundedSemaphore
^^^^^^^^^^^^^^^^
----------------
.. class:: BoundedSemaphore(value=1, \*, loop=None)
@ -293,3 +267,39 @@ BoundedSemaphore
This raises :exc:`ValueError` in :meth:`~Semaphore.release` if it would
increase the value above the initial value.
Bounded semapthores support the :ref:`context management
protocol <async-with-locks>`.
This class is :ref:`not thread safe <asyncio-multithreading>`.
.. _async-with-locks:
Using locks, conditions and semaphores in the :keyword:`async with` statement
-----------------------------------------------------------------------------
:class:`Lock`, :class:`Condition`, :class:`Semaphore`, and
:class:`BoundedSemaphore` objects can be used in :keyword:`async with`
statements.
The :meth:`acquire` method will be called when the block is entered,
and :meth:`release` will be called when the block is exited. Hence,
the following snippet::
async with lock:
# do something...
is equivalent to::
await lock.acquire()
try:
# do something...
finally:
lock.release()
.. deprecated:: 3.7
Lock acquiring using ``await lock`` or ``yield from lock`` and
:keyword:`with` statement (``with await lock``, ``with (yield from
lock)``) are deprecated.

View file

@ -92,6 +92,24 @@ Coroutines (and tasks) can only run when the event loop is running.
used in a callback-style code, wrap its result with :func:`ensure_future`.
.. function:: asyncio.run(coro, \*, debug=False)
This function runs the passed coroutine, taking care of
managing the asyncio event loop and finalizing asynchronous
generators.
This function cannot be called when another asyncio event loop is
running in the same thread.
If debug is True, the event loop will be run in debug mode.
This function always creates a new event loop and closes it at
the end. It should be used as a main entry point for asyncio
programs, and should ideally only be called once.
.. versionadded:: 3.7
.. _asyncio-hello-world-coroutine:
Example: Hello World coroutine
@ -104,10 +122,7 @@ Example of coroutine displaying ``"Hello World"``::
async def hello_world():
print("Hello World!")
loop = asyncio.get_event_loop()
# Blocking call which returns when the hello_world() coroutine is done
loop.run_until_complete(hello_world())
loop.close()
asyncio.run(hello_world())
.. seealso::
@ -127,7 +142,8 @@ using the :meth:`sleep` function::
import asyncio
import datetime
async def display_date(loop):
async def display_date():
loop = asyncio.get_running_loop()
end_time = loop.time() + 5.0
while True:
print(datetime.datetime.now())
@ -135,10 +151,7 @@ using the :meth:`sleep` function::
break
await asyncio.sleep(1)
loop = asyncio.get_event_loop()
# Blocking call which returns when the display_date() coroutine is done
loop.run_until_complete(display_date(loop))
loop.close()
asyncio.run(display_date())
.. seealso::
@ -293,6 +306,12 @@ Future
If the future is already done when this method is called, raises
:exc:`InvalidStateError`.
.. method:: get_loop()
Return the event loop the future object is bound to.
.. versionadded:: 3.7
Example: Future with run_until_complete()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@ -358,10 +377,21 @@ with the result.
Task
----
.. function:: create_task(coro)
Wrap a :ref:`coroutine <coroutine>` *coro* into a task and schedule
its execution. Return the task object.
The task is executed in :func:`get_running_loop` context,
:exc:`RuntimeError` is raised if there is no running loop in
current thread.
.. versionadded:: 3.7
.. class:: Task(coro, \*, loop=None)
Schedule the execution of a :ref:`coroutine <coroutine>`: wrap it in a
future. A task is a subclass of :class:`Future`.
A unit for concurrent running of :ref:`coroutines <coroutine>`,
subclass of :class:`Future`.
A task is responsible for executing a coroutine object in an event loop. If
the wrapped coroutine yields from a future, the task suspends the execution
@ -386,7 +416,7 @@ Task
<coroutine>` did not complete. It is probably a bug and a warning is
logged: see :ref:`Pending task destroyed <asyncio-pending-task-destroyed>`.
Don't directly create :class:`Task` instances: use the :func:`ensure_future`
Don't directly create :class:`Task` instances: use the :func:`create_task`
function or the :meth:`AbstractEventLoop.create_task` method.
This class is :ref:`not thread safe <asyncio-multithreading>`.
@ -504,6 +534,28 @@ Task functions
the event loop object used by the underlying task or coroutine. If it's
not provided, the default event loop is used.
.. function:: current_task(loop=None):
Return the current running :class:`Task` instance or ``None``, if
no task is running.
If *loop* is ``None`` :func:`get_running_loop` is used to get
the current loop.
.. versionadded:: 3.7
.. function:: all_tasks(loop=None):
Return a set of :class:`Task` objects created for the loop.
If *loop* is ``None`` :func:`get_event_loop` is used for getting
current loop.
.. versionadded:: 3.7
.. function:: as_completed(fs, \*, loop=None, timeout=None)
Return an iterator whose values, when waited for, are :class:`Future`
@ -515,7 +567,7 @@ Task functions
Example::
for f in as_completed(fs):
result = yield from f # The 'yield from' may raise
result = await f # The 'await' may raise
# Use result
.. note::
@ -534,15 +586,15 @@ Task functions
.. versionchanged:: 3.5.1
The function accepts any :term:`awaitable` object.
.. note::
:func:`create_task` (added in Python 3.7) is the preferable way
for spawning new tasks.
.. seealso::
The :meth:`AbstractEventLoop.create_task` method.
.. function:: async(coro_or_future, \*, loop=None)
A deprecated alias to :func:`ensure_future`.
.. deprecated:: 3.4.4
The :func:`create_task` function and
:meth:`AbstractEventLoop.create_task` method.
.. function:: wrap_future(future, \*, loop=None)
@ -636,11 +688,11 @@ Task functions
The statement::
res = yield from shield(something())
res = await shield(something())
is exactly equivalent to the statement::
res = yield from something()
res = await something()
*except* that if the coroutine containing it is cancelled, the task running
in ``something()`` is not cancelled. From the point of view of
@ -653,7 +705,7 @@ Task functions
combine ``shield()`` with a try/except clause, as follows::
try:
res = yield from shield(something())
res = await shield(something())
except CancelledError:
res = None
@ -696,7 +748,7 @@ Task functions
Usage::
done, pending = yield from asyncio.wait(fs)
done, pending = await asyncio.wait(fs)
.. note::
@ -720,7 +772,7 @@ Task functions
This function is a :ref:`coroutine <coroutine>`, usage::
result = yield from asyncio.wait_for(fut, 60.0)
result = await asyncio.wait_for(fut, 60.0)
.. versionchanged:: 3.4.3
If the wait is cancelled, the future *fut* is now also cancelled.

View file

@ -20,6 +20,9 @@ at interpreter termination time they will be run in the order ``C``, ``B``,
program is killed by a signal not handled by Python, when a Python fatal
internal error is detected, or when :func:`os._exit` is called.
.. versionchanged:: 3.7
When used with C-API subinterpreters, registered functions
are local to the interpreter they were registered in.
.. function:: register(func, *args, **kwargs)

View file

@ -977,10 +977,14 @@ e.g. ``'utf-8'`` is a valid alias for the ``'utf_8'`` codec.
Some common encodings can bypass the codecs lookup machinery to
improve performance. These optimization opportunities are only
recognized by CPython for a limited set of aliases: utf-8, utf8,
latin-1, latin1, iso-8859-1, mbcs (Windows only), ascii, utf-16,
and utf-32. Using alternative spellings for these encodings may
result in slower execution.
recognized by CPython for a limited set of (case insensitive)
aliases: utf-8, utf8, latin-1, latin1, iso-8859-1, iso8859-1, mbcs
(Windows only), ascii, us-ascii, utf-16, utf16, utf-32, utf32, and
the same using underscores instead of dashes. Using alternative
aliases for these encodings may result in slower execution.
.. versionchanged:: 3.6
Optimization opportunity recognized for us-ascii.
Many of the character sets support the same languages. They vary in individual
characters (e.g. whether the EURO SIGN is supported or not), and in the

View file

@ -87,7 +87,8 @@ ABC Inherits from Abstract Methods Mixin
:class:`Set` ``__iter__``
:class:`KeysView` :class:`MappingView`, ``__contains__``,
:class:`Set` ``__iter__``
:class:`ValuesView` :class:`MappingView` ``__contains__``, ``__iter__``
:class:`ValuesView` :class:`MappingView`, ``__contains__``, ``__iter__``
:class:`Collection`
:class:`Awaitable` ``__await__``
:class:`Coroutine` :class:`Awaitable` ``send``, ``throw`` ``close``
:class:`AsyncIterable` ``__aiter__``

View file

@ -35,8 +35,8 @@ Python's general purpose built-in containers, :class:`dict`, :class:`list`,
.. versionchanged:: 3.3
Moved :ref:`collections-abstract-base-classes` to the :mod:`collections.abc` module.
For backwards compatibility, they continue to be visible in this module
as well.
For backwards compatibility, they continue to be visible in this module through
Python 3.7. Subsequently, they will be removed entirely.
:class:`ChainMap` objects
@ -509,11 +509,14 @@ or subtracting from an empty counter.
.. versionadded:: 3.2
.. method:: rotate(n)
.. method:: rotate(n=1)
Rotate the deque *n* steps to the right. If *n* is negative, rotate to
the left. Rotating one step to the right is equivalent to:
``d.appendleft(d.pop())``.
Rotate the deque *n* steps to the right. If *n* is negative, rotate
to the left.
When the deque is not empty, rotating one step to the right is equivalent
to ``d.appendleft(d.pop())``, and rotating one step to the left is
equivalent to ``d.append(d.popleft())``.
Deque objects also provide one read-only attribute:
@ -618,6 +621,25 @@ added elements by appending to the right and popping to the left::
d.append(elem)
yield s / n
A `round-robin scheduler
<https://en.wikipedia.org/wiki/Round-robin_scheduling>`_ can be implemented with
input iterators stored in a :class:`deque`. Values are yielded from the active
iterator in position zero. If that iterator is exhausted, it can be removed
with :meth:`~deque.popleft`; otherwise, it can be cycled back to the end with
the :meth:`~deque.rotate` method::
def roundrobin(*iterables):
"roundrobin('ABC', 'D', 'EF') --> A D E B F C"
iterators = deque(map(iter, iterables))
while iterators:
try:
while True:
yield next(iterators[0])
iterators.rotate(-1)
except StopIteration:
# Remove an exhausted iterator.
iterators.popleft()
The :meth:`rotate` method provides a way to implement :class:`deque` slicing and
deletion. For example, a pure Python implementation of ``del d[n]`` relies on
the :meth:`rotate` method to position elements to be popped::
@ -763,7 +785,7 @@ Named tuples assign meaning to each position in a tuple and allow for more reada
self-documenting code. They can be used wherever regular tuples are used, and
they add the ability to access fields by name instead of position index.
.. function:: namedtuple(typename, field_names, *, rename=False, module=None)
.. function:: namedtuple(typename, field_names, *, rename=False, defaults=None, module=None)
Returns a new tuple subclass named *typename*. The new subclass is used to
create tuple-like objects that have fields accessible by attribute lookup as
@ -786,6 +808,13 @@ they add the ability to access fields by name instead of position index.
converted to ``['abc', '_1', 'ghi', '_3']``, eliminating the keyword
``def`` and the duplicate fieldname ``abc``.
*defaults* can be ``None`` or an :term:`iterable` of default values.
Since fields with a default value must come after any fields without a
default, the *defaults* are applied to the rightmost parameters. For
example, if the fieldnames are ``['x', 'y', 'z']`` and the defaults are
``(1, 2)``, then ``x`` will be a required argument, ``y`` will default to
``1``, and ``z`` will default to ``2``.
If *module* is defined, the ``__module__`` attribute of the named tuple is
set to that value.
@ -805,6 +834,10 @@ they add the ability to access fields by name instead of position index.
.. versionchanged:: 3.7
Remove the *verbose* parameter and the :attr:`_source` attribute.
.. versionchanged:: 3.7
Added the *defaults* parameter and the :attr:`_field_defaults`
attribute.
.. doctest::
:options: +NORMALIZE_WHITESPACE
@ -892,6 +925,18 @@ field names, the method and attribute names start with an underscore.
>>> Pixel(11, 22, 128, 255, 0)
Pixel(x=11, y=22, red=128, green=255, blue=0)
.. attribute:: somenamedtuple._fields_defaults
Dictionary mapping field names to default values.
.. doctest::
>>> Account = namedtuple('Account', ['type', 'balance'], defaults=[0])
>>> Account._fields_defaults
{'balance': 0}
>>> Account('premium')
Account(type='premium', balance=0)
To retrieve a field whose name is stored in a string, use the :func:`getattr`
function:

View file

@ -21,7 +21,7 @@ spaces, the coordinates are all between 0 and 1.
.. seealso::
More information about color spaces can be found at
http://www.poynton.com/ColorFAQ.html and
http://poynton.ca/ColorFAQ.html and
https://www.cambridgeincolour.com/tutorials/color-spaces.htm.
The :mod:`colorsys` module defines the following functions:

View file

@ -83,6 +83,16 @@ compile Python sources.
If ``0`` is used, then the result of :func:`os.cpu_count()`
will be used.
.. cmdoption:: --invalidation-mode [timestamp|checked-hash|unchecked-hash]
Control how the generated pycs will be invalidated at runtime. The default
setting, ``timestamp``, means that ``.pyc`` files with the source timestamp
and size embedded will be generated. The ``checked-hash`` and
``unchecked-hash`` values cause hash-based pycs to be generated. Hash-based
pycs embed a hash of the source file contents rather than a timestamp. See
:ref:`pyc-invalidation` for more information on how Python validates bytecode
cache files at runtime.
.. versionchanged:: 3.2
Added the ``-i``, ``-b`` and ``-h`` options.
@ -91,6 +101,9 @@ compile Python sources.
was changed to a multilevel value. ``-b`` will always produce a
byte-code file ending in ``.pyc``, never ``.pyo``.
.. versionchanged:: 3.7
Added the ``--invalidation-mode`` parameter.
There is no command-line option to control the optimization level used by the
:func:`compile` function, because the Python interpreter itself already
@ -99,7 +112,7 @@ provides the option: :program:`python -O -m compileall`.
Public functions
----------------
.. function:: compile_dir(dir, maxlevels=10, ddir=None, force=False, rx=None, quiet=0, legacy=False, optimize=-1, workers=1)
.. function:: compile_dir(dir, maxlevels=10, ddir=None, force=False, rx=None, quiet=0, legacy=False, optimize=-1, workers=1, invalidation_mode=py_compile.PycInvalidationMode.TIMESTAMP)
Recursively descend the directory tree named by *dir*, compiling all :file:`.py`
files along the way. Return a true value if all the files compiled successfully,
@ -140,6 +153,10 @@ Public functions
then sequential compilation will be used as a fallback. If *workers* is
lower than ``0``, a :exc:`ValueError` will be raised.
*invalidation_mode* should be a member of the
:class:`py_compile.PycInvalidationMode` enum and controls how the generated
pycs are invalidated at runtime.
.. versionchanged:: 3.2
Added the *legacy* and *optimize* parameter.
@ -156,7 +173,10 @@ Public functions
.. versionchanged:: 3.6
Accepts a :term:`path-like object`.
.. function:: compile_file(fullname, ddir=None, force=False, rx=None, quiet=0, legacy=False, optimize=-1)
.. versionchanged:: 3.7
The *invalidation_mode* parameter was added.
.. function:: compile_file(fullname, ddir=None, force=False, rx=None, quiet=0, legacy=False, optimize=-1, invalidation_mode=py_compile.PycInvalidationMode.TIMESTAMP)
Compile the file with path *fullname*. Return a true value if the file
compiled successfully, and a false value otherwise.
@ -184,6 +204,10 @@ Public functions
*optimize* specifies the optimization level for the compiler. It is passed to
the built-in :func:`compile` function.
*invalidation_mode* should be a member of the
:class:`py_compile.PycInvalidationMode` enum and controls how the generated
pycs are invalidated at runtime.
.. versionadded:: 3.2
.. versionchanged:: 3.5
@ -193,7 +217,10 @@ Public functions
The *legacy* parameter only writes out ``.pyc`` files, not ``.pyo`` files
no matter what the value of *optimize* is.
.. function:: compile_path(skip_curdir=True, maxlevels=0, force=False, quiet=0, legacy=False, optimize=-1)
.. versionchanged:: 3.7
The *invalidation_mode* parameter was added.
.. function:: compile_path(skip_curdir=True, maxlevels=0, force=False, quiet=0, legacy=False, optimize=-1, invalidation_mode=py_compile.PycInvalidationMode.TIMESTAMP)
Byte-compile all the :file:`.py` files found along ``sys.path``. Return a
true value if all the files compiled successfully, and a false value otherwise.
@ -213,6 +240,9 @@ Public functions
The *legacy* parameter only writes out ``.pyc`` files, not ``.pyo`` files
no matter what the value of *optimize* is.
.. versionchanged:: 3.7
The *invalidation_mode* parameter was added.
To force a recompile of all the :file:`.py` files in the :file:`Lib/`
subdirectory and all its subdirectories::

View file

@ -40,21 +40,29 @@ Executor Objects
.. method:: map(func, *iterables, timeout=None, chunksize=1)
Equivalent to :func:`map(func, *iterables) <map>` except *func* is executed
asynchronously and several calls to *func* may be made concurrently. The
returned iterator raises a :exc:`concurrent.futures.TimeoutError` if
:meth:`~iterator.__next__` is called and the result isn't available
Similar to :func:`map(func, *iterables) <map>` except:
* the *iterables* are collected immediately rather than lazily;
* *func* is executed asynchronously and several calls to
*func* may be made concurrently.
The returned iterator raises a :exc:`concurrent.futures.TimeoutError`
if :meth:`~iterator.__next__` is called and the result isn't available
after *timeout* seconds from the original call to :meth:`Executor.map`.
*timeout* can be an int or a float. If *timeout* is not specified or
``None``, there is no limit to the wait time. If a call raises an
exception, then that exception will be raised when its value is
retrieved from the iterator. When using :class:`ProcessPoolExecutor`, this
method chops *iterables* into a number of chunks which it submits to the
pool as separate tasks. The (approximate) size of these chunks can be
specified by setting *chunksize* to a positive integer. For very long
iterables, using a large value for *chunksize* can significantly improve
performance compared to the default size of 1. With :class:`ThreadPoolExecutor`,
*chunksize* has no effect.
``None``, there is no limit to the wait time.
If a *func* call raises an exception, then that exception will be
raised when its value is retrieved from the iterator.
When using :class:`ProcessPoolExecutor`, this method chops *iterables*
into a number of chunks which it submits to the pool as separate
tasks. The (approximate) size of these chunks can be specified by
setting *chunksize* to a positive integer. For very long iterables,
using a large value for *chunksize* can significantly improve
performance compared to the default size of 1. With
:class:`ThreadPoolExecutor`, *chunksize* has no effect.
.. versionchanged:: 3.5
Added the *chunksize* argument.

View file

@ -29,6 +29,17 @@ Functions and classes provided:
.. versionadded:: 3.6
.. class:: AbstractAsyncContextManager
An :term:`abstract base class` for classes that implement
:meth:`object.__aenter__` and :meth:`object.__aexit__`. A default
implementation for :meth:`object.__aenter__` is provided which returns
``self`` while :meth:`object.__aexit__` is an abstract method which by default
returns ``None``. See also the definition of
:ref:`async-context-managers`.
.. versionadded:: 3.7
.. decorator:: contextmanager
@ -137,6 +148,28 @@ Functions and classes provided:
``page.close()`` will be called when the :keyword:`with` block is exited.
.. _simplifying-support-for-single-optional-context-managers:
.. function:: nullcontext(enter_result=None)
Return a context manager that returns enter_result from ``__enter__``, but
otherwise does nothing. It is intended to be used as a stand-in for an
optional context manager, for example::
def process_file(file_or_path):
if isinstance(file_or_path, str):
# If string, open file
cm = open(file_or_path)
else:
# Caller is responsible for closing file
cm = nullcontext(file_or_path)
with cm as file:
# Perform processing on the file
.. versionadded:: 3.7
.. function:: suppress(*exceptions)
Return a context manager that suppresses any of the specified exceptions
@ -402,6 +435,44 @@ Functions and classes provided:
callbacks registered, the arguments passed in will indicate that no
exception occurred.
.. class:: AsyncExitStack()
An :ref:`asynchronous context manager <async-context-managers>`, similar
to :class:`ExitStack`, that supports combining both synchronous and
asynchronous context managers, as well as having coroutines for
cleanup logic.
The :meth:`close` method is not implemented, :meth:`aclose` must be used
instead.
.. method:: enter_async_context(cm)
Similar to :meth:`enter_context` but expects an asynchronous context
manager.
.. method:: push_async_exit(exit)
Similar to :meth:`push` but expects either an asynchronous context manager
or a coroutine.
.. method:: push_async_callback(callback, *args, **kwds)
Similar to :meth:`callback` but expects a coroutine.
.. method:: aclose()
Similar to :meth:`close` but properly handles awaitables.
Continuing the example for :func:`asynccontextmanager`::
async with AsyncExitStack() as stack:
connections = [await stack.enter_async_context(get_connection())
for i in range(5)]
# All opened connections will automatically be released at the end of
# the async with statement, even if attempts to open a connection
# later in the list raise an exception.
.. versionadded:: 3.7
Examples and Recipes
--------------------
@ -433,24 +504,6 @@ statements to manage arbitrary resources that don't natively support the
context management protocol.
Simplifying support for single optional context managers
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In the specific case of a single optional context manager, :class:`ExitStack`
instances can be used as a "do nothing" context manager, allowing a context
manager to easily be omitted without affecting the overall structure of
the source code::
def debug_trace(details):
if __debug__:
return TraceContext(details)
# Don't do anything special with the context in release mode
return ExitStack()
with debug_trace():
# Suite is traced in debug mode, but runs normally otherwise
Catching exceptions from ``__enter__`` methods
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View file

@ -436,6 +436,21 @@ Other constructors, all class methods:
d``.
.. classmethod:: date.fromisoformat(date_string)
Return a :class:`date` corresponding to a *date_string* in the format emitted
by :meth:`date.isoformat`. Specifically, this function supports strings in
the format(s) ``YYYY-MM-DD``.
.. caution::
This does not support parsing arbitrary ISO 8601 strings - it is only intended
as the inverse operation of :meth:`date.isoformat`.
.. versionadded:: 3.7
Class attributes:
.. attribute:: date.min
@ -819,6 +834,21 @@ Other constructors, all class methods:
Added the *tzinfo* argument.
.. classmethod:: datetime.fromisoformat(date_string)
Return a :class:`datetime` corresponding to a *date_string* in one of the
formats emitted by :meth:`date.isoformat` and :meth:`datetime.isoformat`.
Specifically, this function supports strings in the format(s)
``YYYY-MM-DD[*HH[:MM[:SS[.mmm[mmm]]]][+HH:MM[:SS[.ffffff]]]]``,
where ``*`` can match any single character.
.. caution::
This does not support parsing arbitrary ISO 8601 strings - it is only intended
as the inverse operation of :meth:`datetime.isoformat`.
.. versionadded:: 3.7
.. classmethod:: datetime.strptime(date_string, format)
Return a :class:`.datetime` corresponding to *date_string*, parsed according to
@ -1486,6 +1516,23 @@ In boolean contexts, a :class:`.time` object is always considered to be true.
error-prone and has been removed in Python 3.5. See :issue:`13936` for full
details.
Other constructor:
.. classmethod:: time.fromisoformat(time_string)
Return a :class:`time` corresponding to a *time_string* in one of the
formats emitted by :meth:`time.isoformat`. Specifically, this function supports
strings in the format(s) ``HH[:MM[:SS[.mmm[mmm]]]][+HH:MM[:SS[.ffffff]]]``.
.. caution::
This does not support parsing arbitrary ISO 8601 strings - it is only intended
as the inverse operation of :meth:`time.isoformat`.
.. versionadded:: 3.7
Instance methods:
.. method:: time.replace(hour=self.hour, minute=self.minute, second=self.second, \
@ -1587,7 +1634,6 @@ Instance methods:
``self.tzinfo.tzname(None)``, or raises an exception if the latter doesn't
return ``None`` or a string object.
Example:
>>> from datetime import time, tzinfo, timedelta

View file

@ -339,9 +339,23 @@ The module defines the following:
dumbdbm database is created, files with :file:`.dat` and :file:`.dir` extensions
are created.
The optional *flag* argument supports only the semantics of ``'c'``
and ``'n'`` values. Other values will default to database being always
opened for update, and will be created if it does not exist.
The optional *flag* argument can be:
+---------+-------------------------------------------+
| Value | Meaning |
+=========+===========================================+
| ``'r'`` | Open existing database for reading only |
| | (default) |
+---------+-------------------------------------------+
| ``'w'`` | Open existing database for reading and |
| | writing |
+---------+-------------------------------------------+
| ``'c'`` | Open database for reading and writing, |
| | creating it if it doesn't exist |
+---------+-------------------------------------------+
| ``'n'`` | Always create a new, empty database, open |
| | for reading and writing |
+---------+-------------------------------------------+
The optional *mode* argument is the Unix mode of the file, used only when the
database has to be created. It defaults to octal ``0o666`` (and will be modified
@ -351,9 +365,10 @@ The module defines the following:
:func:`.open` always creates a new database when the flag has the value
``'n'``.
.. deprecated-removed:: 3.6 3.8
Creating database in ``'r'`` and ``'w'`` modes. Modifying database in
``'r'`` mode.
.. versionchanged:: 3.8
A database opened with flags ``'r'`` is now read-only. Opening with
flags ``'r'`` and ``'w'`` no longer creates a database if it does not
exist.
In addition to the methods provided by the
:class:`collections.abc.MutableMapping` class, :class:`dumbdbm` objects

View file

@ -24,3 +24,6 @@ The list of modules described in this chapter is:
unittest.mock-examples.rst
2to3.rst
test.rst
See also the Python development mode: the :option:`-X` ``dev`` option and
:envvar:`PYTHONDEVMODE` environment variable.

View file

@ -339,12 +339,16 @@ The Python compiler currently generates the following bytecode instructions.
Duplicates the reference on top of the stack.
.. versionadded:: 3.2
.. opcode:: DUP_TOP_TWO
Duplicates the two references on top of the stack, leaving them in the
same order.
.. versionadded:: 3.2
**Unary operations**
@ -555,11 +559,14 @@ the original TOS1.
the CO_ITERABLE_COROUTINE flag, or resolves
``o.__await__``.
.. versionadded:: 3.5
.. opcode:: GET_AITER
Implements ``TOS = TOS.__aiter__()``.
.. versionadded:: 3.5
.. versionchanged:: 3.7
Returning awaitable objects from ``__aiter__`` is no longer
supported.
@ -570,17 +577,23 @@ the original TOS1.
Implements ``PUSH(get_awaitable(TOS.__anext__()))``. See ``GET_AWAITABLE``
for details about ``get_awaitable``
.. versionadded:: 3.5
.. opcode:: BEFORE_ASYNC_WITH
Resolves ``__aenter__`` and ``__aexit__`` from the object on top of the
stack. Pushes ``__aexit__`` and result of ``__aenter__()`` to the stack.
.. versionadded:: 3.5
.. opcode:: SETUP_ASYNC_WITH
Creates a new frame object.
.. versionadded:: 3.5
**Miscellaneous opcodes**
@ -618,6 +631,8 @@ the original TOS1.
Calls ``dict.setitem(TOS1[-i], TOS, TOS1)``. Used to implement dict
comprehensions.
.. versionadded:: 3.1
For all of the :opcode:`SET_ADD`, :opcode:`LIST_APPEND` and :opcode:`MAP_ADD`
instructions, while the added value or key/value pair is popped off, the
container object remains on the stack so that it is available for further
@ -640,6 +655,7 @@ iterations of the loop.
.. versionadded:: 3.3
.. opcode:: SETUP_ANNOTATIONS
Checks whether ``__annotations__`` is defined in ``locals()``, if not it is
@ -649,6 +665,7 @@ iterations of the loop.
.. versionadded:: 3.6
.. opcode:: IMPORT_STAR
Loads all symbols not starting with ``'_'`` directly from the module TOS to
@ -694,6 +711,8 @@ iterations of the loop.
store it in (a) variable(s) (:opcode:`STORE_FAST`, :opcode:`STORE_NAME`, or
:opcode:`UNPACK_SEQUENCE`).
.. versionadded:: 3.2
.. opcode:: WITH_CLEANUP_START
@ -924,23 +943,31 @@ All of the following opcodes use their arguments.
If TOS is true, sets the bytecode counter to *target*. TOS is popped.
.. versionadded:: 3.1
.. opcode:: POP_JUMP_IF_FALSE (target)
If TOS is false, sets the bytecode counter to *target*. TOS is popped.
.. versionadded:: 3.1
.. opcode:: JUMP_IF_TRUE_OR_POP (target)
If TOS is true, sets the bytecode counter to *target* and leaves TOS on the
stack. Otherwise (TOS is false), TOS is popped.
.. versionadded:: 3.1
.. opcode:: JUMP_IF_FALSE_OR_POP (target)
If TOS is false, sets the bytecode counter to *target* and leaves TOS on the
stack. Otherwise (TOS is true), TOS is popped.
.. versionadded:: 3.1
.. opcode:: JUMP_ABSOLUTE (target)
@ -993,13 +1020,6 @@ All of the following opcodes use their arguments.
Deletes local ``co_varnames[var_num]``.
.. opcode:: STORE_ANNOTATION (namei)
Stores TOS as ``locals()['__annotations__'][co_names[namei]] = TOS``.
.. versionadded:: 3.6
.. opcode:: LOAD_CLOSURE (i)
Pushes a reference to the cell contained in slot *i* of the cell and free
@ -1020,6 +1040,8 @@ All of the following opcodes use their arguments.
consulting the cell. This is used for loading free variables in class
bodies.
.. versionadded:: 3.4
.. opcode:: STORE_DEREF (i)
@ -1032,6 +1054,8 @@ All of the following opcodes use their arguments.
Empties the cell contained in slot *i* of the cell and free variable storage.
Used by the :keyword:`del` statement.
.. versionadded:: 3.2
.. opcode:: RAISE_VARARGS (argc)

View file

@ -53,7 +53,7 @@ over channels that are not "8 bit clean".
:data:`~email.policy.compat32` policy and ``False`` for all others).
*mangle_from_* is intended for use when messages are stored in unix mbox
format (see :mod:`mailbox` and `WHY THE CONTENT-LENGTH FORMAT IS BAD
<http://www.jwz.org/doc/content-length.html>`_).
<https://www.jwz.org/doc/content-length.html>`_).
If *maxheaderlen* is not ``None``, refold any header lines that are longer
than *maxheaderlen*, or if ``0``, do not rewrap any headers. If
@ -154,7 +154,7 @@ to be using :class:`BytesGenerator`, and not :class:`Generator`.
:data:`~email.policy.compat32` policy and ``False`` for all others).
*mangle_from_* is intended for use when messages are stored in unix mbox
format (see :mod:`mailbox` and `WHY THE CONTENT-LENGTH FORMAT IS BAD
<http://www.jwz.org/doc/content-length.html>`_).
<https://www.jwz.org/doc/content-length.html>`_).
If *maxheaderlen* is not ``None``, refold any header lines that are longer
than *maxheaderlen*, or if ``0``, do not rewrap any headers. If

View file

@ -379,7 +379,8 @@ The rules for what is allowed are as follows: names that start and end with
a single underscore are reserved by enum and cannot be used; all other
attributes defined within an enumeration will become members of this
enumeration, with the exception of special methods (:meth:`__str__`,
:meth:`__add__`, etc.) and descriptors (methods are also descriptors).
:meth:`__add__`, etc.), descriptors (methods are also descriptors), and
variable names listed in :attr:`_ignore_`.
Note: if your enumeration defines :meth:`__new__` and/or :meth:`__init__` then
whatever value(s) were given to the enum member will be passed into those
@ -654,7 +655,7 @@ value and let :class:`Flag` select an appropriate value.
Like :class:`IntFlag`, if a combination of :class:`Flag` members results in no
flags being set, the boolean evaluation is :data:`False`::
>>> from enum import Flag
>>> from enum import Flag, auto
>>> class Color(Flag):
... RED = auto()
... BLUE = auto()
@ -943,6 +944,25 @@ will be passed to those methods::
9.802652743337129
TimePeriod
^^^^^^^^^^
An example to show the :attr:`_ignore_` attribute in use::
>>> from datetime import timedelta
>>> class Period(timedelta, Enum):
... "different lengths of time"
... _ignore_ = 'Period i'
... Period = vars()
... for i in range(367):
... Period['day_%d' % i] = i
...
>>> list(Period)[:2]
[<Period.day_0: datetime.timedelta(0)>, <Period.day_1: datetime.timedelta(days=1)>]
>>> list(Period)[-2:]
[<Period.day_365: datetime.timedelta(days=365)>, <Period.day_366: datetime.timedelta(days=366)>]
How are Enums different?
------------------------
@ -994,6 +1014,9 @@ Supported ``_sunder_`` names
- ``_missing_`` -- a lookup function used when a value is not found; may be
overridden
- ``_ignore_`` -- a list of names, either as a :func:`list` or a :func:`str`,
that will not be transformed into members, and will be removed from the final
class
- ``_order_`` -- used in Python 2/3 code to ensure member order is consistent
(class attribute, removed during class creation)
- ``_generate_next_value_`` -- used by the `Functional API`_ and by
@ -1001,6 +1024,7 @@ Supported ``_sunder_`` names
overridden
.. versionadded:: 3.6 ``_missing_``, ``_order_``, ``_generate_next_value_``
.. versionadded:: 3.7 ``_ignore_``
To help keep Python 2 / Python 3 code in sync an :attr:`_order_` attribute can
be provided. It will be checked against the actual order of the enumeration

View file

@ -154,10 +154,7 @@ The following exceptions are the exceptions that are usually raised.
.. exception:: FloatingPointError
Raised when a floating point operation fails. This exception is always defined,
but can only be raised when Python is configured with the
``--with-fpectl`` option, or the :const:`WANT_SIGFPE_HANDLER` symbol is
defined in the :file:`pyconfig.h` file.
Not currently used.
.. exception:: GeneratorExit
@ -370,17 +367,21 @@ The following exceptions are the exceptions that are usually raised.
raised, and the value returned by the function is used as the
:attr:`value` parameter to the constructor of the exception.
If a generator function defined in the presence of a ``from __future__
import generator_stop`` directive raises :exc:`StopIteration`, it will be
converted into a :exc:`RuntimeError` (retaining the :exc:`StopIteration`
as the new exception's cause).
If a generator code directly or indirectly raises :exc:`StopIteration`,
it is converted into a :exc:`RuntimeError` (retaining the
:exc:`StopIteration` as the new exception's cause).
.. versionchanged:: 3.3
Added ``value`` attribute and the ability for generator functions to
use it to return a value.
.. versionchanged:: 3.5
Introduced the RuntimeError transformation.
Introduced the RuntimeError transformation via
``from __future__ import generator_stop``, see :pep:`479`.
.. versionchanged:: 3.7
Enable :pep:`479` for all code by default: a :exc:`StopIteration`
error raised in a generator is transformed into a :exc:`RuntimeError`.
.. exception:: StopAsyncIteration
@ -664,11 +665,13 @@ depending on the system error code.
:pep:`3151` - Reworking the OS and IO exception hierarchy
.. _warning-categories-as-exceptions:
Warnings
--------
The following exceptions are used as warning categories; see the :mod:`warnings`
module for more information.
The following exceptions are used as warning categories; see the
:ref:`warning-categories` documentation for more details.
.. exception:: Warning
@ -682,12 +685,14 @@ module for more information.
.. exception:: DeprecationWarning
Base class for warnings about deprecated features.
Base class for warnings about deprecated features when those warnings are
intended for other Python developers.
.. exception:: PendingDeprecationWarning
Base class for warnings about features which will be deprecated in the future.
Base class for warnings about features which will be deprecated in the
future.
.. exception:: SyntaxWarning
@ -702,8 +707,8 @@ module for more information.
.. exception:: FutureWarning
Base class for warnings about constructs that will change semantically in the
future.
Base class for warnings about deprecated features when those warnings are
intended for end users of applications that are written in Python.
.. exception:: ImportWarning
@ -723,7 +728,8 @@ module for more information.
.. exception:: ResourceWarning
Base class for warnings related to resource usage.
Base class for warnings related to resource usage. Ignored by the default
warning filters.
.. versionadded:: 3.2

View file

@ -1,121 +0,0 @@
:mod:`fpectl` --- Floating point exception control
==================================================
.. module:: fpectl
:platform: Unix
:synopsis: Provide control for floating point exception handling.
.. moduleauthor:: Lee Busby <busby1@llnl.gov>
.. sectionauthor:: Lee Busby <busby1@llnl.gov>
.. note::
The :mod:`fpectl` module is not built by default, and its usage is discouraged
and may be dangerous except in the hands of experts. See also the section
:ref:`fpectl-limitations` on limitations for more details.
.. index:: single: IEEE-754
--------------
Most computers carry out floating point operations in conformance with the
so-called IEEE-754 standard. On any real computer, some floating point
operations produce results that cannot be expressed as a normal floating point
value. For example, try ::
>>> import math
>>> math.exp(1000)
inf
>>> math.exp(1000) / math.exp(1000)
nan
(The example above will work on many platforms. DEC Alpha may be one exception.)
"Inf" is a special, non-numeric value in IEEE-754 that stands for "infinity",
and "nan" means "not a number." Note that, other than the non-numeric results,
nothing special happened when you asked Python to carry out those calculations.
That is in fact the default behaviour prescribed in the IEEE-754 standard, and
if it works for you, stop reading now.
In some circumstances, it would be better to raise an exception and stop
processing at the point where the faulty operation was attempted. The
:mod:`fpectl` module is for use in that situation. It provides control over
floating point units from several hardware manufacturers, allowing the user to
turn on the generation of :const:`SIGFPE` whenever any of the IEEE-754
exceptions Division by Zero, Overflow, or Invalid Operation occurs. In tandem
with a pair of wrapper macros that are inserted into the C code comprising your
python system, :const:`SIGFPE` is trapped and converted into the Python
:exc:`FloatingPointError` exception.
The :mod:`fpectl` module defines the following functions and may raise the given
exception:
.. function:: turnon_sigfpe()
Turn on the generation of :const:`SIGFPE`, and set up an appropriate signal
handler.
.. function:: turnoff_sigfpe()
Reset default handling of floating point exceptions.
.. exception:: FloatingPointError
After :func:`turnon_sigfpe` has been executed, a floating point operation that
raises one of the IEEE-754 exceptions Division by Zero, Overflow, or Invalid
operation will in turn raise this standard Python exception.
.. _fpectl-example:
Example
-------
The following example demonstrates how to start up and test operation of the
:mod:`fpectl` module. ::
>>> import fpectl
>>> import fpetest
>>> fpectl.turnon_sigfpe()
>>> fpetest.test()
overflow PASS
FloatingPointError: Overflow
div by 0 PASS
FloatingPointError: Division by zero
[ more output from test elided ]
>>> import math
>>> math.exp(1000)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
FloatingPointError: in math_1
.. _fpectl-limitations:
Limitations and other considerations
------------------------------------
Setting up a given processor to trap IEEE-754 floating point errors currently
requires custom code on a per-architecture basis. You may have to modify
:mod:`fpectl` to control your particular hardware.
Conversion of an IEEE-754 exception to a Python exception requires that the
wrapper macros ``PyFPE_START_PROTECT`` and ``PyFPE_END_PROTECT`` be inserted
into your code in an appropriate fashion. Python itself has been modified to
support the :mod:`fpectl` module, but many other codes of interest to numerical
analysts have not.
The :mod:`fpectl` module is not thread-safe.
.. seealso::
Some files in the source distribution may be interesting in learning more about
how this module operates. The include file :file:`Include/pyfpe.h` discusses the
implementation of this module at some length. :file:`Modules/fpetestmodule.c`
gives several examples of use. Many additional examples can be found in
:file:`Objects/floatobject.c`.

View file

@ -1423,7 +1423,7 @@ are always available. They are listed here in alphabetical order.
a regular function and do something with its result. This is needed
in some cases where you need a reference to a function from a class
body and you want to avoid the automatic transformation to instance
method. For these cases, use this idiom:
method. For these cases, use this idiom::
class C:
builtin_open = staticmethod(open)

View file

@ -281,23 +281,34 @@ The :mod:`functools` module defines the following functions:
... print(arg)
To add overloaded implementations to the function, use the :func:`register`
attribute of the generic function. It is a decorator, taking a type
parameter and decorating a function implementing the operation for that
type::
attribute of the generic function. It is a decorator. For functions
annotated with types, the decorator will infer the type of the first
argument automatically::
>>> @fun.register(int)
... def _(arg, verbose=False):
>>> @fun.register
... def _(arg: int, verbose=False):
... if verbose:
... print("Strength in numbers, eh?", end=" ")
... print(arg)
...
>>> @fun.register(list)
... def _(arg, verbose=False):
>>> @fun.register
... def _(arg: list, verbose=False):
... if verbose:
... print("Enumerate this:")
... for i, elem in enumerate(arg):
... print(i, elem)
For code which doesn't use type annotations, the appropriate type
argument can be passed explicitly to the decorator itself::
>>> @fun.register(complex)
... def _(arg, verbose=False):
... if verbose:
... print("Better than complicated.", end=" ")
... print(arg.real, arg.imag)
...
To enable registering lambdas and pre-existing functions, the
:func:`register` attribute can be used in a functional form::
@ -368,6 +379,9 @@ The :mod:`functools` module defines the following functions:
.. versionadded:: 3.4
.. versionchanged:: 3.7
The :func:`register` attribute supports using type annotations.
.. function:: update_wrapper(wrapper, wrapped, assigned=WRAPPER_ASSIGNMENTS, updated=WRAPPER_UPDATES)

View file

@ -42,8 +42,10 @@ The :mod:`getpass` module provides two functions:
Return the "login name" of the user.
This function checks the environment variables :envvar:`LOGNAME`,
:envvar:`USER`, :envvar:`LNAME` and :envvar:`USERNAME`, in order, and returns
the value of the first one which is set to a non-empty string. If none are set,
the login name from the password database is returned on systems which support
the :mod:`pwd` module, otherwise, an exception is raised.
:envvar:`USER`, :envvar:`LNAME` and :envvar:`USERNAME`, in order, and
returns the value of the first one which is set to a non-empty string. If
none are set, the login name from the password database is returned on
systems which support the :mod:`pwd` module, otherwise, an exception is
raised.
In general, this function should be preferred over :func:`os.getlogin()`.

View file

@ -482,7 +482,7 @@ Keyed hashing
Keyed hashing can be used for authentication as a faster and simpler
replacement for `Hash-based message authentication code
<http://en.wikipedia.org/wiki/Hash-based_message_authentication_code>`_ (HMAC).
<https://en.wikipedia.org/wiki/Hash-based_message_authentication_code>`_ (HMAC).
BLAKE2 can be securely used in prefix-MAC mode thanks to the
indifferentiability property inherited from BLAKE.
@ -562,7 +562,7 @@ on the hash function used in digital signatures.
by the signer.
(`NIST SP-800-106 "Randomized Hashing for Digital Signatures"
<http://csrc.nist.gov/publications/nistpubs/800-106/NIST-SP-800-106.pdf>`_)
<https://csrc.nist.gov/publications/detail/sp/800-106/final>`_)
In BLAKE2 the salt is processed as a one-time input to the hash function during
initialization, rather than as an input to each compression function.
@ -699,7 +699,7 @@ implementation, extension code, and this documentation:
You should have received a copy of the CC0 Public Domain Dedication along
with this software. If not, see
http://creativecommons.org/publicdomain/zero/1.0/.
https://creativecommons.org/publicdomain/zero/1.0/.
The following people have helped with development or contributed their changes
to the project and the public domain according to the Creative Commons Public
@ -728,7 +728,7 @@ Domain Dedication 1.0 Universal:
https://blake2.net
Official BLAKE2 website.
http://csrc.nist.gov/publications/fips/fips180-2/fips180-2.pdf
https://csrc.nist.gov/csrc/media/publications/fips/180/2/archive/2002-08-01/documents/fips180-2.pdf
The FIPS 180-2 publication on Secure Hash Algorithms.
https://en.wikipedia.org/wiki/Cryptographic_hash_function#Cryptographic_hash_algorithms

View file

@ -187,6 +187,17 @@ a tie-breaker so that two tasks with the same priority are returned in the order
they were added. And since no two entry counts are the same, the tuple
comparison will never attempt to directly compare two tasks.
Another solution to the problem of non-comparable tasks is to create a wrapper
class that ignores the task item and only compares the priority field::
from dataclasses import dataclass, field
from typing import Any
@dataclass(order=True)
class PrioritizedItem:
priority: int
item: Any=field(compare=False)
The remaining challenges revolve around finding a pending task and making
changes to its priority or removing it entirely. Finding a task can be done
with a dictionary pointing to an entry in the queue.

View file

@ -31,6 +31,21 @@ This module implements the HMAC algorithm as described by :rfc:`2104`.
MD5 as implicit default digest for *digestmod* is deprecated.
.. function:: digest(key, msg, digest)
Return digest of *msg* for given secret *key* and *digest*. The
function is equivalent to ``HMAC(key, msg, digest).digest()``, but
uses an optimized C or inline implementation, which is faster for messages
that fit into memory. The parameters *key*, *msg*, and *digest* have
the same meaning as in :func:`~hmac.new`.
CPython implementation detail, the optimized C implementation is only used
when *digest* is a string and name of a digest algorithm, which is
supported by OpenSSL.
.. versionadded:: 3.7
An HMAC object has the following methods:
.. method:: HMAC.update(msg)

View file

@ -67,6 +67,9 @@ generically as an :term:`importer`) to participate in the import process.
:pep:`489`
Multi-phase extension module initialization
:pep:`552`
Deterministic pycs
:pep:`3120`
Using UTF-8 as the Default Source Encoding
@ -366,6 +369,13 @@ ABC hierarchy::
An abstract base class for a :term:`loader`.
See :pep:`302` for the exact definition for a loader.
For loaders that wish to support resource reading, they should
implement a ``get_resource_reader(fullname)`` method as specified
by :class:`importlib.abc.ResourceReader`.
.. versionchanged:: 3.7
Introduced the optional ``get_resource_reader()`` method.
.. method:: create_module(spec)
A method that returns the module object to use when
@ -465,12 +475,88 @@ ABC hierarchy::
The import machinery now takes care of this automatically.
.. class:: ResourceReader
An :term:`abstract base class` to provide the ability to read
*resources*.
From the perspective of this ABC, a *resource* is a binary
artifact that is shipped within a package. Typically this is
something like a data file that lives next to the ``__init__.py``
file of the package. The purpose of this class is to help abstract
out the accessing of such data files so that it does not matter if
the package and its data file(s) are stored in a e.g. zip file
versus on the file system.
For any of methods of this class, a *resource* argument is
expected to be a :term:`path-like object` which represents
conceptually just a file name. This means that no subdirectory
paths should be included in the *resource* argument. This is
because the location of the package the reader is for, acts as the
"directory". Hence the metaphor for directories and file
names is packages and resources, respectively. This is also why
instances of this class are expected to directly correlate to
a specific package (instead of potentially representing multiple
packages or a module).
Loaders that wish to support resource reading are expected to
provide a method called ``get_resource_loader(fullname)`` which
returns an object implementing this ABC's interface. If the module
specified by fullname is not a package, this method should return
:const:`None`. An object compatible with this ABC should only be
returned when the specified module is a package.
.. versionadded:: 3.7
.. abstractmethod:: open_resource(resource)
Returns an opened, :term:`file-like object` for binary reading
of the *resource*.
If the resource cannot be found, :exc:`FileNotFoundError` is
raised.
.. abstractmethod:: resource_path(resource)
Returns the file system path to the *resource*.
If the resource does not concretely exist on the file system,
raise :exc:`FileNotFoundError`.
.. abstractmethod:: is_resource(name)
Returns ``True`` if the named *name* is considered a resource.
:exc:`FileNotFoundError` is raised if *name* does not exist.
.. abstractmethod:: contents()
Returns an :term:`iterator` of strings over the contents of
the package. Do note that it is not required that all names
returned by the iterator be actual resources, e.g. it is
acceptable to return names for which :meth:`is_resource` would
be false.
Allowing non-resource names to be returned is to allow for
situations where how a package and its resources are stored
are known a priori and the non-resource names would be useful.
For instance, returning subdirectory names is allowed so that
when it is known that the package and resources are stored on
the file system then those subdirectory names can be used
directly.
The abstract method returns an iterator of no items.
.. class:: ResourceLoader
An abstract base class for a :term:`loader` which implements the optional
:pep:`302` protocol for loading arbitrary resources from the storage
back-end.
.. deprecated:: 3.7
This ABC is deprecated in favour of supporting resource loading
through :class:`importlib.abc.ResourceReader`.
.. abstractmethod:: get_data(path)
An abstract method to return the bytes for the data located at *path*.
@ -706,6 +792,131 @@ ABC hierarchy::
itself does not end in ``__init__``.
:mod:`importlib.resources` -- Resources
---------------------------------------
.. module:: importlib.resources
:synopsis: Package resource reading, opening, and access
**Source code:** :source:`Lib/importlib/resources.py`
--------------
.. versionadded:: 3.7
This module leverages Python's import system to provide access to *resources*
within *packages*. If you can import a package, you can access resources
within that package. Resources can be opened or read, in either binary or
text mode.
Resources are roughly akin to files inside directories, though it's important
to keep in mind that this is just a metaphor. Resources and packages **do
not** have to exist as physical files and directories on the file system.
Loaders can support resources by implementing the :class:`ResourceReader`
abstract base class.
The following types are defined.
.. data:: Package
The ``Package`` type is defined as ``Union[str, ModuleType]``. This means
that where the function describes accepting a ``Package``, you can pass in
either a string or a module. Module objects must have a resolvable
``__spec__.submodule_search_locations`` that is not ``None``.
.. data:: Resource
This type describes the resource names passed into the various functions
in this package. This is defined as ``Union[str, os.PathLike]``.
The following functions are available.
.. function:: open_binary(package, resource)
Open for binary reading the *resource* within *package*.
*package* is either a name or a module object which conforms to the
``Package`` requirements. *resource* is the name of the resource to open
within *package*; it may not contain path separators and it may not have
sub-resources (i.e. it cannot be a directory). This function returns a
``typing.BinaryIO`` instance, a binary I/O stream open for reading.
.. function:: open_text(package, resource, encoding='utf-8', errors='strict')
Open for text reading the *resource* within *package*. By default, the
resource is opened for reading as UTF-8.
*package* is either a name or a module object which conforms to the
``Package`` requirements. *resource* is the name of the resource to open
within *package*; it may not contain path separators and it may not have
sub-resources (i.e. it cannot be a directory). *encoding* and *errors*
have the same meaning as with built-in :func:`open`.
This function returns a ``typing.TextIO`` instance, a text I/O stream open
for reading.
.. function:: read_binary(package, resource)
Read and return the contents of the *resource* within *package* as
``bytes``.
*package* is either a name or a module object which conforms to the
``Package`` requirements. *resource* is the name of the resource to open
within *package*; it may not contain path separators and it may not have
sub-resources (i.e. it cannot be a directory). This function returns the
contents of the resource as :class:`bytes`.
.. function:: read_text(package, resource, encoding='utf-8', errors='strict')
Read and return the contents of *resource* within *package* as a ``str``.
By default, the contents are read as strict UTF-8.
*package* is either a name or a module object which conforms to the
``Package`` requirements. *resource* is the name of the resource to open
within *package*; it may not contain path separators and it may not have
sub-resources (i.e. it cannot be a directory). *encoding* and *errors*
have the same meaning as with built-in :func:`open`. This function
returns the contents of the resource as :class:`str`.
.. function:: path(package, resource)
Return the path to the *resource* as an actual file system path. This
function returns a context manager for use in a :keyword:`with` statement.
The context manager provides a :class:`pathlib.Path` object.
Exiting the context manager cleans up any temporary file created when the
resource needs to be extracted from e.g. a zip file.
*package* is either a name or a module object which conforms to the
``Package`` requirements. *resource* is the name of the resource to open
within *package*; it may not contain path separators and it may not have
sub-resources (i.e. it cannot be a directory).
.. function:: is_resource(package, name)
Return ``True`` if there is a resource named *name* in the package,
otherwise ``False``. Remember that directories are *not* resources!
*package* is either a name or a module object which conforms to the
``Package`` requirements.
.. function:: contents(package)
Return an iterator over the named items within the package. The iterator
returns :class:`str` resources (e.g. files) and non-resources
(e.g. directories). The iterator does not recurse into subdirectories.
*package* is either a name or a module object which conforms to the
``Package`` requirements.
:mod:`importlib.machinery` -- Importers and path hooks
------------------------------------------------------
@ -1080,7 +1291,7 @@ find and load modules.
Name of the place from which the module is loaded, e.g. "builtin" for
built-in modules and the filename for modules loaded from source.
Normally "origin" should be set, but it may be ``None`` (the default)
which indicates it is unspecified.
which indicates it is unspecified (e.g. for namespace packages).
.. attribute:: submodule_search_locations
@ -1327,6 +1538,14 @@ an :term:`importer`.
.. versionchanged:: 3.6
Accepts a :term:`path-like object`.
.. function:: source_hash(source_bytes)
Return the hash of *source_bytes* as bytes. A hash-based ``.pyc`` file embeds
the :func:`source_hash` of the corresponding source file's contents in its
header.
.. versionadded:: 3.7
.. class:: LazyLoader(loader)
A class which postpones the execution of the loader of a module until the

View file

@ -34,6 +34,9 @@ provided as convenient choices for the second argument to :func:`getmembers`.
They also help you determine when you can expect to find the following special
attributes:
.. this function name is too big to fit in the ascii-art table below
.. |coroutine-origin-link| replace:: :func:`sys.set_coroutine_origin_tracking_depth`
+-----------+-------------------+---------------------------+
| Type | Attribute | Description |
+===========+===================+===========================+
@ -215,6 +218,10 @@ attributes:
+-----------+-------------------+---------------------------+
| | cr_code | code |
+-----------+-------------------+---------------------------+
| | cr_origin | where coroutine was |
| | | created, or ``None``. See |
| | | |coroutine-origin-link| |
+-----------+-------------------+---------------------------+
| builtin | __doc__ | documentation string |
+-----------+-------------------+---------------------------+
| | __name__ | original name of this |
@ -234,6 +241,9 @@ attributes:
The ``__name__`` attribute of generators is now set from the function
name, instead of the code name, and it can now be modified.
.. versionchanged:: 3.7
Add ``cr_origin`` attribute to coroutines.
.. function:: getmembers(object[, predicate])
@ -592,7 +602,13 @@ function.
.. attribute:: Signature.parameters
An ordered mapping of parameters' names to the corresponding
:class:`Parameter` objects.
:class:`Parameter` objects. Parameters appear in strict definition
order, including keyword-only parameters.
.. versionchanged:: 3.7
Python only explicitly guaranteed that it preserved the declaration
order of keyword-only parameters as of version 3.7, although in practice
this order had always been preserved in Python 3.
.. attribute:: Signature.return_annotation
@ -885,7 +901,7 @@ Classes and functions
*defaults* is an *n*-tuple of default argument values corresponding to the
last *n* positional parameters, or ``None`` if there are no such defaults
defined.
*kwonlyargs* is a list of keyword-only parameter names.
*kwonlyargs* is a list of keyword-only parameter names in declaration order.
*kwonlydefaults* is a dictionary mapping parameter names from *kwonlyargs*
to the default values used if no argument is supplied.
*annotations* is a dictionary mapping parameter names to annotations.
@ -911,6 +927,11 @@ Classes and functions
single-source Python 2/3 code migrating away from the legacy
:func:`getargspec` API.
.. versionchanged:: 3.7
Python only explicitly guaranteed that it preserved the declaration
order of keyword-only parameters as of version 3.7, although in practice
this order had always been preserved in Python 3.
.. function:: getargvalues(frame)

View file

@ -205,8 +205,8 @@ ABC Inherits Stub Methods Mixin M
``writable``, and ``writelines``
:class:`RawIOBase` :class:`IOBase` ``readinto`` and Inherited :class:`IOBase` methods, ``read``,
``write`` and ``readall``
:class:`BufferedIOBase` :class:`IOBase` ``detach``, ``read``, Inherited :class:`IOBase` methods, ``readinto``
``read1``, and ``write``
:class:`BufferedIOBase` :class:`IOBase` ``detach``, ``read``, Inherited :class:`IOBase` methods, ``readinto``,
``read1``, and ``write`` and ``readinto1``
:class:`TextIOBase` :class:`IOBase` ``detach``, ``read``, Inherited :class:`IOBase` methods, ``encoding``,
``readline``, and ``errors``, and ``newlines``
``write``
@ -385,14 +385,17 @@ I/O Base Classes
.. method:: read(size=-1)
Read up to *size* bytes from the object and return them. As a convenience,
if *size* is unspecified or -1, :meth:`readall` is called. Otherwise,
only one system call is ever made. Fewer than *size* bytes may be
returned if the operating system call returns fewer than *size* bytes.
if *size* is unspecified or -1, all bytes until EOF are returned.
Otherwise, only one system call is ever made. Fewer than *size* bytes may
be returned if the operating system call returns fewer than *size* bytes.
If 0 bytes are returned, and *size* was not 0, this indicates end of file.
If the object is in non-blocking mode and no bytes are available,
``None`` is returned.
The default implementation defers to :meth:`readall` and
:meth:`readinto`.
.. method:: readall()
Read and return all the bytes from the stream until EOF, using multiple
@ -901,7 +904,7 @@ Text I/O
locale encoding using :func:`locale.setlocale`, use the current locale
encoding instead of the user preferred encoding.
:class:`TextIOWrapper` provides one attribute in addition to those of
:class:`TextIOWrapper` provides these members in addition to those of
:class:`TextIOBase` and its parents:
.. attribute:: line_buffering
@ -915,11 +918,19 @@ Text I/O
.. versionadded:: 3.7
.. method:: reconfigure(*, line_buffering=None, write_through=None)
.. method:: reconfigure(*[, encoding][, errors][, newline][, \
line_buffering][, write_through])
Reconfigure this text stream using new settings for *line_buffering*
and *write_through*. Passing ``None`` as an argument will retain
the current setting for that parameter.
Reconfigure this text stream using new settings for *encoding*,
*errors*, *newline*, *line_buffering* and *write_through*.
Parameters not specified keep current settings, except
``errors='strict`` is used when *encoding* is specified but
*errors* is not specified.
It is not possible to change the encoding or newline if some data
has already been read from the stream. On the other hand, changing
encoding after write is possible.
This method does an implicit stream flush before setting the
new parameters.

View file

@ -32,7 +32,7 @@ operator can be mapped across two vectors to form an efficient dot-product:
``sum(map(operator.mul, vector1, vector2))``.
**Infinite Iterators:**
**Infinite iterators:**
================== ================= ================================================= =========================================
Iterator Arguments Results Example
@ -61,7 +61,7 @@ Iterator Arguments Results
:func:`zip_longest` p, q, ... (p[0], q[0]), (p[1], q[1]), ... ``zip_longest('ABCD', 'xy', fillvalue='-') --> Ax By C- D-``
============================ ============================ ================================================= =============================================================
**Combinatoric generators:**
**Combinatoric iterators:**
============================================== ==================== =============================================================
Iterator Arguments Results
@ -753,15 +753,16 @@ which incur interpreter overhead.
def roundrobin(*iterables):
"roundrobin('ABC', 'D', 'EF') --> A D E B F C"
# Recipe credited to George Sakkis
pending = len(iterables)
num_active = len(iterables)
nexts = cycle(iter(it).__next__ for it in iterables)
while pending:
while num_active:
try:
for next in nexts:
yield next()
except StopIteration:
pending -= 1
nexts = cycle(islice(nexts, pending))
# Remove the iterator we just exhausted from the cycle.
num_active -= 1
nexts = cycle(islice(nexts, num_active))
def partition(pred, iterable):
'Use a predicate to partition entries into false entries and true entries'
@ -858,6 +859,29 @@ which incur interpreter overhead.
indices = sorted(random.randrange(n) for i in range(r))
return tuple(pool[i] for i in indices)
def nth_combination(iterable, r, index):
'Equivalent to list(combinations(iterable, r))[index]'
pool = tuple(iterable)
n = len(pool)
if r < 0 or r > n:
raise ValueError
c = 1
k = min(r, n-r)
for i in range(1, k+1):
c = c * (n - k + i) // i
if index < 0:
index += c
if index < 0 or index >= c:
raise IndexError
result = []
while r:
c, n, r = c*r//n, n-1, r-1
while index >= c:
index -= c
c, n = c*(n-r)//n, n-1
result.append(pool[-1-n])
return tuple(result)
Note, many of the above recipes can be optimized by replacing global lookups
with local variables defined as default values. For example, the
*dotproduct* recipe can be written as::

View file

@ -147,6 +147,16 @@ The :mod:`locale` module defines the following exception and functions:
| ``CHAR_MAX`` | Nothing is specified in this locale. |
+--------------+-----------------------------------------+
The function sets temporarily the ``LC_CTYPE`` locale to the ``LC_NUMERIC``
locale to decode ``decimal_point`` and ``thousands_sep`` byte strings if
they are non-ASCII or longer than 1 byte, and the ``LC_NUMERIC`` locale is
different than the ``LC_CTYPE`` locale. This temporary change affects other
threads.
.. versionchanged:: 3.7
The function now sets temporarily the ``LC_CTYPE`` locale to the
``LC_NUMERIC`` locale in some cases.
.. function:: nl_langinfo(option)
@ -316,6 +326,13 @@ The :mod:`locale` module defines the following exception and functions:
preferences, so this function is not thread-safe. If invoking setlocale is not
necessary or desired, *do_setlocale* should be set to ``False``.
On Android or in the UTF-8 mode (:option:`-X` ``utf8`` option), always
return ``'UTF-8'``, the locale and the *do_setlocale* argument are ignored.
.. versionchanged:: 3.7
The function now always returns ``UTF-8`` on Android or if the UTF-8 mode
is enabled.
.. function:: normalize(localename)

View file

@ -91,12 +91,12 @@ is the module's name in the Python package namespace.
scenario is to attach handlers only to the root logger, and to let
propagation take care of the rest.
.. method:: Logger.setLevel(lvl)
.. method:: Logger.setLevel(level)
Sets the threshold for this logger to *lvl*. Logging messages which are less
severe than *lvl* will be ignored; logging messages which have severity *lvl*
Sets the threshold for this logger to *level*. Logging messages which are less
severe than *level* will be ignored; logging messages which have severity *level*
or higher will be emitted by whichever handler or handlers service this logger,
unless a handler's level has been set to a higher severity level than *lvl*.
unless a handler's level has been set to a higher severity level than *level*.
When a logger is created, the level is set to :const:`NOTSET` (which causes
all messages to be processed when the logger is the root logger, or delegation
@ -117,7 +117,7 @@ is the module's name in the Python package namespace.
See :ref:`levels` for a list of levels.
.. versionchanged:: 3.2
The *lvl* parameter now accepts a string representation of the
The *level* parameter now accepts a string representation of the
level such as 'INFO' as an alternative to the integer constants
such as :const:`INFO`. Note, however, that levels are internally stored
as integers, and methods such as e.g. :meth:`getEffectiveLevel` and
@ -267,14 +267,14 @@ is the module's name in the Python package namespace.
message. This method should only be called from an exception handler.
.. method:: Logger.addFilter(filt)
.. method:: Logger.addFilter(filter)
Adds the specified filter *filt* to this logger.
Adds the specified filter *filter* to this logger.
.. method:: Logger.removeFilter(filt)
.. method:: Logger.removeFilter(filter)
Removes the specified filter *filt* from this logger.
Removes the specified filter *filter* from this logger.
.. method:: Logger.filter(record)
@ -393,33 +393,34 @@ subclasses. However, the :meth:`__init__` method in subclasses needs to call
Releases the thread lock acquired with :meth:`acquire`.
.. method:: Handler.setLevel(lvl)
.. method:: Handler.setLevel(level)
Sets the threshold for this handler to *lvl*. Logging messages which are less
severe than *lvl* will be ignored. When a handler is created, the level is set
to :const:`NOTSET` (which causes all messages to be processed).
Sets the threshold for this handler to *level*. Logging messages which are
less severe than *level* will be ignored. When a handler is created, the
level is set to :const:`NOTSET` (which causes all messages to be
processed).
See :ref:`levels` for a list of levels.
.. versionchanged:: 3.2
The *lvl* parameter now accepts a string representation of the
The *level* parameter now accepts a string representation of the
level such as 'INFO' as an alternative to the integer constants
such as :const:`INFO`.
.. method:: Handler.setFormatter(form)
.. method:: Handler.setFormatter(fmt)
Sets the :class:`Formatter` for this handler to *form*.
Sets the :class:`Formatter` for this handler to *fmt*.
.. method:: Handler.addFilter(filt)
.. method:: Handler.addFilter(filter)
Adds the specified filter *filt* to this handler.
Adds the specified filter *filter* to this handler.
.. method:: Handler.removeFilter(filt)
.. method:: Handler.removeFilter(filter)
Removes the specified filter *filt* from this handler.
Removes the specified filter *filter* from this handler.
.. method:: Handler.filter(record)

View file

@ -491,7 +491,7 @@ Supported mailbox formats are Maildir, mbox, MH, Babyl, and MMDF.
`Configuring Netscape Mail on Unix: Why The Content-Length Format is Bad <https://www.jwz.org/doc/content-length.html>`_
An argument for using the original mbox format rather than a variation.
`"mbox" is a family of several mutually incompatible mailbox formats <http://homepage.ntlworld.com/jonathan.deboynepollard/FGA/mail-mbox-formats.html>`_
`"mbox" is a family of several mutually incompatible mailbox formats <https://www.loc.gov/preservation/digital/formats/fdd/fdd000383.shtml>`_
A history of mbox variations.
@ -620,7 +620,7 @@ Supported mailbox formats are Maildir, mbox, MH, Babyl, and MMDF.
`nmh - Message Handling System <http://www.nongnu.org/nmh/>`_
Home page of :program:`nmh`, an updated version of the original :program:`mh`.
`MH & nmh: Email for Users & Programmers <http://rand-mh.sourceforge.net/book/>`_
`MH & nmh: Email for Users & Programmers <https://rand-mh.sourceforge.io/book/>`_
A GPL-licensed book on :program:`mh` and :program:`nmh`, with some information
on the mailbox format.

View file

@ -461,7 +461,7 @@ Constants
Tau is a circle constant equal to 2\ *π*, the ratio of a circle's circumference to
its radius. To learn more about Tau, check out Vi Hart's video `Pi is (still)
Wrong <https://www.youtube.com/watch?v=jG7vhMMXagQ>`_, and start celebrating
`Tau day <http://tauday.com/>`_ by eating twice as much pie!
`Tau day <https://tauday.com/>`_ by eating twice as much pie!
.. versionadded:: 3.6

View file

@ -1837,8 +1837,8 @@ Running the following commands creates a server for a single shared queue which
remote clients can access::
>>> from multiprocessing.managers import BaseManager
>>> import queue
>>> queue = queue.Queue()
>>> from queue import Queue
>>> queue = Queue()
>>> class QueueManager(BaseManager): pass
>>> QueueManager.register('get_queue', callable=lambda:queue)
>>> m = QueueManager(address=('', 50000), authkey=b'abracadabra')

View file

@ -20,8 +20,10 @@ the Unix :program:`ftp` program and other FTP clients.
A :class:`~netrc.netrc` instance or subclass instance encapsulates data from a netrc
file. The initialization argument, if present, specifies the file to parse. If
no argument is given, the file :file:`.netrc` in the user's home directory will
be read. Parse errors will raise :exc:`NetrcParseError` with diagnostic
no argument is given, the file :file:`.netrc` in the user's home directory --
as determined by :func:`os.path.expanduser` -- will be read. Otherwise,
a :exc:`FileNotFoundError` exception will be raised.
Parse errors will raise :exc:`NetrcParseError` with diagnostic
information including the file name, line number, and terminating token.
If no argument is specified on a POSIX system, the presence of passwords in
the :file:`.netrc` file will raise a :exc:`NetrcParseError` if the file
@ -32,6 +34,10 @@ the Unix :program:`ftp` program and other FTP clients.
.. versionchanged:: 3.4 Added the POSIX permission check.
.. versionchanged:: 3.7
:func:`os.path.expanduser` is used to find the location of the
:file:`.netrc` file when *file* is not passed as argument.
.. exception:: NetrcParseError
@ -82,4 +88,3 @@ Instances of :class:`~netrc.netrc` have public instance variables:
punctuation is allowed in passwords, however, note that whitespace and
non-printable characters are not allowed in passwords. This is a limitation
of the way the .netrc file is parsed and may be removed in the future.

View file

@ -567,7 +567,7 @@ An option group is obtained using the class :class:`OptionGroup`:
where
* parser is the :class:`OptionParser` instance the group will be insterted in
* parser is the :class:`OptionParser` instance the group will be inserted in
to
* title is the group title
* description, optional, is a long description of the group

View file

@ -240,8 +240,9 @@ the :mod:`glob` module.)
.. function:: isfile(path)
Return ``True`` if *path* is an existing regular file. This follows symbolic
links, so both :func:`islink` and :func:`isfile` can be true for the same path.
Return ``True`` if *path* is an :func:`existing <exists>` regular file.
This follows symbolic links, so both :func:`islink` and :func:`isfile` can
be true for the same path.
.. versionchanged:: 3.6
Accepts a :term:`path-like object`.
@ -249,8 +250,9 @@ the :mod:`glob` module.)
.. function:: isdir(path)
Return ``True`` if *path* is an existing directory. This follows symbolic
links, so both :func:`islink` and :func:`isdir` can be true for the same path.
Return ``True`` if *path* is an :func:`existing <exists>` directory. This
follows symbolic links, so both :func:`islink` and :func:`isdir` can be true
for the same path.
.. versionchanged:: 3.6
Accepts a :term:`path-like object`.
@ -258,8 +260,9 @@ the :mod:`glob` module.)
.. function:: islink(path)
Return ``True`` if *path* refers to a directory entry that is a symbolic link.
Always ``False`` if symbolic links are not supported by the Python runtime.
Return ``True`` if *path* refers to an :func:`existing <exists>` directory
entry that is a symbolic link. Always ``False`` if symbolic links are not
supported by the Python runtime.
.. versionchanged:: 3.6
Accepts a :term:`path-like object`.

View file

@ -325,10 +325,11 @@ process and user.
.. function:: getlogin()
Return the name of the user logged in on the controlling terminal of the
process. For most purposes, it is more useful to use the environment
variables :envvar:`LOGNAME` or :envvar:`USERNAME` to find out who the user
is, or ``pwd.getpwuid(os.getuid())[0]`` to get the login name of the current
real user id.
process. For most purposes, it is more useful to use
:func:`getpass.getuser` since the latter checks the environment variables
:envvar:`LOGNAME` or :envvar:`USERNAME` to find out who the user is, and
falls back to ``pwd.getpwuid(os.getuid())[0]`` to get the login name of the
current real user id.
Availability: Unix, Windows.
@ -735,13 +736,17 @@ as internal buffering of data.
.. function:: dup2(fd, fd2, inheritable=True)
Duplicate file descriptor *fd* to *fd2*, closing the latter first if necessary.
The file descriptor *fd2* is :ref:`inheritable <fd_inheritance>` by default,
or non-inheritable if *inheritable* is ``False``.
Duplicate file descriptor *fd* to *fd2*, closing the latter first if
necessary. Return *fd2*. The new file descriptor is :ref:`inheritable
<fd_inheritance>` by default or non-inheritable if *inheritable*
is ``False``.
.. versionchanged:: 3.4
Add the optional *inheritable* parameter.
.. versionchanged:: 3.7
Return *fd2* on success. Previously, ``None`` was always returned.
.. function:: fchmod(fd, mode)
@ -1097,6 +1102,45 @@ or `the MSDN <https://msdn.microsoft.com/en-us/library/z0kc8e3z.aspx>`_ on Windo
.. versionadded:: 3.3
.. function:: pwritev(fd, buffers, offset, flags=0)
Combines the functionality of :func:`os.writev` and :func:`os.pwrite`. It
writes the contents of *buffers* to file descriptor *fd* at offset *offset*.
*buffers* must be a sequence of :term:`bytes-like objects <bytes-like object>`.
Buffers are processed in array order. Entire contents of first buffer is written
before proceeding to second, and so on. The operating system may set a limit
(sysconf() value SC_IOV_MAX) on the number of buffers that can be used.
:func:`~os.pwritev` writes the contents of each object to the file descriptor
and returns the total number of bytes written.
The *flags* argument contains a bitwise OR of zero or more of the following
flags:
- RWF_DSYNC
- RWF_SYNC
Using non-zero flags requires Linux 4.7 or newer.
Availability: Linux (version 2.6.30), FreeBSD 6.0 and newer,
OpenBSD (version 2.7 and newer).
.. versionadded:: 3.7
.. data:: RWF_DSYNC (since Linux 4.7)
Provide a per-write equivalent of the O_DSYNC open(2) flag. This flag
is meaningful only for pwritev2(), and its effect applies only to the
data range written by the system call.
.. versionadded:: 3.7
.. data:: RWF_SYNC (since Linux 4.7)
Provide a per-write equivalent of the O_SYNC open(2) flag. This flag is
meaningful only for pwritev2(), and its effect applies only to the data
range written by the system call.
.. versionadded:: 3.7
.. function:: read(fd, n)
Read at most *n* bytes from file descriptor *fd*. Return a bytestring containing the
@ -1191,6 +1235,51 @@ or `the MSDN <https://msdn.microsoft.com/en-us/library/z0kc8e3z.aspx>`_ on Windo
.. versionadded:: 3.3
.. function:: preadv(fd, buffers, offset, flags=0)
Combines the functionality of :func:`os.readv` and :func:`os.pread`. It
reads from a file descriptor *fd* into a number of mutable :term:`bytes-like
objects <bytes-like object>` *buffers*. As :func:`os.readv`, it will transfer
data into each buffer until it is full and then move on to the next buffer in
the sequence to hold the rest of the data. Its fourth argument, *offset*,
specifies the file offset at which the input operation is to be performed.
:func:`~os.preadv` return the total number of bytes read (which can be less than
the total capacity of all the objects).
The flags argument contains a bitwise OR of zero or more of the following
flags:
- RWF_HIPRI
- RWF_NOWAIT
Using non-zero flags requires Linux 4.6 or newer.
Availability: Linux (version 2.6.30), FreeBSD 6.0 and newer,
OpenBSD (version 2.7 and newer).
.. versionadded:: 3.7
.. data:: RWF_HIPRI (since Linux 4.6)
High priority read/write. Allows block-based filesystems to use polling
of the device, which provides lower latency, but may use additional
resources. (Currently, this feature is usable only on a file descriptor
opened using the O_DIRECT flag.)
.. versionadded:: 3.7
.. data:: RWF_NOWAIT (since Linux 4.14)
Do not wait for data which is not immediately available. If this flag
is specified, the preadv2() system call will return instantly
if it would have to read data from the backing storage or wait for a lock.
If some data was successfully read, it will return the number of bytes
read. If no bytes were read, it will return -1 and set errno to EAGAIN.
Currently, this flag is meaningful only for preadv2().
.. versionadded:: 3.7
.. function:: tcgetpgrp(fd)
Return the process group associated with the terminal given by *fd* (an open
@ -2385,6 +2474,14 @@ features:
Time of file creation.
On Solaris and derivatives, the following attributes may also be
available:
.. attribute:: st_fstype
String that uniquely identifies the type of the filesystem that
contains the file.
On Mac OS systems, the following attributes may also be available:
.. attribute:: st_rsize
@ -2428,6 +2525,8 @@ features:
.. versionadded:: 3.5
Added the :attr:`st_file_attributes` member on Windows.
.. versionadded:: 3.7
Added the :attr:`st_fstype` member to Solaris/derivatives.
.. function:: statvfs(path)
@ -2436,7 +2535,7 @@ features:
correspond to the members of the :c:type:`statvfs` structure, namely:
:attr:`f_bsize`, :attr:`f_frsize`, :attr:`f_blocks`, :attr:`f_bfree`,
:attr:`f_bavail`, :attr:`f_files`, :attr:`f_ffree`, :attr:`f_favail`,
:attr:`f_flag`, :attr:`f_namemax`.
:attr:`f_flag`, :attr:`f_namemax`, :attr:`f_fsid`.
Two module-level constants are defined for the :attr:`f_flag` attribute's
bit-flags: if :const:`ST_RDONLY` is set, the filesystem is mounted
@ -2471,6 +2570,9 @@ features:
.. versionchanged:: 3.6
Accepts a :term:`path-like object`.
.. versionadded:: 3.7
Added :attr:`f_fsid`.
.. data:: supports_dir_fd

View file

@ -14,7 +14,7 @@ the standard audio interface for Linux and recent versions of FreeBSD.
.. Things will get more complicated for future Linux versions, since
ALSA is in the standard kernel as of 2.5.x. Presumably if you
use ALSA, you'll have to make sure its OSS compatibility layer
is active to use ossaudiodev, but you're gonna need it for the vast
is active to use ossaudiodev, but you're going to need it for the vast
majority of Linux audio apps anyway.
Sounds like things are also complicated for other BSDs. In response
@ -447,4 +447,3 @@ The remaining methods are specific to audio mixing:
microphone input::
mixer.setrecsrc (1 << ossaudiodev.SOUND_MIXER_MIC)

View file

@ -11,9 +11,9 @@ available for Python:
`PyGObject <https://wiki.gnome.org/Projects/PyGObject>`_
PyGObject provides introspection bindings for C libraries using
`GObject <https://developer.gnome.org/gobject/stable/>`_. One of
these libraries is the `GTK+ 3 <http://www.gtk.org/>`_ widget set.
these libraries is the `GTK+ 3 <https://www.gtk.org/>`_ widget set.
GTK+ comes with many more widgets than Tkinter provides. An online
`Python GTK+ 3 Tutorial <https://python-gtk-3-tutorial.readthedocs.org/en/latest/>`_
`Python GTK+ 3 Tutorial <https://python-gtk-3-tutorial.readthedocs.io/>`_
is available.
`PyGTK <http://www.pygtk.org/>`_
@ -35,7 +35,7 @@ available for Python:
Compared to PyQt, its licensing scheme is friendlier to non-open source
applications.
`wxPython <http://www.wxpython.org>`_
`wxPython <https://www.wxpython.org>`_
wxPython is a cross-platform GUI toolkit for Python that is built around
the popular `wxWidgets <https://www.wxwidgets.org/>`_ (formerly wxWindows)
C++ toolkit. It provides a native look and feel for applications on

View file

@ -61,6 +61,12 @@ useful than quitting the debugger upon program's exit.
:file:`pdb.py` now accepts a ``-c`` option that executes commands as if given
in a :file:`.pdbrc` file, see :ref:`debugger-commands`.
.. versionadded:: 3.7
:file:`pdb.py` now accepts a ``-m`` option that execute modules similar to the way
``python3 -m`` does. As with a script, the debugger will pause execution just
before the first line of the module.
The typical usage to break into the debugger from a running program is to
insert ::
@ -326,16 +332,19 @@ by the local file.
(com) end
(Pdb)
To remove all commands from a breakpoint, type commands and follow it
To remove all commands from a breakpoint, type ``commands`` and follow it
immediately with ``end``; that is, give no commands.
With no *bpnumber* argument, commands refers to the last breakpoint set.
With no *bpnumber* argument, ``commands`` refers to the last breakpoint set.
You can use breakpoint commands to start your program up again. Simply use
the continue command, or step, or any other command that resumes execution.
the :pdbcmd:`continue` command, or :pdbcmd:`step`,
or any other command that resumes execution.
Specifying any command resuming execution (currently continue, step, next,
return, jump, quit and their abbreviations) terminates the command list (as if
Specifying any command resuming execution
(currently :pdbcmd:`continue`, :pdbcmd:`step`, :pdbcmd:`next`,
:pdbcmd:`return`, :pdbcmd:`jump`, :pdbcmd:`quit` and their abbreviations)
terminates the command :pdbcmd:`list` (as if
that command was immediately followed by end). This is because any time you
resume execution (even with a simple next or step), you may encounter another
breakpoint—which could have its own command list, leading to ambiguities about

View file

@ -370,7 +370,7 @@ The :mod:`pickle` module exports two classes, :class:`Pickler` and
Python 2 names to the new names used in Python 3. The *encoding* and
*errors* tell pickle how to decode 8-bit string instances pickled by Python
2; these default to 'ASCII' and 'strict', respectively. The *encoding* can
be 'bytes' to read these ß8-bit string instances as bytes objects.
be 'bytes' to read these 8-bit string instances as bytes objects.
.. method:: load()

View file

@ -38,7 +38,7 @@ or :class:`datetime.datetime` objects.
.. seealso::
`PList manual page <https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man5/plist.5.html>`_
`PList manual page <https://developer.apple.com/library/content/documentation/Cocoa/Conceptual/PropertyLists/>`_
Apple's documentation of the file format.

View file

@ -139,6 +139,7 @@ The :mod:`pstats` module's :class:`~pstats.Stats` class has a variety of methods
for manipulating and printing the data saved into a profile results file::
import pstats
from pstats import SortKey
p = pstats.Stats('restats')
p.strip_dirs().sort_stats(-1).print_stats()
@ -148,14 +149,14 @@ entries according to the standard module/line/name string that is printed. The
:meth:`~pstats.Stats.print_stats` method printed out all the statistics. You
might try the following sort calls::
p.sort_stats('name')
p.sort_stats(SortKey.NAME)
p.print_stats()
The first call will actually sort the list by function name, and the second call
will print out the statistics. The following are some interesting calls to
experiment with::
p.sort_stats('cumulative').print_stats(10)
p.sort_stats(SortKey.CUMULATIVE).print_stats(10)
This sorts the profile by cumulative time in a function, and then only prints
the ten most significant lines. If you want to understand what algorithms are
@ -164,20 +165,20 @@ taking time, the above line is what you would use.
If you were looking to see what functions were looping a lot, and taking a lot
of time, you would do::
p.sort_stats('time').print_stats(10)
p.sort_stats(SortKey.TIME).print_stats(10)
to sort according to time spent within each function, and then print the
statistics for the top ten functions.
You might also try::
p.sort_stats('file').print_stats('__init__')
p.sort_stats(SortKey.FILENAME).print_stats('__init__')
This will sort all the statistics by file name, and then print out statistics
for only the class init methods (since they are spelled with ``__init__`` in
them). As one final example, you could try::
p.sort_stats('time', 'cumulative').print_stats(.5, 'init')
p.sort_stats(SortKey.TIME, SortKey.CUMULATIVE).print_stats(.5, 'init')
This line sorts statistics with a primary key of time, and a secondary key of
cumulative time, and then prints out some of the statistics. To be specific, the
@ -250,12 +251,13 @@ functions:
without writing the profile data to a file::
import cProfile, pstats, io
from pstats import SortKey
pr = cProfile.Profile()
pr.enable()
# ... do something ...
pr.disable()
s = io.StringIO()
sortby = 'cumulative'
sortby = SortKey.CUMULATIVE
ps = pstats.Stats(pr, stream=s).sort_stats(sortby)
ps.print_stats()
print(s.getvalue())
@ -361,60 +363,65 @@ Analysis of the profiler data is done using the :class:`~pstats.Stats` class.
.. method:: sort_stats(*keys)
This method modifies the :class:`Stats` object by sorting it according to
the supplied criteria. The argument is typically a string identifying the
basis of a sort (example: ``'time'`` or ``'name'``).
the supplied criteria. The argument can be either a string or a SortKey
enum identifying the basis of a sort (example: ``'time'``, ``'name'``,
``SortKey.TIME`` or ``SortKey.NAME``). The SortKey enums argument have
advantage over the string argument in that it is more robust and less
error prone.
When more than one key is provided, then additional keys are used as
secondary criteria when there is equality in all keys selected before
them. For example, ``sort_stats('name', 'file')`` will sort all the
entries according to their function name, and resolve all ties (identical
function names) by sorting by file name.
them. For example, ``sort_stats(SortKey.NAME, SortKey.FILE)`` will sort
all the entries according to their function name, and resolve all ties
(identical function names) by sorting by file name.
Abbreviations can be used for any key names, as long as the abbreviation
is unambiguous. The following are the keys currently defined:
For the string argument, abbreviations can be used for any key names, as
long as the abbreviation is unambiguous.
+------------------+----------------------+
| Valid Arg | Meaning |
+==================+======================+
| ``'calls'`` | call count |
+------------------+----------------------+
| ``'cumulative'`` | cumulative time |
+------------------+----------------------+
| ``'cumtime'`` | cumulative time |
+------------------+----------------------+
| ``'file'`` | file name |
+------------------+----------------------+
| ``'filename'`` | file name |
+------------------+----------------------+
| ``'module'`` | file name |
+------------------+----------------------+
| ``'ncalls'`` | call count |
+------------------+----------------------+
| ``'pcalls'`` | primitive call count |
+------------------+----------------------+
| ``'line'`` | line number |
+------------------+----------------------+
| ``'name'`` | function name |
+------------------+----------------------+
| ``'nfl'`` | name/file/line |
+------------------+----------------------+
| ``'stdname'`` | standard name |
+------------------+----------------------+
| ``'time'`` | internal time |
+------------------+----------------------+
| ``'tottime'`` | internal time |
+------------------+----------------------+
The following are the valid string and SortKey:
+------------------+---------------------+----------------------+
| Valid String Arg | Valid enum Arg | Meaning |
+==================+=====================+======================+
| ``'calls'`` | SortKey.CALLS | call count |
+------------------+---------------------+----------------------+
| ``'cumulative'`` | SortKey.CUMULATIVE | cumulative time |
+------------------+---------------------+----------------------+
| ``'cumtime'`` | N/A | cumulative time |
+------------------+---------------------+----------------------+
| ``'file'`` | N/A | file name |
+------------------+---------------------+----------------------+
| ``'filename'`` | SortKey.FILENAME | file name |
+------------------+---------------------+----------------------+
| ``'module'`` | N/A | file name |
+------------------+---------------------+----------------------+
| ``'ncalls'`` | N/A | call count |
+------------------+---------------------+----------------------+
| ``'pcalls'`` | SortKey.PCALLS | primitive call count |
+------------------+---------------------+----------------------+
| ``'line'`` | SortKey.LINE | line number |
+------------------+---------------------+----------------------+
| ``'name'`` | SortKey.NAME | function name |
+------------------+---------------------+----------------------+
| ``'nfl'`` | SortKey.NFL | name/file/line |
+------------------+---------------------+----------------------+
| ``'stdname'`` | SortKey.STDNAME | standard name |
+------------------+---------------------+----------------------+
| ``'time'`` | SortKey.TIME | internal time |
+------------------+---------------------+----------------------+
| ``'tottime'`` | N/A | internal time |
+------------------+---------------------+----------------------+
Note that all sorts on statistics are in descending order (placing most
time consuming items first), where as name, file, and line number searches
are in ascending order (alphabetical). The subtle distinction between
``'nfl'`` and ``'stdname'`` is that the standard name is a sort of the
name as printed, which means that the embedded line numbers get compared
in an odd way. For example, lines 3, 20, and 40 would (if the file names
were the same) appear in the string order 20, 3 and 40. In contrast,
``'nfl'`` does a numeric compare of the line numbers. In fact,
``sort_stats('nfl')`` is the same as ``sort_stats('name', 'file',
'line')``.
``SortKey.NFL`` and ``SortKey.STDNAME`` is that the standard name is a
sort of the name as printed, which means that the embedded line numbers
get compared in an odd way. For example, lines 3, 20, and 40 would (if
the file names were the same) appear in the string order 20, 3 and 40.
In contrast, ``SortKey.NFL`` does a numeric compare of the line numbers.
In fact, ``sort_stats(SortKey.NFL)`` is the same as
``sort_stats(SortKey.NAME, SortKey.FILENAME, SortKey.LINE)``.
For backward-compatibility reasons, the numeric arguments ``-1``, ``0``,
``1``, and ``2`` are permitted. They are interpreted as ``'stdname'``,
@ -424,6 +431,8 @@ Analysis of the profiler data is done using the :class:`~pstats.Stats` class.
.. For compatibility with the old profiler.
.. versionadded:: 3.7
Added the SortKey enum.
.. method:: reverse_order()

View file

@ -27,7 +27,7 @@ byte-code cache files in the directory containing the source code.
Exception raised when an error occurs while attempting to compile the file.
.. function:: compile(file, cfile=None, dfile=None, doraise=False, optimize=-1)
.. function:: compile(file, cfile=None, dfile=None, doraise=False, optimize=-1, invalidation_mode=PycInvalidationMode.TIMESTAMP)
Compile a source file to byte-code and write out the byte-code cache file.
The source code is loaded from the file named *file*. The byte-code is
@ -53,6 +53,12 @@ byte-code cache files in the directory containing the source code.
:func:`compile` function. The default of ``-1`` selects the optimization
level of the current interpreter.
*invalidation_mode* should be a member of the :class:`PycInvalidationMode`
enum and controls how the generated ``.pyc`` files are invalidated at
runtime. If the :envvar:`SOURCE_DATE_EPOCH` environment variable is set,
*invalidation_mode* will be forced to
:attr:`PycInvalidationMode.CHECKED_HASH`.
.. versionchanged:: 3.2
Changed default value of *cfile* to be :PEP:`3147`-compliant. Previous
default was *file* + ``'c'`` (``'o'`` if optimization was enabled).
@ -65,6 +71,44 @@ byte-code cache files in the directory containing the source code.
caveat that :exc:`FileExistsError` is raised if *cfile* is a symlink or
non-regular file.
.. versionchanged:: 3.7
The *invalidation_mode* parameter was added as specified in :pep:`552`.
If the :envvar:`SOURCE_DATE_EPOCH` environment variable is set,
*invalidation_mode* will be forced to
:attr:`PycInvalidationMode.CHECKED_HASH`.
.. class:: PycInvalidationMode
A enumeration of possible methods the interpreter can use to determine
whether a bytecode file is up to date with a source file. The ``.pyc`` file
indicates the desired invalidation mode in its header. See
:ref:`pyc-invalidation` for more information on how Python invalidates
``.pyc`` files at runtime.
.. versionadded:: 3.7
.. attribute:: TIMESTAMP
The ``.pyc`` file includes the timestamp and size of the source file,
which Python will compare against the metadata of the source file at
runtime to determine if the ``.pyc`` file needs to be regenerated.
.. attribute:: CHECKED_HASH
The ``.pyc`` file includes a hash of the source file content, which Python
will compare against the source at runtime to determine if the ``.pyc``
file needs to be regenerated.
.. attribute:: UNCHECKED_HASH
Like :attr:`CHECKED_HASH`, the ``.pyc`` file includes a hash of the source
file content. However, Python will at runtime assume the ``.pyc`` file is
up to date and not validate the ``.pyc`` against the source file at all.
This option is useful when the ``.pycs`` are kept up to date by some
system external to Python like a build system.
.. function:: main(args=None)

View file

@ -24,4 +24,3 @@ overview:
gc.rst
inspect.rst
site.rst
fpectl.rst

View file

@ -23,8 +23,14 @@ the first retrieved (operating like a stack). With a priority queue,
the entries are kept sorted (using the :mod:`heapq` module) and the
lowest valued entry is retrieved first.
Internally, the module uses locks to temporarily block competing threads;
however, it is not designed to handle reentrancy within a thread.
Internally, those three types of queues use locks to temporarily block
competing threads; however, they are not designed to handle reentrancy
within a thread.
In addition, the module implements a "simple"
:abbr:`FIFO (first-in, first-out)` queue type where
specific implementations can provide additional guarantees
in exchange for the smaller functionality.
The :mod:`queue` module defines the following classes and exceptions:
@ -56,6 +62,24 @@ The :mod:`queue` module defines the following classes and exceptions:
one returned by ``sorted(list(entries))[0]``). A typical pattern for entries
is a tuple in the form: ``(priority_number, data)``.
If the *data* elements are not comparable, the data can be wrapped in a class
that ignores the data item and only compares the priority number::
from dataclasses import dataclass, field
from typing import Any
@dataclass(order=True)
class PrioritizedItem:
priority: int
item: Any=field(compare=False)
.. class:: SimpleQueue()
Constructor for an unbounded :abbr:`FIFO (first-in, first-out)` queue.
Simple queues lack advanced functionality such as task tracking.
.. versionadded:: 3.7
.. exception:: Empty
@ -191,6 +215,60 @@ Example of how to wait for enqueued tasks to be completed::
t.join()
SimpleQueue Objects
-------------------
:class:`SimpleQueue` objects provide the public methods described below.
.. method:: SimpleQueue.qsize()
Return the approximate size of the queue. Note, qsize() > 0 doesn't
guarantee that a subsequent get() will not block.
.. method:: SimpleQueue.empty()
Return ``True`` if the queue is empty, ``False`` otherwise. If empty()
returns ``False`` it doesn't guarantee that a subsequent call to get()
will not block.
.. method:: SimpleQueue.put(item, block=True, timeout=None)
Put *item* into the queue. The method never blocks and always succeeds
(except for potential low-level errors such as failure to allocate memory).
The optional args *block* and *timeout* are ignored and only provided
for compatibility with :meth:`Queue.put`.
.. impl-detail::
This method has a C implementation which is reentrant. That is, a
``put()`` or ``get()`` call can be interrupted by another ``put()``
call in the same thread without deadlocking or corrupting internal
state inside the queue. This makes it appropriate for use in
destructors such as ``__del__`` methods or :mod:`weakref` callbacks.
.. method:: SimpleQueue.put_nowait(item)
Equivalent to ``put(item)``, provided for compatibility with
:meth:`Queue.put_nowait`.
.. method:: SimpleQueue.get(block=True, timeout=None)
Remove and return an item from the queue. If optional args *block* is true and
*timeout* is ``None`` (the default), block if necessary until an item is available.
If *timeout* is a positive number, it blocks at most *timeout* seconds and
raises the :exc:`Empty` exception if no item was available within that time.
Otherwise (*block* is false), return an item if one is immediately available,
else raise the :exc:`Empty` exception (*timeout* is ignored in that case).
.. method:: SimpleQueue.get_nowait()
Equivalent to ``get(False)``.
.. seealso::
Class :class:`multiprocessing.Queue`
@ -200,4 +278,3 @@ Example of how to wait for enqueued tasks to be completed::
:class:`collections.deque` is an alternative implementation of unbounded
queues with fast atomic :meth:`~collections.deque.append` and
:meth:`~collections.deque.popleft` operations that do not require locking.

View file

@ -34,9 +34,10 @@ sending a graphics file.
Encode the contents of the *input* file and write the resulting quoted-printable
data to the *output* file. *input* and *output* must be
:term:`binary file objects <file object>`. *quotetabs*, a flag which controls
whether to encode embedded spaces and tabs must be provideda and when true it
encodes such embedded whitespace, and when false it leaves them unencoded.
:term:`binary file objects <file object>`. *quotetabs*, a
non-optional flag which controls whether to encode embedded spaces
and tabs; when true it encodes such embedded whitespace, and when
false it leaves them unencoded.
Note that spaces and tabs appearing at the end of lines are always encoded,
as per :rfc:`1521`. *header* is a flag which controls if spaces are encoded
as underscores as per :rfc:`1522`.

View file

@ -345,7 +345,7 @@ The special characters are:
This example looks for a word following a hyphen:
>>> m = re.search('(?<=-)\w+', 'spam-egg')
>>> m = re.search(r'(?<=-)\w+', 'spam-egg')
>>> m.group(0)
'egg'
@ -692,11 +692,11 @@ form.
splits occur, and the remainder of the string is returned as the final element
of the list. ::
>>> re.split('\W+', 'Words, words, words.')
>>> re.split(r'\W+', 'Words, words, words.')
['Words', 'words', 'words', '']
>>> re.split('(\W+)', 'Words, words, words.')
>>> re.split(r'(\W+)', 'Words, words, words.')
['Words', ', ', 'words', ', ', 'words', '.', '']
>>> re.split('\W+', 'Words, words, words.', 1)
>>> re.split(r'\W+', 'Words, words, words.', 1)
['Words', 'words, words.']
>>> re.split('[a-f]+', '0a3B9', flags=re.IGNORECASE)
['0', '3', '9']
@ -705,43 +705,28 @@ form.
the string, the result will start with an empty string. The same holds for
the end of the string::
>>> re.split('(\W+)', '...words, words...')
>>> re.split(r'(\W+)', '...words, words...')
['', '...', 'words', ', ', 'words', '...', '']
That way, separator components are always found at the same relative
indices within the result list.
.. note::
Empty matches for the pattern split the string only when not adjacent
to a previous empty match.
:func:`split` doesn't currently split a string on an empty pattern match.
For example::
>>> re.split('x*', 'axbc')
['a', 'bc']
Even though ``'x*'`` also matches 0 'x' before 'a', between 'b' and 'c',
and after 'c', currently these matches are ignored. The correct behavior
(i.e. splitting on empty matches too and returning ``['', 'a', 'b', 'c',
'']``) will be implemented in future versions of Python, but since this
is a backward incompatible change, a :exc:`FutureWarning` will be raised
in the meanwhile.
Patterns that can only match empty strings currently never split the
string. Since this doesn't match the expected behavior, a
:exc:`ValueError` will be raised starting from Python 3.5::
>>> re.split("^$", "foo\n\nbar\n", flags=re.M)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
...
ValueError: split() requires a non-empty pattern match.
>>> re.split(r'\b', 'Words, words, words.')
['', 'Words', ', ', 'words', ', ', 'words', '.']
>>> re.split(r'\W*', '...words...')
['', '', 'w', 'o', 'r', 'd', 's', '', '']
>>> re.split(r'(\W*)', '...words...')
['', '...', '', '', 'w', '', 'o', '', 'r', '', 'd', '', 's', '...', '', '', '']
.. versionchanged:: 3.1
Added the optional flags argument.
.. versionchanged:: 3.5
Splitting on a pattern that could match an empty string now raises
a warning. Patterns that can only match empty strings are now rejected.
.. versionchanged:: 3.7
Added support of splitting on a pattern that could match an empty string.
.. function:: findall(pattern, string, flags=0)
@ -749,8 +734,10 @@ form.
strings. The *string* is scanned left-to-right, and matches are returned in
the order found. If one or more groups are present in the pattern, return a
list of groups; this will be a list of tuples if the pattern has more than
one group. Empty matches are included in the result unless they touch the
beginning of another match.
one group. Empty matches are included in the result.
.. versionchanged:: 3.7
Non-empty matches can now start just after a previous empty match.
.. function:: finditer(pattern, string, flags=0)
@ -758,8 +745,10 @@ form.
Return an :term:`iterator` yielding :ref:`match objects <match-objects>` over
all non-overlapping matches for the RE *pattern* in *string*. The *string*
is scanned left-to-right, and matches are returned in the order found. Empty
matches are included in the result unless they touch the beginning of another
match.
matches are included in the result.
.. versionchanged:: 3.7
Non-empty matches can now start just after a previous empty match.
.. function:: sub(pattern, repl, string, count=0, flags=0)
@ -795,8 +784,8 @@ form.
The optional argument *count* is the maximum number of pattern occurrences to be
replaced; *count* must be a non-negative integer. If omitted or zero, all
occurrences will be replaced. Empty matches for the pattern are replaced only
when not adjacent to a previous match, so ``sub('x*', '-', 'abc')`` returns
``'-a-b-c-'``.
when not adjacent to a previous empty match, so ``sub('x*', '-', 'abxd')`` returns
``'-a-b--d-'``.
In string-type *repl* arguments, in addition to the character escapes and
backreferences described above,
@ -822,6 +811,9 @@ form.
Unknown escapes in *repl* consisting of ``'\'`` and an ASCII letter
now are errors.
Empty matches for the pattern are replaced when adjacent to a previous
non-empty match.
.. function:: subn(pattern, repl, string, count=0, flags=0)

Some files were not shown because too many files have changed in this diff Show more