index
int64 0
731k
| package
stringlengths 2
98
⌀ | name
stringlengths 1
76
| docstring
stringlengths 0
281k
⌀ | code
stringlengths 4
1.07M
⌀ | signature
stringlengths 2
42.8k
⌀ |
---|---|---|---|---|---|
35,960 | pathlib | __hash__ | Return :func:`hash(self) <hash>`. | def __hash__(self):
try:
return self._hash
except AttributeError:
self._hash = hash(tuple(self._cparts))
return self._hash
| (self) |
35,961 | pathlib | __le__ | Return ``self <= other``. | def __le__(self, other):
if not isinstance(other, PurePath) or self._flavour is not other._flavour:
return NotImplemented
return self._cparts <= other._cparts
| (self, other) -> bool |
35,962 | pathlib | __lt__ | Return ``self < other``. | def __lt__(self, other):
if not isinstance(other, PurePath) or self._flavour is not other._flavour:
return NotImplemented
return self._cparts < other._cparts
| (self, other) -> bool |
35,963 | pathlib | __new__ | Construct a PurePath from one or several strings and or existing
PurePath objects. The strings and path objects are combined so as
to yield a canonicalized path, which is incorporated into the
new PurePath object.
| def __new__(cls, *args):
"""Construct a PurePath from one or several strings and or existing
PurePath objects. The strings and path objects are combined so as
to yield a canonicalized path, which is incorporated into the
new PurePath object.
"""
if cls is PurePath:
cls = PureWindowsPath if os.name == 'nt' else PurePosixPath
return cls._from_parts(args)
| (cls, *args) |
35,965 | apeye_core | __repr__ | Return a string representation of the :class:`~apeye_core.URLPath`. | epr__(self) -> str:
super().__repr__()
| (self) -> str |
35,966 | pathlib | __rtruediv__ | Return ``value / self``. | def __rtruediv__(self, key):
try:
return self._from_parts([key] + self._parts)
except TypeError:
return NotImplemented
| (self, key) |
35,967 | apeye_core | __str__ |
Return the string representation of the path, suitable for passing to system calls.
| tr__(self) -> str:
the string representation of the path, suitable for passing to system calls.
hasattr(self, "_root") and hasattr(self, "_load_parts"):
_load_parts() # type: ignore[attr-defined]
n self._str # type: ignore
AttributeError:
sattr(self, "_parts"):
s = self._parts # type: ignore[attr-defined]
s = self._tail # type: ignore[attr-defined]
_str = self._format_parsed_parts(self._drv, self._root, parts) or '' # type: ignore
n self._str
| (self) -> str |
35,968 | pathlib | __truediv__ | Return ``self / value``. | def __truediv__(self, key):
try:
return self._make_child((key,))
except TypeError:
return NotImplemented
| (self, key) |
35,971 | apeye_core | as_uri | null | uri(self, *args, **kwargs) -> "NoReturn": # noqa: D102
NotImplementedError
| (self, *args, **kwargs) -> 'NoReturn' |
35,972 | apeye_core | is_absolute |
Returns whether the path is absolute (i.e. starts with ``/``).
.. versionadded:: 1.1.0 previously raised :exc:`NotImplementedError`.
| absolute(self) -> bool:
s whether the path is absolute (i.e. starts with ``/``).
sionadded:: 1.1.0 previously raised :exc:`NotImplementedError`.
self.root == '/'
| (self) -> bool |
35,975 | apeye_core | joinpath |
Combine this :class:`~.apeye.url.URLPath` with one or several arguments.
.. versionadded:: 1.1.0 previously raised :exc:`NotImplementedError`.
:returns: A new :class:`~.apeye.url.URLPath` representing either a subpath
(if all arguments are relative paths) or a totally different path
(if one of the arguments is absolute).
| npath(self: URLPathType, *args) -> URLPathType:
e this :class:`~.apeye.url.URLPath` with one or several arguments.
sionadded:: 1.1.0 previously raised :exc:`NotImplementedError`.
ns: A new :class:`~.apeye.url.URLPath` representing either a subpath
ll arguments are relative paths) or a totally different path
ne of the arguments is absolute).
super().joinpath(*args)
| (self: ~URLPathType, *args) -> ~URLPathType |
35,977 | apeye_core | relative_to |
Returns the relative path to another path identified by the passed arguments.
The arguments are joined together to form a single path, and therefore the following behave identically:
.. code-block:: pycon
>>> URLPath("/news/sport").relative_to("/", "news")
URLPath('sport')
>>> URLPath("/news/sport").relative_to("/news")
URLPath('sport')
.. versionadded:: 1.1.0 previously raised :exc:`NotImplementedError`.
:param \*other:
:raises ValueError: if the operation is not possible (because this is not a subpath of the other path)
.. latex:clearpage::
.. seealso::
:meth:`~.apeye.url.URL.relative_to`, which is recommended when constructing a relative path from a :class:`~URL`.
This method cannot correctly handle some cases, such as:
.. code-block:: pycon
>>> URL("https://github.com/domdfcoding").path.relative_to(URL("https://github.com").path)
Traceback (most recent call last):
ValueError: '/domdfcoding' does not start with ''
Since ``URL("https://github.com").path`` is ``URLPath('')``.
Instead, use:
>>> URL("https://github.com/domdfcoding").relative_to(URL("https://github.com"))
URLPath('domdfcoding')
| ative_to(self: URLPathType, *other: PathLike) -> URLPathType:
s the relative path to another path identified by the passed arguments.
guments are joined together to form a single path, and therefore the following behave identically:
e-block:: pycon
RLPath("/news/sport").relative_to("/", "news")
th('sport')
RLPath("/news/sport").relative_to("/news")
th('sport')
sionadded:: 1.1.0 previously raised :exc:`NotImplementedError`.
\*other:
s ValueError: if the operation is not possible (because this is not a subpath of the other path)
ex:clearpage::
also::
:`~.apeye.url.URL.relative_to`, which is recommended when constructing a relative path from a :class:`~URL`.
method cannot correctly handle some cases, such as:
de-block:: pycon
URL("https://github.com/domdfcoding").path.relative_to(URL("https://github.com").path)
eback (most recent call last):
eError: '/domdfcoding' does not start with ''
``URL("https://github.com").path`` is ``URLPath('')``.
ad, use:
URL("https://github.com/domdfcoding").relative_to(URL("https://github.com"))
ath('domdfcoding')
super().relative_to(*other)
| (self: ~URLPathType, *other: Union[str, pathlib.Path, os.PathLike]) -> ~URLPathType |
35,982 | curatorbin | OSPlatform | Mapping of curator OS to Evergreen build variant name. | class OSPlatform(object):
"""Mapping of curator OS to Evergreen build variant name."""
MACOS_x64 = "macos"
MACOS_arm64 = "macos-arm64"
WINDOWS_x64 = "windows-64"
LINUX_x64 = "ubuntu"
LINUX_arm64 = "arm"
| () |
35,983 | curatorbin | get_curator_path |
returns path to curator binary, after checking it exists and matches
the hardcoded git hash. If this is not the case, it raises an error.
| def get_curator_path():
"""
returns path to curator binary, after checking it exists and matches
the hardcoded git hash. If this is not the case, it raises an error.
"""
current_module = __import__(__name__)
build_path = current_module.__path__[0]
os_platform = None
processor_info = "processor info unavailable"
if sys.platform == "darwin":
command = ["/usr/sbin/sysctl", "-n", "machdep.cpu.brand_string"]
processor_info = subprocess.check_output(command).decode().strip()
if "Intel" in processor_info:
os_platform = OSPlatform.MACOS_x64
elif "Apple" in processor_info: # E.g. Apple M1, Apple M2.
os_platform = OSPlatform.MACOS_arm64
elif sys.platform == "win32":
processor_info = platform.processor()
if "Intel" in processor_info:
os_platform = OSPlatform.WINDOWS_x64
elif sys.platform.startswith("linux"):
command = "cat /proc/cpuinfo"
processor_info = subprocess.check_output(command, shell=True).decode().strip()
if "Intel" in processor_info:
os_platform = OSPlatform.LINUX_x64
elif re.search("CPU implementer.*0x41", processor_info) is not None:
os_platform = OSPlatform.LINUX_arm64
if os_platform is None:
raise OSError("Unrecognized OS and/or CPU architecture. OS: {}, Processor info: {}".format(
sys.platform, processor_info))
curator_path = os.path.join(build_path, os_platform, "curator")
if sys.platform == "win32":
curator_path += ".exe"
git_hash = "67131aa196877d75cf29ca28ac4181f5d241a3a5"
curator_exists = os.path.isfile(curator_path)
if curator_exists:
curator_version = subprocess.check_output([curator_path,
"--version"]).decode('utf-8').split()
curator_same_version = git_hash in curator_version
if curator_same_version :
return curator_path
errmsg = ("Found a different version of curator. "
"Looking for '{}', but found '{}'. Something has gone terribly wrong"
" in the python wrapper for curator").format(git_hash, curator_version)
raise RuntimeError(errmsg)
else:
raise FileNotFoundError("Something has gone terribly wrong."
"curator binary not found at '{}'".format(curator_path))
| () |
35,987 | curatorbin | run_curator | runs the curator binary packaged with this module, passing along any arguments. | def run_curator(*args):
"""runs the curator binary packaged with this module, passing along any arguments."""
curator_path = get_curator_path()
return subprocess.check_output([curator_path, *args])
| (*args) |
35,990 | builtins | Proxy | null | from builtins import Proxy
| null |
35,995 | flask_dance.consumer.oauth1 | OAuth1ConsumerBlueprint |
A subclass of :class:`flask.Blueprint` that sets up OAuth 1 authentication.
| class OAuth1ConsumerBlueprint(BaseOAuthConsumerBlueprint):
"""
A subclass of :class:`flask.Blueprint` that sets up OAuth 1 authentication.
"""
def __init__(
self,
name,
import_name,
client_key=None,
client_secret=None,
*,
signature_method=SIGNATURE_HMAC,
signature_type=SIGNATURE_TYPE_AUTH_HEADER,
rsa_key=None,
client_class=None,
force_include_body=False,
static_folder=None,
static_url_path=None,
template_folder=None,
url_prefix=None,
subdomain=None,
url_defaults=None,
root_path=None,
login_url=None,
authorized_url=None,
base_url=None,
request_token_url=None,
authorization_url=None,
access_token_url=None,
redirect_url=None,
redirect_to=None,
session_class=None,
storage=None,
rule_kwargs=None,
**kwargs,
):
"""
Most of the constructor arguments are forwarded either to the
:class:`flask.Blueprint` constructor or the
:class:`requests_oauthlib.OAuth1Session` constructor, including
``**kwargs`` (which is forwarded to
:class:`~requests_oauthlib.OAuth1Session`).
Only the arguments that are relevant to Flask-Dance are documented here.
Args:
base_url: The base URL of the OAuth provider.
If specified, all URLs passed to this instance will be
resolved relative to this URL.
request_token_url: The URL specified by the OAuth provider for
obtaining a
`request token <http://oauth.net/core/1.0a/#auth_step1>`_.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
authorization_url: The URL specified by the OAuth provider for
the user to
`grant token authorization <http://oauth.net/core/1.0a/#auth_step2>`_.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
access_token_url: The URL specified by the OAuth provider for
obtaining an
`access token <http://oauth.net/core/1.0a/#auth_step3>`_.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
login_url: The URL route for the ``login`` view that kicks off
the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}``, so that the URL is based on the name
of the blueprint.
authorized_url: The URL route for the ``authorized`` view that
completes the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}/authorized``, so that the URL is
based on the name of the blueprint.
redirect_url: When the OAuth dance is complete,
redirect the user to this URL.
redirect_to: When the OAuth dance is complete,
redirect the user to the URL obtained by calling
:func:`~flask.url_for` with this argument. If you do not specify
either ``redirect_url`` or ``redirect_to``, the user will be
redirected to the root path (``/``).
session_class: The class to use for creating a Requests session
between the consumer (your website) and the provider (e.g.
Google). Defaults to
:class:`~flask_dance.consumer.requests.OAuth1Session`.
storage: A token storage class, or an instance of a token storage
class, to use for this blueprint. Defaults to
:class:`~flask_dance.consumer.storage.session.SessionStorage`.
rule_kwargs (dict, optional): Additional arguments that should be passed when adding
the login and authorized routes. Defaults to ``None``.
"""
BaseOAuthConsumerBlueprint.__init__(
self,
name,
import_name,
static_folder=static_folder,
static_url_path=static_url_path,
template_folder=template_folder,
url_prefix=url_prefix,
subdomain=subdomain,
url_defaults=url_defaults,
root_path=root_path,
login_url=login_url,
authorized_url=authorized_url,
storage=storage,
rule_kwargs=rule_kwargs,
)
self.base_url = base_url
self.session_class = session_class or OAuth1Session
# passed to OAuth1Session()
self.client_key = client_key
self.client_secret = client_secret
self.signature_method = signature_method
self.signature_type = signature_type
self.rsa_key = rsa_key
self.client_class = client_class
self.force_include_body = force_include_body
self.kwargs = kwargs
# used by view functions
self.request_token_url = request_token_url
self.authorization_url = authorization_url
self.access_token_url = access_token_url
self.redirect_url = redirect_url
self.redirect_to = redirect_to
self.teardown_app_request(self.teardown_session)
@cached_property
def session(self):
"""
This is a session between the consumer (your website) and the provider
(e.g. Google). It is *not* a session between a user of your website
and your website.
:return:
"""
return self.session_class(
client_key=self.client_key,
client_secret=self.client_secret,
signature_method=self.signature_method,
signature_type=self.signature_type,
rsa_key=self.rsa_key,
client_class=self.client_class,
force_include_body=self.force_include_body,
blueprint=self,
base_url=self.base_url,
**self.kwargs,
)
def teardown_session(self, exception=None):
try:
del self.session
except KeyError:
pass
def login(self):
callback_uri = url_for(".authorized", _external=True)
self.session._client.client.callback_uri = to_unicode(callback_uri)
try:
self.session.fetch_request_token(
self.request_token_url, should_load_token=False
)
except TokenRequestDenied as err:
message = err.args[0]
response = getattr(err, "response", None)
log.warning("OAuth 1 request token error: %s", message)
oauth_error.send(self, message=message, response=response)
# can't proceed with OAuth, have to just redirect to next_url
if self.redirect_url:
next_url = self.redirect_url
elif self.redirect_to:
next_url = url_for(self.redirect_to)
else:
next_url = "/"
return redirect(next_url)
url = self.session.authorization_url(self.authorization_url)
oauth_before_login.send(self, url=url)
return redirect(url)
def authorized(self):
"""
This is the route/function that the user will be redirected to by
the provider (e.g. Google) after the user has logged into the
provider's website and authorized your app to access their account.
"""
if self.redirect_url:
next_url = self.redirect_url
elif self.redirect_to:
next_url = url_for(self.redirect_to)
else:
next_url = "/"
try:
self.session.parse_authorization_response(request.url)
except TokenMissing as err:
message = err.args[0]
response = getattr(err, "response", None)
log.warning("OAuth 1 access token error: %s", message)
oauth_error.send(self, message=message, response=response)
return redirect(next_url)
try:
token = self.session.fetch_access_token(
self.access_token_url, should_load_token=False
)
except ValueError as err:
# can't proceed with OAuth, have to just redirect to next_url
message = err.args[0]
response = getattr(err, "response", None)
log.warning("OAuth 1 access token error: %s", message)
oauth_error.send(self, message=message, response=response)
return redirect(next_url)
results = oauth_authorized.send(self, token=token) or []
set_token = True
for func, ret in results:
if isinstance(ret, (Response, current_app.response_class)):
return ret
if ret == False:
set_token = False
if set_token:
self.token = token
return redirect(next_url)
| (name, import_name, client_key=None, client_secret=None, *, signature_method='HMAC-SHA1', signature_type='AUTH_HEADER', rsa_key=None, client_class=None, force_include_body=False, static_folder=None, static_url_path=None, template_folder=None, url_prefix=None, subdomain=None, url_defaults=None, root_path=None, login_url=None, authorized_url=None, base_url=None, request_token_url=None, authorization_url=None, access_token_url=None, redirect_url=None, redirect_to=None, session_class=None, storage=None, rule_kwargs=None, **kwargs) |
35,996 | flask_dance.consumer.oauth1 | __init__ |
Most of the constructor arguments are forwarded either to the
:class:`flask.Blueprint` constructor or the
:class:`requests_oauthlib.OAuth1Session` constructor, including
``**kwargs`` (which is forwarded to
:class:`~requests_oauthlib.OAuth1Session`).
Only the arguments that are relevant to Flask-Dance are documented here.
Args:
base_url: The base URL of the OAuth provider.
If specified, all URLs passed to this instance will be
resolved relative to this URL.
request_token_url: The URL specified by the OAuth provider for
obtaining a
`request token <http://oauth.net/core/1.0a/#auth_step1>`_.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
authorization_url: The URL specified by the OAuth provider for
the user to
`grant token authorization <http://oauth.net/core/1.0a/#auth_step2>`_.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
access_token_url: The URL specified by the OAuth provider for
obtaining an
`access token <http://oauth.net/core/1.0a/#auth_step3>`_.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
login_url: The URL route for the ``login`` view that kicks off
the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}``, so that the URL is based on the name
of the blueprint.
authorized_url: The URL route for the ``authorized`` view that
completes the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}/authorized``, so that the URL is
based on the name of the blueprint.
redirect_url: When the OAuth dance is complete,
redirect the user to this URL.
redirect_to: When the OAuth dance is complete,
redirect the user to the URL obtained by calling
:func:`~flask.url_for` with this argument. If you do not specify
either ``redirect_url`` or ``redirect_to``, the user will be
redirected to the root path (``/``).
session_class: The class to use for creating a Requests session
between the consumer (your website) and the provider (e.g.
Google). Defaults to
:class:`~flask_dance.consumer.requests.OAuth1Session`.
storage: A token storage class, or an instance of a token storage
class, to use for this blueprint. Defaults to
:class:`~flask_dance.consumer.storage.session.SessionStorage`.
rule_kwargs (dict, optional): Additional arguments that should be passed when adding
the login and authorized routes. Defaults to ``None``.
| def __init__(
self,
name,
import_name,
client_key=None,
client_secret=None,
*,
signature_method=SIGNATURE_HMAC,
signature_type=SIGNATURE_TYPE_AUTH_HEADER,
rsa_key=None,
client_class=None,
force_include_body=False,
static_folder=None,
static_url_path=None,
template_folder=None,
url_prefix=None,
subdomain=None,
url_defaults=None,
root_path=None,
login_url=None,
authorized_url=None,
base_url=None,
request_token_url=None,
authorization_url=None,
access_token_url=None,
redirect_url=None,
redirect_to=None,
session_class=None,
storage=None,
rule_kwargs=None,
**kwargs,
):
"""
Most of the constructor arguments are forwarded either to the
:class:`flask.Blueprint` constructor or the
:class:`requests_oauthlib.OAuth1Session` constructor, including
``**kwargs`` (which is forwarded to
:class:`~requests_oauthlib.OAuth1Session`).
Only the arguments that are relevant to Flask-Dance are documented here.
Args:
base_url: The base URL of the OAuth provider.
If specified, all URLs passed to this instance will be
resolved relative to this URL.
request_token_url: The URL specified by the OAuth provider for
obtaining a
`request token <http://oauth.net/core/1.0a/#auth_step1>`_.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
authorization_url: The URL specified by the OAuth provider for
the user to
`grant token authorization <http://oauth.net/core/1.0a/#auth_step2>`_.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
access_token_url: The URL specified by the OAuth provider for
obtaining an
`access token <http://oauth.net/core/1.0a/#auth_step3>`_.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
login_url: The URL route for the ``login`` view that kicks off
the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}``, so that the URL is based on the name
of the blueprint.
authorized_url: The URL route for the ``authorized`` view that
completes the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}/authorized``, so that the URL is
based on the name of the blueprint.
redirect_url: When the OAuth dance is complete,
redirect the user to this URL.
redirect_to: When the OAuth dance is complete,
redirect the user to the URL obtained by calling
:func:`~flask.url_for` with this argument. If you do not specify
either ``redirect_url`` or ``redirect_to``, the user will be
redirected to the root path (``/``).
session_class: The class to use for creating a Requests session
between the consumer (your website) and the provider (e.g.
Google). Defaults to
:class:`~flask_dance.consumer.requests.OAuth1Session`.
storage: A token storage class, or an instance of a token storage
class, to use for this blueprint. Defaults to
:class:`~flask_dance.consumer.storage.session.SessionStorage`.
rule_kwargs (dict, optional): Additional arguments that should be passed when adding
the login and authorized routes. Defaults to ``None``.
"""
BaseOAuthConsumerBlueprint.__init__(
self,
name,
import_name,
static_folder=static_folder,
static_url_path=static_url_path,
template_folder=template_folder,
url_prefix=url_prefix,
subdomain=subdomain,
url_defaults=url_defaults,
root_path=root_path,
login_url=login_url,
authorized_url=authorized_url,
storage=storage,
rule_kwargs=rule_kwargs,
)
self.base_url = base_url
self.session_class = session_class or OAuth1Session
# passed to OAuth1Session()
self.client_key = client_key
self.client_secret = client_secret
self.signature_method = signature_method
self.signature_type = signature_type
self.rsa_key = rsa_key
self.client_class = client_class
self.force_include_body = force_include_body
self.kwargs = kwargs
# used by view functions
self.request_token_url = request_token_url
self.authorization_url = authorization_url
self.access_token_url = access_token_url
self.redirect_url = redirect_url
self.redirect_to = redirect_to
self.teardown_app_request(self.teardown_session)
| (self, name, import_name, client_key=None, client_secret=None, *, signature_method='HMAC-SHA1', signature_type='AUTH_HEADER', rsa_key=None, client_class=None, force_include_body=False, static_folder=None, static_url_path=None, template_folder=None, url_prefix=None, subdomain=None, url_defaults=None, root_path=None, login_url=None, authorized_url=None, base_url=None, request_token_url=None, authorization_url=None, access_token_url=None, redirect_url=None, redirect_to=None, session_class=None, storage=None, rule_kwargs=None, **kwargs) |
36,015 | flask_dance.consumer.oauth1 | authorized |
This is the route/function that the user will be redirected to by
the provider (e.g. Google) after the user has logged into the
provider's website and authorized your app to access their account.
| def authorized(self):
"""
This is the route/function that the user will be redirected to by
the provider (e.g. Google) after the user has logged into the
provider's website and authorized your app to access their account.
"""
if self.redirect_url:
next_url = self.redirect_url
elif self.redirect_to:
next_url = url_for(self.redirect_to)
else:
next_url = "/"
try:
self.session.parse_authorization_response(request.url)
except TokenMissing as err:
message = err.args[0]
response = getattr(err, "response", None)
log.warning("OAuth 1 access token error: %s", message)
oauth_error.send(self, message=message, response=response)
return redirect(next_url)
try:
token = self.session.fetch_access_token(
self.access_token_url, should_load_token=False
)
except ValueError as err:
# can't proceed with OAuth, have to just redirect to next_url
message = err.args[0]
response = getattr(err, "response", None)
log.warning("OAuth 1 access token error: %s", message)
oauth_error.send(self, message=message, response=response)
return redirect(next_url)
results = oauth_authorized.send(self, token=token) or []
set_token = True
for func, ret in results:
if isinstance(ret, (Response, current_app.response_class)):
return ret
if ret == False:
set_token = False
if set_token:
self.token = token
return redirect(next_url)
| (self) |
36,024 | flask_dance.consumer.base | load_config |
Used to dynamically load variables from the Flask application config
into the blueprint. To tell this blueprint to pull configuration from
the app, just set key-value pairs in the ``from_config`` dict. Keys
are the name of the local variable to set on the blueprint object,
and values are the variable name in the Flask application config.
For example:
blueprint.from_config["session.client_id"] = "GITHUB_OAUTH_CLIENT_ID"
| def load_config(self):
"""
Used to dynamically load variables from the Flask application config
into the blueprint. To tell this blueprint to pull configuration from
the app, just set key-value pairs in the ``from_config`` dict. Keys
are the name of the local variable to set on the blueprint object,
and values are the variable name in the Flask application config.
For example:
blueprint.from_config["session.client_id"] = "GITHUB_OAUTH_CLIENT_ID"
"""
for local_var, config_var in self.from_config.items():
value = flask.current_app.config.get(config_var)
if value:
if "." in local_var:
# this is a dotpath -- needs special handling
body, tail = local_var.rsplit(".", 1)
obj = getattrd(self, body)
setattr(obj, tail, value)
else:
# just use a normal setattr call
setattr(self, local_var, value)
| (self) |
36,025 | flask_dance.consumer.oauth1 | login | null | def login(self):
callback_uri = url_for(".authorized", _external=True)
self.session._client.client.callback_uri = to_unicode(callback_uri)
try:
self.session.fetch_request_token(
self.request_token_url, should_load_token=False
)
except TokenRequestDenied as err:
message = err.args[0]
response = getattr(err, "response", None)
log.warning("OAuth 1 request token error: %s", message)
oauth_error.send(self, message=message, response=response)
# can't proceed with OAuth, have to just redirect to next_url
if self.redirect_url:
next_url = self.redirect_url
elif self.redirect_to:
next_url = url_for(self.redirect_to)
else:
next_url = "/"
return redirect(next_url)
url = self.session.authorization_url(self.authorization_url)
oauth_before_login.send(self, url=url)
return redirect(url)
| (self) |
36,040 | flask_dance.consumer.oauth1 | teardown_session | null | def teardown_session(self, exception=None):
try:
del self.session
except KeyError:
pass
| (self, exception=None) |
36,043 | flask_dance.consumer.oauth2 | OAuth2ConsumerBlueprint |
A subclass of :class:`flask.Blueprint` that sets up OAuth 2 authentication.
| class OAuth2ConsumerBlueprint(BaseOAuthConsumerBlueprint):
"""
A subclass of :class:`flask.Blueprint` that sets up OAuth 2 authentication.
"""
def __init__(
self,
name,
import_name,
client_id=None,
client_secret=None,
*,
client=None,
auto_refresh_url=None,
auto_refresh_kwargs=None,
scope=None,
state=None,
static_folder=None,
static_url_path=None,
template_folder=None,
url_prefix=None,
subdomain=None,
url_defaults=None,
root_path=None,
login_url=None,
authorized_url=None,
base_url=None,
authorization_url=None,
authorization_url_params=None,
token_url=None,
token_url_params=None,
redirect_url=None,
redirect_to=None,
session_class=None,
storage=None,
rule_kwargs=None,
use_pkce=False,
code_challenge_method="S256",
**kwargs,
):
"""
Most of the constructor arguments are forwarded either to the
:class:`flask.Blueprint` constructor or the
:class:`requests_oauthlib.OAuth2Session` constructor, including
``**kwargs`` (which is forwarded to
:class:`~requests_oauthlib.OAuth2Session`).
Only the arguments that are relevant to Flask-Dance are documented here.
Args:
base_url: The base URL of the OAuth provider.
If specified, all URLs passed to this instance will be
resolved relative to this URL.
authorization_url: The URL specified by the OAuth provider for
obtaining an
`authorization grant <https://datatracker.ietf.org/doc/html/rfc6749#section-1.3>`__.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
authorization_url_params (dict): A dict of extra
key-value pairs to include in the query string of the
``authorization_url``, beyond those necessary for a standard
OAuth 2 authorization grant request.
token_url: The URL specified by the OAuth provider for
obtaining an
`access token <https://datatracker.ietf.org/doc/html/rfc6749#section-1.4>`__.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
token_url_params (dict): A dict of extra
key-value pairs to include in the query string of the
``token_url``, beyond those necessary for a standard
OAuth 2 access token request.
login_url: The URL route for the ``login`` view that kicks off
the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}``, so that the URL is based on the name
of the blueprint.
authorized_url: The URL route for the ``authorized`` view that
completes the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}/authorized``, so that the URL is
based on the name of the blueprint.
redirect_url: When the OAuth dance is complete,
redirect the user to this URL.
redirect_to: When the OAuth dance is complete,
redirect the user to the URL obtained by calling
:func:`~flask.url_for` with this argument. If you do not specify
either ``redirect_url`` or ``redirect_to``, the user will be
redirected to the root path (``/``).
session_class: The class to use for creating a Requests session
between the consumer (your website) and the provider (e.g.
Google). Defaults to
:class:`~flask_dance.consumer.requests.OAuth2Session`.
storage: A token storage class, or an instance of a token storage
class, to use for this blueprint. Defaults to
:class:`~flask_dance.consumer.storage.session.SessionStorage`.
rule_kwargs (dict, optional): Additional arguments that should be passed when adding
the login and authorized routes. Defaults to ``None``.
use_pkce: If true then the authorization flow will follow the PKCE (Proof Key for Code Exchange).
For more details please refer to `RFC7636 <https://www.rfc-editor.org/rfc/rfc7636#section-4.1>`__
code_challenge_method: Code challenge method to be used in authorization code flow with PKCE
instead of client secret. It will be used only if ``use_pkce`` is set to True.
Defaults to ``S256``.
"""
BaseOAuthConsumerBlueprint.__init__(
self,
name,
import_name,
static_folder=static_folder,
static_url_path=static_url_path,
template_folder=template_folder,
url_prefix=url_prefix,
subdomain=subdomain,
url_defaults=url_defaults,
root_path=root_path,
login_url=login_url,
authorized_url=authorized_url,
storage=storage,
rule_kwargs=rule_kwargs,
)
self.base_url = base_url
self.session_class = session_class or OAuth2Session
# passed to OAuth2Session()
self._client_id = client_id
self.client = client
self.auto_refresh_url = auto_refresh_url
self.auto_refresh_kwargs = auto_refresh_kwargs
self.scope = scope
self.state = state
self.kwargs = kwargs
self.client_secret = client_secret
# used by view functions
self.authorization_url = authorization_url
self.authorization_url_params = authorization_url_params or {}
self.token_url = token_url
self.token_url_params = token_url_params or {}
self.redirect_url = redirect_url
self.redirect_to = redirect_to
self.code_challenge_method = code_challenge_method
self.use_pkce = use_pkce
self.teardown_app_request(self.teardown_session)
@property
def client_id(self):
return self.session.client_id
@client_id.setter
def client_id(self, value):
self.session.client_id = value
# due to a bug in requests-oauthlib, we need to set this manually
self.session._client.client_id = value
@cached_property
def session(self):
"""
This is a session between the consumer (your website) and the provider
(e.g. Google). It is *not* a session between a user of your website
and your website.
:return:
"""
ret = self.session_class(
client_id=self._client_id,
client=self.client,
auto_refresh_url=self.auto_refresh_url,
auto_refresh_kwargs=self.auto_refresh_kwargs,
scope=self.scope,
state=self.state,
blueprint=self,
base_url=self.base_url,
**self.kwargs,
)
def token_updater(token):
self.token = token
ret.token_updater = token_updater
return self.session_created(ret)
def session_created(self, session):
return session
def teardown_session(self, exception=None):
try:
del self.session
except KeyError:
pass
def login(self):
log.debug("client_id = %s", self.client_id)
self.session.redirect_uri = url_for(".authorized", _external=True)
if self.use_pkce:
code_verifier = generate_token(length=48)
code_challenge = self.session._client.create_code_challenge(
code_verifier=code_verifier,
code_challenge_method=self.code_challenge_method,
)
self.authorization_url_params.update(
{
"code_challenge_method": self.code_challenge_method,
"code_challenge": code_challenge,
}
)
code_verifier_key = f"{self.name}_oauth_code_verifier"
flask.session[code_verifier_key] = code_verifier
log.debug("code_verifier = %s", code_verifier)
url, state = self.session.authorization_url(
self.authorization_url, state=self.state, **self.authorization_url_params
)
state_key = f"{self.name}_oauth_state"
flask.session[state_key] = state
log.debug("state = %s", state)
log.debug("redirect URL = %s", url)
oauth_before_login.send(self, url=url)
return redirect(url)
def authorized(self):
"""
This is the route/function that the user will be redirected to by
the provider (e.g. Google) after the user has logged into the
provider's website and authorized your app to access their account.
"""
if self.redirect_url:
next_url = self.redirect_url
elif self.redirect_to:
next_url = url_for(self.redirect_to)
else:
next_url = "/"
log.debug("next_url = %s", next_url)
# check for error in request args
error = request.args.get("error")
if error:
error_desc = request.args.get("error_description")
error_uri = request.args.get("error_uri")
log.warning(
"OAuth 2 authorization error: %s description: %s uri: %s",
error,
error_desc,
error_uri,
)
results = oauth_error.send(
self, error=error, error_description=error_desc, error_uri=error_uri
)
if results:
for _, ret in results:
if isinstance(ret, (Response, current_app.response_class)):
return ret
return redirect(next_url)
state_key = f"{self.name}_oauth_state"
if state_key not in flask.session:
# can't validate state, so redirect back to login view
log.info("state not found, redirecting user to login")
return redirect(url_for(".login"))
state = flask.session[state_key]
log.debug("state = %s", state)
self.session._state = state
del flask.session[state_key]
if self.use_pkce:
code_verifier_key = f"{self.name}_oauth_code_verifier"
if code_verifier_key not in flask.session:
# can't find code_verifier, so redirect back to login view
log.info("code_verifier not found, redirecting user to login")
return redirect(url_for(".login"))
code_verifier = flask.session[code_verifier_key]
log.debug("code_verifier = %s", code_verifier)
del flask.session[code_verifier_key]
self.token_url_params["code_verifier"] = code_verifier
self.session.redirect_uri = url_for(".authorized", _external=True)
log.debug("client_id = %s", self.client_id)
log.debug("client_secret = %s", self.client_secret)
try:
token = self.session.fetch_token(
self.token_url,
authorization_response=request.url,
client_secret=self.client_secret,
**self.token_url_params,
)
except MissingCodeError as e:
e.args = (
e.args[0],
"The redirect request did not contain the expected parameters. Instead I got: {}".format(
json.dumps(request.args)
),
)
raise
results = oauth_authorized.send(self, token=token) or []
set_token = True
for func, ret in results:
if isinstance(ret, (Response, current_app.response_class)):
return ret
if ret == False:
set_token = False
if set_token:
try:
self.token = token
except ValueError as error:
log.warning("OAuth 2 authorization error: %s", str(error))
oauth_error.send(self, error=error)
return redirect(next_url)
| (name, import_name, client_id=None, client_secret=None, *, client=None, auto_refresh_url=None, auto_refresh_kwargs=None, scope=None, state=None, static_folder=None, static_url_path=None, template_folder=None, url_prefix=None, subdomain=None, url_defaults=None, root_path=None, login_url=None, authorized_url=None, base_url=None, authorization_url=None, authorization_url_params=None, token_url=None, token_url_params=None, redirect_url=None, redirect_to=None, session_class=None, storage=None, rule_kwargs=None, use_pkce=False, code_challenge_method='S256', **kwargs) |
36,044 | flask_dance.consumer.oauth2 | __init__ |
Most of the constructor arguments are forwarded either to the
:class:`flask.Blueprint` constructor or the
:class:`requests_oauthlib.OAuth2Session` constructor, including
``**kwargs`` (which is forwarded to
:class:`~requests_oauthlib.OAuth2Session`).
Only the arguments that are relevant to Flask-Dance are documented here.
Args:
base_url: The base URL of the OAuth provider.
If specified, all URLs passed to this instance will be
resolved relative to this URL.
authorization_url: The URL specified by the OAuth provider for
obtaining an
`authorization grant <https://datatracker.ietf.org/doc/html/rfc6749#section-1.3>`__.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
authorization_url_params (dict): A dict of extra
key-value pairs to include in the query string of the
``authorization_url``, beyond those necessary for a standard
OAuth 2 authorization grant request.
token_url: The URL specified by the OAuth provider for
obtaining an
`access token <https://datatracker.ietf.org/doc/html/rfc6749#section-1.4>`__.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
token_url_params (dict): A dict of extra
key-value pairs to include in the query string of the
``token_url``, beyond those necessary for a standard
OAuth 2 access token request.
login_url: The URL route for the ``login`` view that kicks off
the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}``, so that the URL is based on the name
of the blueprint.
authorized_url: The URL route for the ``authorized`` view that
completes the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}/authorized``, so that the URL is
based on the name of the blueprint.
redirect_url: When the OAuth dance is complete,
redirect the user to this URL.
redirect_to: When the OAuth dance is complete,
redirect the user to the URL obtained by calling
:func:`~flask.url_for` with this argument. If you do not specify
either ``redirect_url`` or ``redirect_to``, the user will be
redirected to the root path (``/``).
session_class: The class to use for creating a Requests session
between the consumer (your website) and the provider (e.g.
Google). Defaults to
:class:`~flask_dance.consumer.requests.OAuth2Session`.
storage: A token storage class, or an instance of a token storage
class, to use for this blueprint. Defaults to
:class:`~flask_dance.consumer.storage.session.SessionStorage`.
rule_kwargs (dict, optional): Additional arguments that should be passed when adding
the login and authorized routes. Defaults to ``None``.
use_pkce: If true then the authorization flow will follow the PKCE (Proof Key for Code Exchange).
For more details please refer to `RFC7636 <https://www.rfc-editor.org/rfc/rfc7636#section-4.1>`__
code_challenge_method: Code challenge method to be used in authorization code flow with PKCE
instead of client secret. It will be used only if ``use_pkce`` is set to True.
Defaults to ``S256``.
| def __init__(
self,
name,
import_name,
client_id=None,
client_secret=None,
*,
client=None,
auto_refresh_url=None,
auto_refresh_kwargs=None,
scope=None,
state=None,
static_folder=None,
static_url_path=None,
template_folder=None,
url_prefix=None,
subdomain=None,
url_defaults=None,
root_path=None,
login_url=None,
authorized_url=None,
base_url=None,
authorization_url=None,
authorization_url_params=None,
token_url=None,
token_url_params=None,
redirect_url=None,
redirect_to=None,
session_class=None,
storage=None,
rule_kwargs=None,
use_pkce=False,
code_challenge_method="S256",
**kwargs,
):
"""
Most of the constructor arguments are forwarded either to the
:class:`flask.Blueprint` constructor or the
:class:`requests_oauthlib.OAuth2Session` constructor, including
``**kwargs`` (which is forwarded to
:class:`~requests_oauthlib.OAuth2Session`).
Only the arguments that are relevant to Flask-Dance are documented here.
Args:
base_url: The base URL of the OAuth provider.
If specified, all URLs passed to this instance will be
resolved relative to this URL.
authorization_url: The URL specified by the OAuth provider for
obtaining an
`authorization grant <https://datatracker.ietf.org/doc/html/rfc6749#section-1.3>`__.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
authorization_url_params (dict): A dict of extra
key-value pairs to include in the query string of the
``authorization_url``, beyond those necessary for a standard
OAuth 2 authorization grant request.
token_url: The URL specified by the OAuth provider for
obtaining an
`access token <https://datatracker.ietf.org/doc/html/rfc6749#section-1.4>`__.
This can be an fully-qualified URL, or a path that is
resolved relative to the ``base_url``.
token_url_params (dict): A dict of extra
key-value pairs to include in the query string of the
``token_url``, beyond those necessary for a standard
OAuth 2 access token request.
login_url: The URL route for the ``login`` view that kicks off
the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}``, so that the URL is based on the name
of the blueprint.
authorized_url: The URL route for the ``authorized`` view that
completes the OAuth dance. This string will be
:ref:`formatted <python:formatstrings>`
with the instance so that attributes can be interpolated.
Defaults to ``/{bp.name}/authorized``, so that the URL is
based on the name of the blueprint.
redirect_url: When the OAuth dance is complete,
redirect the user to this URL.
redirect_to: When the OAuth dance is complete,
redirect the user to the URL obtained by calling
:func:`~flask.url_for` with this argument. If you do not specify
either ``redirect_url`` or ``redirect_to``, the user will be
redirected to the root path (``/``).
session_class: The class to use for creating a Requests session
between the consumer (your website) and the provider (e.g.
Google). Defaults to
:class:`~flask_dance.consumer.requests.OAuth2Session`.
storage: A token storage class, or an instance of a token storage
class, to use for this blueprint. Defaults to
:class:`~flask_dance.consumer.storage.session.SessionStorage`.
rule_kwargs (dict, optional): Additional arguments that should be passed when adding
the login and authorized routes. Defaults to ``None``.
use_pkce: If true then the authorization flow will follow the PKCE (Proof Key for Code Exchange).
For more details please refer to `RFC7636 <https://www.rfc-editor.org/rfc/rfc7636#section-4.1>`__
code_challenge_method: Code challenge method to be used in authorization code flow with PKCE
instead of client secret. It will be used only if ``use_pkce`` is set to True.
Defaults to ``S256``.
"""
BaseOAuthConsumerBlueprint.__init__(
self,
name,
import_name,
static_folder=static_folder,
static_url_path=static_url_path,
template_folder=template_folder,
url_prefix=url_prefix,
subdomain=subdomain,
url_defaults=url_defaults,
root_path=root_path,
login_url=login_url,
authorized_url=authorized_url,
storage=storage,
rule_kwargs=rule_kwargs,
)
self.base_url = base_url
self.session_class = session_class or OAuth2Session
# passed to OAuth2Session()
self._client_id = client_id
self.client = client
self.auto_refresh_url = auto_refresh_url
self.auto_refresh_kwargs = auto_refresh_kwargs
self.scope = scope
self.state = state
self.kwargs = kwargs
self.client_secret = client_secret
# used by view functions
self.authorization_url = authorization_url
self.authorization_url_params = authorization_url_params or {}
self.token_url = token_url
self.token_url_params = token_url_params or {}
self.redirect_url = redirect_url
self.redirect_to = redirect_to
self.code_challenge_method = code_challenge_method
self.use_pkce = use_pkce
self.teardown_app_request(self.teardown_session)
| (self, name, import_name, client_id=None, client_secret=None, *, client=None, auto_refresh_url=None, auto_refresh_kwargs=None, scope=None, state=None, static_folder=None, static_url_path=None, template_folder=None, url_prefix=None, subdomain=None, url_defaults=None, root_path=None, login_url=None, authorized_url=None, base_url=None, authorization_url=None, authorization_url_params=None, token_url=None, token_url_params=None, redirect_url=None, redirect_to=None, session_class=None, storage=None, rule_kwargs=None, use_pkce=False, code_challenge_method='S256', **kwargs) |
36,063 | flask_dance.consumer.oauth2 | authorized |
This is the route/function that the user will be redirected to by
the provider (e.g. Google) after the user has logged into the
provider's website and authorized your app to access their account.
| def authorized(self):
"""
This is the route/function that the user will be redirected to by
the provider (e.g. Google) after the user has logged into the
provider's website and authorized your app to access their account.
"""
if self.redirect_url:
next_url = self.redirect_url
elif self.redirect_to:
next_url = url_for(self.redirect_to)
else:
next_url = "/"
log.debug("next_url = %s", next_url)
# check for error in request args
error = request.args.get("error")
if error:
error_desc = request.args.get("error_description")
error_uri = request.args.get("error_uri")
log.warning(
"OAuth 2 authorization error: %s description: %s uri: %s",
error,
error_desc,
error_uri,
)
results = oauth_error.send(
self, error=error, error_description=error_desc, error_uri=error_uri
)
if results:
for _, ret in results:
if isinstance(ret, (Response, current_app.response_class)):
return ret
return redirect(next_url)
state_key = f"{self.name}_oauth_state"
if state_key not in flask.session:
# can't validate state, so redirect back to login view
log.info("state not found, redirecting user to login")
return redirect(url_for(".login"))
state = flask.session[state_key]
log.debug("state = %s", state)
self.session._state = state
del flask.session[state_key]
if self.use_pkce:
code_verifier_key = f"{self.name}_oauth_code_verifier"
if code_verifier_key not in flask.session:
# can't find code_verifier, so redirect back to login view
log.info("code_verifier not found, redirecting user to login")
return redirect(url_for(".login"))
code_verifier = flask.session[code_verifier_key]
log.debug("code_verifier = %s", code_verifier)
del flask.session[code_verifier_key]
self.token_url_params["code_verifier"] = code_verifier
self.session.redirect_uri = url_for(".authorized", _external=True)
log.debug("client_id = %s", self.client_id)
log.debug("client_secret = %s", self.client_secret)
try:
token = self.session.fetch_token(
self.token_url,
authorization_response=request.url,
client_secret=self.client_secret,
**self.token_url_params,
)
except MissingCodeError as e:
e.args = (
e.args[0],
"The redirect request did not contain the expected parameters. Instead I got: {}".format(
json.dumps(request.args)
),
)
raise
results = oauth_authorized.send(self, token=token) or []
set_token = True
for func, ret in results:
if isinstance(ret, (Response, current_app.response_class)):
return ret
if ret == False:
set_token = False
if set_token:
try:
self.token = token
except ValueError as error:
log.warning("OAuth 2 authorization error: %s", str(error))
oauth_error.send(self, error=error)
return redirect(next_url)
| (self) |
36,073 | flask_dance.consumer.oauth2 | login | null | def login(self):
log.debug("client_id = %s", self.client_id)
self.session.redirect_uri = url_for(".authorized", _external=True)
if self.use_pkce:
code_verifier = generate_token(length=48)
code_challenge = self.session._client.create_code_challenge(
code_verifier=code_verifier,
code_challenge_method=self.code_challenge_method,
)
self.authorization_url_params.update(
{
"code_challenge_method": self.code_challenge_method,
"code_challenge": code_challenge,
}
)
code_verifier_key = f"{self.name}_oauth_code_verifier"
flask.session[code_verifier_key] = code_verifier
log.debug("code_verifier = %s", code_verifier)
url, state = self.session.authorization_url(
self.authorization_url, state=self.state, **self.authorization_url_params
)
state_key = f"{self.name}_oauth_state"
flask.session[state_key] = state
log.debug("state = %s", state)
log.debug("redirect URL = %s", url)
oauth_before_login.send(self, url=url)
return redirect(url)
| (self) |
36,086 | flask_dance.consumer.oauth2 | session_created | null | def session_created(self, session):
return session
| (self, session) |
36,094 | keplergl.keplergl | DataException | null | class DataException(TraitError):
pass
| null |
36,148 | keplergl.keplergl | KeplerGl | An example widget. | class KeplerGl(widgets.DOMWidget):
"""An example widget."""
_view_name = Unicode('KeplerGlView').tag(sync=True)
_model_name = Unicode('KeplerGlModal').tag(sync=True)
_view_module = Unicode('keplergl-jupyter').tag(sync=True)
_model_module = Unicode('keplergl-jupyter').tag(sync=True)
_view_module_version = Unicode(EXTENSION_SPEC_VERSION).tag(sync=True)
_model_module_version = Unicode(EXTENSION_SPEC_VERSION).tag(sync=True)
value = Unicode('Hello World!').tag(sync=True)
data = Dict({}).tag(sync=True, **data_serialization)
config = Dict({}).tag(sync=True)
height = Int(400).tag(sync=True)
def __init__(self, **kwargs):
if 'show_docs' not in kwargs:
kwargs['show_docs'] = True
if kwargs['show_docs']:
print('User Guide: {}'.format(documentation))
kwargs.pop('show_docs')
super(KeplerGl, self).__init__(**kwargs)
@validate('data')
def _validate_data(self, proposal):
'''Validate data input.
Makes sure data is a dict, and each value should be either a df, a geojson dictionary / string or csv string
layers list.
'''
if type(proposal.value) is not dict:
raise DataException('[data type error]: Expecting a dictionary mapping from id to value, but got {}'.format(type(proposal.value)))
else:
for key, value in proposal.value.items():
if not isinstance(value, pd.DataFrame) and (type(value) is not str) and (type(value) is not dict):
raise DataException('[data type error]: value of {} should be a DataFrame, a Geojson Dictionary or String, a csv String, but got {}'.format(key, type(value)))
return proposal.value
def add_data(self, data, name="unnamed"):
''' Send data to Voyager
Inputs:
- data string, can be a csv string or json string
- name string
Example of use:
keplergl.add_data(data_string, name="data_1")
'''
normalized = _normalize_data(data)
copy = self.data.copy()
copy.update({name: normalized})
self.data = copy
def show(self, data=None, config=None, read_only=False, center_map=False):
''' Display current map in Google Colab
Inputs:
- data: a data dictionary {"name": data}, if not provided, will use current map data
- config: map config dictionary, if not provided, will use current map config
- read_only: if read_only is True, hide side panel to disable map customization
- center_map: if center_map is True, the bound of the map will be updated acoording to the current map data
Example of use:
# this will display map in Google Colab
from keplergl import KeplerGL
map1 = KeplerGL()
map1.show()
'''
keplergl_html = resource_string(__name__, 'static/keplergl.html').decode('utf-8')
# find open of body
k = keplergl_html.find("<body>")
data_to_add = data_to_json(self.data, None) if data == None else data_to_json(data, None)
config_to_add = self.config if config == None else config
keplergl_data = json.dumps({"config": config_to_add, "data": data_to_add, "options": {"readOnly": read_only, "centerMap": center_map}})
cmd = """window.__keplerglDataConfig = {};""".format(keplergl_data)
frame_txt = keplergl_html[:k] + "<body><script>" + cmd + "</script>" + keplergl_html[k+6:]
if "google.colab" in sys.modules:
from IPython.display import HTML, Javascript
display(HTML(frame_txt))
display(Javascript(f"google.colab.output.setIframeHeight('{self.height}');"))
def _repr_html_(self, data=None, config=None, read_only=False, center_map=False):
''' Return current map in an html encoded string
Inputs:
- data: a data dictionary {"name": data}, if not provided, will use current map data
- config: map config dictionary, if not provided, will use current map config
- read_only: if read_only is True, hide side panel to disable map customization
- center_map: if center_map is True, the bound of the map will be updated acoording to the current map data
Returns:
- a html encoded string
Example of use:
# this will save map with provided data and config
keplergl._repr_html_(data={"data_1": df}, config=config)
# this will save current map
keplergl._repr_html_()
'''
keplergl_html = resource_string(__name__, 'static/keplergl.html').decode('utf-8')
# find open of body
k = keplergl_html.find("<body>")
data_to_add = data_to_json(self.data, None) if data == None else data_to_json(data, None)
config_to_add = self.config if config == None else config
# for key in data_to_add:
# print(type(data_to_add[key]))
keplergl_data = json.dumps({"config": config_to_add, "data": data_to_add, "options": {"readOnly": read_only, "centerMap": center_map}})
cmd = """window.__keplerglDataConfig = {};""".format(keplergl_data)
frame_txt = keplergl_html[:k] + "<body><script>" + cmd + "</script>" + keplergl_html[k+6:]
return frame_txt.encode('utf-8')
def save_to_html(self, data=None, config=None, file_name='keplergl_map.html', read_only=False, center_map=False):
''' Save current map to an interactive html
Inputs:
- data: a data dictionary {"name": data}, if not provided, will use current map data
- config: map config dictionary, if not provided, will use current map config
- file_name: the html file name, default is keplergl_map.html
- read_only: if read_only is True, hide side panel to disable map customization
Returns:
- an html file will be saved to your notebook
Example of use:
# this will save map with provided data and config
keplergl.save_to_html(data={"data_1": df}, config=config, file_name='first_map.html')
# this will save current map
keplergl.save_to_html(file_name='first_map.html')
'''
frame_txt = self._repr_html_(data=data, config=config, read_only=read_only, center_map=center_map)
with open(file_name, 'wb') as f:
f.write(frame_txt)
print("Map saved to {}!".format(file_name))
| (**kwargs) |
36,151 | keplergl.keplergl | __init__ | null | def __init__(self, **kwargs):
if 'show_docs' not in kwargs:
kwargs['show_docs'] = True
if kwargs['show_docs']:
print('User Guide: {}'.format(documentation))
kwargs.pop('show_docs')
super(KeplerGl, self).__init__(**kwargs)
| (self, **kwargs) |
36,158 | ipywidgets.widgets.widget | _gen_repr_from_keys | null | def _gen_repr_from_keys(self, keys):
class_name = self.__class__.__name__
signature = ', '.join(
'%s=%r' % (key, getattr(self, key))
for key in keys
)
return '%s(%s)' % (class_name, signature)
| (self, keys) |
36,162 | ipywidgets.widgets.widget | _handle_displayed | Called when a view has been displayed for this widget instance | def _handle_displayed(self, **kwargs):
"""Called when a view has been displayed for this widget instance"""
self._display_callbacks(self, **kwargs)
| (self, **kwargs) |
36,163 | ipywidgets.widgets.widget | m | null | def _show_traceback(method):
"""decorator for showing tracebacks in IPython"""
def m(self, *args, **kwargs):
try:
return(method(self, *args, **kwargs))
except Exception as e:
ip = get_ipython()
if ip is None:
self.log.warning("Exception in widget method %s: %s", method, e, exc_info=True)
else:
ip.showtraceback()
return m
| (self, *args, **kwargs) |
36,164 | ipywidgets.widgets.widget | _ipython_display_ | Called when `IPython.display.display` is called on the widget. | def _ipython_display_(self, **kwargs):
"""Called when `IPython.display.display` is called on the widget."""
plaintext = repr(self)
if len(plaintext) > 110:
plaintext = plaintext[:110] + '…'
data = {
'text/plain': plaintext,
}
if self._view_name is not None:
# The 'application/vnd.jupyter.widget-view+json' mimetype has not been registered yet.
# See the registration process and naming convention at
# http://tools.ietf.org/html/rfc6838
# and the currently registered mimetypes at
# http://www.iana.org/assignments/media-types/media-types.xhtml.
data['application/vnd.jupyter.widget-view+json'] = {
'version_major': 2,
'version_minor': 0,
'model_id': self._model_id
}
display(data, raw=True)
if self._view_name is not None:
self._handle_displayed(**kwargs)
| (self, **kwargs) |
36,166 | ipywidgets.widgets.widget | _lock_property | Lock a property-value pair.
The value should be the JSON state of the property.
NOTE: This, in addition to the single lock for all state changes, is
flawed. In the future we may want to look into buffering state changes
back to the front-end. | def items(self):
for model_module, mm in sorted(self._registry.items()):
for model_version, mv in sorted(mm.items()):
for model_name, vm in sorted(mv.items()):
for view_module, vv in sorted(vm.items()):
for view_version, vn in sorted(vv.items()):
for view_name, widget in sorted(vn.items()):
yield (model_module, model_version, model_name, view_module, view_version, view_name), widget
| (self, **properties) |
36,171 | keplergl.keplergl | _repr_html_ | Return current map in an html encoded string
Inputs:
- data: a data dictionary {"name": data}, if not provided, will use current map data
- config: map config dictionary, if not provided, will use current map config
- read_only: if read_only is True, hide side panel to disable map customization
- center_map: if center_map is True, the bound of the map will be updated acoording to the current map data
Returns:
- a html encoded string
Example of use:
# this will save map with provided data and config
keplergl._repr_html_(data={"data_1": df}, config=config)
# this will save current map
keplergl._repr_html_()
| def _repr_html_(self, data=None, config=None, read_only=False, center_map=False):
''' Return current map in an html encoded string
Inputs:
- data: a data dictionary {"name": data}, if not provided, will use current map data
- config: map config dictionary, if not provided, will use current map config
- read_only: if read_only is True, hide side panel to disable map customization
- center_map: if center_map is True, the bound of the map will be updated acoording to the current map data
Returns:
- a html encoded string
Example of use:
# this will save map with provided data and config
keplergl._repr_html_(data={"data_1": df}, config=config)
# this will save current map
keplergl._repr_html_()
'''
keplergl_html = resource_string(__name__, 'static/keplergl.html').decode('utf-8')
# find open of body
k = keplergl_html.find("<body>")
data_to_add = data_to_json(self.data, None) if data == None else data_to_json(data, None)
config_to_add = self.config if config == None else config
# for key in data_to_add:
# print(type(data_to_add[key]))
keplergl_data = json.dumps({"config": config_to_add, "data": data_to_add, "options": {"readOnly": read_only, "centerMap": center_map}})
cmd = """window.__keplerglDataConfig = {};""".format(keplergl_data)
frame_txt = keplergl_html[:k] + "<body><script>" + cmd + "</script>" + keplergl_html[k+6:]
return frame_txt.encode('utf-8')
| (self, data=None, config=None, read_only=False, center_map=False) |
36,172 | ipywidgets.widgets.domwidget | _repr_keys | null | def _repr_keys(self):
for key in super(DOMWidget, self)._repr_keys():
# Exclude layout if it had the default value
if key == 'layout':
value = getattr(self, key)
if repr(value) == '%s()' % value.__class__.__name__:
continue
yield key
# We also need to include _dom_classes in repr for reproducibility
if self._dom_classes:
yield '_dom_classes'
| (self) |
36,174 | ipywidgets.widgets.widget | _should_send_property | Check the property lock (property_lock) | def _should_send_property(self, key, value):
"""Check the property lock (property_lock)"""
to_json = self.trait_metadata(key, 'to_json', self._trait_to_json)
if key in self._property_lock:
# model_state, buffer_paths, buffers
split_value = _remove_buffers({ key: to_json(value, self)})
split_lock = _remove_buffers({ key: self._property_lock[key]})
# A roundtrip conversion through json in the comparison takes care of
# idiosyncracies of how python data structures map to json, for example
# tuples get converted to lists.
if (jsonloads(jsondumps(split_value[0])) == split_lock[0]
and split_value[1] == split_lock[1]
and _buffer_list_equal(split_value[2], split_lock[2])):
return False
if self._holding_sync:
self._states_to_send.add(key)
return False
else:
return True
| (self, key, value) |
36,178 | keplergl.keplergl | add_data | Send data to Voyager
Inputs:
- data string, can be a csv string or json string
- name string
Example of use:
keplergl.add_data(data_string, name="data_1")
| def add_data(self, data, name="unnamed"):
''' Send data to Voyager
Inputs:
- data string, can be a csv string or json string
- name string
Example of use:
keplergl.add_data(data_string, name="data_1")
'''
normalized = _normalize_data(data)
copy = self.data.copy()
copy.update({name: normalized})
self.data = copy
| (self, data, name='unnamed') |
36,179 | ipywidgets.widgets.widget | add_traits | Dynamically add trait attributes to the Widget. | def add_traits(self, **traits):
"""Dynamically add trait attributes to the Widget."""
super(Widget, self).add_traits(**traits)
for name, trait in traits.items():
if trait.get_metadata('sync'):
self.keys.append(name)
self.send_state(name)
| (self, **traits) |
36,180 | ipywidgets.widgets.widget | close | Close method.
Closes the underlying comm.
When the comm is closed, all of the widget views are automatically
removed from the front-end. | def close(self):
"""Close method.
Closes the underlying comm.
When the comm is closed, all of the widget views are automatically
removed from the front-end."""
if self.comm is not None:
Widget.widgets.pop(self.model_id, None)
self.comm.close()
self.comm = None
self._ipython_display_ = None
| (self) |
36,181 | ipywidgets.widgets.widget | get_manager_state | Returns the full state for a widget manager for embedding
:param drop_defaults: when True, it will not include default value
:param widgets: list with widgets to include in the state (or all widgets when None)
:return:
| @staticmethod
def get_manager_state(drop_defaults=False, widgets=None):
"""Returns the full state for a widget manager for embedding
:param drop_defaults: when True, it will not include default value
:param widgets: list with widgets to include in the state (or all widgets when None)
:return:
"""
state = {}
if widgets is None:
widgets = Widget.widgets.values()
for widget in widgets:
state[widget.model_id] = widget._get_embed_state(drop_defaults=drop_defaults)
return {'version_major': 2, 'version_minor': 0, 'state': state}
| (drop_defaults=False, widgets=None) |
36,182 | ipywidgets.widgets.widget | get_state | Gets the widget state, or a piece of it.
Parameters
----------
key : unicode or iterable (optional)
A single property's name or iterable of property names to get.
Returns
-------
state : dict of states
metadata : dict
metadata for each field: {key: metadata}
| def get_state(self, key=None, drop_defaults=False):
"""Gets the widget state, or a piece of it.
Parameters
----------
key : unicode or iterable (optional)
A single property's name or iterable of property names to get.
Returns
-------
state : dict of states
metadata : dict
metadata for each field: {key: metadata}
"""
if key is None:
keys = self.keys
elif isinstance(key, string_types):
keys = [key]
elif isinstance(key, Iterable):
keys = key
else:
raise ValueError("key must be a string, an iterable of keys, or None")
state = {}
traits = self.traits()
for k in keys:
to_json = self.trait_metadata(k, 'to_json', self._trait_to_json)
value = to_json(getattr(self, k), self)
if not PY3 and isinstance(traits[k], Bytes) and isinstance(value, bytes):
value = memoryview(value)
if not drop_defaults or not self._compare(value, traits[k].default_value):
state[k] = value
return state
| (self, key=None, drop_defaults=False) |
36,184 | ipywidgets.widgets.widget | handle_comm_opened | Static method, called when a widget is constructed. | @staticmethod
def handle_comm_opened(comm, msg):
"""Static method, called when a widget is constructed."""
version = msg.get('metadata', {}).get('version', '')
if version.split('.')[0] != PROTOCOL_VERSION_MAJOR:
raise ValueError("Incompatible widget protocol versions: received version %r, expected version %r"%(version, __protocol_version__))
data = msg['content']['data']
state = data['state']
# Find the widget class to instantiate in the registered widgets
widget_class = Widget.widget_types.get(state['_model_module'],
state['_model_module_version'],
state['_model_name'],
state['_view_module'],
state['_view_module_version'],
state['_view_name'])
widget = widget_class(comm=comm)
if 'buffer_paths' in data:
_put_buffers(state, data['buffer_paths'], msg['buffers'])
widget.set_state(state)
| (comm, msg) |
36,186 | ipywidgets.widgets.widget | hold_sync | Hold syncing any state until the outermost context manager exits | def items(self):
for model_module, mm in sorted(self._registry.items()):
for model_version, mv in sorted(mm.items()):
for model_name, vm in sorted(mv.items()):
for view_module, vv in sorted(vm.items()):
for view_version, vn in sorted(vv.items()):
for view_name, widget in sorted(vn.items()):
yield (model_module, model_version, model_name, view_module, view_version, view_name), widget
| (self) |
36,188 | ipywidgets.widgets.widget | notify_change | Called when a property has changed. | def notify_change(self, change):
"""Called when a property has changed."""
# Send the state to the frontend before the user-registered callbacks
# are called.
name = change['name']
if self.comm is not None and getattr(self.comm, 'kernel', True) is not None:
# Make sure this isn't information that the front-end just sent us.
if name in self.keys and self._should_send_property(name, getattr(self, name)):
# Send new state to front-end
self.send_state(key=name)
super(Widget, self).notify_change(change)
| (self, change) |
36,190 | ipywidgets.widgets.widget | on_displayed | (Un)Register a widget displayed callback.
Parameters
----------
callback: method handler
Must have a signature of::
callback(widget, **kwargs)
kwargs from display are passed through without modification.
remove: bool
True if the callback should be unregistered. | def on_displayed(self, callback, remove=False):
"""(Un)Register a widget displayed callback.
Parameters
----------
callback: method handler
Must have a signature of::
callback(widget, **kwargs)
kwargs from display are passed through without modification.
remove: bool
True if the callback should be unregistered."""
self._display_callbacks.register_callback(callback, remove=remove)
| (self, callback, remove=False) |
36,196 | keplergl.keplergl | save_to_html | Save current map to an interactive html
Inputs:
- data: a data dictionary {"name": data}, if not provided, will use current map data
- config: map config dictionary, if not provided, will use current map config
- file_name: the html file name, default is keplergl_map.html
- read_only: if read_only is True, hide side panel to disable map customization
Returns:
- an html file will be saved to your notebook
Example of use:
# this will save map with provided data and config
keplergl.save_to_html(data={"data_1": df}, config=config, file_name='first_map.html')
# this will save current map
keplergl.save_to_html(file_name='first_map.html')
| def save_to_html(self, data=None, config=None, file_name='keplergl_map.html', read_only=False, center_map=False):
''' Save current map to an interactive html
Inputs:
- data: a data dictionary {"name": data}, if not provided, will use current map data
- config: map config dictionary, if not provided, will use current map config
- file_name: the html file name, default is keplergl_map.html
- read_only: if read_only is True, hide side panel to disable map customization
Returns:
- an html file will be saved to your notebook
Example of use:
# this will save map with provided data and config
keplergl.save_to_html(data={"data_1": df}, config=config, file_name='first_map.html')
# this will save current map
keplergl.save_to_html(file_name='first_map.html')
'''
frame_txt = self._repr_html_(data=data, config=config, read_only=read_only, center_map=center_map)
with open(file_name, 'wb') as f:
f.write(frame_txt)
print("Map saved to {}!".format(file_name))
| (self, data=None, config=None, file_name='keplergl_map.html', read_only=False, center_map=False) |
36,202 | keplergl.keplergl | show | Display current map in Google Colab
Inputs:
- data: a data dictionary {"name": data}, if not provided, will use current map data
- config: map config dictionary, if not provided, will use current map config
- read_only: if read_only is True, hide side panel to disable map customization
- center_map: if center_map is True, the bound of the map will be updated acoording to the current map data
Example of use:
# this will display map in Google Colab
from keplergl import KeplerGL
map1 = KeplerGL()
map1.show()
| def show(self, data=None, config=None, read_only=False, center_map=False):
''' Display current map in Google Colab
Inputs:
- data: a data dictionary {"name": data}, if not provided, will use current map data
- config: map config dictionary, if not provided, will use current map config
- read_only: if read_only is True, hide side panel to disable map customization
- center_map: if center_map is True, the bound of the map will be updated acoording to the current map data
Example of use:
# this will display map in Google Colab
from keplergl import KeplerGL
map1 = KeplerGL()
map1.show()
'''
keplergl_html = resource_string(__name__, 'static/keplergl.html').decode('utf-8')
# find open of body
k = keplergl_html.find("<body>")
data_to_add = data_to_json(self.data, None) if data == None else data_to_json(data, None)
config_to_add = self.config if config == None else config
keplergl_data = json.dumps({"config": config_to_add, "data": data_to_add, "options": {"readOnly": read_only, "centerMap": center_map}})
cmd = """window.__keplerglDataConfig = {};""".format(keplergl_data)
frame_txt = keplergl_html[:k] + "<body><script>" + cmd + "</script>" + keplergl_html[k+6:]
if "google.colab" in sys.modules:
from IPython.display import HTML, Javascript
display(HTML(frame_txt))
display(Javascript(f"google.colab.output.setIframeHeight('{self.height}');"))
| (self, data=None, config=None, read_only=False, center_map=False) |
36,235 | keplergl | _jupyter_nbextension_paths | null | def _jupyter_nbextension_paths():
return [{
'section': 'notebook',
'src': 'static',
'dest': 'keplergl-jupyter',
'require': 'keplergl-jupyter/extension'
}]
| () |
36,237 | keplergl.keplergl | data_from_json | Deserialize a Javascript date. | def data_from_json(js, manager):
'''Deserialize a Javascript date.'''
return js
| (js, manager) |
36,238 | keplergl.keplergl | data_to_json | Serialize a Python date object.
Attributes of this dictionary are to be passed to the JavaScript side.
| def data_to_json(data, manager):
'''Serialize a Python date object.
Attributes of this dictionary are to be passed to the JavaScript side.
'''
if data is None:
return None
else:
if type(data) is not dict:
print(data)
raise Exception('data type incorrect expecting a dictionary mapping from data id to value, but got {}'.format(type(data)))
return None
else:
dataset = {}
for key, value in data.items():
normalized = _normalize_data(value)
dataset.update({key: normalized})
return dataset
| (data, manager) |
36,328 | statistics | LinearRegression | LinearRegression(slope, intercept) | from statistics import LinearRegression
| (slope, intercept) |
36,330 | namedtuple_LinearRegression | __new__ | Create new instance of LinearRegression(slope, intercept) | from builtins import function
| (_cls, slope, intercept) |
36,333 | collections | _replace | Return a new LinearRegression object replacing specified fields with new values | def namedtuple(typename, field_names, *, rename=False, defaults=None, module=None):
"""Returns a new subclass of tuple with named fields.
>>> Point = namedtuple('Point', ['x', 'y'])
>>> Point.__doc__ # docstring for the new class
'Point(x, y)'
>>> p = Point(11, y=22) # instantiate with positional args or keywords
>>> p[0] + p[1] # indexable like a plain tuple
33
>>> x, y = p # unpack like a regular tuple
>>> x, y
(11, 22)
>>> p.x + p.y # fields also accessible by name
33
>>> d = p._asdict() # convert to a dictionary
>>> d['x']
11
>>> Point(**d) # convert from a dictionary
Point(x=11, y=22)
>>> p._replace(x=100) # _replace() is like str.replace() but targets named fields
Point(x=100, y=22)
"""
# Validate the field names. At the user's option, either generate an error
# message or automatically replace the field name with a valid name.
if isinstance(field_names, str):
field_names = field_names.replace(',', ' ').split()
field_names = list(map(str, field_names))
typename = _sys.intern(str(typename))
if rename:
seen = set()
for index, name in enumerate(field_names):
if (not name.isidentifier()
or _iskeyword(name)
or name.startswith('_')
or name in seen):
field_names[index] = f'_{index}'
seen.add(name)
for name in [typename] + field_names:
if type(name) is not str:
raise TypeError('Type names and field names must be strings')
if not name.isidentifier():
raise ValueError('Type names and field names must be valid '
f'identifiers: {name!r}')
if _iskeyword(name):
raise ValueError('Type names and field names cannot be a '
f'keyword: {name!r}')
seen = set()
for name in field_names:
if name.startswith('_') and not rename:
raise ValueError('Field names cannot start with an underscore: '
f'{name!r}')
if name in seen:
raise ValueError(f'Encountered duplicate field name: {name!r}')
seen.add(name)
field_defaults = {}
if defaults is not None:
defaults = tuple(defaults)
if len(defaults) > len(field_names):
raise TypeError('Got more default values than field names')
field_defaults = dict(reversed(list(zip(reversed(field_names),
reversed(defaults)))))
# Variables used in the methods and docstrings
field_names = tuple(map(_sys.intern, field_names))
num_fields = len(field_names)
arg_list = ', '.join(field_names)
if num_fields == 1:
arg_list += ','
repr_fmt = '(' + ', '.join(f'{name}=%r' for name in field_names) + ')'
tuple_new = tuple.__new__
_dict, _tuple, _len, _map, _zip = dict, tuple, len, map, zip
# Create all the named tuple methods to be added to the class namespace
namespace = {
'_tuple_new': tuple_new,
'__builtins__': {},
'__name__': f'namedtuple_{typename}',
}
code = f'lambda _cls, {arg_list}: _tuple_new(_cls, ({arg_list}))'
__new__ = eval(code, namespace)
__new__.__name__ = '__new__'
__new__.__doc__ = f'Create new instance of {typename}({arg_list})'
if defaults is not None:
__new__.__defaults__ = defaults
@classmethod
def _make(cls, iterable):
result = tuple_new(cls, iterable)
if _len(result) != num_fields:
raise TypeError(f'Expected {num_fields} arguments, got {len(result)}')
return result
_make.__func__.__doc__ = (f'Make a new {typename} object from a sequence '
'or iterable')
def _replace(self, /, **kwds):
result = self._make(_map(kwds.pop, field_names, self))
if kwds:
raise ValueError(f'Got unexpected field names: {list(kwds)!r}')
return result
_replace.__doc__ = (f'Return a new {typename} object replacing specified '
'fields with new values')
def __repr__(self):
'Return a nicely formatted representation string'
return self.__class__.__name__ + repr_fmt % self
def _asdict(self):
'Return a new dict which maps field names to their values.'
return _dict(_zip(self._fields, self))
def __getnewargs__(self):
'Return self as a plain tuple. Used by copy and pickle.'
return _tuple(self)
# Modify function metadata to help with introspection and debugging
for method in (
__new__,
_make.__func__,
_replace,
__repr__,
_asdict,
__getnewargs__,
):
method.__qualname__ = f'{typename}.{method.__name__}'
# Build-up the class namespace dictionary
# and use type() to build the result class
class_namespace = {
'__doc__': f'{typename}({arg_list})',
'__slots__': (),
'_fields': field_names,
'_field_defaults': field_defaults,
'__new__': __new__,
'_make': _make,
'_replace': _replace,
'__repr__': __repr__,
'_asdict': _asdict,
'__getnewargs__': __getnewargs__,
'__match_args__': field_names,
}
for index, name in enumerate(field_names):
doc = _sys.intern(f'Alias for field number {index}')
class_namespace[name] = _tuplegetter(index, doc)
result = type(typename, (tuple,), class_namespace)
# For pickling to work, the __module__ variable needs to be set to the frame
# where the named tuple is created. Bypass this step in environments where
# sys._getframe is not defined (Jython for example) or sys._getframe is not
# defined for arguments greater than 0 (IronPython), or where the user has
# specified a particular module.
if module is None:
try:
module = _sys._getframe(1).f_globals.get('__name__', '__main__')
except (AttributeError, ValueError):
pass
if module is not None:
result.__module__ = module
return result
| (self, /, **kwds) |
36,334 | statistics | NormalDist | Normal distribution of a random variable | class NormalDist:
"Normal distribution of a random variable"
# https://en.wikipedia.org/wiki/Normal_distribution
# https://en.wikipedia.org/wiki/Variance#Properties
__slots__ = {
'_mu': 'Arithmetic mean of a normal distribution',
'_sigma': 'Standard deviation of a normal distribution',
}
def __init__(self, mu=0.0, sigma=1.0):
"NormalDist where mu is the mean and sigma is the standard deviation."
if sigma < 0.0:
raise StatisticsError('sigma must be non-negative')
self._mu = float(mu)
self._sigma = float(sigma)
@classmethod
def from_samples(cls, data):
"Make a normal distribution instance from sample data."
if not isinstance(data, (list, tuple)):
data = list(data)
xbar = fmean(data)
return cls(xbar, stdev(data, xbar))
def samples(self, n, *, seed=None):
"Generate *n* samples for a given mean and standard deviation."
gauss = random.gauss if seed is None else random.Random(seed).gauss
mu, sigma = self._mu, self._sigma
return [gauss(mu, sigma) for i in range(n)]
def pdf(self, x):
"Probability density function. P(x <= X < x+dx) / dx"
variance = self._sigma ** 2.0
if not variance:
raise StatisticsError('pdf() not defined when sigma is zero')
return exp((x - self._mu)**2.0 / (-2.0*variance)) / sqrt(tau*variance)
def cdf(self, x):
"Cumulative distribution function. P(X <= x)"
if not self._sigma:
raise StatisticsError('cdf() not defined when sigma is zero')
return 0.5 * (1.0 + erf((x - self._mu) / (self._sigma * sqrt(2.0))))
def inv_cdf(self, p):
"""Inverse cumulative distribution function. x : P(X <= x) = p
Finds the value of the random variable such that the probability of
the variable being less than or equal to that value equals the given
probability.
This function is also called the percent point function or quantile
function.
"""
if p <= 0.0 or p >= 1.0:
raise StatisticsError('p must be in the range 0.0 < p < 1.0')
if self._sigma <= 0.0:
raise StatisticsError('cdf() not defined when sigma at or below zero')
return _normal_dist_inv_cdf(p, self._mu, self._sigma)
def quantiles(self, n=4):
"""Divide into *n* continuous intervals with equal probability.
Returns a list of (n - 1) cut points separating the intervals.
Set *n* to 4 for quartiles (the default). Set *n* to 10 for deciles.
Set *n* to 100 for percentiles which gives the 99 cuts points that
separate the normal distribution in to 100 equal sized groups.
"""
return [self.inv_cdf(i / n) for i in range(1, n)]
def overlap(self, other):
"""Compute the overlapping coefficient (OVL) between two normal distributions.
Measures the agreement between two normal probability distributions.
Returns a value between 0.0 and 1.0 giving the overlapping area in
the two underlying probability density functions.
>>> N1 = NormalDist(2.4, 1.6)
>>> N2 = NormalDist(3.2, 2.0)
>>> N1.overlap(N2)
0.8035050657330205
"""
# See: "The overlapping coefficient as a measure of agreement between
# probability distributions and point estimation of the overlap of two
# normal densities" -- Henry F. Inman and Edwin L. Bradley Jr
# http://dx.doi.org/10.1080/03610928908830127
if not isinstance(other, NormalDist):
raise TypeError('Expected another NormalDist instance')
X, Y = self, other
if (Y._sigma, Y._mu) < (X._sigma, X._mu): # sort to assure commutativity
X, Y = Y, X
X_var, Y_var = X.variance, Y.variance
if not X_var or not Y_var:
raise StatisticsError('overlap() not defined when sigma is zero')
dv = Y_var - X_var
dm = fabs(Y._mu - X._mu)
if not dv:
return 1.0 - erf(dm / (2.0 * X._sigma * sqrt(2.0)))
a = X._mu * Y_var - Y._mu * X_var
b = X._sigma * Y._sigma * sqrt(dm**2.0 + dv * log(Y_var / X_var))
x1 = (a + b) / dv
x2 = (a - b) / dv
return 1.0 - (fabs(Y.cdf(x1) - X.cdf(x1)) + fabs(Y.cdf(x2) - X.cdf(x2)))
def zscore(self, x):
"""Compute the Standard Score. (x - mean) / stdev
Describes *x* in terms of the number of standard deviations
above or below the mean of the normal distribution.
"""
# https://www.statisticshowto.com/probability-and-statistics/z-score/
if not self._sigma:
raise StatisticsError('zscore() not defined when sigma is zero')
return (x - self._mu) / self._sigma
@property
def mean(self):
"Arithmetic mean of the normal distribution."
return self._mu
@property
def median(self):
"Return the median of the normal distribution"
return self._mu
@property
def mode(self):
"""Return the mode of the normal distribution
The mode is the value x where which the probability density
function (pdf) takes its maximum value.
"""
return self._mu
@property
def stdev(self):
"Standard deviation of the normal distribution."
return self._sigma
@property
def variance(self):
"Square of the standard deviation."
return self._sigma ** 2.0
def __add__(x1, x2):
"""Add a constant or another NormalDist instance.
If *other* is a constant, translate mu by the constant,
leaving sigma unchanged.
If *other* is a NormalDist, add both the means and the variances.
Mathematically, this works only if the two distributions are
independent or if they are jointly normally distributed.
"""
if isinstance(x2, NormalDist):
return NormalDist(x1._mu + x2._mu, hypot(x1._sigma, x2._sigma))
return NormalDist(x1._mu + x2, x1._sigma)
def __sub__(x1, x2):
"""Subtract a constant or another NormalDist instance.
If *other* is a constant, translate by the constant mu,
leaving sigma unchanged.
If *other* is a NormalDist, subtract the means and add the variances.
Mathematically, this works only if the two distributions are
independent or if they are jointly normally distributed.
"""
if isinstance(x2, NormalDist):
return NormalDist(x1._mu - x2._mu, hypot(x1._sigma, x2._sigma))
return NormalDist(x1._mu - x2, x1._sigma)
def __mul__(x1, x2):
"""Multiply both mu and sigma by a constant.
Used for rescaling, perhaps to change measurement units.
Sigma is scaled with the absolute value of the constant.
"""
return NormalDist(x1._mu * x2, x1._sigma * fabs(x2))
def __truediv__(x1, x2):
"""Divide both mu and sigma by a constant.
Used for rescaling, perhaps to change measurement units.
Sigma is scaled with the absolute value of the constant.
"""
return NormalDist(x1._mu / x2, x1._sigma / fabs(x2))
def __pos__(x1):
"Return a copy of the instance."
return NormalDist(x1._mu, x1._sigma)
def __neg__(x1):
"Negates mu while keeping sigma the same."
return NormalDist(-x1._mu, x1._sigma)
__radd__ = __add__
def __rsub__(x1, x2):
"Subtract a NormalDist from a constant or another NormalDist."
return -(x1 - x2)
__rmul__ = __mul__
def __eq__(x1, x2):
"Two NormalDist objects are equal if their mu and sigma are both equal."
if not isinstance(x2, NormalDist):
return NotImplemented
return x1._mu == x2._mu and x1._sigma == x2._sigma
def __hash__(self):
"NormalDist objects hash equal if their mu and sigma are both equal."
return hash((self._mu, self._sigma))
def __repr__(self):
return f'{type(self).__name__}(mu={self._mu!r}, sigma={self._sigma!r})'
def __getstate__(self):
return self._mu, self._sigma
def __setstate__(self, state):
self._mu, self._sigma = state
| (mu=0.0, sigma=1.0) |
36,335 | statistics | __add__ | Add a constant or another NormalDist instance.
If *other* is a constant, translate mu by the constant,
leaving sigma unchanged.
If *other* is a NormalDist, add both the means and the variances.
Mathematically, this works only if the two distributions are
independent or if they are jointly normally distributed.
| def __add__(x1, x2):
"""Add a constant or another NormalDist instance.
If *other* is a constant, translate mu by the constant,
leaving sigma unchanged.
If *other* is a NormalDist, add both the means and the variances.
Mathematically, this works only if the two distributions are
independent or if they are jointly normally distributed.
"""
if isinstance(x2, NormalDist):
return NormalDist(x1._mu + x2._mu, hypot(x1._sigma, x2._sigma))
return NormalDist(x1._mu + x2, x1._sigma)
| (x1, x2) |
36,336 | statistics | __eq__ | Two NormalDist objects are equal if their mu and sigma are both equal. | def __eq__(x1, x2):
"Two NormalDist objects are equal if their mu and sigma are both equal."
if not isinstance(x2, NormalDist):
return NotImplemented
return x1._mu == x2._mu and x1._sigma == x2._sigma
| (x1, x2) |
36,337 | statistics | __getstate__ | null | def __getstate__(self):
return self._mu, self._sigma
| (self) |
36,338 | statistics | __hash__ | NormalDist objects hash equal if their mu and sigma are both equal. | def __hash__(self):
"NormalDist objects hash equal if their mu and sigma are both equal."
return hash((self._mu, self._sigma))
| (self) |
36,339 | statistics | __init__ | NormalDist where mu is the mean and sigma is the standard deviation. | def __init__(self, mu=0.0, sigma=1.0):
"NormalDist where mu is the mean and sigma is the standard deviation."
if sigma < 0.0:
raise StatisticsError('sigma must be non-negative')
self._mu = float(mu)
self._sigma = float(sigma)
| (self, mu=0.0, sigma=1.0) |
36,340 | statistics | __mul__ | Multiply both mu and sigma by a constant.
Used for rescaling, perhaps to change measurement units.
Sigma is scaled with the absolute value of the constant.
| def __mul__(x1, x2):
"""Multiply both mu and sigma by a constant.
Used for rescaling, perhaps to change measurement units.
Sigma is scaled with the absolute value of the constant.
"""
return NormalDist(x1._mu * x2, x1._sigma * fabs(x2))
| (x1, x2) |
36,341 | statistics | __neg__ | Negates mu while keeping sigma the same. | def __neg__(x1):
"Negates mu while keeping sigma the same."
return NormalDist(-x1._mu, x1._sigma)
| (x1) |
36,342 | statistics | __pos__ | Return a copy of the instance. | def __pos__(x1):
"Return a copy of the instance."
return NormalDist(x1._mu, x1._sigma)
| (x1) |
36,344 | statistics | __repr__ | null | def __repr__(self):
return f'{type(self).__name__}(mu={self._mu!r}, sigma={self._sigma!r})'
| (self) |
36,346 | statistics | __rsub__ | Subtract a NormalDist from a constant or another NormalDist. | def __rsub__(x1, x2):
"Subtract a NormalDist from a constant or another NormalDist."
return -(x1 - x2)
| (x1, x2) |
36,347 | statistics | __setstate__ | null | def __setstate__(self, state):
self._mu, self._sigma = state
| (self, state) |
36,348 | statistics | __sub__ | Subtract a constant or another NormalDist instance.
If *other* is a constant, translate by the constant mu,
leaving sigma unchanged.
If *other* is a NormalDist, subtract the means and add the variances.
Mathematically, this works only if the two distributions are
independent or if they are jointly normally distributed.
| def __sub__(x1, x2):
"""Subtract a constant or another NormalDist instance.
If *other* is a constant, translate by the constant mu,
leaving sigma unchanged.
If *other* is a NormalDist, subtract the means and add the variances.
Mathematically, this works only if the two distributions are
independent or if they are jointly normally distributed.
"""
if isinstance(x2, NormalDist):
return NormalDist(x1._mu - x2._mu, hypot(x1._sigma, x2._sigma))
return NormalDist(x1._mu - x2, x1._sigma)
| (x1, x2) |
36,349 | statistics | __truediv__ | Divide both mu and sigma by a constant.
Used for rescaling, perhaps to change measurement units.
Sigma is scaled with the absolute value of the constant.
| def __truediv__(x1, x2):
"""Divide both mu and sigma by a constant.
Used for rescaling, perhaps to change measurement units.
Sigma is scaled with the absolute value of the constant.
"""
return NormalDist(x1._mu / x2, x1._sigma / fabs(x2))
| (x1, x2) |
36,350 | statistics | cdf | Cumulative distribution function. P(X <= x) | def cdf(self, x):
"Cumulative distribution function. P(X <= x)"
if not self._sigma:
raise StatisticsError('cdf() not defined when sigma is zero')
return 0.5 * (1.0 + erf((x - self._mu) / (self._sigma * sqrt(2.0))))
| (self, x) |
36,351 | statistics | inv_cdf | Inverse cumulative distribution function. x : P(X <= x) = p
Finds the value of the random variable such that the probability of
the variable being less than or equal to that value equals the given
probability.
This function is also called the percent point function or quantile
function.
| def inv_cdf(self, p):
"""Inverse cumulative distribution function. x : P(X <= x) = p
Finds the value of the random variable such that the probability of
the variable being less than or equal to that value equals the given
probability.
This function is also called the percent point function or quantile
function.
"""
if p <= 0.0 or p >= 1.0:
raise StatisticsError('p must be in the range 0.0 < p < 1.0')
if self._sigma <= 0.0:
raise StatisticsError('cdf() not defined when sigma at or below zero')
return _normal_dist_inv_cdf(p, self._mu, self._sigma)
| (self, p) |
36,352 | statistics | overlap | Compute the overlapping coefficient (OVL) between two normal distributions.
Measures the agreement between two normal probability distributions.
Returns a value between 0.0 and 1.0 giving the overlapping area in
the two underlying probability density functions.
>>> N1 = NormalDist(2.4, 1.6)
>>> N2 = NormalDist(3.2, 2.0)
>>> N1.overlap(N2)
0.8035050657330205
| def overlap(self, other):
"""Compute the overlapping coefficient (OVL) between two normal distributions.
Measures the agreement between two normal probability distributions.
Returns a value between 0.0 and 1.0 giving the overlapping area in
the two underlying probability density functions.
>>> N1 = NormalDist(2.4, 1.6)
>>> N2 = NormalDist(3.2, 2.0)
>>> N1.overlap(N2)
0.8035050657330205
"""
# See: "The overlapping coefficient as a measure of agreement between
# probability distributions and point estimation of the overlap of two
# normal densities" -- Henry F. Inman and Edwin L. Bradley Jr
# http://dx.doi.org/10.1080/03610928908830127
if not isinstance(other, NormalDist):
raise TypeError('Expected another NormalDist instance')
X, Y = self, other
if (Y._sigma, Y._mu) < (X._sigma, X._mu): # sort to assure commutativity
X, Y = Y, X
X_var, Y_var = X.variance, Y.variance
if not X_var or not Y_var:
raise StatisticsError('overlap() not defined when sigma is zero')
dv = Y_var - X_var
dm = fabs(Y._mu - X._mu)
if not dv:
return 1.0 - erf(dm / (2.0 * X._sigma * sqrt(2.0)))
a = X._mu * Y_var - Y._mu * X_var
b = X._sigma * Y._sigma * sqrt(dm**2.0 + dv * log(Y_var / X_var))
x1 = (a + b) / dv
x2 = (a - b) / dv
return 1.0 - (fabs(Y.cdf(x1) - X.cdf(x1)) + fabs(Y.cdf(x2) - X.cdf(x2)))
| (self, other) |
36,353 | statistics | pdf | Probability density function. P(x <= X < x+dx) / dx | def pdf(self, x):
"Probability density function. P(x <= X < x+dx) / dx"
variance = self._sigma ** 2.0
if not variance:
raise StatisticsError('pdf() not defined when sigma is zero')
return exp((x - self._mu)**2.0 / (-2.0*variance)) / sqrt(tau*variance)
| (self, x) |
36,354 | statistics | quantiles | Divide into *n* continuous intervals with equal probability.
Returns a list of (n - 1) cut points separating the intervals.
Set *n* to 4 for quartiles (the default). Set *n* to 10 for deciles.
Set *n* to 100 for percentiles which gives the 99 cuts points that
separate the normal distribution in to 100 equal sized groups.
| def quantiles(self, n=4):
"""Divide into *n* continuous intervals with equal probability.
Returns a list of (n - 1) cut points separating the intervals.
Set *n* to 4 for quartiles (the default). Set *n* to 10 for deciles.
Set *n* to 100 for percentiles which gives the 99 cuts points that
separate the normal distribution in to 100 equal sized groups.
"""
return [self.inv_cdf(i / n) for i in range(1, n)]
| (self, n=4) |
36,355 | statistics | samples | Generate *n* samples for a given mean and standard deviation. | def samples(self, n, *, seed=None):
"Generate *n* samples for a given mean and standard deviation."
gauss = random.gauss if seed is None else random.Random(seed).gauss
mu, sigma = self._mu, self._sigma
return [gauss(mu, sigma) for i in range(n)]
| (self, n, *, seed=None) |
36,356 | statistics | zscore | Compute the Standard Score. (x - mean) / stdev
Describes *x* in terms of the number of standard deviations
above or below the mean of the normal distribution.
| def zscore(self, x):
"""Compute the Standard Score. (x - mean) / stdev
Describes *x* in terms of the number of standard deviations
above or below the mean of the normal distribution.
"""
# https://www.statisticshowto.com/probability-and-statistics/z-score/
if not self._sigma:
raise StatisticsError('zscore() not defined when sigma is zero')
return (x - self._mu) / self._sigma
| (self, x) |
36,357 | statistics | StatisticsError | null | class StatisticsError(ValueError):
pass
| null |
36,358 | statistics | _coerce | Coerce types T and S to a common type, or raise TypeError.
Coercion rules are currently an implementation detail. See the CoerceTest
test class in test_statistics for details.
| def _coerce(T, S):
"""Coerce types T and S to a common type, or raise TypeError.
Coercion rules are currently an implementation detail. See the CoerceTest
test class in test_statistics for details.
"""
# See http://bugs.python.org/issue24068.
assert T is not bool, "initial type T is bool"
# If the types are the same, no need to coerce anything. Put this
# first, so that the usual case (no coercion needed) happens as soon
# as possible.
if T is S: return T
# Mixed int & other coerce to the other type.
if S is int or S is bool: return T
if T is int: return S
# If one is a (strict) subclass of the other, coerce to the subclass.
if issubclass(S, T): return S
if issubclass(T, S): return T
# Ints coerce to the other type.
if issubclass(T, int): return S
if issubclass(S, int): return T
# Mixed fraction & float coerces to float (or float subclass).
if issubclass(T, Fraction) and issubclass(S, float):
return S
if issubclass(T, float) and issubclass(S, Fraction):
return T
# Any other combination is disallowed.
msg = "don't know how to coerce %s and %s"
raise TypeError(msg % (T.__name__, S.__name__))
| (T, S) |
36,359 | statistics | _convert | Convert value to given numeric type T. | def _convert(value, T):
"""Convert value to given numeric type T."""
if type(value) is T:
# This covers the cases where T is Fraction, or where value is
# a NAN or INF (Decimal or float).
return value
if issubclass(T, int) and value.denominator != 1:
T = float
try:
# FIXME: what do we do if this overflows?
return T(value)
except TypeError:
if issubclass(T, Decimal):
return T(value.numerator) / T(value.denominator)
else:
raise
| (value, T) |
36,360 | statistics | _exact_ratio | Return Real number x to exact (numerator, denominator) pair.
>>> _exact_ratio(0.25)
(1, 4)
x is expected to be an int, Fraction, Decimal or float.
| def _exact_ratio(x):
"""Return Real number x to exact (numerator, denominator) pair.
>>> _exact_ratio(0.25)
(1, 4)
x is expected to be an int, Fraction, Decimal or float.
"""
try:
return x.as_integer_ratio()
except AttributeError:
pass
except (OverflowError, ValueError):
# float NAN or INF.
assert not _isfinite(x)
return (x, None)
try:
# x may be an Integral ABC.
return (x.numerator, x.denominator)
except AttributeError:
msg = f"can't convert type '{type(x).__name__}' to numerator/denominator"
raise TypeError(msg)
| (x) |
36,361 | statistics | _fail_neg | Iterate over values, failing if any are less than zero. | def _fail_neg(values, errmsg='negative value'):
"""Iterate over values, failing if any are less than zero."""
for x in values:
if x < 0:
raise StatisticsError(errmsg)
yield x
| (values, errmsg='negative value') |
36,362 | statistics | _find_lteq | Locate the leftmost value exactly equal to x | def _find_lteq(a, x):
'Locate the leftmost value exactly equal to x'
i = bisect_left(a, x)
if i != len(a) and a[i] == x:
return i
raise ValueError
| (a, x) |
36,363 | statistics | _find_rteq | Locate the rightmost value exactly equal to x | def _find_rteq(a, l, x):
'Locate the rightmost value exactly equal to x'
i = bisect_right(a, x, lo=l)
if i != (len(a) + 1) and a[i - 1] == x:
return i - 1
raise ValueError
| (a, l, x) |
36,364 | statistics | _isfinite | null | def _isfinite(x):
try:
return x.is_finite() # Likely a Decimal.
except AttributeError:
return math.isfinite(x) # Coerces to float first.
| (x) |
36,365 | statistics | _ss | Return sum of square deviations of sequence data.
If ``c`` is None, the mean is calculated in one pass, and the deviations
from the mean are calculated in a second pass. Otherwise, deviations are
calculated from ``c`` as given. Use the second case with care, as it can
lead to garbage results.
| def _ss(data, c=None):
"""Return sum of square deviations of sequence data.
If ``c`` is None, the mean is calculated in one pass, and the deviations
from the mean are calculated in a second pass. Otherwise, deviations are
calculated from ``c`` as given. Use the second case with care, as it can
lead to garbage results.
"""
if c is not None:
T, total, count = _sum((x-c)**2 for x in data)
return (T, total)
T, total, count = _sum(data)
mean_n, mean_d = (total / count).as_integer_ratio()
partials = Counter()
for n, d in map(_exact_ratio, data):
diff_n = n * mean_d - d * mean_n
diff_d = d * mean_d
partials[diff_d * diff_d] += diff_n * diff_n
if None in partials:
# The sum will be a NAN or INF. We can ignore all the finite
# partials, and just look at this special one.
total = partials[None]
assert not _isfinite(total)
else:
total = sum(Fraction(n, d) for d, n in partials.items())
return (T, total)
| (data, c=None) |
36,366 | statistics | _sum | _sum(data) -> (type, sum, count)
Return a high-precision sum of the given numeric data as a fraction,
together with the type to be converted to and the count of items.
Examples
--------
>>> _sum([3, 2.25, 4.5, -0.5, 0.25])
(<class 'float'>, Fraction(19, 2), 5)
Some sources of round-off error will be avoided:
# Built-in sum returns zero.
>>> _sum([1e50, 1, -1e50] * 1000)
(<class 'float'>, Fraction(1000, 1), 3000)
Fractions and Decimals are also supported:
>>> from fractions import Fraction as F
>>> _sum([F(2, 3), F(7, 5), F(1, 4), F(5, 6)])
(<class 'fractions.Fraction'>, Fraction(63, 20), 4)
>>> from decimal import Decimal as D
>>> data = [D("0.1375"), D("0.2108"), D("0.3061"), D("0.0419")]
>>> _sum(data)
(<class 'decimal.Decimal'>, Fraction(6963, 10000), 4)
Mixed types are currently treated as an error, except that int is
allowed.
| def _sum(data):
"""_sum(data) -> (type, sum, count)
Return a high-precision sum of the given numeric data as a fraction,
together with the type to be converted to and the count of items.
Examples
--------
>>> _sum([3, 2.25, 4.5, -0.5, 0.25])
(<class 'float'>, Fraction(19, 2), 5)
Some sources of round-off error will be avoided:
# Built-in sum returns zero.
>>> _sum([1e50, 1, -1e50] * 1000)
(<class 'float'>, Fraction(1000, 1), 3000)
Fractions and Decimals are also supported:
>>> from fractions import Fraction as F
>>> _sum([F(2, 3), F(7, 5), F(1, 4), F(5, 6)])
(<class 'fractions.Fraction'>, Fraction(63, 20), 4)
>>> from decimal import Decimal as D
>>> data = [D("0.1375"), D("0.2108"), D("0.3061"), D("0.0419")]
>>> _sum(data)
(<class 'decimal.Decimal'>, Fraction(6963, 10000), 4)
Mixed types are currently treated as an error, except that int is
allowed.
"""
count = 0
partials = {}
partials_get = partials.get
T = int
for typ, values in groupby(data, type):
T = _coerce(T, typ) # or raise TypeError
for n, d in map(_exact_ratio, values):
count += 1
partials[d] = partials_get(d, 0) + n
if None in partials:
# The sum will be a NAN or INF. We can ignore all the finite
# partials, and just look at this special one.
total = partials[None]
assert not _isfinite(total)
else:
# Sum all the partial sums using builtin sum.
total = sum(Fraction(n, d) for d, n in partials.items())
return (T, total, count)
| (data) |
36,367 | statistics | correlation | Pearson's correlation coefficient
Return the Pearson's correlation coefficient for two inputs. Pearson's
correlation coefficient *r* takes values between -1 and +1. It measures the
strength and direction of the linear relationship, where +1 means very
strong, positive linear relationship, -1 very strong, negative linear
relationship, and 0 no linear relationship.
>>> x = [1, 2, 3, 4, 5, 6, 7, 8, 9]
>>> y = [9, 8, 7, 6, 5, 4, 3, 2, 1]
>>> correlation(x, x)
1.0
>>> correlation(x, y)
-1.0
| def correlation(x, y, /):
"""Pearson's correlation coefficient
Return the Pearson's correlation coefficient for two inputs. Pearson's
correlation coefficient *r* takes values between -1 and +1. It measures the
strength and direction of the linear relationship, where +1 means very
strong, positive linear relationship, -1 very strong, negative linear
relationship, and 0 no linear relationship.
>>> x = [1, 2, 3, 4, 5, 6, 7, 8, 9]
>>> y = [9, 8, 7, 6, 5, 4, 3, 2, 1]
>>> correlation(x, x)
1.0
>>> correlation(x, y)
-1.0
"""
n = len(x)
if len(y) != n:
raise StatisticsError('correlation requires that both inputs have same number of data points')
if n < 2:
raise StatisticsError('correlation requires at least two data points')
xbar = fsum(x) / n
ybar = fsum(y) / n
sxy = fsum((xi - xbar) * (yi - ybar) for xi, yi in zip(x, y))
sxx = fsum((xi - xbar) ** 2.0 for xi in x)
syy = fsum((yi - ybar) ** 2.0 for yi in y)
try:
return sxy / sqrt(sxx * syy)
except ZeroDivisionError:
raise StatisticsError('at least one of the inputs is constant')
| (x, y, /) |
36,368 | statistics | covariance | Covariance
Return the sample covariance of two inputs *x* and *y*. Covariance
is a measure of the joint variability of two inputs.
>>> x = [1, 2, 3, 4, 5, 6, 7, 8, 9]
>>> y = [1, 2, 3, 1, 2, 3, 1, 2, 3]
>>> covariance(x, y)
0.75
>>> z = [9, 8, 7, 6, 5, 4, 3, 2, 1]
>>> covariance(x, z)
-7.5
>>> covariance(z, x)
-7.5
| def covariance(x, y, /):
"""Covariance
Return the sample covariance of two inputs *x* and *y*. Covariance
is a measure of the joint variability of two inputs.
>>> x = [1, 2, 3, 4, 5, 6, 7, 8, 9]
>>> y = [1, 2, 3, 1, 2, 3, 1, 2, 3]
>>> covariance(x, y)
0.75
>>> z = [9, 8, 7, 6, 5, 4, 3, 2, 1]
>>> covariance(x, z)
-7.5
>>> covariance(z, x)
-7.5
"""
n = len(x)
if len(y) != n:
raise StatisticsError('covariance requires that both inputs have same number of data points')
if n < 2:
raise StatisticsError('covariance requires at least two data points')
xbar = fsum(x) / n
ybar = fsum(y) / n
sxy = fsum((xi - xbar) * (yi - ybar) for xi, yi in zip(x, y))
return sxy / (n - 1)
| (x, y, /) |
36,369 | statistics | fmean | Convert data to floats and compute the arithmetic mean.
This runs faster than the mean() function and it always returns a float.
If the input dataset is empty, it raises a StatisticsError.
>>> fmean([3.5, 4.0, 5.25])
4.25
| def fmean(data):
"""Convert data to floats and compute the arithmetic mean.
This runs faster than the mean() function and it always returns a float.
If the input dataset is empty, it raises a StatisticsError.
>>> fmean([3.5, 4.0, 5.25])
4.25
"""
try:
n = len(data)
except TypeError:
# Handle iterators that do not define __len__().
n = 0
def count(iterable):
nonlocal n
for n, x in enumerate(iterable, start=1):
yield x
total = fsum(count(data))
else:
total = fsum(data)
try:
return total / n
except ZeroDivisionError:
raise StatisticsError('fmean requires at least one data point') from None
| (data) |
36,370 | statistics | geometric_mean | Convert data to floats and compute the geometric mean.
Raises a StatisticsError if the input dataset is empty,
if it contains a zero, or if it contains a negative value.
No special efforts are made to achieve exact results.
(However, this may change in the future.)
>>> round(geometric_mean([54, 24, 36]), 9)
36.0
| def geometric_mean(data):
"""Convert data to floats and compute the geometric mean.
Raises a StatisticsError if the input dataset is empty,
if it contains a zero, or if it contains a negative value.
No special efforts are made to achieve exact results.
(However, this may change in the future.)
>>> round(geometric_mean([54, 24, 36]), 9)
36.0
"""
try:
return exp(fmean(map(log, data)))
except ValueError:
raise StatisticsError('geometric mean requires a non-empty dataset '
'containing positive numbers') from None
| (data) |
36,372 | statistics | harmonic_mean | Return the harmonic mean of data.
The harmonic mean is the reciprocal of the arithmetic mean of the
reciprocals of the data. It can be used for averaging ratios or
rates, for example speeds.
Suppose a car travels 40 km/hr for 5 km and then speeds-up to
60 km/hr for another 5 km. What is the average speed?
>>> harmonic_mean([40, 60])
48.0
Suppose a car travels 40 km/hr for 5 km, and when traffic clears,
speeds-up to 60 km/hr for the remaining 30 km of the journey. What
is the average speed?
>>> harmonic_mean([40, 60], weights=[5, 30])
56.0
If ``data`` is empty, or any element is less than zero,
``harmonic_mean`` will raise ``StatisticsError``.
| def harmonic_mean(data, weights=None):
"""Return the harmonic mean of data.
The harmonic mean is the reciprocal of the arithmetic mean of the
reciprocals of the data. It can be used for averaging ratios or
rates, for example speeds.
Suppose a car travels 40 km/hr for 5 km and then speeds-up to
60 km/hr for another 5 km. What is the average speed?
>>> harmonic_mean([40, 60])
48.0
Suppose a car travels 40 km/hr for 5 km, and when traffic clears,
speeds-up to 60 km/hr for the remaining 30 km of the journey. What
is the average speed?
>>> harmonic_mean([40, 60], weights=[5, 30])
56.0
If ``data`` is empty, or any element is less than zero,
``harmonic_mean`` will raise ``StatisticsError``.
"""
if iter(data) is data:
data = list(data)
errmsg = 'harmonic mean does not support negative values'
n = len(data)
if n < 1:
raise StatisticsError('harmonic_mean requires at least one data point')
elif n == 1 and weights is None:
x = data[0]
if isinstance(x, (numbers.Real, Decimal)):
if x < 0:
raise StatisticsError(errmsg)
return x
else:
raise TypeError('unsupported type')
if weights is None:
weights = repeat(1, n)
sum_weights = n
else:
if iter(weights) is weights:
weights = list(weights)
if len(weights) != n:
raise StatisticsError('Number of weights does not match data size')
_, sum_weights, _ = _sum(w for w in _fail_neg(weights, errmsg))
try:
data = _fail_neg(data, errmsg)
T, total, count = _sum(w / x if w else 0 for w, x in zip(weights, data))
except ZeroDivisionError:
return 0
if total <= 0:
raise StatisticsError('Weighted sum must be positive')
return _convert(sum_weights / total, T)
| (data, weights=None) |
36,373 | operator | itemgetter | itemgetter(item, ...) --> itemgetter object
Return a callable object that fetches the given item(s) from its operand.
After f = itemgetter(2), the call f(r) returns r[2].
After g = itemgetter(2, 5, 3), the call g(r) returns (r[2], r[5], r[3]) | class itemgetter:
"""
Return a callable object that fetches the given item(s) from its operand.
After f = itemgetter(2), the call f(r) returns r[2].
After g = itemgetter(2, 5, 3), the call g(r) returns (r[2], r[5], r[3])
"""
__slots__ = ('_items', '_call')
def __init__(self, item, *items):
if not items:
self._items = (item,)
def func(obj):
return obj[item]
self._call = func
else:
self._items = items = (item,) + items
def func(obj):
return tuple(obj[i] for i in items)
self._call = func
def __call__(self, obj):
return self._call(obj)
def __repr__(self):
return '%s.%s(%s)' % (self.__class__.__module__,
self.__class__.__name__,
', '.join(map(repr, self._items)))
def __reduce__(self):
return self.__class__, self._items
| null |
36,374 | statistics | linear_regression | Slope and intercept for simple linear regression.
Return the slope and intercept of simple linear regression
parameters estimated using ordinary least squares. Simple linear
regression describes relationship between an independent variable
*x* and a dependent variable *y* in terms of linear function:
y = slope * x + intercept + noise
where *slope* and *intercept* are the regression parameters that are
estimated, and noise represents the variability of the data that was
not explained by the linear regression (it is equal to the
difference between predicted and actual values of the dependent
variable).
The parameters are returned as a named tuple.
>>> x = [1, 2, 3, 4, 5]
>>> noise = NormalDist().samples(5, seed=42)
>>> y = [3 * x[i] + 2 + noise[i] for i in range(5)]
>>> linear_regression(x, y) #doctest: +ELLIPSIS
LinearRegression(slope=3.09078914170..., intercept=1.75684970486...)
| def linear_regression(x, y, /):
"""Slope and intercept for simple linear regression.
Return the slope and intercept of simple linear regression
parameters estimated using ordinary least squares. Simple linear
regression describes relationship between an independent variable
*x* and a dependent variable *y* in terms of linear function:
y = slope * x + intercept + noise
where *slope* and *intercept* are the regression parameters that are
estimated, and noise represents the variability of the data that was
not explained by the linear regression (it is equal to the
difference between predicted and actual values of the dependent
variable).
The parameters are returned as a named tuple.
>>> x = [1, 2, 3, 4, 5]
>>> noise = NormalDist().samples(5, seed=42)
>>> y = [3 * x[i] + 2 + noise[i] for i in range(5)]
>>> linear_regression(x, y) #doctest: +ELLIPSIS
LinearRegression(slope=3.09078914170..., intercept=1.75684970486...)
"""
n = len(x)
if len(y) != n:
raise StatisticsError('linear regression requires that both inputs have same number of data points')
if n < 2:
raise StatisticsError('linear regression requires at least two data points')
xbar = fsum(x) / n
ybar = fsum(y) / n
sxy = fsum((xi - xbar) * (yi - ybar) for xi, yi in zip(x, y))
sxx = fsum((xi - xbar) ** 2.0 for xi in x)
try:
slope = sxy / sxx # equivalent to: covariance(x, y) / variance(x)
except ZeroDivisionError:
raise StatisticsError('x is constant')
intercept = ybar - slope * xbar
return LinearRegression(slope=slope, intercept=intercept)
| (x, y, /) |
36,376 | statistics | mean | Return the sample arithmetic mean of data.
>>> mean([1, 2, 3, 4, 4])
2.8
>>> from fractions import Fraction as F
>>> mean([F(3, 7), F(1, 21), F(5, 3), F(1, 3)])
Fraction(13, 21)
>>> from decimal import Decimal as D
>>> mean([D("0.5"), D("0.75"), D("0.625"), D("0.375")])
Decimal('0.5625')
If ``data`` is empty, StatisticsError will be raised.
| def mean(data):
"""Return the sample arithmetic mean of data.
>>> mean([1, 2, 3, 4, 4])
2.8
>>> from fractions import Fraction as F
>>> mean([F(3, 7), F(1, 21), F(5, 3), F(1, 3)])
Fraction(13, 21)
>>> from decimal import Decimal as D
>>> mean([D("0.5"), D("0.75"), D("0.625"), D("0.375")])
Decimal('0.5625')
If ``data`` is empty, StatisticsError will be raised.
"""
if iter(data) is data:
data = list(data)
n = len(data)
if n < 1:
raise StatisticsError('mean requires at least one data point')
T, total, count = _sum(data)
assert count == n
return _convert(total / n, T)
| (data) |
36,377 | statistics | median | Return the median (middle value) of numeric data.
When the number of data points is odd, return the middle data point.
When the number of data points is even, the median is interpolated by
taking the average of the two middle values:
>>> median([1, 3, 5])
3
>>> median([1, 3, 5, 7])
4.0
| def median(data):
"""Return the median (middle value) of numeric data.
When the number of data points is odd, return the middle data point.
When the number of data points is even, the median is interpolated by
taking the average of the two middle values:
>>> median([1, 3, 5])
3
>>> median([1, 3, 5, 7])
4.0
"""
data = sorted(data)
n = len(data)
if n == 0:
raise StatisticsError("no median for empty data")
if n % 2 == 1:
return data[n // 2]
else:
i = n // 2
return (data[i - 1] + data[i]) / 2
| (data) |
36,378 | statistics | median_grouped | Return the 50th percentile (median) of grouped continuous data.
>>> median_grouped([1, 2, 2, 3, 4, 4, 4, 4, 4, 5])
3.7
>>> median_grouped([52, 52, 53, 54])
52.5
This calculates the median as the 50th percentile, and should be
used when your data is continuous and grouped. In the above example,
the values 1, 2, 3, etc. actually represent the midpoint of classes
0.5-1.5, 1.5-2.5, 2.5-3.5, etc. The middle value falls somewhere in
class 3.5-4.5, and interpolation is used to estimate it.
Optional argument ``interval`` represents the class interval, and
defaults to 1. Changing the class interval naturally will change the
interpolated 50th percentile value:
>>> median_grouped([1, 3, 3, 5, 7], interval=1)
3.25
>>> median_grouped([1, 3, 3, 5, 7], interval=2)
3.5
This function does not check whether the data points are at least
``interval`` apart.
| def median_grouped(data, interval=1):
"""Return the 50th percentile (median) of grouped continuous data.
>>> median_grouped([1, 2, 2, 3, 4, 4, 4, 4, 4, 5])
3.7
>>> median_grouped([52, 52, 53, 54])
52.5
This calculates the median as the 50th percentile, and should be
used when your data is continuous and grouped. In the above example,
the values 1, 2, 3, etc. actually represent the midpoint of classes
0.5-1.5, 1.5-2.5, 2.5-3.5, etc. The middle value falls somewhere in
class 3.5-4.5, and interpolation is used to estimate it.
Optional argument ``interval`` represents the class interval, and
defaults to 1. Changing the class interval naturally will change the
interpolated 50th percentile value:
>>> median_grouped([1, 3, 3, 5, 7], interval=1)
3.25
>>> median_grouped([1, 3, 3, 5, 7], interval=2)
3.5
This function does not check whether the data points are at least
``interval`` apart.
"""
data = sorted(data)
n = len(data)
if n == 0:
raise StatisticsError("no median for empty data")
elif n == 1:
return data[0]
# Find the value at the midpoint. Remember this corresponds to the
# centre of the class interval.
x = data[n // 2]
for obj in (x, interval):
if isinstance(obj, (str, bytes)):
raise TypeError('expected number but got %r' % obj)
try:
L = x - interval / 2 # The lower limit of the median interval.
except TypeError:
# Mixed type. For now we just coerce to float.
L = float(x) - float(interval) / 2
# Uses bisection search to search for x in data with log(n) time complexity
# Find the position of leftmost occurrence of x in data
l1 = _find_lteq(data, x)
# Find the position of rightmost occurrence of x in data[l1...len(data)]
# Assuming always l1 <= l2
l2 = _find_rteq(data, l1, x)
cf = l1
f = l2 - l1 + 1
return L + interval * (n / 2 - cf) / f
| (data, interval=1) |
36,379 | statistics | median_high | Return the high median of data.
When the number of data points is odd, the middle value is returned.
When it is even, the larger of the two middle values is returned.
>>> median_high([1, 3, 5])
3
>>> median_high([1, 3, 5, 7])
5
| def median_high(data):
"""Return the high median of data.
When the number of data points is odd, the middle value is returned.
When it is even, the larger of the two middle values is returned.
>>> median_high([1, 3, 5])
3
>>> median_high([1, 3, 5, 7])
5
"""
data = sorted(data)
n = len(data)
if n == 0:
raise StatisticsError("no median for empty data")
return data[n // 2]
| (data) |
36,380 | statistics | median_low | Return the low median of numeric data.
When the number of data points is odd, the middle value is returned.
When it is even, the smaller of the two middle values is returned.
>>> median_low([1, 3, 5])
3
>>> median_low([1, 3, 5, 7])
3
| def median_low(data):
"""Return the low median of numeric data.
When the number of data points is odd, the middle value is returned.
When it is even, the smaller of the two middle values is returned.
>>> median_low([1, 3, 5])
3
>>> median_low([1, 3, 5, 7])
3
"""
data = sorted(data)
n = len(data)
if n == 0:
raise StatisticsError("no median for empty data")
if n % 2 == 1:
return data[n // 2]
else:
return data[n // 2 - 1]
| (data) |
36,381 | statistics | mode | Return the most common data point from discrete or nominal data.
``mode`` assumes discrete data, and returns a single value. This is the
standard treatment of the mode as commonly taught in schools:
>>> mode([1, 1, 2, 3, 3, 3, 3, 4])
3
This also works with nominal (non-numeric) data:
>>> mode(["red", "blue", "blue", "red", "green", "red", "red"])
'red'
If there are multiple modes with same frequency, return the first one
encountered:
>>> mode(['red', 'red', 'green', 'blue', 'blue'])
'red'
If *data* is empty, ``mode``, raises StatisticsError.
| def mode(data):
"""Return the most common data point from discrete or nominal data.
``mode`` assumes discrete data, and returns a single value. This is the
standard treatment of the mode as commonly taught in schools:
>>> mode([1, 1, 2, 3, 3, 3, 3, 4])
3
This also works with nominal (non-numeric) data:
>>> mode(["red", "blue", "blue", "red", "green", "red", "red"])
'red'
If there are multiple modes with same frequency, return the first one
encountered:
>>> mode(['red', 'red', 'green', 'blue', 'blue'])
'red'
If *data* is empty, ``mode``, raises StatisticsError.
"""
pairs = Counter(iter(data)).most_common(1)
try:
return pairs[0][0]
except IndexError:
raise StatisticsError('no mode for empty data') from None
| (data) |
36,382 | statistics | multimode | Return a list of the most frequently occurring values.
Will return more than one result if there are multiple modes
or an empty list if *data* is empty.
>>> multimode('aabbbbbbbbcc')
['b']
>>> multimode('aabbbbccddddeeffffgg')
['b', 'd', 'f']
>>> multimode('')
[]
| def multimode(data):
"""Return a list of the most frequently occurring values.
Will return more than one result if there are multiple modes
or an empty list if *data* is empty.
>>> multimode('aabbbbbbbbcc')
['b']
>>> multimode('aabbbbccddddeeffffgg')
['b', 'd', 'f']
>>> multimode('')
[]
"""
counts = Counter(iter(data)).most_common()
maxcount, mode_items = next(groupby(counts, key=itemgetter(1)), (0, []))
return list(map(itemgetter(0), mode_items))
| (data) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.