applied-ai-018 commited on
Commit
ab25032
·
verified ·
1 Parent(s): 5e18c37

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. ckpts/universal/global_step80/zero/20.mlp.dense_h_to_4h.weight/exp_avg.pt +3 -0
  2. venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/__init__.cpython-310.pyc +0 -0
  3. venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/continued_fraction.cpython-310.pyc +0 -0
  4. venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/factor_.cpython-310.pyc +0 -0
  5. venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/generate.cpython-310.pyc +0 -0
  6. venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/modular.cpython-310.pyc +0 -0
  7. venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/primetest.cpython-310.pyc +0 -0
  8. venv/lib/python3.10/site-packages/sympy/ntheory/tests/__pycache__/test_ecm.cpython-310.pyc +0 -0
  9. venv/lib/python3.10/site-packages/sympy/ntheory/tests/__pycache__/test_elliptic_curve.cpython-310.pyc +0 -0
  10. venv/lib/python3.10/site-packages/sympy/ntheory/tests/__pycache__/test_modular.cpython-310.pyc +0 -0
  11. venv/lib/python3.10/site-packages/sympy/ntheory/tests/__pycache__/test_partitions.cpython-310.pyc +0 -0
  12. venv/lib/python3.10/site-packages/sympy/ntheory/tests/__pycache__/test_primetest.cpython-310.pyc +0 -0
  13. venv/lib/python3.10/site-packages/sympy/utilities/__init__.py +30 -0
  14. venv/lib/python3.10/site-packages/sympy/utilities/autowrap.py +1129 -0
  15. venv/lib/python3.10/site-packages/sympy/utilities/codegen.py +2236 -0
  16. venv/lib/python3.10/site-packages/sympy/utilities/decorator.py +330 -0
  17. venv/lib/python3.10/site-packages/sympy/utilities/enumerative.py +1157 -0
  18. venv/lib/python3.10/site-packages/sympy/utilities/exceptions.py +275 -0
  19. venv/lib/python3.10/site-packages/sympy/utilities/iterables.py +3172 -0
  20. venv/lib/python3.10/site-packages/sympy/utilities/lambdify.py +1526 -0
  21. venv/lib/python3.10/site-packages/sympy/utilities/magic.py +12 -0
  22. venv/lib/python3.10/site-packages/sympy/utilities/matchpy_connector.py +289 -0
  23. venv/lib/python3.10/site-packages/sympy/utilities/mathml/__init__.py +79 -0
  24. venv/lib/python3.10/site-packages/sympy/utilities/mathml/__pycache__/__init__.cpython-310.pyc +0 -0
  25. venv/lib/python3.10/site-packages/sympy/utilities/mathml/data/mmlctop.xsl +0 -0
  26. venv/lib/python3.10/site-packages/sympy/utilities/mathml/data/mmltex.xsl +0 -0
  27. venv/lib/python3.10/site-packages/sympy/utilities/mathml/data/simple_mmlctop.xsl +0 -0
  28. venv/lib/python3.10/site-packages/sympy/utilities/memoization.py +59 -0
  29. venv/lib/python3.10/site-packages/sympy/utilities/misc.py +565 -0
  30. venv/lib/python3.10/site-packages/sympy/utilities/pkgdata.py +56 -0
  31. venv/lib/python3.10/site-packages/sympy/utilities/pytest.py +12 -0
  32. venv/lib/python3.10/site-packages/sympy/utilities/randtest.py +12 -0
  33. venv/lib/python3.10/site-packages/sympy/utilities/runtests.py +13 -0
  34. venv/lib/python3.10/site-packages/sympy/utilities/source.py +40 -0
  35. venv/lib/python3.10/site-packages/sympy/utilities/tests/__init__.py +0 -0
  36. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/__init__.cpython-310.pyc +0 -0
  37. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_autowrap.cpython-310.pyc +0 -0
  38. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_codegen.cpython-310.pyc +0 -0
  39. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_codegen_julia.cpython-310.pyc +0 -0
  40. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_codegen_octave.cpython-310.pyc +0 -0
  41. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_codegen_rust.cpython-310.pyc +0 -0
  42. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_decorator.cpython-310.pyc +0 -0
  43. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_deprecated.cpython-310.pyc +0 -0
  44. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_enumerative.cpython-310.pyc +0 -0
  45. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_exceptions.cpython-310.pyc +0 -0
  46. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_iterables.cpython-310.pyc +0 -0
  47. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_lambdify.cpython-310.pyc +0 -0
  48. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_matchpy_connector.cpython-310.pyc +0 -0
  49. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_mathml.cpython-310.pyc +0 -0
  50. venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_misc.cpython-310.pyc +0 -0
ckpts/universal/global_step80/zero/20.mlp.dense_h_to_4h.weight/exp_avg.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:22fca2a333686e5de2f3b3e80533256452bbbd17bc1b55791ab9b55c33b9b53c
3
+ size 33555612
venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (2.51 kB). View file
 
venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/continued_fraction.cpython-310.pyc ADDED
Binary file (9.17 kB). View file
 
venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/factor_.cpython-310.pyc ADDED
Binary file (65.3 kB). View file
 
venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/generate.cpython-310.pyc ADDED
Binary file (26.9 kB). View file
 
venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/modular.cpython-310.pyc ADDED
Binary file (7.43 kB). View file
 
venv/lib/python3.10/site-packages/sympy/ntheory/__pycache__/primetest.cpython-310.pyc ADDED
Binary file (15.6 kB). View file
 
venv/lib/python3.10/site-packages/sympy/ntheory/tests/__pycache__/test_ecm.cpython-310.pyc ADDED
Binary file (2.09 kB). View file
 
venv/lib/python3.10/site-packages/sympy/ntheory/tests/__pycache__/test_elliptic_curve.cpython-310.pyc ADDED
Binary file (737 Bytes). View file
 
venv/lib/python3.10/site-packages/sympy/ntheory/tests/__pycache__/test_modular.cpython-310.pyc ADDED
Binary file (1.89 kB). View file
 
venv/lib/python3.10/site-packages/sympy/ntheory/tests/__pycache__/test_partitions.cpython-310.pyc ADDED
Binary file (853 Bytes). View file
 
venv/lib/python3.10/site-packages/sympy/ntheory/tests/__pycache__/test_primetest.cpython-310.pyc ADDED
Binary file (6.08 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/__init__.py ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """This module contains some general purpose utilities that are used across
2
+ SymPy.
3
+ """
4
+ from .iterables import (flatten, group, take, subsets,
5
+ variations, numbered_symbols, cartes, capture, dict_merge,
6
+ prefixes, postfixes, sift, topological_sort, unflatten,
7
+ has_dups, has_variety, reshape, rotations)
8
+
9
+ from .misc import filldedent
10
+
11
+ from .lambdify import lambdify
12
+
13
+ from .decorator import threaded, xthreaded, public, memoize_property
14
+
15
+ from .timeutils import timed
16
+
17
+ __all__ = [
18
+ 'flatten', 'group', 'take', 'subsets', 'variations', 'numbered_symbols',
19
+ 'cartes', 'capture', 'dict_merge', 'prefixes', 'postfixes', 'sift',
20
+ 'topological_sort', 'unflatten', 'has_dups', 'has_variety', 'reshape',
21
+ 'rotations',
22
+
23
+ 'filldedent',
24
+
25
+ 'lambdify',
26
+
27
+ 'threaded', 'xthreaded', 'public', 'memoize_property',
28
+
29
+ 'timed',
30
+ ]
venv/lib/python3.10/site-packages/sympy/utilities/autowrap.py ADDED
@@ -0,0 +1,1129 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Module for compiling codegen output, and wrap the binary for use in
2
+ python.
3
+
4
+ .. note:: To use the autowrap module it must first be imported
5
+
6
+ >>> from sympy.utilities.autowrap import autowrap
7
+
8
+ This module provides a common interface for different external backends, such
9
+ as f2py, fwrap, Cython, SWIG(?) etc. (Currently only f2py and Cython are
10
+ implemented) The goal is to provide access to compiled binaries of acceptable
11
+ performance with a one-button user interface, e.g.,
12
+
13
+ >>> from sympy.abc import x,y
14
+ >>> expr = (x - y)**25
15
+ >>> flat = expr.expand()
16
+ >>> binary_callable = autowrap(flat)
17
+ >>> binary_callable(2, 3)
18
+ -1.0
19
+
20
+ Although a SymPy user might primarily be interested in working with
21
+ mathematical expressions and not in the details of wrapping tools
22
+ needed to evaluate such expressions efficiently in numerical form,
23
+ the user cannot do so without some understanding of the
24
+ limits in the target language. For example, the expanded expression
25
+ contains large coefficients which result in loss of precision when
26
+ computing the expression:
27
+
28
+ >>> binary_callable(3, 2)
29
+ 0.0
30
+ >>> binary_callable(4, 5), binary_callable(5, 4)
31
+ (-22925376.0, 25165824.0)
32
+
33
+ Wrapping the unexpanded expression gives the expected behavior:
34
+
35
+ >>> e = autowrap(expr)
36
+ >>> e(4, 5), e(5, 4)
37
+ (-1.0, 1.0)
38
+
39
+ The callable returned from autowrap() is a binary Python function, not a
40
+ SymPy object. If it is desired to use the compiled function in symbolic
41
+ expressions, it is better to use binary_function() which returns a SymPy
42
+ Function object. The binary callable is attached as the _imp_ attribute and
43
+ invoked when a numerical evaluation is requested with evalf(), or with
44
+ lambdify().
45
+
46
+ >>> from sympy.utilities.autowrap import binary_function
47
+ >>> f = binary_function('f', expr)
48
+ >>> 2*f(x, y) + y
49
+ y + 2*f(x, y)
50
+ >>> (2*f(x, y) + y).evalf(2, subs={x: 1, y:2})
51
+ 0.e-110
52
+
53
+ When is this useful?
54
+
55
+ 1) For computations on large arrays, Python iterations may be too slow,
56
+ and depending on the mathematical expression, it may be difficult to
57
+ exploit the advanced index operations provided by NumPy.
58
+
59
+ 2) For *really* long expressions that will be called repeatedly, the
60
+ compiled binary should be significantly faster than SymPy's .evalf()
61
+
62
+ 3) If you are generating code with the codegen utility in order to use
63
+ it in another project, the automatic Python wrappers let you test the
64
+ binaries immediately from within SymPy.
65
+
66
+ 4) To create customized ufuncs for use with numpy arrays.
67
+ See *ufuncify*.
68
+
69
+ When is this module NOT the best approach?
70
+
71
+ 1) If you are really concerned about speed or memory optimizations,
72
+ you will probably get better results by working directly with the
73
+ wrapper tools and the low level code. However, the files generated
74
+ by this utility may provide a useful starting point and reference
75
+ code. Temporary files will be left intact if you supply the keyword
76
+ tempdir="path/to/files/".
77
+
78
+ 2) If the array computation can be handled easily by numpy, and you
79
+ do not need the binaries for another project.
80
+
81
+ """
82
+
83
+ import sys
84
+ import os
85
+ import shutil
86
+ import tempfile
87
+ from subprocess import STDOUT, CalledProcessError, check_output
88
+ from string import Template
89
+ from warnings import warn
90
+
91
+ from sympy.core.cache import cacheit
92
+ from sympy.core.function import Lambda
93
+ from sympy.core.relational import Eq
94
+ from sympy.core.symbol import Dummy, Symbol
95
+ from sympy.tensor.indexed import Idx, IndexedBase
96
+ from sympy.utilities.codegen import (make_routine, get_code_generator,
97
+ OutputArgument, InOutArgument,
98
+ InputArgument, CodeGenArgumentListError,
99
+ Result, ResultBase, C99CodeGen)
100
+ from sympy.utilities.iterables import iterable
101
+ from sympy.utilities.lambdify import implemented_function
102
+ from sympy.utilities.decorator import doctest_depends_on
103
+
104
+ _doctest_depends_on = {'exe': ('f2py', 'gfortran', 'gcc'),
105
+ 'modules': ('numpy',)}
106
+
107
+
108
+ class CodeWrapError(Exception):
109
+ pass
110
+
111
+
112
+ class CodeWrapper:
113
+ """Base Class for code wrappers"""
114
+ _filename = "wrapped_code"
115
+ _module_basename = "wrapper_module"
116
+ _module_counter = 0
117
+
118
+ @property
119
+ def filename(self):
120
+ return "%s_%s" % (self._filename, CodeWrapper._module_counter)
121
+
122
+ @property
123
+ def module_name(self):
124
+ return "%s_%s" % (self._module_basename, CodeWrapper._module_counter)
125
+
126
+ def __init__(self, generator, filepath=None, flags=[], verbose=False):
127
+ """
128
+ generator -- the code generator to use
129
+ """
130
+ self.generator = generator
131
+ self.filepath = filepath
132
+ self.flags = flags
133
+ self.quiet = not verbose
134
+
135
+ @property
136
+ def include_header(self):
137
+ return bool(self.filepath)
138
+
139
+ @property
140
+ def include_empty(self):
141
+ return bool(self.filepath)
142
+
143
+ def _generate_code(self, main_routine, routines):
144
+ routines.append(main_routine)
145
+ self.generator.write(
146
+ routines, self.filename, True, self.include_header,
147
+ self.include_empty)
148
+
149
+ def wrap_code(self, routine, helpers=None):
150
+ helpers = helpers or []
151
+ if self.filepath:
152
+ workdir = os.path.abspath(self.filepath)
153
+ else:
154
+ workdir = tempfile.mkdtemp("_sympy_compile")
155
+ if not os.access(workdir, os.F_OK):
156
+ os.mkdir(workdir)
157
+ oldwork = os.getcwd()
158
+ os.chdir(workdir)
159
+ try:
160
+ sys.path.append(workdir)
161
+ self._generate_code(routine, helpers)
162
+ self._prepare_files(routine)
163
+ self._process_files(routine)
164
+ mod = __import__(self.module_name)
165
+ finally:
166
+ sys.path.remove(workdir)
167
+ CodeWrapper._module_counter += 1
168
+ os.chdir(oldwork)
169
+ if not self.filepath:
170
+ try:
171
+ shutil.rmtree(workdir)
172
+ except OSError:
173
+ # Could be some issues on Windows
174
+ pass
175
+
176
+ return self._get_wrapped_function(mod, routine.name)
177
+
178
+ def _process_files(self, routine):
179
+ command = self.command
180
+ command.extend(self.flags)
181
+ try:
182
+ retoutput = check_output(command, stderr=STDOUT)
183
+ except CalledProcessError as e:
184
+ raise CodeWrapError(
185
+ "Error while executing command: %s. Command output is:\n%s" % (
186
+ " ".join(command), e.output.decode('utf-8')))
187
+ if not self.quiet:
188
+ print(retoutput)
189
+
190
+
191
+ class DummyWrapper(CodeWrapper):
192
+ """Class used for testing independent of backends """
193
+
194
+ template = """# dummy module for testing of SymPy
195
+ def %(name)s():
196
+ return "%(expr)s"
197
+ %(name)s.args = "%(args)s"
198
+ %(name)s.returns = "%(retvals)s"
199
+ """
200
+
201
+ def _prepare_files(self, routine):
202
+ return
203
+
204
+ def _generate_code(self, routine, helpers):
205
+ with open('%s.py' % self.module_name, 'w') as f:
206
+ printed = ", ".join(
207
+ [str(res.expr) for res in routine.result_variables])
208
+ # convert OutputArguments to return value like f2py
209
+ args = filter(lambda x: not isinstance(
210
+ x, OutputArgument), routine.arguments)
211
+ retvals = []
212
+ for val in routine.result_variables:
213
+ if isinstance(val, Result):
214
+ retvals.append('nameless')
215
+ else:
216
+ retvals.append(val.result_var)
217
+
218
+ print(DummyWrapper.template % {
219
+ 'name': routine.name,
220
+ 'expr': printed,
221
+ 'args': ", ".join([str(a.name) for a in args]),
222
+ 'retvals': ", ".join([str(val) for val in retvals])
223
+ }, end="", file=f)
224
+
225
+ def _process_files(self, routine):
226
+ return
227
+
228
+ @classmethod
229
+ def _get_wrapped_function(cls, mod, name):
230
+ return getattr(mod, name)
231
+
232
+
233
+ class CythonCodeWrapper(CodeWrapper):
234
+ """Wrapper that uses Cython"""
235
+
236
+ setup_template = """\
237
+ from setuptools import setup
238
+ from setuptools import Extension
239
+ from Cython.Build import cythonize
240
+ cy_opts = {cythonize_options}
241
+ {np_import}
242
+ ext_mods = [Extension(
243
+ {ext_args},
244
+ include_dirs={include_dirs},
245
+ library_dirs={library_dirs},
246
+ libraries={libraries},
247
+ extra_compile_args={extra_compile_args},
248
+ extra_link_args={extra_link_args}
249
+ )]
250
+ setup(ext_modules=cythonize(ext_mods, **cy_opts))
251
+ """
252
+
253
+ _cythonize_options = {'compiler_directives':{'language_level' : "3"}}
254
+
255
+ pyx_imports = (
256
+ "import numpy as np\n"
257
+ "cimport numpy as np\n\n")
258
+
259
+ pyx_header = (
260
+ "cdef extern from '{header_file}.h':\n"
261
+ " {prototype}\n\n")
262
+
263
+ pyx_func = (
264
+ "def {name}_c({arg_string}):\n"
265
+ "\n"
266
+ "{declarations}"
267
+ "{body}")
268
+
269
+ std_compile_flag = '-std=c99'
270
+
271
+ def __init__(self, *args, **kwargs):
272
+ """Instantiates a Cython code wrapper.
273
+
274
+ The following optional parameters get passed to ``setuptools.Extension``
275
+ for building the Python extension module. Read its documentation to
276
+ learn more.
277
+
278
+ Parameters
279
+ ==========
280
+ include_dirs : [list of strings]
281
+ A list of directories to search for C/C++ header files (in Unix
282
+ form for portability).
283
+ library_dirs : [list of strings]
284
+ A list of directories to search for C/C++ libraries at link time.
285
+ libraries : [list of strings]
286
+ A list of library names (not filenames or paths) to link against.
287
+ extra_compile_args : [list of strings]
288
+ Any extra platform- and compiler-specific information to use when
289
+ compiling the source files in 'sources'. For platforms and
290
+ compilers where "command line" makes sense, this is typically a
291
+ list of command-line arguments, but for other platforms it could be
292
+ anything. Note that the attribute ``std_compile_flag`` will be
293
+ appended to this list.
294
+ extra_link_args : [list of strings]
295
+ Any extra platform- and compiler-specific information to use when
296
+ linking object files together to create the extension (or to create
297
+ a new static Python interpreter). Similar interpretation as for
298
+ 'extra_compile_args'.
299
+ cythonize_options : [dictionary]
300
+ Keyword arguments passed on to cythonize.
301
+
302
+ """
303
+
304
+ self._include_dirs = kwargs.pop('include_dirs', [])
305
+ self._library_dirs = kwargs.pop('library_dirs', [])
306
+ self._libraries = kwargs.pop('libraries', [])
307
+ self._extra_compile_args = kwargs.pop('extra_compile_args', [])
308
+ self._extra_compile_args.append(self.std_compile_flag)
309
+ self._extra_link_args = kwargs.pop('extra_link_args', [])
310
+ self._cythonize_options = kwargs.pop('cythonize_options', self._cythonize_options)
311
+
312
+ self._need_numpy = False
313
+
314
+ super().__init__(*args, **kwargs)
315
+
316
+ @property
317
+ def command(self):
318
+ command = [sys.executable, "setup.py", "build_ext", "--inplace"]
319
+ return command
320
+
321
+ def _prepare_files(self, routine, build_dir=os.curdir):
322
+ # NOTE : build_dir is used for testing purposes.
323
+ pyxfilename = self.module_name + '.pyx'
324
+ codefilename = "%s.%s" % (self.filename, self.generator.code_extension)
325
+
326
+ # pyx
327
+ with open(os.path.join(build_dir, pyxfilename), 'w') as f:
328
+ self.dump_pyx([routine], f, self.filename)
329
+
330
+ # setup.py
331
+ ext_args = [repr(self.module_name), repr([pyxfilename, codefilename])]
332
+ if self._need_numpy:
333
+ np_import = 'import numpy as np\n'
334
+ self._include_dirs.append('np.get_include()')
335
+ else:
336
+ np_import = ''
337
+
338
+ with open(os.path.join(build_dir, 'setup.py'), 'w') as f:
339
+ includes = str(self._include_dirs).replace("'np.get_include()'",
340
+ 'np.get_include()')
341
+ f.write(self.setup_template.format(
342
+ ext_args=", ".join(ext_args),
343
+ np_import=np_import,
344
+ include_dirs=includes,
345
+ library_dirs=self._library_dirs,
346
+ libraries=self._libraries,
347
+ extra_compile_args=self._extra_compile_args,
348
+ extra_link_args=self._extra_link_args,
349
+ cythonize_options=self._cythonize_options
350
+ ))
351
+
352
+ @classmethod
353
+ def _get_wrapped_function(cls, mod, name):
354
+ return getattr(mod, name + '_c')
355
+
356
+ def dump_pyx(self, routines, f, prefix):
357
+ """Write a Cython file with Python wrappers
358
+
359
+ This file contains all the definitions of the routines in c code and
360
+ refers to the header file.
361
+
362
+ Arguments
363
+ ---------
364
+ routines
365
+ List of Routine instances
366
+ f
367
+ File-like object to write the file to
368
+ prefix
369
+ The filename prefix, used to refer to the proper header file.
370
+ Only the basename of the prefix is used.
371
+ """
372
+ headers = []
373
+ functions = []
374
+ for routine in routines:
375
+ prototype = self.generator.get_prototype(routine)
376
+
377
+ # C Function Header Import
378
+ headers.append(self.pyx_header.format(header_file=prefix,
379
+ prototype=prototype))
380
+
381
+ # Partition the C function arguments into categories
382
+ py_rets, py_args, py_loc, py_inf = self._partition_args(routine.arguments)
383
+
384
+ # Function prototype
385
+ name = routine.name
386
+ arg_string = ", ".join(self._prototype_arg(arg) for arg in py_args)
387
+
388
+ # Local Declarations
389
+ local_decs = []
390
+ for arg, val in py_inf.items():
391
+ proto = self._prototype_arg(arg)
392
+ mat, ind = [self._string_var(v) for v in val]
393
+ local_decs.append(" cdef {} = {}.shape[{}]".format(proto, mat, ind))
394
+ local_decs.extend([" cdef {}".format(self._declare_arg(a)) for a in py_loc])
395
+ declarations = "\n".join(local_decs)
396
+ if declarations:
397
+ declarations = declarations + "\n"
398
+
399
+ # Function Body
400
+ args_c = ", ".join([self._call_arg(a) for a in routine.arguments])
401
+ rets = ", ".join([self._string_var(r.name) for r in py_rets])
402
+ if routine.results:
403
+ body = ' return %s(%s)' % (routine.name, args_c)
404
+ if rets:
405
+ body = body + ', ' + rets
406
+ else:
407
+ body = ' %s(%s)\n' % (routine.name, args_c)
408
+ body = body + ' return ' + rets
409
+
410
+ functions.append(self.pyx_func.format(name=name, arg_string=arg_string,
411
+ declarations=declarations, body=body))
412
+
413
+ # Write text to file
414
+ if self._need_numpy:
415
+ # Only import numpy if required
416
+ f.write(self.pyx_imports)
417
+ f.write('\n'.join(headers))
418
+ f.write('\n'.join(functions))
419
+
420
+ def _partition_args(self, args):
421
+ """Group function arguments into categories."""
422
+ py_args = []
423
+ py_returns = []
424
+ py_locals = []
425
+ py_inferred = {}
426
+ for arg in args:
427
+ if isinstance(arg, OutputArgument):
428
+ py_returns.append(arg)
429
+ py_locals.append(arg)
430
+ elif isinstance(arg, InOutArgument):
431
+ py_returns.append(arg)
432
+ py_args.append(arg)
433
+ else:
434
+ py_args.append(arg)
435
+ # Find arguments that are array dimensions. These can be inferred
436
+ # locally in the Cython code.
437
+ if isinstance(arg, (InputArgument, InOutArgument)) and arg.dimensions:
438
+ dims = [d[1] + 1 for d in arg.dimensions]
439
+ sym_dims = [(i, d) for (i, d) in enumerate(dims) if
440
+ isinstance(d, Symbol)]
441
+ for (i, d) in sym_dims:
442
+ py_inferred[d] = (arg.name, i)
443
+ for arg in args:
444
+ if arg.name in py_inferred:
445
+ py_inferred[arg] = py_inferred.pop(arg.name)
446
+ # Filter inferred arguments from py_args
447
+ py_args = [a for a in py_args if a not in py_inferred]
448
+ return py_returns, py_args, py_locals, py_inferred
449
+
450
+ def _prototype_arg(self, arg):
451
+ mat_dec = "np.ndarray[{mtype}, ndim={ndim}] {name}"
452
+ np_types = {'double': 'np.double_t',
453
+ 'int': 'np.int_t'}
454
+ t = arg.get_datatype('c')
455
+ if arg.dimensions:
456
+ self._need_numpy = True
457
+ ndim = len(arg.dimensions)
458
+ mtype = np_types[t]
459
+ return mat_dec.format(mtype=mtype, ndim=ndim, name=self._string_var(arg.name))
460
+ else:
461
+ return "%s %s" % (t, self._string_var(arg.name))
462
+
463
+ def _declare_arg(self, arg):
464
+ proto = self._prototype_arg(arg)
465
+ if arg.dimensions:
466
+ shape = '(' + ','.join(self._string_var(i[1] + 1) for i in arg.dimensions) + ')'
467
+ return proto + " = np.empty({shape})".format(shape=shape)
468
+ else:
469
+ return proto + " = 0"
470
+
471
+ def _call_arg(self, arg):
472
+ if arg.dimensions:
473
+ t = arg.get_datatype('c')
474
+ return "<{}*> {}.data".format(t, self._string_var(arg.name))
475
+ elif isinstance(arg, ResultBase):
476
+ return "&{}".format(self._string_var(arg.name))
477
+ else:
478
+ return self._string_var(arg.name)
479
+
480
+ def _string_var(self, var):
481
+ printer = self.generator.printer.doprint
482
+ return printer(var)
483
+
484
+
485
+ class F2PyCodeWrapper(CodeWrapper):
486
+ """Wrapper that uses f2py"""
487
+
488
+ def __init__(self, *args, **kwargs):
489
+
490
+ ext_keys = ['include_dirs', 'library_dirs', 'libraries',
491
+ 'extra_compile_args', 'extra_link_args']
492
+ msg = ('The compilation option kwarg {} is not supported with the f2py '
493
+ 'backend.')
494
+
495
+ for k in ext_keys:
496
+ if k in kwargs.keys():
497
+ warn(msg.format(k))
498
+ kwargs.pop(k, None)
499
+
500
+ super().__init__(*args, **kwargs)
501
+
502
+ @property
503
+ def command(self):
504
+ filename = self.filename + '.' + self.generator.code_extension
505
+ args = ['-c', '-m', self.module_name, filename]
506
+ command = [sys.executable, "-c", "import numpy.f2py as f2py2e;f2py2e.main()"]+args
507
+ return command
508
+
509
+ def _prepare_files(self, routine):
510
+ pass
511
+
512
+ @classmethod
513
+ def _get_wrapped_function(cls, mod, name):
514
+ return getattr(mod, name)
515
+
516
+
517
+ # Here we define a lookup of backends -> tuples of languages. For now, each
518
+ # tuple is of length 1, but if a backend supports more than one language,
519
+ # the most preferable language is listed first.
520
+ _lang_lookup = {'CYTHON': ('C99', 'C89', 'C'),
521
+ 'F2PY': ('F95',),
522
+ 'NUMPY': ('C99', 'C89', 'C'),
523
+ 'DUMMY': ('F95',)} # Dummy here just for testing
524
+
525
+
526
+ def _infer_language(backend):
527
+ """For a given backend, return the top choice of language"""
528
+ langs = _lang_lookup.get(backend.upper(), False)
529
+ if not langs:
530
+ raise ValueError("Unrecognized backend: " + backend)
531
+ return langs[0]
532
+
533
+
534
+ def _validate_backend_language(backend, language):
535
+ """Throws error if backend and language are incompatible"""
536
+ langs = _lang_lookup.get(backend.upper(), False)
537
+ if not langs:
538
+ raise ValueError("Unrecognized backend: " + backend)
539
+ if language.upper() not in langs:
540
+ raise ValueError(("Backend {} and language {} are "
541
+ "incompatible").format(backend, language))
542
+
543
+
544
+ @cacheit
545
+ @doctest_depends_on(exe=('f2py', 'gfortran'), modules=('numpy',))
546
+ def autowrap(expr, language=None, backend='f2py', tempdir=None, args=None,
547
+ flags=None, verbose=False, helpers=None, code_gen=None, **kwargs):
548
+ """Generates Python callable binaries based on the math expression.
549
+
550
+ Parameters
551
+ ==========
552
+
553
+ expr
554
+ The SymPy expression that should be wrapped as a binary routine.
555
+ language : string, optional
556
+ If supplied, (options: 'C' or 'F95'), specifies the language of the
557
+ generated code. If ``None`` [default], the language is inferred based
558
+ upon the specified backend.
559
+ backend : string, optional
560
+ Backend used to wrap the generated code. Either 'f2py' [default],
561
+ or 'cython'.
562
+ tempdir : string, optional
563
+ Path to directory for temporary files. If this argument is supplied,
564
+ the generated code and the wrapper input files are left intact in the
565
+ specified path.
566
+ args : iterable, optional
567
+ An ordered iterable of symbols. Specifies the argument sequence for the
568
+ function.
569
+ flags : iterable, optional
570
+ Additional option flags that will be passed to the backend.
571
+ verbose : bool, optional
572
+ If True, autowrap will not mute the command line backends. This can be
573
+ helpful for debugging.
574
+ helpers : 3-tuple or iterable of 3-tuples, optional
575
+ Used to define auxiliary expressions needed for the main expr. If the
576
+ main expression needs to call a specialized function it should be
577
+ passed in via ``helpers``. Autowrap will then make sure that the
578
+ compiled main expression can link to the helper routine. Items should
579
+ be 3-tuples with (<function_name>, <sympy_expression>,
580
+ <argument_tuple>). It is mandatory to supply an argument sequence to
581
+ helper routines.
582
+ code_gen : CodeGen instance
583
+ An instance of a CodeGen subclass. Overrides ``language``.
584
+ include_dirs : [string]
585
+ A list of directories to search for C/C++ header files (in Unix form
586
+ for portability).
587
+ library_dirs : [string]
588
+ A list of directories to search for C/C++ libraries at link time.
589
+ libraries : [string]
590
+ A list of library names (not filenames or paths) to link against.
591
+ extra_compile_args : [string]
592
+ Any extra platform- and compiler-specific information to use when
593
+ compiling the source files in 'sources'. For platforms and compilers
594
+ where "command line" makes sense, this is typically a list of
595
+ command-line arguments, but for other platforms it could be anything.
596
+ extra_link_args : [string]
597
+ Any extra platform- and compiler-specific information to use when
598
+ linking object files together to create the extension (or to create a
599
+ new static Python interpreter). Similar interpretation as for
600
+ 'extra_compile_args'.
601
+
602
+ Examples
603
+ ========
604
+
605
+ >>> from sympy.abc import x, y, z
606
+ >>> from sympy.utilities.autowrap import autowrap
607
+ >>> expr = ((x - y + z)**(13)).expand()
608
+ >>> binary_func = autowrap(expr)
609
+ >>> binary_func(1, 4, 2)
610
+ -1.0
611
+
612
+ """
613
+ if language:
614
+ if not isinstance(language, type):
615
+ _validate_backend_language(backend, language)
616
+ else:
617
+ language = _infer_language(backend)
618
+
619
+ # two cases 1) helpers is an iterable of 3-tuples and 2) helpers is a
620
+ # 3-tuple
621
+ if iterable(helpers) and len(helpers) != 0 and iterable(helpers[0]):
622
+ helpers = helpers if helpers else ()
623
+ else:
624
+ helpers = [helpers] if helpers else ()
625
+ args = list(args) if iterable(args, exclude=set) else args
626
+
627
+ if code_gen is None:
628
+ code_gen = get_code_generator(language, "autowrap")
629
+
630
+ CodeWrapperClass = {
631
+ 'F2PY': F2PyCodeWrapper,
632
+ 'CYTHON': CythonCodeWrapper,
633
+ 'DUMMY': DummyWrapper
634
+ }[backend.upper()]
635
+ code_wrapper = CodeWrapperClass(code_gen, tempdir, flags if flags else (),
636
+ verbose, **kwargs)
637
+
638
+ helps = []
639
+ for name_h, expr_h, args_h in helpers:
640
+ helps.append(code_gen.routine(name_h, expr_h, args_h))
641
+
642
+ for name_h, expr_h, args_h in helpers:
643
+ if expr.has(expr_h):
644
+ name_h = binary_function(name_h, expr_h, backend='dummy')
645
+ expr = expr.subs(expr_h, name_h(*args_h))
646
+ try:
647
+ routine = code_gen.routine('autofunc', expr, args)
648
+ except CodeGenArgumentListError as e:
649
+ # if all missing arguments are for pure output, we simply attach them
650
+ # at the end and try again, because the wrappers will silently convert
651
+ # them to return values anyway.
652
+ new_args = []
653
+ for missing in e.missing_args:
654
+ if not isinstance(missing, OutputArgument):
655
+ raise
656
+ new_args.append(missing.name)
657
+ routine = code_gen.routine('autofunc', expr, args + new_args)
658
+
659
+ return code_wrapper.wrap_code(routine, helpers=helps)
660
+
661
+
662
+ @doctest_depends_on(exe=('f2py', 'gfortran'), modules=('numpy',))
663
+ def binary_function(symfunc, expr, **kwargs):
664
+ """Returns a SymPy function with expr as binary implementation
665
+
666
+ This is a convenience function that automates the steps needed to
667
+ autowrap the SymPy expression and attaching it to a Function object
668
+ with implemented_function().
669
+
670
+ Parameters
671
+ ==========
672
+
673
+ symfunc : SymPy Function
674
+ The function to bind the callable to.
675
+ expr : SymPy Expression
676
+ The expression used to generate the function.
677
+ kwargs : dict
678
+ Any kwargs accepted by autowrap.
679
+
680
+ Examples
681
+ ========
682
+
683
+ >>> from sympy.abc import x, y
684
+ >>> from sympy.utilities.autowrap import binary_function
685
+ >>> expr = ((x - y)**(25)).expand()
686
+ >>> f = binary_function('f', expr)
687
+ >>> type(f)
688
+ <class 'sympy.core.function.UndefinedFunction'>
689
+ >>> 2*f(x, y)
690
+ 2*f(x, y)
691
+ >>> f(x, y).evalf(2, subs={x: 1, y: 2})
692
+ -1.0
693
+
694
+ """
695
+ binary = autowrap(expr, **kwargs)
696
+ return implemented_function(symfunc, binary)
697
+
698
+ #################################################################
699
+ # UFUNCIFY #
700
+ #################################################################
701
+
702
+ _ufunc_top = Template("""\
703
+ #include "Python.h"
704
+ #include "math.h"
705
+ #include "numpy/ndarraytypes.h"
706
+ #include "numpy/ufuncobject.h"
707
+ #include "numpy/halffloat.h"
708
+ #include ${include_file}
709
+
710
+ static PyMethodDef ${module}Methods[] = {
711
+ {NULL, NULL, 0, NULL}
712
+ };""")
713
+
714
+ _ufunc_outcalls = Template("*((double *)out${outnum}) = ${funcname}(${call_args});")
715
+
716
+ _ufunc_body = Template("""\
717
+ static void ${funcname}_ufunc(char **args, npy_intp *dimensions, npy_intp* steps, void* data)
718
+ {
719
+ npy_intp i;
720
+ npy_intp n = dimensions[0];
721
+ ${declare_args}
722
+ ${declare_steps}
723
+ for (i = 0; i < n; i++) {
724
+ ${outcalls}
725
+ ${step_increments}
726
+ }
727
+ }
728
+ PyUFuncGenericFunction ${funcname}_funcs[1] = {&${funcname}_ufunc};
729
+ static char ${funcname}_types[${n_types}] = ${types}
730
+ static void *${funcname}_data[1] = {NULL};""")
731
+
732
+ _ufunc_bottom = Template("""\
733
+ #if PY_VERSION_HEX >= 0x03000000
734
+ static struct PyModuleDef moduledef = {
735
+ PyModuleDef_HEAD_INIT,
736
+ "${module}",
737
+ NULL,
738
+ -1,
739
+ ${module}Methods,
740
+ NULL,
741
+ NULL,
742
+ NULL,
743
+ NULL
744
+ };
745
+
746
+ PyMODINIT_FUNC PyInit_${module}(void)
747
+ {
748
+ PyObject *m, *d;
749
+ ${function_creation}
750
+ m = PyModule_Create(&moduledef);
751
+ if (!m) {
752
+ return NULL;
753
+ }
754
+ import_array();
755
+ import_umath();
756
+ d = PyModule_GetDict(m);
757
+ ${ufunc_init}
758
+ return m;
759
+ }
760
+ #else
761
+ PyMODINIT_FUNC init${module}(void)
762
+ {
763
+ PyObject *m, *d;
764
+ ${function_creation}
765
+ m = Py_InitModule("${module}", ${module}Methods);
766
+ if (m == NULL) {
767
+ return;
768
+ }
769
+ import_array();
770
+ import_umath();
771
+ d = PyModule_GetDict(m);
772
+ ${ufunc_init}
773
+ }
774
+ #endif\
775
+ """)
776
+
777
+ _ufunc_init_form = Template("""\
778
+ ufunc${ind} = PyUFunc_FromFuncAndData(${funcname}_funcs, ${funcname}_data, ${funcname}_types, 1, ${n_in}, ${n_out},
779
+ PyUFunc_None, "${module}", ${docstring}, 0);
780
+ PyDict_SetItemString(d, "${funcname}", ufunc${ind});
781
+ Py_DECREF(ufunc${ind});""")
782
+
783
+ _ufunc_setup = Template("""\
784
+ from setuptools.extension import Extension
785
+ from setuptools import setup
786
+
787
+ from numpy import get_include
788
+
789
+ if __name__ == "__main__":
790
+ setup(ext_modules=[
791
+ Extension('${module}',
792
+ sources=['${module}.c', '${filename}.c'],
793
+ include_dirs=[get_include()])])
794
+ """)
795
+
796
+
797
+ class UfuncifyCodeWrapper(CodeWrapper):
798
+ """Wrapper for Ufuncify"""
799
+
800
+ def __init__(self, *args, **kwargs):
801
+
802
+ ext_keys = ['include_dirs', 'library_dirs', 'libraries',
803
+ 'extra_compile_args', 'extra_link_args']
804
+ msg = ('The compilation option kwarg {} is not supported with the numpy'
805
+ ' backend.')
806
+
807
+ for k in ext_keys:
808
+ if k in kwargs.keys():
809
+ warn(msg.format(k))
810
+ kwargs.pop(k, None)
811
+
812
+ super().__init__(*args, **kwargs)
813
+
814
+ @property
815
+ def command(self):
816
+ command = [sys.executable, "setup.py", "build_ext", "--inplace"]
817
+ return command
818
+
819
+ def wrap_code(self, routines, helpers=None):
820
+ # This routine overrides CodeWrapper because we can't assume funcname == routines[0].name
821
+ # Therefore we have to break the CodeWrapper private API.
822
+ # There isn't an obvious way to extend multi-expr support to
823
+ # the other autowrap backends, so we limit this change to ufuncify.
824
+ helpers = helpers if helpers is not None else []
825
+ # We just need a consistent name
826
+ funcname = 'wrapped_' + str(id(routines) + id(helpers))
827
+
828
+ workdir = self.filepath or tempfile.mkdtemp("_sympy_compile")
829
+ if not os.access(workdir, os.F_OK):
830
+ os.mkdir(workdir)
831
+ oldwork = os.getcwd()
832
+ os.chdir(workdir)
833
+ try:
834
+ sys.path.append(workdir)
835
+ self._generate_code(routines, helpers)
836
+ self._prepare_files(routines, funcname)
837
+ self._process_files(routines)
838
+ mod = __import__(self.module_name)
839
+ finally:
840
+ sys.path.remove(workdir)
841
+ CodeWrapper._module_counter += 1
842
+ os.chdir(oldwork)
843
+ if not self.filepath:
844
+ try:
845
+ shutil.rmtree(workdir)
846
+ except OSError:
847
+ # Could be some issues on Windows
848
+ pass
849
+
850
+ return self._get_wrapped_function(mod, funcname)
851
+
852
+ def _generate_code(self, main_routines, helper_routines):
853
+ all_routines = main_routines + helper_routines
854
+ self.generator.write(
855
+ all_routines, self.filename, True, self.include_header,
856
+ self.include_empty)
857
+
858
+ def _prepare_files(self, routines, funcname):
859
+
860
+ # C
861
+ codefilename = self.module_name + '.c'
862
+ with open(codefilename, 'w') as f:
863
+ self.dump_c(routines, f, self.filename, funcname=funcname)
864
+
865
+ # setup.py
866
+ with open('setup.py', 'w') as f:
867
+ self.dump_setup(f)
868
+
869
+ @classmethod
870
+ def _get_wrapped_function(cls, mod, name):
871
+ return getattr(mod, name)
872
+
873
+ def dump_setup(self, f):
874
+ setup = _ufunc_setup.substitute(module=self.module_name,
875
+ filename=self.filename)
876
+ f.write(setup)
877
+
878
+ def dump_c(self, routines, f, prefix, funcname=None):
879
+ """Write a C file with Python wrappers
880
+
881
+ This file contains all the definitions of the routines in c code.
882
+
883
+ Arguments
884
+ ---------
885
+ routines
886
+ List of Routine instances
887
+ f
888
+ File-like object to write the file to
889
+ prefix
890
+ The filename prefix, used to name the imported module.
891
+ funcname
892
+ Name of the main function to be returned.
893
+ """
894
+ if funcname is None:
895
+ if len(routines) == 1:
896
+ funcname = routines[0].name
897
+ else:
898
+ msg = 'funcname must be specified for multiple output routines'
899
+ raise ValueError(msg)
900
+ functions = []
901
+ function_creation = []
902
+ ufunc_init = []
903
+ module = self.module_name
904
+ include_file = "\"{}.h\"".format(prefix)
905
+ top = _ufunc_top.substitute(include_file=include_file, module=module)
906
+
907
+ name = funcname
908
+
909
+ # Partition the C function arguments into categories
910
+ # Here we assume all routines accept the same arguments
911
+ r_index = 0
912
+ py_in, _ = self._partition_args(routines[0].arguments)
913
+ n_in = len(py_in)
914
+ n_out = len(routines)
915
+
916
+ # Declare Args
917
+ form = "char *{0}{1} = args[{2}];"
918
+ arg_decs = [form.format('in', i, i) for i in range(n_in)]
919
+ arg_decs.extend([form.format('out', i, i+n_in) for i in range(n_out)])
920
+ declare_args = '\n '.join(arg_decs)
921
+
922
+ # Declare Steps
923
+ form = "npy_intp {0}{1}_step = steps[{2}];"
924
+ step_decs = [form.format('in', i, i) for i in range(n_in)]
925
+ step_decs.extend([form.format('out', i, i+n_in) for i in range(n_out)])
926
+ declare_steps = '\n '.join(step_decs)
927
+
928
+ # Call Args
929
+ form = "*(double *)in{0}"
930
+ call_args = ', '.join([form.format(a) for a in range(n_in)])
931
+
932
+ # Step Increments
933
+ form = "{0}{1} += {0}{1}_step;"
934
+ step_incs = [form.format('in', i) for i in range(n_in)]
935
+ step_incs.extend([form.format('out', i, i) for i in range(n_out)])
936
+ step_increments = '\n '.join(step_incs)
937
+
938
+ # Types
939
+ n_types = n_in + n_out
940
+ types = "{" + ', '.join(["NPY_DOUBLE"]*n_types) + "};"
941
+
942
+ # Docstring
943
+ docstring = '"Created in SymPy with Ufuncify"'
944
+
945
+ # Function Creation
946
+ function_creation.append("PyObject *ufunc{};".format(r_index))
947
+
948
+ # Ufunc initialization
949
+ init_form = _ufunc_init_form.substitute(module=module,
950
+ funcname=name,
951
+ docstring=docstring,
952
+ n_in=n_in, n_out=n_out,
953
+ ind=r_index)
954
+ ufunc_init.append(init_form)
955
+
956
+ outcalls = [_ufunc_outcalls.substitute(
957
+ outnum=i, call_args=call_args, funcname=routines[i].name) for i in
958
+ range(n_out)]
959
+
960
+ body = _ufunc_body.substitute(module=module, funcname=name,
961
+ declare_args=declare_args,
962
+ declare_steps=declare_steps,
963
+ call_args=call_args,
964
+ step_increments=step_increments,
965
+ n_types=n_types, types=types,
966
+ outcalls='\n '.join(outcalls))
967
+ functions.append(body)
968
+
969
+ body = '\n\n'.join(functions)
970
+ ufunc_init = '\n '.join(ufunc_init)
971
+ function_creation = '\n '.join(function_creation)
972
+ bottom = _ufunc_bottom.substitute(module=module,
973
+ ufunc_init=ufunc_init,
974
+ function_creation=function_creation)
975
+ text = [top, body, bottom]
976
+ f.write('\n\n'.join(text))
977
+
978
+ def _partition_args(self, args):
979
+ """Group function arguments into categories."""
980
+ py_in = []
981
+ py_out = []
982
+ for arg in args:
983
+ if isinstance(arg, OutputArgument):
984
+ py_out.append(arg)
985
+ elif isinstance(arg, InOutArgument):
986
+ raise ValueError("Ufuncify doesn't support InOutArguments")
987
+ else:
988
+ py_in.append(arg)
989
+ return py_in, py_out
990
+
991
+
992
+ @cacheit
993
+ @doctest_depends_on(exe=('f2py', 'gfortran', 'gcc'), modules=('numpy',))
994
+ def ufuncify(args, expr, language=None, backend='numpy', tempdir=None,
995
+ flags=None, verbose=False, helpers=None, **kwargs):
996
+ """Generates a binary function that supports broadcasting on numpy arrays.
997
+
998
+ Parameters
999
+ ==========
1000
+
1001
+ args : iterable
1002
+ Either a Symbol or an iterable of symbols. Specifies the argument
1003
+ sequence for the function.
1004
+ expr
1005
+ A SymPy expression that defines the element wise operation.
1006
+ language : string, optional
1007
+ If supplied, (options: 'C' or 'F95'), specifies the language of the
1008
+ generated code. If ``None`` [default], the language is inferred based
1009
+ upon the specified backend.
1010
+ backend : string, optional
1011
+ Backend used to wrap the generated code. Either 'numpy' [default],
1012
+ 'cython', or 'f2py'.
1013
+ tempdir : string, optional
1014
+ Path to directory for temporary files. If this argument is supplied,
1015
+ the generated code and the wrapper input files are left intact in
1016
+ the specified path.
1017
+ flags : iterable, optional
1018
+ Additional option flags that will be passed to the backend.
1019
+ verbose : bool, optional
1020
+ If True, autowrap will not mute the command line backends. This can
1021
+ be helpful for debugging.
1022
+ helpers : iterable, optional
1023
+ Used to define auxiliary expressions needed for the main expr. If
1024
+ the main expression needs to call a specialized function it should
1025
+ be put in the ``helpers`` iterable. Autowrap will then make sure
1026
+ that the compiled main expression can link to the helper routine.
1027
+ Items should be tuples with (<funtion_name>, <sympy_expression>,
1028
+ <arguments>). It is mandatory to supply an argument sequence to
1029
+ helper routines.
1030
+ kwargs : dict
1031
+ These kwargs will be passed to autowrap if the `f2py` or `cython`
1032
+ backend is used and ignored if the `numpy` backend is used.
1033
+
1034
+ Notes
1035
+ =====
1036
+
1037
+ The default backend ('numpy') will create actual instances of
1038
+ ``numpy.ufunc``. These support ndimensional broadcasting, and implicit type
1039
+ conversion. Use of the other backends will result in a "ufunc-like"
1040
+ function, which requires equal length 1-dimensional arrays for all
1041
+ arguments, and will not perform any type conversions.
1042
+
1043
+ References
1044
+ ==========
1045
+
1046
+ .. [1] https://numpy.org/doc/stable/reference/ufuncs.html
1047
+
1048
+ Examples
1049
+ ========
1050
+
1051
+ >>> from sympy.utilities.autowrap import ufuncify
1052
+ >>> from sympy.abc import x, y
1053
+ >>> import numpy as np
1054
+ >>> f = ufuncify((x, y), y + x**2)
1055
+ >>> type(f)
1056
+ <class 'numpy.ufunc'>
1057
+ >>> f([1, 2, 3], 2)
1058
+ array([ 3., 6., 11.])
1059
+ >>> f(np.arange(5), 3)
1060
+ array([ 3., 4., 7., 12., 19.])
1061
+
1062
+ For the 'f2py' and 'cython' backends, inputs are required to be equal length
1063
+ 1-dimensional arrays. The 'f2py' backend will perform type conversion, but
1064
+ the Cython backend will error if the inputs are not of the expected type.
1065
+
1066
+ >>> f_fortran = ufuncify((x, y), y + x**2, backend='f2py')
1067
+ >>> f_fortran(1, 2)
1068
+ array([ 3.])
1069
+ >>> f_fortran(np.array([1, 2, 3]), np.array([1.0, 2.0, 3.0]))
1070
+ array([ 2., 6., 12.])
1071
+ >>> f_cython = ufuncify((x, y), y + x**2, backend='Cython')
1072
+ >>> f_cython(1, 2) # doctest: +ELLIPSIS
1073
+ Traceback (most recent call last):
1074
+ ...
1075
+ TypeError: Argument '_x' has incorrect type (expected numpy.ndarray, got int)
1076
+ >>> f_cython(np.array([1.0]), np.array([2.0]))
1077
+ array([ 3.])
1078
+
1079
+ """
1080
+
1081
+ if isinstance(args, Symbol):
1082
+ args = (args,)
1083
+ else:
1084
+ args = tuple(args)
1085
+
1086
+ if language:
1087
+ _validate_backend_language(backend, language)
1088
+ else:
1089
+ language = _infer_language(backend)
1090
+
1091
+ helpers = helpers if helpers else ()
1092
+ flags = flags if flags else ()
1093
+
1094
+ if backend.upper() == 'NUMPY':
1095
+ # maxargs is set by numpy compile-time constant NPY_MAXARGS
1096
+ # If a future version of numpy modifies or removes this restriction
1097
+ # this variable should be changed or removed
1098
+ maxargs = 32
1099
+ helps = []
1100
+ for name, expr, args in helpers:
1101
+ helps.append(make_routine(name, expr, args))
1102
+ code_wrapper = UfuncifyCodeWrapper(C99CodeGen("ufuncify"), tempdir,
1103
+ flags, verbose)
1104
+ if not isinstance(expr, (list, tuple)):
1105
+ expr = [expr]
1106
+ if len(expr) == 0:
1107
+ raise ValueError('Expression iterable has zero length')
1108
+ if len(expr) + len(args) > maxargs:
1109
+ msg = ('Cannot create ufunc with more than {0} total arguments: '
1110
+ 'got {1} in, {2} out')
1111
+ raise ValueError(msg.format(maxargs, len(args), len(expr)))
1112
+ routines = [make_routine('autofunc{}'.format(idx), exprx, args) for
1113
+ idx, exprx in enumerate(expr)]
1114
+ return code_wrapper.wrap_code(routines, helpers=helps)
1115
+ else:
1116
+ # Dummies are used for all added expressions to prevent name clashes
1117
+ # within the original expression.
1118
+ y = IndexedBase(Dummy('y'))
1119
+ m = Dummy('m', integer=True)
1120
+ i = Idx(Dummy('i', integer=True), m)
1121
+ f_dummy = Dummy('f')
1122
+ f = implemented_function('%s_%d' % (f_dummy.name, f_dummy.dummy_index), Lambda(args, expr))
1123
+ # For each of the args create an indexed version.
1124
+ indexed_args = [IndexedBase(Dummy(str(a))) for a in args]
1125
+ # Order the arguments (out, args, dim)
1126
+ args = [y] + indexed_args + [m]
1127
+ args_with_indices = [a[i] for a in indexed_args]
1128
+ return autowrap(Eq(y[i], f(*args_with_indices)), language, backend,
1129
+ tempdir, args, flags, verbose, helpers, **kwargs)
venv/lib/python3.10/site-packages/sympy/utilities/codegen.py ADDED
@@ -0,0 +1,2236 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ module for generating C, C++, Fortran77, Fortran90, Julia, Rust
3
+ and Octave/Matlab routines that evaluate SymPy expressions.
4
+ This module is work in progress.
5
+ Only the milestones with a '+' character in the list below have been completed.
6
+
7
+ --- How is sympy.utilities.codegen different from sympy.printing.ccode? ---
8
+
9
+ We considered the idea to extend the printing routines for SymPy functions in
10
+ such a way that it prints complete compilable code, but this leads to a few
11
+ unsurmountable issues that can only be tackled with dedicated code generator:
12
+
13
+ - For C, one needs both a code and a header file, while the printing routines
14
+ generate just one string. This code generator can be extended to support
15
+ .pyf files for f2py.
16
+
17
+ - SymPy functions are not concerned with programming-technical issues, such
18
+ as input, output and input-output arguments. Other examples are contiguous
19
+ or non-contiguous arrays, including headers of other libraries such as gsl
20
+ or others.
21
+
22
+ - It is highly interesting to evaluate several SymPy functions in one C
23
+ routine, eventually sharing common intermediate results with the help
24
+ of the cse routine. This is more than just printing.
25
+
26
+ - From the programming perspective, expressions with constants should be
27
+ evaluated in the code generator as much as possible. This is different
28
+ for printing.
29
+
30
+ --- Basic assumptions ---
31
+
32
+ * A generic Routine data structure describes the routine that must be
33
+ translated into C/Fortran/... code. This data structure covers all
34
+ features present in one or more of the supported languages.
35
+
36
+ * Descendants from the CodeGen class transform multiple Routine instances
37
+ into compilable code. Each derived class translates into a specific
38
+ language.
39
+
40
+ * In many cases, one wants a simple workflow. The friendly functions in the
41
+ last part are a simple api on top of the Routine/CodeGen stuff. They are
42
+ easier to use, but are less powerful.
43
+
44
+ --- Milestones ---
45
+
46
+ + First working version with scalar input arguments, generating C code,
47
+ tests
48
+ + Friendly functions that are easier to use than the rigorous
49
+ Routine/CodeGen workflow.
50
+ + Integer and Real numbers as input and output
51
+ + Output arguments
52
+ + InputOutput arguments
53
+ + Sort input/output arguments properly
54
+ + Contiguous array arguments (numpy matrices)
55
+ + Also generate .pyf code for f2py (in autowrap module)
56
+ + Isolate constants and evaluate them beforehand in double precision
57
+ + Fortran 90
58
+ + Octave/Matlab
59
+
60
+ - Common Subexpression Elimination
61
+ - User defined comments in the generated code
62
+ - Optional extra include lines for libraries/objects that can eval special
63
+ functions
64
+ - Test other C compilers and libraries: gcc, tcc, libtcc, gcc+gsl, ...
65
+ - Contiguous array arguments (SymPy matrices)
66
+ - Non-contiguous array arguments (SymPy matrices)
67
+ - ccode must raise an error when it encounters something that cannot be
68
+ translated into c. ccode(integrate(sin(x)/x, x)) does not make sense.
69
+ - Complex numbers as input and output
70
+ - A default complex datatype
71
+ - Include extra information in the header: date, user, hostname, sha1
72
+ hash, ...
73
+ - Fortran 77
74
+ - C++
75
+ - Python
76
+ - Julia
77
+ - Rust
78
+ - ...
79
+
80
+ """
81
+
82
+ import os
83
+ import textwrap
84
+ from io import StringIO
85
+
86
+ from sympy import __version__ as sympy_version
87
+ from sympy.core import Symbol, S, Tuple, Equality, Function, Basic
88
+ from sympy.printing.c import c_code_printers
89
+ from sympy.printing.codeprinter import AssignmentError
90
+ from sympy.printing.fortran import FCodePrinter
91
+ from sympy.printing.julia import JuliaCodePrinter
92
+ from sympy.printing.octave import OctaveCodePrinter
93
+ from sympy.printing.rust import RustCodePrinter
94
+ from sympy.tensor import Idx, Indexed, IndexedBase
95
+ from sympy.matrices import (MatrixSymbol, ImmutableMatrix, MatrixBase,
96
+ MatrixExpr, MatrixSlice)
97
+ from sympy.utilities.iterables import is_sequence
98
+
99
+
100
+ __all__ = [
101
+ # description of routines
102
+ "Routine", "DataType", "default_datatypes", "get_default_datatype",
103
+ "Argument", "InputArgument", "OutputArgument", "Result",
104
+ # routines -> code
105
+ "CodeGen", "CCodeGen", "FCodeGen", "JuliaCodeGen", "OctaveCodeGen",
106
+ "RustCodeGen",
107
+ # friendly functions
108
+ "codegen", "make_routine",
109
+ ]
110
+
111
+
112
+ #
113
+ # Description of routines
114
+ #
115
+
116
+
117
+ class Routine:
118
+ """Generic description of evaluation routine for set of expressions.
119
+
120
+ A CodeGen class can translate instances of this class into code in a
121
+ particular language. The routine specification covers all the features
122
+ present in these languages. The CodeGen part must raise an exception
123
+ when certain features are not present in the target language. For
124
+ example, multiple return values are possible in Python, but not in C or
125
+ Fortran. Another example: Fortran and Python support complex numbers,
126
+ while C does not.
127
+
128
+ """
129
+
130
+ def __init__(self, name, arguments, results, local_vars, global_vars):
131
+ """Initialize a Routine instance.
132
+
133
+ Parameters
134
+ ==========
135
+
136
+ name : string
137
+ Name of the routine.
138
+
139
+ arguments : list of Arguments
140
+ These are things that appear in arguments of a routine, often
141
+ appearing on the right-hand side of a function call. These are
142
+ commonly InputArguments but in some languages, they can also be
143
+ OutputArguments or InOutArguments (e.g., pass-by-reference in C
144
+ code).
145
+
146
+ results : list of Results
147
+ These are the return values of the routine, often appearing on
148
+ the left-hand side of a function call. The difference between
149
+ Results and OutputArguments and when you should use each is
150
+ language-specific.
151
+
152
+ local_vars : list of Results
153
+ These are variables that will be defined at the beginning of the
154
+ function.
155
+
156
+ global_vars : list of Symbols
157
+ Variables which will not be passed into the function.
158
+
159
+ """
160
+
161
+ # extract all input symbols and all symbols appearing in an expression
162
+ input_symbols = set()
163
+ symbols = set()
164
+ for arg in arguments:
165
+ if isinstance(arg, OutputArgument):
166
+ symbols.update(arg.expr.free_symbols - arg.expr.atoms(Indexed))
167
+ elif isinstance(arg, InputArgument):
168
+ input_symbols.add(arg.name)
169
+ elif isinstance(arg, InOutArgument):
170
+ input_symbols.add(arg.name)
171
+ symbols.update(arg.expr.free_symbols - arg.expr.atoms(Indexed))
172
+ else:
173
+ raise ValueError("Unknown Routine argument: %s" % arg)
174
+
175
+ for r in results:
176
+ if not isinstance(r, Result):
177
+ raise ValueError("Unknown Routine result: %s" % r)
178
+ symbols.update(r.expr.free_symbols - r.expr.atoms(Indexed))
179
+
180
+ local_symbols = set()
181
+ for r in local_vars:
182
+ if isinstance(r, Result):
183
+ symbols.update(r.expr.free_symbols - r.expr.atoms(Indexed))
184
+ local_symbols.add(r.name)
185
+ else:
186
+ local_symbols.add(r)
187
+
188
+ symbols = {s.label if isinstance(s, Idx) else s for s in symbols}
189
+
190
+ # Check that all symbols in the expressions are covered by
191
+ # InputArguments/InOutArguments---subset because user could
192
+ # specify additional (unused) InputArguments or local_vars.
193
+ notcovered = symbols.difference(
194
+ input_symbols.union(local_symbols).union(global_vars))
195
+ if notcovered != set():
196
+ raise ValueError("Symbols needed for output are not in input " +
197
+ ", ".join([str(x) for x in notcovered]))
198
+
199
+ self.name = name
200
+ self.arguments = arguments
201
+ self.results = results
202
+ self.local_vars = local_vars
203
+ self.global_vars = global_vars
204
+
205
+ def __str__(self):
206
+ return self.__class__.__name__ + "({name!r}, {arguments}, {results}, {local_vars}, {global_vars})".format(**self.__dict__)
207
+
208
+ __repr__ = __str__
209
+
210
+ @property
211
+ def variables(self):
212
+ """Returns a set of all variables possibly used in the routine.
213
+
214
+ For routines with unnamed return values, the dummies that may or
215
+ may not be used will be included in the set.
216
+
217
+ """
218
+ v = set(self.local_vars)
219
+ for arg in self.arguments:
220
+ v.add(arg.name)
221
+ for res in self.results:
222
+ v.add(res.result_var)
223
+ return v
224
+
225
+ @property
226
+ def result_variables(self):
227
+ """Returns a list of OutputArgument, InOutArgument and Result.
228
+
229
+ If return values are present, they are at the end of the list.
230
+ """
231
+ args = [arg for arg in self.arguments if isinstance(
232
+ arg, (OutputArgument, InOutArgument))]
233
+ args.extend(self.results)
234
+ return args
235
+
236
+
237
+ class DataType:
238
+ """Holds strings for a certain datatype in different languages."""
239
+ def __init__(self, cname, fname, pyname, jlname, octname, rsname):
240
+ self.cname = cname
241
+ self.fname = fname
242
+ self.pyname = pyname
243
+ self.jlname = jlname
244
+ self.octname = octname
245
+ self.rsname = rsname
246
+
247
+
248
+ default_datatypes = {
249
+ "int": DataType("int", "INTEGER*4", "int", "", "", "i32"),
250
+ "float": DataType("double", "REAL*8", "float", "", "", "f64"),
251
+ "complex": DataType("double", "COMPLEX*16", "complex", "", "", "float") #FIXME:
252
+ # complex is only supported in fortran, python, julia, and octave.
253
+ # So to not break c or rust code generation, we stick with double or
254
+ # float, respectively (but actually should raise an exception for
255
+ # explicitly complex variables (x.is_complex==True))
256
+ }
257
+
258
+
259
+ COMPLEX_ALLOWED = False
260
+ def get_default_datatype(expr, complex_allowed=None):
261
+ """Derives an appropriate datatype based on the expression."""
262
+ if complex_allowed is None:
263
+ complex_allowed = COMPLEX_ALLOWED
264
+ if complex_allowed:
265
+ final_dtype = "complex"
266
+ else:
267
+ final_dtype = "float"
268
+ if expr.is_integer:
269
+ return default_datatypes["int"]
270
+ elif expr.is_real:
271
+ return default_datatypes["float"]
272
+ elif isinstance(expr, MatrixBase):
273
+ #check all entries
274
+ dt = "int"
275
+ for element in expr:
276
+ if dt == "int" and not element.is_integer:
277
+ dt = "float"
278
+ if dt == "float" and not element.is_real:
279
+ return default_datatypes[final_dtype]
280
+ return default_datatypes[dt]
281
+ else:
282
+ return default_datatypes[final_dtype]
283
+
284
+
285
+ class Variable:
286
+ """Represents a typed variable."""
287
+
288
+ def __init__(self, name, datatype=None, dimensions=None, precision=None):
289
+ """Return a new variable.
290
+
291
+ Parameters
292
+ ==========
293
+
294
+ name : Symbol or MatrixSymbol
295
+
296
+ datatype : optional
297
+ When not given, the data type will be guessed based on the
298
+ assumptions on the symbol argument.
299
+
300
+ dimension : sequence containing tupes, optional
301
+ If present, the argument is interpreted as an array, where this
302
+ sequence of tuples specifies (lower, upper) bounds for each
303
+ index of the array.
304
+
305
+ precision : int, optional
306
+ Controls the precision of floating point constants.
307
+
308
+ """
309
+ if not isinstance(name, (Symbol, MatrixSymbol)):
310
+ raise TypeError("The first argument must be a SymPy symbol.")
311
+ if datatype is None:
312
+ datatype = get_default_datatype(name)
313
+ elif not isinstance(datatype, DataType):
314
+ raise TypeError("The (optional) `datatype' argument must be an "
315
+ "instance of the DataType class.")
316
+ if dimensions and not isinstance(dimensions, (tuple, list)):
317
+ raise TypeError(
318
+ "The dimension argument must be a sequence of tuples")
319
+
320
+ self._name = name
321
+ self._datatype = {
322
+ 'C': datatype.cname,
323
+ 'FORTRAN': datatype.fname,
324
+ 'JULIA': datatype.jlname,
325
+ 'OCTAVE': datatype.octname,
326
+ 'PYTHON': datatype.pyname,
327
+ 'RUST': datatype.rsname,
328
+ }
329
+ self.dimensions = dimensions
330
+ self.precision = precision
331
+
332
+ def __str__(self):
333
+ return "%s(%r)" % (self.__class__.__name__, self.name)
334
+
335
+ __repr__ = __str__
336
+
337
+ @property
338
+ def name(self):
339
+ return self._name
340
+
341
+ def get_datatype(self, language):
342
+ """Returns the datatype string for the requested language.
343
+
344
+ Examples
345
+ ========
346
+
347
+ >>> from sympy import Symbol
348
+ >>> from sympy.utilities.codegen import Variable
349
+ >>> x = Variable(Symbol('x'))
350
+ >>> x.get_datatype('c')
351
+ 'double'
352
+ >>> x.get_datatype('fortran')
353
+ 'REAL*8'
354
+
355
+ """
356
+ try:
357
+ return self._datatype[language.upper()]
358
+ except KeyError:
359
+ raise CodeGenError("Has datatypes for languages: %s" %
360
+ ", ".join(self._datatype))
361
+
362
+
363
+ class Argument(Variable):
364
+ """An abstract Argument data structure: a name and a data type.
365
+
366
+ This structure is refined in the descendants below.
367
+
368
+ """
369
+ pass
370
+
371
+
372
+ class InputArgument(Argument):
373
+ pass
374
+
375
+
376
+ class ResultBase:
377
+ """Base class for all "outgoing" information from a routine.
378
+
379
+ Objects of this class stores a SymPy expression, and a SymPy object
380
+ representing a result variable that will be used in the generated code
381
+ only if necessary.
382
+
383
+ """
384
+ def __init__(self, expr, result_var):
385
+ self.expr = expr
386
+ self.result_var = result_var
387
+
388
+ def __str__(self):
389
+ return "%s(%r, %r)" % (self.__class__.__name__, self.expr,
390
+ self.result_var)
391
+
392
+ __repr__ = __str__
393
+
394
+
395
+ class OutputArgument(Argument, ResultBase):
396
+ """OutputArgument are always initialized in the routine."""
397
+
398
+ def __init__(self, name, result_var, expr, datatype=None, dimensions=None, precision=None):
399
+ """Return a new variable.
400
+
401
+ Parameters
402
+ ==========
403
+
404
+ name : Symbol, MatrixSymbol
405
+ The name of this variable. When used for code generation, this
406
+ might appear, for example, in the prototype of function in the
407
+ argument list.
408
+
409
+ result_var : Symbol, Indexed
410
+ Something that can be used to assign a value to this variable.
411
+ Typically the same as `name` but for Indexed this should be e.g.,
412
+ "y[i]" whereas `name` should be the Symbol "y".
413
+
414
+ expr : object
415
+ The expression that should be output, typically a SymPy
416
+ expression.
417
+
418
+ datatype : optional
419
+ When not given, the data type will be guessed based on the
420
+ assumptions on the symbol argument.
421
+
422
+ dimension : sequence containing tupes, optional
423
+ If present, the argument is interpreted as an array, where this
424
+ sequence of tuples specifies (lower, upper) bounds for each
425
+ index of the array.
426
+
427
+ precision : int, optional
428
+ Controls the precision of floating point constants.
429
+
430
+ """
431
+
432
+ Argument.__init__(self, name, datatype, dimensions, precision)
433
+ ResultBase.__init__(self, expr, result_var)
434
+
435
+ def __str__(self):
436
+ return "%s(%r, %r, %r)" % (self.__class__.__name__, self.name, self.result_var, self.expr)
437
+
438
+ __repr__ = __str__
439
+
440
+
441
+ class InOutArgument(Argument, ResultBase):
442
+ """InOutArgument are never initialized in the routine."""
443
+
444
+ def __init__(self, name, result_var, expr, datatype=None, dimensions=None, precision=None):
445
+ if not datatype:
446
+ datatype = get_default_datatype(expr)
447
+ Argument.__init__(self, name, datatype, dimensions, precision)
448
+ ResultBase.__init__(self, expr, result_var)
449
+ __init__.__doc__ = OutputArgument.__init__.__doc__
450
+
451
+
452
+ def __str__(self):
453
+ return "%s(%r, %r, %r)" % (self.__class__.__name__, self.name, self.expr,
454
+ self.result_var)
455
+
456
+ __repr__ = __str__
457
+
458
+
459
+ class Result(Variable, ResultBase):
460
+ """An expression for a return value.
461
+
462
+ The name result is used to avoid conflicts with the reserved word
463
+ "return" in the Python language. It is also shorter than ReturnValue.
464
+
465
+ These may or may not need a name in the destination (e.g., "return(x*y)"
466
+ might return a value without ever naming it).
467
+
468
+ """
469
+
470
+ def __init__(self, expr, name=None, result_var=None, datatype=None,
471
+ dimensions=None, precision=None):
472
+ """Initialize a return value.
473
+
474
+ Parameters
475
+ ==========
476
+
477
+ expr : SymPy expression
478
+
479
+ name : Symbol, MatrixSymbol, optional
480
+ The name of this return variable. When used for code generation,
481
+ this might appear, for example, in the prototype of function in a
482
+ list of return values. A dummy name is generated if omitted.
483
+
484
+ result_var : Symbol, Indexed, optional
485
+ Something that can be used to assign a value to this variable.
486
+ Typically the same as `name` but for Indexed this should be e.g.,
487
+ "y[i]" whereas `name` should be the Symbol "y". Defaults to
488
+ `name` if omitted.
489
+
490
+ datatype : optional
491
+ When not given, the data type will be guessed based on the
492
+ assumptions on the expr argument.
493
+
494
+ dimension : sequence containing tupes, optional
495
+ If present, this variable is interpreted as an array,
496
+ where this sequence of tuples specifies (lower, upper)
497
+ bounds for each index of the array.
498
+
499
+ precision : int, optional
500
+ Controls the precision of floating point constants.
501
+
502
+ """
503
+ # Basic because it is the base class for all types of expressions
504
+ if not isinstance(expr, (Basic, MatrixBase)):
505
+ raise TypeError("The first argument must be a SymPy expression.")
506
+
507
+ if name is None:
508
+ name = 'result_%d' % abs(hash(expr))
509
+
510
+ if datatype is None:
511
+ #try to infer data type from the expression
512
+ datatype = get_default_datatype(expr)
513
+
514
+ if isinstance(name, str):
515
+ if isinstance(expr, (MatrixBase, MatrixExpr)):
516
+ name = MatrixSymbol(name, *expr.shape)
517
+ else:
518
+ name = Symbol(name)
519
+
520
+ if result_var is None:
521
+ result_var = name
522
+
523
+ Variable.__init__(self, name, datatype=datatype,
524
+ dimensions=dimensions, precision=precision)
525
+ ResultBase.__init__(self, expr, result_var)
526
+
527
+ def __str__(self):
528
+ return "%s(%r, %r, %r)" % (self.__class__.__name__, self.expr, self.name,
529
+ self.result_var)
530
+
531
+ __repr__ = __str__
532
+
533
+
534
+ #
535
+ # Transformation of routine objects into code
536
+ #
537
+
538
+ class CodeGen:
539
+ """Abstract class for the code generators."""
540
+
541
+ printer = None # will be set to an instance of a CodePrinter subclass
542
+
543
+ def _indent_code(self, codelines):
544
+ return self.printer.indent_code(codelines)
545
+
546
+ def _printer_method_with_settings(self, method, settings=None, *args, **kwargs):
547
+ settings = settings or {}
548
+ ori = {k: self.printer._settings[k] for k in settings}
549
+ for k, v in settings.items():
550
+ self.printer._settings[k] = v
551
+ result = getattr(self.printer, method)(*args, **kwargs)
552
+ for k, v in ori.items():
553
+ self.printer._settings[k] = v
554
+ return result
555
+
556
+ def _get_symbol(self, s):
557
+ """Returns the symbol as fcode prints it."""
558
+ if self.printer._settings['human']:
559
+ expr_str = self.printer.doprint(s)
560
+ else:
561
+ constants, not_supported, expr_str = self.printer.doprint(s)
562
+ if constants or not_supported:
563
+ raise ValueError("Failed to print %s" % str(s))
564
+ return expr_str.strip()
565
+
566
+ def __init__(self, project="project", cse=False):
567
+ """Initialize a code generator.
568
+
569
+ Derived classes will offer more options that affect the generated
570
+ code.
571
+
572
+ """
573
+ self.project = project
574
+ self.cse = cse
575
+
576
+ def routine(self, name, expr, argument_sequence=None, global_vars=None):
577
+ """Creates an Routine object that is appropriate for this language.
578
+
579
+ This implementation is appropriate for at least C/Fortran. Subclasses
580
+ can override this if necessary.
581
+
582
+ Here, we assume at most one return value (the l-value) which must be
583
+ scalar. Additional outputs are OutputArguments (e.g., pointers on
584
+ right-hand-side or pass-by-reference). Matrices are always returned
585
+ via OutputArguments. If ``argument_sequence`` is None, arguments will
586
+ be ordered alphabetically, but with all InputArguments first, and then
587
+ OutputArgument and InOutArguments.
588
+
589
+ """
590
+
591
+ if self.cse:
592
+ from sympy.simplify.cse_main import cse
593
+
594
+ if is_sequence(expr) and not isinstance(expr, (MatrixBase, MatrixExpr)):
595
+ if not expr:
596
+ raise ValueError("No expression given")
597
+ for e in expr:
598
+ if not e.is_Equality:
599
+ raise CodeGenError("Lists of expressions must all be Equalities. {} is not.".format(e))
600
+
601
+ # create a list of right hand sides and simplify them
602
+ rhs = [e.rhs for e in expr]
603
+ common, simplified = cse(rhs)
604
+
605
+ # pack the simplified expressions back up with their left hand sides
606
+ expr = [Equality(e.lhs, rhs) for e, rhs in zip(expr, simplified)]
607
+ else:
608
+ if isinstance(expr, Equality):
609
+ common, simplified = cse(expr.rhs) #, ignore=in_out_args)
610
+ expr = Equality(expr.lhs, simplified[0])
611
+ else:
612
+ common, simplified = cse(expr)
613
+ expr = simplified
614
+
615
+ local_vars = [Result(b,a) for a,b in common]
616
+ local_symbols = {a for a,_ in common}
617
+ local_expressions = Tuple(*[b for _,b in common])
618
+ else:
619
+ local_expressions = Tuple()
620
+
621
+ if is_sequence(expr) and not isinstance(expr, (MatrixBase, MatrixExpr)):
622
+ if not expr:
623
+ raise ValueError("No expression given")
624
+ expressions = Tuple(*expr)
625
+ else:
626
+ expressions = Tuple(expr)
627
+
628
+ if self.cse:
629
+ if {i.label for i in expressions.atoms(Idx)} != set():
630
+ raise CodeGenError("CSE and Indexed expressions do not play well together yet")
631
+ else:
632
+ # local variables for indexed expressions
633
+ local_vars = {i.label for i in expressions.atoms(Idx)}
634
+ local_symbols = local_vars
635
+
636
+ # global variables
637
+ global_vars = set() if global_vars is None else set(global_vars)
638
+
639
+ # symbols that should be arguments
640
+ symbols = (expressions.free_symbols | local_expressions.free_symbols) - local_symbols - global_vars
641
+ new_symbols = set()
642
+ new_symbols.update(symbols)
643
+
644
+ for symbol in symbols:
645
+ if isinstance(symbol, Idx):
646
+ new_symbols.remove(symbol)
647
+ new_symbols.update(symbol.args[1].free_symbols)
648
+ if isinstance(symbol, Indexed):
649
+ new_symbols.remove(symbol)
650
+ symbols = new_symbols
651
+
652
+ # Decide whether to use output argument or return value
653
+ return_val = []
654
+ output_args = []
655
+ for expr in expressions:
656
+ if isinstance(expr, Equality):
657
+ out_arg = expr.lhs
658
+ expr = expr.rhs
659
+ if isinstance(out_arg, Indexed):
660
+ dims = tuple([ (S.Zero, dim - 1) for dim in out_arg.shape])
661
+ symbol = out_arg.base.label
662
+ elif isinstance(out_arg, Symbol):
663
+ dims = []
664
+ symbol = out_arg
665
+ elif isinstance(out_arg, MatrixSymbol):
666
+ dims = tuple([ (S.Zero, dim - 1) for dim in out_arg.shape])
667
+ symbol = out_arg
668
+ else:
669
+ raise CodeGenError("Only Indexed, Symbol, or MatrixSymbol "
670
+ "can define output arguments.")
671
+
672
+ if expr.has(symbol):
673
+ output_args.append(
674
+ InOutArgument(symbol, out_arg, expr, dimensions=dims))
675
+ else:
676
+ output_args.append(
677
+ OutputArgument(symbol, out_arg, expr, dimensions=dims))
678
+
679
+ # remove duplicate arguments when they are not local variables
680
+ if symbol not in local_vars:
681
+ # avoid duplicate arguments
682
+ symbols.remove(symbol)
683
+ elif isinstance(expr, (ImmutableMatrix, MatrixSlice)):
684
+ # Create a "dummy" MatrixSymbol to use as the Output arg
685
+ out_arg = MatrixSymbol('out_%s' % abs(hash(expr)), *expr.shape)
686
+ dims = tuple([(S.Zero, dim - 1) for dim in out_arg.shape])
687
+ output_args.append(
688
+ OutputArgument(out_arg, out_arg, expr, dimensions=dims))
689
+ else:
690
+ return_val.append(Result(expr))
691
+
692
+ arg_list = []
693
+
694
+ # setup input argument list
695
+
696
+ # helper to get dimensions for data for array-like args
697
+ def dimensions(s):
698
+ return [(S.Zero, dim - 1) for dim in s.shape]
699
+
700
+ array_symbols = {}
701
+ for array in expressions.atoms(Indexed) | local_expressions.atoms(Indexed):
702
+ array_symbols[array.base.label] = array
703
+ for array in expressions.atoms(MatrixSymbol) | local_expressions.atoms(MatrixSymbol):
704
+ array_symbols[array] = array
705
+
706
+ for symbol in sorted(symbols, key=str):
707
+ if symbol in array_symbols:
708
+ array = array_symbols[symbol]
709
+ metadata = {'dimensions': dimensions(array)}
710
+ else:
711
+ metadata = {}
712
+
713
+ arg_list.append(InputArgument(symbol, **metadata))
714
+
715
+ output_args.sort(key=lambda x: str(x.name))
716
+ arg_list.extend(output_args)
717
+
718
+ if argument_sequence is not None:
719
+ # if the user has supplied IndexedBase instances, we'll accept that
720
+ new_sequence = []
721
+ for arg in argument_sequence:
722
+ if isinstance(arg, IndexedBase):
723
+ new_sequence.append(arg.label)
724
+ else:
725
+ new_sequence.append(arg)
726
+ argument_sequence = new_sequence
727
+
728
+ missing = [x for x in arg_list if x.name not in argument_sequence]
729
+ if missing:
730
+ msg = "Argument list didn't specify: {0} "
731
+ msg = msg.format(", ".join([str(m.name) for m in missing]))
732
+ raise CodeGenArgumentListError(msg, missing)
733
+
734
+ # create redundant arguments to produce the requested sequence
735
+ name_arg_dict = {x.name: x for x in arg_list}
736
+ new_args = []
737
+ for symbol in argument_sequence:
738
+ try:
739
+ new_args.append(name_arg_dict[symbol])
740
+ except KeyError:
741
+ if isinstance(symbol, (IndexedBase, MatrixSymbol)):
742
+ metadata = {'dimensions': dimensions(symbol)}
743
+ else:
744
+ metadata = {}
745
+ new_args.append(InputArgument(symbol, **metadata))
746
+ arg_list = new_args
747
+
748
+ return Routine(name, arg_list, return_val, local_vars, global_vars)
749
+
750
+ def write(self, routines, prefix, to_files=False, header=True, empty=True):
751
+ """Writes all the source code files for the given routines.
752
+
753
+ The generated source is returned as a list of (filename, contents)
754
+ tuples, or is written to files (see below). Each filename consists
755
+ of the given prefix, appended with an appropriate extension.
756
+
757
+ Parameters
758
+ ==========
759
+
760
+ routines : list
761
+ A list of Routine instances to be written
762
+
763
+ prefix : string
764
+ The prefix for the output files
765
+
766
+ to_files : bool, optional
767
+ When True, the output is written to files. Otherwise, a list
768
+ of (filename, contents) tuples is returned. [default: False]
769
+
770
+ header : bool, optional
771
+ When True, a header comment is included on top of each source
772
+ file. [default: True]
773
+
774
+ empty : bool, optional
775
+ When True, empty lines are included to structure the source
776
+ files. [default: True]
777
+
778
+ """
779
+ if to_files:
780
+ for dump_fn in self.dump_fns:
781
+ filename = "%s.%s" % (prefix, dump_fn.extension)
782
+ with open(filename, "w") as f:
783
+ dump_fn(self, routines, f, prefix, header, empty)
784
+ else:
785
+ result = []
786
+ for dump_fn in self.dump_fns:
787
+ filename = "%s.%s" % (prefix, dump_fn.extension)
788
+ contents = StringIO()
789
+ dump_fn(self, routines, contents, prefix, header, empty)
790
+ result.append((filename, contents.getvalue()))
791
+ return result
792
+
793
+ def dump_code(self, routines, f, prefix, header=True, empty=True):
794
+ """Write the code by calling language specific methods.
795
+
796
+ The generated file contains all the definitions of the routines in
797
+ low-level code and refers to the header file if appropriate.
798
+
799
+ Parameters
800
+ ==========
801
+
802
+ routines : list
803
+ A list of Routine instances.
804
+
805
+ f : file-like
806
+ Where to write the file.
807
+
808
+ prefix : string
809
+ The filename prefix, used to refer to the proper header file.
810
+ Only the basename of the prefix is used.
811
+
812
+ header : bool, optional
813
+ When True, a header comment is included on top of each source
814
+ file. [default : True]
815
+
816
+ empty : bool, optional
817
+ When True, empty lines are included to structure the source
818
+ files. [default : True]
819
+
820
+ """
821
+
822
+ code_lines = self._preprocessor_statements(prefix)
823
+
824
+ for routine in routines:
825
+ if empty:
826
+ code_lines.append("\n")
827
+ code_lines.extend(self._get_routine_opening(routine))
828
+ code_lines.extend(self._declare_arguments(routine))
829
+ code_lines.extend(self._declare_globals(routine))
830
+ code_lines.extend(self._declare_locals(routine))
831
+ if empty:
832
+ code_lines.append("\n")
833
+ code_lines.extend(self._call_printer(routine))
834
+ if empty:
835
+ code_lines.append("\n")
836
+ code_lines.extend(self._get_routine_ending(routine))
837
+
838
+ code_lines = self._indent_code(''.join(code_lines))
839
+
840
+ if header:
841
+ code_lines = ''.join(self._get_header() + [code_lines])
842
+
843
+ if code_lines:
844
+ f.write(code_lines)
845
+
846
+
847
+ class CodeGenError(Exception):
848
+ pass
849
+
850
+
851
+ class CodeGenArgumentListError(Exception):
852
+ @property
853
+ def missing_args(self):
854
+ return self.args[1]
855
+
856
+
857
+ header_comment = """Code generated with SymPy %(version)s
858
+
859
+ See http://www.sympy.org/ for more information.
860
+
861
+ This file is part of '%(project)s'
862
+ """
863
+
864
+
865
+ class CCodeGen(CodeGen):
866
+ """Generator for C code.
867
+
868
+ The .write() method inherited from CodeGen will output a code file and
869
+ an interface file, <prefix>.c and <prefix>.h respectively.
870
+
871
+ """
872
+
873
+ code_extension = "c"
874
+ interface_extension = "h"
875
+ standard = 'c99'
876
+
877
+ def __init__(self, project="project", printer=None,
878
+ preprocessor_statements=None, cse=False):
879
+ super().__init__(project=project, cse=cse)
880
+ self.printer = printer or c_code_printers[self.standard.lower()]()
881
+
882
+ self.preprocessor_statements = preprocessor_statements
883
+ if preprocessor_statements is None:
884
+ self.preprocessor_statements = ['#include <math.h>']
885
+
886
+ def _get_header(self):
887
+ """Writes a common header for the generated files."""
888
+ code_lines = []
889
+ code_lines.append("/" + "*"*78 + '\n')
890
+ tmp = header_comment % {"version": sympy_version,
891
+ "project": self.project}
892
+ for line in tmp.splitlines():
893
+ code_lines.append(" *%s*\n" % line.center(76))
894
+ code_lines.append(" " + "*"*78 + "/\n")
895
+ return code_lines
896
+
897
+ def get_prototype(self, routine):
898
+ """Returns a string for the function prototype of the routine.
899
+
900
+ If the routine has multiple result objects, an CodeGenError is
901
+ raised.
902
+
903
+ See: https://en.wikipedia.org/wiki/Function_prototype
904
+
905
+ """
906
+ if len(routine.results) > 1:
907
+ raise CodeGenError("C only supports a single or no return value.")
908
+ elif len(routine.results) == 1:
909
+ ctype = routine.results[0].get_datatype('C')
910
+ else:
911
+ ctype = "void"
912
+
913
+ type_args = []
914
+ for arg in routine.arguments:
915
+ name = self.printer.doprint(arg.name)
916
+ if arg.dimensions or isinstance(arg, ResultBase):
917
+ type_args.append((arg.get_datatype('C'), "*%s" % name))
918
+ else:
919
+ type_args.append((arg.get_datatype('C'), name))
920
+ arguments = ", ".join([ "%s %s" % t for t in type_args])
921
+ return "%s %s(%s)" % (ctype, routine.name, arguments)
922
+
923
+ def _preprocessor_statements(self, prefix):
924
+ code_lines = []
925
+ code_lines.append('#include "{}.h"'.format(os.path.basename(prefix)))
926
+ code_lines.extend(self.preprocessor_statements)
927
+ code_lines = ['{}\n'.format(l) for l in code_lines]
928
+ return code_lines
929
+
930
+ def _get_routine_opening(self, routine):
931
+ prototype = self.get_prototype(routine)
932
+ return ["%s {\n" % prototype]
933
+
934
+ def _declare_arguments(self, routine):
935
+ # arguments are declared in prototype
936
+ return []
937
+
938
+ def _declare_globals(self, routine):
939
+ # global variables are not explicitly declared within C functions
940
+ return []
941
+
942
+ def _declare_locals(self, routine):
943
+
944
+ # Compose a list of symbols to be dereferenced in the function
945
+ # body. These are the arguments that were passed by a reference
946
+ # pointer, excluding arrays.
947
+ dereference = []
948
+ for arg in routine.arguments:
949
+ if isinstance(arg, ResultBase) and not arg.dimensions:
950
+ dereference.append(arg.name)
951
+
952
+ code_lines = []
953
+ for result in routine.local_vars:
954
+
955
+ # local variables that are simple symbols such as those used as indices into
956
+ # for loops are defined declared elsewhere.
957
+ if not isinstance(result, Result):
958
+ continue
959
+
960
+ if result.name != result.result_var:
961
+ raise CodeGen("Result variable and name should match: {}".format(result))
962
+ assign_to = result.name
963
+ t = result.get_datatype('c')
964
+ if isinstance(result.expr, (MatrixBase, MatrixExpr)):
965
+ dims = result.expr.shape
966
+ code_lines.append("{} {}[{}];\n".format(t, str(assign_to), dims[0]*dims[1]))
967
+ prefix = ""
968
+ else:
969
+ prefix = "const {} ".format(t)
970
+
971
+ constants, not_c, c_expr = self._printer_method_with_settings(
972
+ 'doprint', {"human": False, "dereference": dereference},
973
+ result.expr, assign_to=assign_to)
974
+
975
+ for name, value in sorted(constants, key=str):
976
+ code_lines.append("double const %s = %s;\n" % (name, value))
977
+
978
+ code_lines.append("{}{}\n".format(prefix, c_expr))
979
+
980
+ return code_lines
981
+
982
+ def _call_printer(self, routine):
983
+ code_lines = []
984
+
985
+ # Compose a list of symbols to be dereferenced in the function
986
+ # body. These are the arguments that were passed by a reference
987
+ # pointer, excluding arrays.
988
+ dereference = []
989
+ for arg in routine.arguments:
990
+ if isinstance(arg, ResultBase) and not arg.dimensions:
991
+ dereference.append(arg.name)
992
+
993
+ return_val = None
994
+ for result in routine.result_variables:
995
+ if isinstance(result, Result):
996
+ assign_to = routine.name + "_result"
997
+ t = result.get_datatype('c')
998
+ code_lines.append("{} {};\n".format(t, str(assign_to)))
999
+ return_val = assign_to
1000
+ else:
1001
+ assign_to = result.result_var
1002
+
1003
+ try:
1004
+ constants, not_c, c_expr = self._printer_method_with_settings(
1005
+ 'doprint', {"human": False, "dereference": dereference},
1006
+ result.expr, assign_to=assign_to)
1007
+ except AssignmentError:
1008
+ assign_to = result.result_var
1009
+ code_lines.append(
1010
+ "%s %s;\n" % (result.get_datatype('c'), str(assign_to)))
1011
+ constants, not_c, c_expr = self._printer_method_with_settings(
1012
+ 'doprint', {"human": False, "dereference": dereference},
1013
+ result.expr, assign_to=assign_to)
1014
+
1015
+ for name, value in sorted(constants, key=str):
1016
+ code_lines.append("double const %s = %s;\n" % (name, value))
1017
+ code_lines.append("%s\n" % c_expr)
1018
+
1019
+ if return_val:
1020
+ code_lines.append(" return %s;\n" % return_val)
1021
+ return code_lines
1022
+
1023
+ def _get_routine_ending(self, routine):
1024
+ return ["}\n"]
1025
+
1026
+ def dump_c(self, routines, f, prefix, header=True, empty=True):
1027
+ self.dump_code(routines, f, prefix, header, empty)
1028
+ dump_c.extension = code_extension # type: ignore
1029
+ dump_c.__doc__ = CodeGen.dump_code.__doc__
1030
+
1031
+ def dump_h(self, routines, f, prefix, header=True, empty=True):
1032
+ """Writes the C header file.
1033
+
1034
+ This file contains all the function declarations.
1035
+
1036
+ Parameters
1037
+ ==========
1038
+
1039
+ routines : list
1040
+ A list of Routine instances.
1041
+
1042
+ f : file-like
1043
+ Where to write the file.
1044
+
1045
+ prefix : string
1046
+ The filename prefix, used to construct the include guards.
1047
+ Only the basename of the prefix is used.
1048
+
1049
+ header : bool, optional
1050
+ When True, a header comment is included on top of each source
1051
+ file. [default : True]
1052
+
1053
+ empty : bool, optional
1054
+ When True, empty lines are included to structure the source
1055
+ files. [default : True]
1056
+
1057
+ """
1058
+ if header:
1059
+ print(''.join(self._get_header()), file=f)
1060
+ guard_name = "%s__%s__H" % (self.project.replace(
1061
+ " ", "_").upper(), prefix.replace("/", "_").upper())
1062
+ # include guards
1063
+ if empty:
1064
+ print(file=f)
1065
+ print("#ifndef %s" % guard_name, file=f)
1066
+ print("#define %s" % guard_name, file=f)
1067
+ if empty:
1068
+ print(file=f)
1069
+ # declaration of the function prototypes
1070
+ for routine in routines:
1071
+ prototype = self.get_prototype(routine)
1072
+ print("%s;" % prototype, file=f)
1073
+ # end if include guards
1074
+ if empty:
1075
+ print(file=f)
1076
+ print("#endif", file=f)
1077
+ if empty:
1078
+ print(file=f)
1079
+ dump_h.extension = interface_extension # type: ignore
1080
+
1081
+ # This list of dump functions is used by CodeGen.write to know which dump
1082
+ # functions it has to call.
1083
+ dump_fns = [dump_c, dump_h]
1084
+
1085
+ class C89CodeGen(CCodeGen):
1086
+ standard = 'C89'
1087
+
1088
+ class C99CodeGen(CCodeGen):
1089
+ standard = 'C99'
1090
+
1091
+ class FCodeGen(CodeGen):
1092
+ """Generator for Fortran 95 code
1093
+
1094
+ The .write() method inherited from CodeGen will output a code file and
1095
+ an interface file, <prefix>.f90 and <prefix>.h respectively.
1096
+
1097
+ """
1098
+
1099
+ code_extension = "f90"
1100
+ interface_extension = "h"
1101
+
1102
+ def __init__(self, project='project', printer=None):
1103
+ super().__init__(project)
1104
+ self.printer = printer or FCodePrinter()
1105
+
1106
+ def _get_header(self):
1107
+ """Writes a common header for the generated files."""
1108
+ code_lines = []
1109
+ code_lines.append("!" + "*"*78 + '\n')
1110
+ tmp = header_comment % {"version": sympy_version,
1111
+ "project": self.project}
1112
+ for line in tmp.splitlines():
1113
+ code_lines.append("!*%s*\n" % line.center(76))
1114
+ code_lines.append("!" + "*"*78 + '\n')
1115
+ return code_lines
1116
+
1117
+ def _preprocessor_statements(self, prefix):
1118
+ return []
1119
+
1120
+ def _get_routine_opening(self, routine):
1121
+ """Returns the opening statements of the fortran routine."""
1122
+ code_list = []
1123
+ if len(routine.results) > 1:
1124
+ raise CodeGenError(
1125
+ "Fortran only supports a single or no return value.")
1126
+ elif len(routine.results) == 1:
1127
+ result = routine.results[0]
1128
+ code_list.append(result.get_datatype('fortran'))
1129
+ code_list.append("function")
1130
+ else:
1131
+ code_list.append("subroutine")
1132
+
1133
+ args = ", ".join("%s" % self._get_symbol(arg.name)
1134
+ for arg in routine.arguments)
1135
+
1136
+ call_sig = "{}({})\n".format(routine.name, args)
1137
+ # Fortran 95 requires all lines be less than 132 characters, so wrap
1138
+ # this line before appending.
1139
+ call_sig = ' &\n'.join(textwrap.wrap(call_sig,
1140
+ width=60,
1141
+ break_long_words=False)) + '\n'
1142
+ code_list.append(call_sig)
1143
+ code_list = [' '.join(code_list)]
1144
+ code_list.append('implicit none\n')
1145
+ return code_list
1146
+
1147
+ def _declare_arguments(self, routine):
1148
+ # argument type declarations
1149
+ code_list = []
1150
+ array_list = []
1151
+ scalar_list = []
1152
+ for arg in routine.arguments:
1153
+
1154
+ if isinstance(arg, InputArgument):
1155
+ typeinfo = "%s, intent(in)" % arg.get_datatype('fortran')
1156
+ elif isinstance(arg, InOutArgument):
1157
+ typeinfo = "%s, intent(inout)" % arg.get_datatype('fortran')
1158
+ elif isinstance(arg, OutputArgument):
1159
+ typeinfo = "%s, intent(out)" % arg.get_datatype('fortran')
1160
+ else:
1161
+ raise CodeGenError("Unknown Argument type: %s" % type(arg))
1162
+
1163
+ fprint = self._get_symbol
1164
+
1165
+ if arg.dimensions:
1166
+ # fortran arrays start at 1
1167
+ dimstr = ", ".join(["%s:%s" % (
1168
+ fprint(dim[0] + 1), fprint(dim[1] + 1))
1169
+ for dim in arg.dimensions])
1170
+ typeinfo += ", dimension(%s)" % dimstr
1171
+ array_list.append("%s :: %s\n" % (typeinfo, fprint(arg.name)))
1172
+ else:
1173
+ scalar_list.append("%s :: %s\n" % (typeinfo, fprint(arg.name)))
1174
+
1175
+ # scalars first, because they can be used in array declarations
1176
+ code_list.extend(scalar_list)
1177
+ code_list.extend(array_list)
1178
+
1179
+ return code_list
1180
+
1181
+ def _declare_globals(self, routine):
1182
+ # Global variables not explicitly declared within Fortran 90 functions.
1183
+ # Note: a future F77 mode may need to generate "common" blocks.
1184
+ return []
1185
+
1186
+ def _declare_locals(self, routine):
1187
+ code_list = []
1188
+ for var in sorted(routine.local_vars, key=str):
1189
+ typeinfo = get_default_datatype(var)
1190
+ code_list.append("%s :: %s\n" % (
1191
+ typeinfo.fname, self._get_symbol(var)))
1192
+ return code_list
1193
+
1194
+ def _get_routine_ending(self, routine):
1195
+ """Returns the closing statements of the fortran routine."""
1196
+ if len(routine.results) == 1:
1197
+ return ["end function\n"]
1198
+ else:
1199
+ return ["end subroutine\n"]
1200
+
1201
+ def get_interface(self, routine):
1202
+ """Returns a string for the function interface.
1203
+
1204
+ The routine should have a single result object, which can be None.
1205
+ If the routine has multiple result objects, a CodeGenError is
1206
+ raised.
1207
+
1208
+ See: https://en.wikipedia.org/wiki/Function_prototype
1209
+
1210
+ """
1211
+ prototype = [ "interface\n" ]
1212
+ prototype.extend(self._get_routine_opening(routine))
1213
+ prototype.extend(self._declare_arguments(routine))
1214
+ prototype.extend(self._get_routine_ending(routine))
1215
+ prototype.append("end interface\n")
1216
+
1217
+ return "".join(prototype)
1218
+
1219
+ def _call_printer(self, routine):
1220
+ declarations = []
1221
+ code_lines = []
1222
+ for result in routine.result_variables:
1223
+ if isinstance(result, Result):
1224
+ assign_to = routine.name
1225
+ elif isinstance(result, (OutputArgument, InOutArgument)):
1226
+ assign_to = result.result_var
1227
+
1228
+ constants, not_fortran, f_expr = self._printer_method_with_settings(
1229
+ 'doprint', {"human": False, "source_format": 'free', "standard": 95},
1230
+ result.expr, assign_to=assign_to)
1231
+
1232
+ for obj, v in sorted(constants, key=str):
1233
+ t = get_default_datatype(obj)
1234
+ declarations.append(
1235
+ "%s, parameter :: %s = %s\n" % (t.fname, obj, v))
1236
+ for obj in sorted(not_fortran, key=str):
1237
+ t = get_default_datatype(obj)
1238
+ if isinstance(obj, Function):
1239
+ name = obj.func
1240
+ else:
1241
+ name = obj
1242
+ declarations.append("%s :: %s\n" % (t.fname, name))
1243
+
1244
+ code_lines.append("%s\n" % f_expr)
1245
+ return declarations + code_lines
1246
+
1247
+ def _indent_code(self, codelines):
1248
+ return self._printer_method_with_settings(
1249
+ 'indent_code', {"human": False, "source_format": 'free'}, codelines)
1250
+
1251
+ def dump_f95(self, routines, f, prefix, header=True, empty=True):
1252
+ # check that symbols are unique with ignorecase
1253
+ for r in routines:
1254
+ lowercase = {str(x).lower() for x in r.variables}
1255
+ orig_case = {str(x) for x in r.variables}
1256
+ if len(lowercase) < len(orig_case):
1257
+ raise CodeGenError("Fortran ignores case. Got symbols: %s" %
1258
+ (", ".join([str(var) for var in r.variables])))
1259
+ self.dump_code(routines, f, prefix, header, empty)
1260
+ dump_f95.extension = code_extension # type: ignore
1261
+ dump_f95.__doc__ = CodeGen.dump_code.__doc__
1262
+
1263
+ def dump_h(self, routines, f, prefix, header=True, empty=True):
1264
+ """Writes the interface to a header file.
1265
+
1266
+ This file contains all the function declarations.
1267
+
1268
+ Parameters
1269
+ ==========
1270
+
1271
+ routines : list
1272
+ A list of Routine instances.
1273
+
1274
+ f : file-like
1275
+ Where to write the file.
1276
+
1277
+ prefix : string
1278
+ The filename prefix.
1279
+
1280
+ header : bool, optional
1281
+ When True, a header comment is included on top of each source
1282
+ file. [default : True]
1283
+
1284
+ empty : bool, optional
1285
+ When True, empty lines are included to structure the source
1286
+ files. [default : True]
1287
+
1288
+ """
1289
+ if header:
1290
+ print(''.join(self._get_header()), file=f)
1291
+ if empty:
1292
+ print(file=f)
1293
+ # declaration of the function prototypes
1294
+ for routine in routines:
1295
+ prototype = self.get_interface(routine)
1296
+ f.write(prototype)
1297
+ if empty:
1298
+ print(file=f)
1299
+ dump_h.extension = interface_extension # type: ignore
1300
+
1301
+ # This list of dump functions is used by CodeGen.write to know which dump
1302
+ # functions it has to call.
1303
+ dump_fns = [dump_f95, dump_h]
1304
+
1305
+
1306
+ class JuliaCodeGen(CodeGen):
1307
+ """Generator for Julia code.
1308
+
1309
+ The .write() method inherited from CodeGen will output a code file
1310
+ <prefix>.jl.
1311
+
1312
+ """
1313
+
1314
+ code_extension = "jl"
1315
+
1316
+ def __init__(self, project='project', printer=None):
1317
+ super().__init__(project)
1318
+ self.printer = printer or JuliaCodePrinter()
1319
+
1320
+ def routine(self, name, expr, argument_sequence, global_vars):
1321
+ """Specialized Routine creation for Julia."""
1322
+
1323
+ if is_sequence(expr) and not isinstance(expr, (MatrixBase, MatrixExpr)):
1324
+ if not expr:
1325
+ raise ValueError("No expression given")
1326
+ expressions = Tuple(*expr)
1327
+ else:
1328
+ expressions = Tuple(expr)
1329
+
1330
+ # local variables
1331
+ local_vars = {i.label for i in expressions.atoms(Idx)}
1332
+
1333
+ # global variables
1334
+ global_vars = set() if global_vars is None else set(global_vars)
1335
+
1336
+ # symbols that should be arguments
1337
+ old_symbols = expressions.free_symbols - local_vars - global_vars
1338
+ symbols = set()
1339
+ for s in old_symbols:
1340
+ if isinstance(s, Idx):
1341
+ symbols.update(s.args[1].free_symbols)
1342
+ elif not isinstance(s, Indexed):
1343
+ symbols.add(s)
1344
+
1345
+ # Julia supports multiple return values
1346
+ return_vals = []
1347
+ output_args = []
1348
+ for (i, expr) in enumerate(expressions):
1349
+ if isinstance(expr, Equality):
1350
+ out_arg = expr.lhs
1351
+ expr = expr.rhs
1352
+ symbol = out_arg
1353
+ if isinstance(out_arg, Indexed):
1354
+ dims = tuple([ (S.One, dim) for dim in out_arg.shape])
1355
+ symbol = out_arg.base.label
1356
+ output_args.append(InOutArgument(symbol, out_arg, expr, dimensions=dims))
1357
+ if not isinstance(out_arg, (Indexed, Symbol, MatrixSymbol)):
1358
+ raise CodeGenError("Only Indexed, Symbol, or MatrixSymbol "
1359
+ "can define output arguments.")
1360
+
1361
+ return_vals.append(Result(expr, name=symbol, result_var=out_arg))
1362
+ if not expr.has(symbol):
1363
+ # this is a pure output: remove from the symbols list, so
1364
+ # it doesn't become an input.
1365
+ symbols.remove(symbol)
1366
+
1367
+ else:
1368
+ # we have no name for this output
1369
+ return_vals.append(Result(expr, name='out%d' % (i+1)))
1370
+
1371
+ # setup input argument list
1372
+ output_args.sort(key=lambda x: str(x.name))
1373
+ arg_list = list(output_args)
1374
+ array_symbols = {}
1375
+ for array in expressions.atoms(Indexed):
1376
+ array_symbols[array.base.label] = array
1377
+ for array in expressions.atoms(MatrixSymbol):
1378
+ array_symbols[array] = array
1379
+
1380
+ for symbol in sorted(symbols, key=str):
1381
+ arg_list.append(InputArgument(symbol))
1382
+
1383
+ if argument_sequence is not None:
1384
+ # if the user has supplied IndexedBase instances, we'll accept that
1385
+ new_sequence = []
1386
+ for arg in argument_sequence:
1387
+ if isinstance(arg, IndexedBase):
1388
+ new_sequence.append(arg.label)
1389
+ else:
1390
+ new_sequence.append(arg)
1391
+ argument_sequence = new_sequence
1392
+
1393
+ missing = [x for x in arg_list if x.name not in argument_sequence]
1394
+ if missing:
1395
+ msg = "Argument list didn't specify: {0} "
1396
+ msg = msg.format(", ".join([str(m.name) for m in missing]))
1397
+ raise CodeGenArgumentListError(msg, missing)
1398
+
1399
+ # create redundant arguments to produce the requested sequence
1400
+ name_arg_dict = {x.name: x for x in arg_list}
1401
+ new_args = []
1402
+ for symbol in argument_sequence:
1403
+ try:
1404
+ new_args.append(name_arg_dict[symbol])
1405
+ except KeyError:
1406
+ new_args.append(InputArgument(symbol))
1407
+ arg_list = new_args
1408
+
1409
+ return Routine(name, arg_list, return_vals, local_vars, global_vars)
1410
+
1411
+ def _get_header(self):
1412
+ """Writes a common header for the generated files."""
1413
+ code_lines = []
1414
+ tmp = header_comment % {"version": sympy_version,
1415
+ "project": self.project}
1416
+ for line in tmp.splitlines():
1417
+ if line == '':
1418
+ code_lines.append("#\n")
1419
+ else:
1420
+ code_lines.append("# %s\n" % line)
1421
+ return code_lines
1422
+
1423
+ def _preprocessor_statements(self, prefix):
1424
+ return []
1425
+
1426
+ def _get_routine_opening(self, routine):
1427
+ """Returns the opening statements of the routine."""
1428
+ code_list = []
1429
+ code_list.append("function ")
1430
+
1431
+ # Inputs
1432
+ args = []
1433
+ for i, arg in enumerate(routine.arguments):
1434
+ if isinstance(arg, OutputArgument):
1435
+ raise CodeGenError("Julia: invalid argument of type %s" %
1436
+ str(type(arg)))
1437
+ if isinstance(arg, (InputArgument, InOutArgument)):
1438
+ args.append("%s" % self._get_symbol(arg.name))
1439
+ args = ", ".join(args)
1440
+ code_list.append("%s(%s)\n" % (routine.name, args))
1441
+ code_list = [ "".join(code_list) ]
1442
+
1443
+ return code_list
1444
+
1445
+ def _declare_arguments(self, routine):
1446
+ return []
1447
+
1448
+ def _declare_globals(self, routine):
1449
+ return []
1450
+
1451
+ def _declare_locals(self, routine):
1452
+ return []
1453
+
1454
+ def _get_routine_ending(self, routine):
1455
+ outs = []
1456
+ for result in routine.results:
1457
+ if isinstance(result, Result):
1458
+ # Note: name not result_var; want `y` not `y[i]` for Indexed
1459
+ s = self._get_symbol(result.name)
1460
+ else:
1461
+ raise CodeGenError("unexpected object in Routine results")
1462
+ outs.append(s)
1463
+ return ["return " + ", ".join(outs) + "\nend\n"]
1464
+
1465
+ def _call_printer(self, routine):
1466
+ declarations = []
1467
+ code_lines = []
1468
+ for i, result in enumerate(routine.results):
1469
+ if isinstance(result, Result):
1470
+ assign_to = result.result_var
1471
+ else:
1472
+ raise CodeGenError("unexpected object in Routine results")
1473
+
1474
+ constants, not_supported, jl_expr = self._printer_method_with_settings(
1475
+ 'doprint', {"human": False}, result.expr, assign_to=assign_to)
1476
+
1477
+ for obj, v in sorted(constants, key=str):
1478
+ declarations.append(
1479
+ "%s = %s\n" % (obj, v))
1480
+ for obj in sorted(not_supported, key=str):
1481
+ if isinstance(obj, Function):
1482
+ name = obj.func
1483
+ else:
1484
+ name = obj
1485
+ declarations.append(
1486
+ "# unsupported: %s\n" % (name))
1487
+ code_lines.append("%s\n" % (jl_expr))
1488
+ return declarations + code_lines
1489
+
1490
+ def _indent_code(self, codelines):
1491
+ # Note that indenting seems to happen twice, first
1492
+ # statement-by-statement by JuliaPrinter then again here.
1493
+ p = JuliaCodePrinter({'human': False})
1494
+ return p.indent_code(codelines)
1495
+
1496
+ def dump_jl(self, routines, f, prefix, header=True, empty=True):
1497
+ self.dump_code(routines, f, prefix, header, empty)
1498
+
1499
+ dump_jl.extension = code_extension # type: ignore
1500
+ dump_jl.__doc__ = CodeGen.dump_code.__doc__
1501
+
1502
+ # This list of dump functions is used by CodeGen.write to know which dump
1503
+ # functions it has to call.
1504
+ dump_fns = [dump_jl]
1505
+
1506
+
1507
+ class OctaveCodeGen(CodeGen):
1508
+ """Generator for Octave code.
1509
+
1510
+ The .write() method inherited from CodeGen will output a code file
1511
+ <prefix>.m.
1512
+
1513
+ Octave .m files usually contain one function. That function name should
1514
+ match the filename (``prefix``). If you pass multiple ``name_expr`` pairs,
1515
+ the latter ones are presumed to be private functions accessed by the
1516
+ primary function.
1517
+
1518
+ You should only pass inputs to ``argument_sequence``: outputs are ordered
1519
+ according to their order in ``name_expr``.
1520
+
1521
+ """
1522
+
1523
+ code_extension = "m"
1524
+
1525
+ def __init__(self, project='project', printer=None):
1526
+ super().__init__(project)
1527
+ self.printer = printer or OctaveCodePrinter()
1528
+
1529
+ def routine(self, name, expr, argument_sequence, global_vars):
1530
+ """Specialized Routine creation for Octave."""
1531
+
1532
+ # FIXME: this is probably general enough for other high-level
1533
+ # languages, perhaps its the C/Fortran one that is specialized!
1534
+
1535
+ if is_sequence(expr) and not isinstance(expr, (MatrixBase, MatrixExpr)):
1536
+ if not expr:
1537
+ raise ValueError("No expression given")
1538
+ expressions = Tuple(*expr)
1539
+ else:
1540
+ expressions = Tuple(expr)
1541
+
1542
+ # local variables
1543
+ local_vars = {i.label for i in expressions.atoms(Idx)}
1544
+
1545
+ # global variables
1546
+ global_vars = set() if global_vars is None else set(global_vars)
1547
+
1548
+ # symbols that should be arguments
1549
+ old_symbols = expressions.free_symbols - local_vars - global_vars
1550
+ symbols = set()
1551
+ for s in old_symbols:
1552
+ if isinstance(s, Idx):
1553
+ symbols.update(s.args[1].free_symbols)
1554
+ elif not isinstance(s, Indexed):
1555
+ symbols.add(s)
1556
+
1557
+ # Octave supports multiple return values
1558
+ return_vals = []
1559
+ for (i, expr) in enumerate(expressions):
1560
+ if isinstance(expr, Equality):
1561
+ out_arg = expr.lhs
1562
+ expr = expr.rhs
1563
+ symbol = out_arg
1564
+ if isinstance(out_arg, Indexed):
1565
+ symbol = out_arg.base.label
1566
+ if not isinstance(out_arg, (Indexed, Symbol, MatrixSymbol)):
1567
+ raise CodeGenError("Only Indexed, Symbol, or MatrixSymbol "
1568
+ "can define output arguments.")
1569
+
1570
+ return_vals.append(Result(expr, name=symbol, result_var=out_arg))
1571
+ if not expr.has(symbol):
1572
+ # this is a pure output: remove from the symbols list, so
1573
+ # it doesn't become an input.
1574
+ symbols.remove(symbol)
1575
+
1576
+ else:
1577
+ # we have no name for this output
1578
+ return_vals.append(Result(expr, name='out%d' % (i+1)))
1579
+
1580
+ # setup input argument list
1581
+ arg_list = []
1582
+ array_symbols = {}
1583
+ for array in expressions.atoms(Indexed):
1584
+ array_symbols[array.base.label] = array
1585
+ for array in expressions.atoms(MatrixSymbol):
1586
+ array_symbols[array] = array
1587
+
1588
+ for symbol in sorted(symbols, key=str):
1589
+ arg_list.append(InputArgument(symbol))
1590
+
1591
+ if argument_sequence is not None:
1592
+ # if the user has supplied IndexedBase instances, we'll accept that
1593
+ new_sequence = []
1594
+ for arg in argument_sequence:
1595
+ if isinstance(arg, IndexedBase):
1596
+ new_sequence.append(arg.label)
1597
+ else:
1598
+ new_sequence.append(arg)
1599
+ argument_sequence = new_sequence
1600
+
1601
+ missing = [x for x in arg_list if x.name not in argument_sequence]
1602
+ if missing:
1603
+ msg = "Argument list didn't specify: {0} "
1604
+ msg = msg.format(", ".join([str(m.name) for m in missing]))
1605
+ raise CodeGenArgumentListError(msg, missing)
1606
+
1607
+ # create redundant arguments to produce the requested sequence
1608
+ name_arg_dict = {x.name: x for x in arg_list}
1609
+ new_args = []
1610
+ for symbol in argument_sequence:
1611
+ try:
1612
+ new_args.append(name_arg_dict[symbol])
1613
+ except KeyError:
1614
+ new_args.append(InputArgument(symbol))
1615
+ arg_list = new_args
1616
+
1617
+ return Routine(name, arg_list, return_vals, local_vars, global_vars)
1618
+
1619
+ def _get_header(self):
1620
+ """Writes a common header for the generated files."""
1621
+ code_lines = []
1622
+ tmp = header_comment % {"version": sympy_version,
1623
+ "project": self.project}
1624
+ for line in tmp.splitlines():
1625
+ if line == '':
1626
+ code_lines.append("%\n")
1627
+ else:
1628
+ code_lines.append("%% %s\n" % line)
1629
+ return code_lines
1630
+
1631
+ def _preprocessor_statements(self, prefix):
1632
+ return []
1633
+
1634
+ def _get_routine_opening(self, routine):
1635
+ """Returns the opening statements of the routine."""
1636
+ code_list = []
1637
+ code_list.append("function ")
1638
+
1639
+ # Outputs
1640
+ outs = []
1641
+ for i, result in enumerate(routine.results):
1642
+ if isinstance(result, Result):
1643
+ # Note: name not result_var; want `y` not `y(i)` for Indexed
1644
+ s = self._get_symbol(result.name)
1645
+ else:
1646
+ raise CodeGenError("unexpected object in Routine results")
1647
+ outs.append(s)
1648
+ if len(outs) > 1:
1649
+ code_list.append("[" + (", ".join(outs)) + "]")
1650
+ else:
1651
+ code_list.append("".join(outs))
1652
+ code_list.append(" = ")
1653
+
1654
+ # Inputs
1655
+ args = []
1656
+ for i, arg in enumerate(routine.arguments):
1657
+ if isinstance(arg, (OutputArgument, InOutArgument)):
1658
+ raise CodeGenError("Octave: invalid argument of type %s" %
1659
+ str(type(arg)))
1660
+ if isinstance(arg, InputArgument):
1661
+ args.append("%s" % self._get_symbol(arg.name))
1662
+ args = ", ".join(args)
1663
+ code_list.append("%s(%s)\n" % (routine.name, args))
1664
+ code_list = [ "".join(code_list) ]
1665
+
1666
+ return code_list
1667
+
1668
+ def _declare_arguments(self, routine):
1669
+ return []
1670
+
1671
+ def _declare_globals(self, routine):
1672
+ if not routine.global_vars:
1673
+ return []
1674
+ s = " ".join(sorted([self._get_symbol(g) for g in routine.global_vars]))
1675
+ return ["global " + s + "\n"]
1676
+
1677
+ def _declare_locals(self, routine):
1678
+ return []
1679
+
1680
+ def _get_routine_ending(self, routine):
1681
+ return ["end\n"]
1682
+
1683
+ def _call_printer(self, routine):
1684
+ declarations = []
1685
+ code_lines = []
1686
+ for i, result in enumerate(routine.results):
1687
+ if isinstance(result, Result):
1688
+ assign_to = result.result_var
1689
+ else:
1690
+ raise CodeGenError("unexpected object in Routine results")
1691
+
1692
+ constants, not_supported, oct_expr = self._printer_method_with_settings(
1693
+ 'doprint', {"human": False}, result.expr, assign_to=assign_to)
1694
+
1695
+ for obj, v in sorted(constants, key=str):
1696
+ declarations.append(
1697
+ " %s = %s; %% constant\n" % (obj, v))
1698
+ for obj in sorted(not_supported, key=str):
1699
+ if isinstance(obj, Function):
1700
+ name = obj.func
1701
+ else:
1702
+ name = obj
1703
+ declarations.append(
1704
+ " %% unsupported: %s\n" % (name))
1705
+ code_lines.append("%s\n" % (oct_expr))
1706
+ return declarations + code_lines
1707
+
1708
+ def _indent_code(self, codelines):
1709
+ return self._printer_method_with_settings(
1710
+ 'indent_code', {"human": False}, codelines)
1711
+
1712
+ def dump_m(self, routines, f, prefix, header=True, empty=True, inline=True):
1713
+ # Note used to call self.dump_code() but we need more control for header
1714
+
1715
+ code_lines = self._preprocessor_statements(prefix)
1716
+
1717
+ for i, routine in enumerate(routines):
1718
+ if i > 0:
1719
+ if empty:
1720
+ code_lines.append("\n")
1721
+ code_lines.extend(self._get_routine_opening(routine))
1722
+ if i == 0:
1723
+ if routine.name != prefix:
1724
+ raise ValueError('Octave function name should match prefix')
1725
+ if header:
1726
+ code_lines.append("%" + prefix.upper() +
1727
+ " Autogenerated by SymPy\n")
1728
+ code_lines.append(''.join(self._get_header()))
1729
+ code_lines.extend(self._declare_arguments(routine))
1730
+ code_lines.extend(self._declare_globals(routine))
1731
+ code_lines.extend(self._declare_locals(routine))
1732
+ if empty:
1733
+ code_lines.append("\n")
1734
+ code_lines.extend(self._call_printer(routine))
1735
+ if empty:
1736
+ code_lines.append("\n")
1737
+ code_lines.extend(self._get_routine_ending(routine))
1738
+
1739
+ code_lines = self._indent_code(''.join(code_lines))
1740
+
1741
+ if code_lines:
1742
+ f.write(code_lines)
1743
+
1744
+ dump_m.extension = code_extension # type: ignore
1745
+ dump_m.__doc__ = CodeGen.dump_code.__doc__
1746
+
1747
+ # This list of dump functions is used by CodeGen.write to know which dump
1748
+ # functions it has to call.
1749
+ dump_fns = [dump_m]
1750
+
1751
+ class RustCodeGen(CodeGen):
1752
+ """Generator for Rust code.
1753
+
1754
+ The .write() method inherited from CodeGen will output a code file
1755
+ <prefix>.rs
1756
+
1757
+ """
1758
+
1759
+ code_extension = "rs"
1760
+
1761
+ def __init__(self, project="project", printer=None):
1762
+ super().__init__(project=project)
1763
+ self.printer = printer or RustCodePrinter()
1764
+
1765
+ def routine(self, name, expr, argument_sequence, global_vars):
1766
+ """Specialized Routine creation for Rust."""
1767
+
1768
+ if is_sequence(expr) and not isinstance(expr, (MatrixBase, MatrixExpr)):
1769
+ if not expr:
1770
+ raise ValueError("No expression given")
1771
+ expressions = Tuple(*expr)
1772
+ else:
1773
+ expressions = Tuple(expr)
1774
+
1775
+ # local variables
1776
+ local_vars = {i.label for i in expressions.atoms(Idx)}
1777
+
1778
+ # global variables
1779
+ global_vars = set() if global_vars is None else set(global_vars)
1780
+
1781
+ # symbols that should be arguments
1782
+ symbols = expressions.free_symbols - local_vars - global_vars - expressions.atoms(Indexed)
1783
+
1784
+ # Rust supports multiple return values
1785
+ return_vals = []
1786
+ output_args = []
1787
+ for (i, expr) in enumerate(expressions):
1788
+ if isinstance(expr, Equality):
1789
+ out_arg = expr.lhs
1790
+ expr = expr.rhs
1791
+ symbol = out_arg
1792
+ if isinstance(out_arg, Indexed):
1793
+ dims = tuple([ (S.One, dim) for dim in out_arg.shape])
1794
+ symbol = out_arg.base.label
1795
+ output_args.append(InOutArgument(symbol, out_arg, expr, dimensions=dims))
1796
+ if not isinstance(out_arg, (Indexed, Symbol, MatrixSymbol)):
1797
+ raise CodeGenError("Only Indexed, Symbol, or MatrixSymbol "
1798
+ "can define output arguments.")
1799
+
1800
+ return_vals.append(Result(expr, name=symbol, result_var=out_arg))
1801
+ if not expr.has(symbol):
1802
+ # this is a pure output: remove from the symbols list, so
1803
+ # it doesn't become an input.
1804
+ symbols.remove(symbol)
1805
+
1806
+ else:
1807
+ # we have no name for this output
1808
+ return_vals.append(Result(expr, name='out%d' % (i+1)))
1809
+
1810
+ # setup input argument list
1811
+ output_args.sort(key=lambda x: str(x.name))
1812
+ arg_list = list(output_args)
1813
+ array_symbols = {}
1814
+ for array in expressions.atoms(Indexed):
1815
+ array_symbols[array.base.label] = array
1816
+ for array in expressions.atoms(MatrixSymbol):
1817
+ array_symbols[array] = array
1818
+
1819
+ for symbol in sorted(symbols, key=str):
1820
+ arg_list.append(InputArgument(symbol))
1821
+
1822
+ if argument_sequence is not None:
1823
+ # if the user has supplied IndexedBase instances, we'll accept that
1824
+ new_sequence = []
1825
+ for arg in argument_sequence:
1826
+ if isinstance(arg, IndexedBase):
1827
+ new_sequence.append(arg.label)
1828
+ else:
1829
+ new_sequence.append(arg)
1830
+ argument_sequence = new_sequence
1831
+
1832
+ missing = [x for x in arg_list if x.name not in argument_sequence]
1833
+ if missing:
1834
+ msg = "Argument list didn't specify: {0} "
1835
+ msg = msg.format(", ".join([str(m.name) for m in missing]))
1836
+ raise CodeGenArgumentListError(msg, missing)
1837
+
1838
+ # create redundant arguments to produce the requested sequence
1839
+ name_arg_dict = {x.name: x for x in arg_list}
1840
+ new_args = []
1841
+ for symbol in argument_sequence:
1842
+ try:
1843
+ new_args.append(name_arg_dict[symbol])
1844
+ except KeyError:
1845
+ new_args.append(InputArgument(symbol))
1846
+ arg_list = new_args
1847
+
1848
+ return Routine(name, arg_list, return_vals, local_vars, global_vars)
1849
+
1850
+
1851
+ def _get_header(self):
1852
+ """Writes a common header for the generated files."""
1853
+ code_lines = []
1854
+ code_lines.append("/*\n")
1855
+ tmp = header_comment % {"version": sympy_version,
1856
+ "project": self.project}
1857
+ for line in tmp.splitlines():
1858
+ code_lines.append((" *%s" % line.center(76)).rstrip() + "\n")
1859
+ code_lines.append(" */\n")
1860
+ return code_lines
1861
+
1862
+ def get_prototype(self, routine):
1863
+ """Returns a string for the function prototype of the routine.
1864
+
1865
+ If the routine has multiple result objects, an CodeGenError is
1866
+ raised.
1867
+
1868
+ See: https://en.wikipedia.org/wiki/Function_prototype
1869
+
1870
+ """
1871
+ results = [i.get_datatype('Rust') for i in routine.results]
1872
+
1873
+ if len(results) == 1:
1874
+ rstype = " -> " + results[0]
1875
+ elif len(routine.results) > 1:
1876
+ rstype = " -> (" + ", ".join(results) + ")"
1877
+ else:
1878
+ rstype = ""
1879
+
1880
+ type_args = []
1881
+ for arg in routine.arguments:
1882
+ name = self.printer.doprint(arg.name)
1883
+ if arg.dimensions or isinstance(arg, ResultBase):
1884
+ type_args.append(("*%s" % name, arg.get_datatype('Rust')))
1885
+ else:
1886
+ type_args.append((name, arg.get_datatype('Rust')))
1887
+ arguments = ", ".join([ "%s: %s" % t for t in type_args])
1888
+ return "fn %s(%s)%s" % (routine.name, arguments, rstype)
1889
+
1890
+ def _preprocessor_statements(self, prefix):
1891
+ code_lines = []
1892
+ # code_lines.append("use std::f64::consts::*;\n")
1893
+ return code_lines
1894
+
1895
+ def _get_routine_opening(self, routine):
1896
+ prototype = self.get_prototype(routine)
1897
+ return ["%s {\n" % prototype]
1898
+
1899
+ def _declare_arguments(self, routine):
1900
+ # arguments are declared in prototype
1901
+ return []
1902
+
1903
+ def _declare_globals(self, routine):
1904
+ # global variables are not explicitly declared within C functions
1905
+ return []
1906
+
1907
+ def _declare_locals(self, routine):
1908
+ # loop variables are declared in loop statement
1909
+ return []
1910
+
1911
+ def _call_printer(self, routine):
1912
+
1913
+ code_lines = []
1914
+ declarations = []
1915
+ returns = []
1916
+
1917
+ # Compose a list of symbols to be dereferenced in the function
1918
+ # body. These are the arguments that were passed by a reference
1919
+ # pointer, excluding arrays.
1920
+ dereference = []
1921
+ for arg in routine.arguments:
1922
+ if isinstance(arg, ResultBase) and not arg.dimensions:
1923
+ dereference.append(arg.name)
1924
+
1925
+ for i, result in enumerate(routine.results):
1926
+ if isinstance(result, Result):
1927
+ assign_to = result.result_var
1928
+ returns.append(str(result.result_var))
1929
+ else:
1930
+ raise CodeGenError("unexpected object in Routine results")
1931
+
1932
+ constants, not_supported, rs_expr = self._printer_method_with_settings(
1933
+ 'doprint', {"human": False}, result.expr, assign_to=assign_to)
1934
+
1935
+ for name, value in sorted(constants, key=str):
1936
+ declarations.append("const %s: f64 = %s;\n" % (name, value))
1937
+
1938
+ for obj in sorted(not_supported, key=str):
1939
+ if isinstance(obj, Function):
1940
+ name = obj.func
1941
+ else:
1942
+ name = obj
1943
+ declarations.append("// unsupported: %s\n" % (name))
1944
+
1945
+ code_lines.append("let %s\n" % rs_expr);
1946
+
1947
+ if len(returns) > 1:
1948
+ returns = ['(' + ', '.join(returns) + ')']
1949
+
1950
+ returns.append('\n')
1951
+
1952
+ return declarations + code_lines + returns
1953
+
1954
+ def _get_routine_ending(self, routine):
1955
+ return ["}\n"]
1956
+
1957
+ def dump_rs(self, routines, f, prefix, header=True, empty=True):
1958
+ self.dump_code(routines, f, prefix, header, empty)
1959
+
1960
+ dump_rs.extension = code_extension # type: ignore
1961
+ dump_rs.__doc__ = CodeGen.dump_code.__doc__
1962
+
1963
+ # This list of dump functions is used by CodeGen.write to know which dump
1964
+ # functions it has to call.
1965
+ dump_fns = [dump_rs]
1966
+
1967
+
1968
+
1969
+
1970
+ def get_code_generator(language, project=None, standard=None, printer = None):
1971
+ if language == 'C':
1972
+ if standard is None:
1973
+ pass
1974
+ elif standard.lower() == 'c89':
1975
+ language = 'C89'
1976
+ elif standard.lower() == 'c99':
1977
+ language = 'C99'
1978
+ CodeGenClass = {"C": CCodeGen, "C89": C89CodeGen, "C99": C99CodeGen,
1979
+ "F95": FCodeGen, "JULIA": JuliaCodeGen,
1980
+ "OCTAVE": OctaveCodeGen,
1981
+ "RUST": RustCodeGen}.get(language.upper())
1982
+ if CodeGenClass is None:
1983
+ raise ValueError("Language '%s' is not supported." % language)
1984
+ return CodeGenClass(project, printer)
1985
+
1986
+
1987
+ #
1988
+ # Friendly functions
1989
+ #
1990
+
1991
+
1992
+ def codegen(name_expr, language=None, prefix=None, project="project",
1993
+ to_files=False, header=True, empty=True, argument_sequence=None,
1994
+ global_vars=None, standard=None, code_gen=None, printer = None):
1995
+ """Generate source code for expressions in a given language.
1996
+
1997
+ Parameters
1998
+ ==========
1999
+
2000
+ name_expr : tuple, or list of tuples
2001
+ A single (name, expression) tuple or a list of (name, expression)
2002
+ tuples. Each tuple corresponds to a routine. If the expression is
2003
+ an equality (an instance of class Equality) the left hand side is
2004
+ considered an output argument. If expression is an iterable, then
2005
+ the routine will have multiple outputs.
2006
+
2007
+ language : string,
2008
+ A string that indicates the source code language. This is case
2009
+ insensitive. Currently, 'C', 'F95' and 'Octave' are supported.
2010
+ 'Octave' generates code compatible with both Octave and Matlab.
2011
+
2012
+ prefix : string, optional
2013
+ A prefix for the names of the files that contain the source code.
2014
+ Language-dependent suffixes will be appended. If omitted, the name
2015
+ of the first name_expr tuple is used.
2016
+
2017
+ project : string, optional
2018
+ A project name, used for making unique preprocessor instructions.
2019
+ [default: "project"]
2020
+
2021
+ to_files : bool, optional
2022
+ When True, the code will be written to one or more files with the
2023
+ given prefix, otherwise strings with the names and contents of
2024
+ these files are returned. [default: False]
2025
+
2026
+ header : bool, optional
2027
+ When True, a header is written on top of each source file.
2028
+ [default: True]
2029
+
2030
+ empty : bool, optional
2031
+ When True, empty lines are used to structure the code.
2032
+ [default: True]
2033
+
2034
+ argument_sequence : iterable, optional
2035
+ Sequence of arguments for the routine in a preferred order. A
2036
+ CodeGenError is raised if required arguments are missing.
2037
+ Redundant arguments are used without warning. If omitted,
2038
+ arguments will be ordered alphabetically, but with all input
2039
+ arguments first, and then output or in-out arguments.
2040
+
2041
+ global_vars : iterable, optional
2042
+ Sequence of global variables used by the routine. Variables
2043
+ listed here will not show up as function arguments.
2044
+
2045
+ standard : string
2046
+
2047
+ code_gen : CodeGen instance
2048
+ An instance of a CodeGen subclass. Overrides ``language``.
2049
+
2050
+ Examples
2051
+ ========
2052
+
2053
+ >>> from sympy.utilities.codegen import codegen
2054
+ >>> from sympy.abc import x, y, z
2055
+ >>> [(c_name, c_code), (h_name, c_header)] = codegen(
2056
+ ... ("f", x+y*z), "C89", "test", header=False, empty=False)
2057
+ >>> print(c_name)
2058
+ test.c
2059
+ >>> print(c_code)
2060
+ #include "test.h"
2061
+ #include <math.h>
2062
+ double f(double x, double y, double z) {
2063
+ double f_result;
2064
+ f_result = x + y*z;
2065
+ return f_result;
2066
+ }
2067
+ <BLANKLINE>
2068
+ >>> print(h_name)
2069
+ test.h
2070
+ >>> print(c_header)
2071
+ #ifndef PROJECT__TEST__H
2072
+ #define PROJECT__TEST__H
2073
+ double f(double x, double y, double z);
2074
+ #endif
2075
+ <BLANKLINE>
2076
+
2077
+ Another example using Equality objects to give named outputs. Here the
2078
+ filename (prefix) is taken from the first (name, expr) pair.
2079
+
2080
+ >>> from sympy.abc import f, g
2081
+ >>> from sympy import Eq
2082
+ >>> [(c_name, c_code), (h_name, c_header)] = codegen(
2083
+ ... [("myfcn", x + y), ("fcn2", [Eq(f, 2*x), Eq(g, y)])],
2084
+ ... "C99", header=False, empty=False)
2085
+ >>> print(c_name)
2086
+ myfcn.c
2087
+ >>> print(c_code)
2088
+ #include "myfcn.h"
2089
+ #include <math.h>
2090
+ double myfcn(double x, double y) {
2091
+ double myfcn_result;
2092
+ myfcn_result = x + y;
2093
+ return myfcn_result;
2094
+ }
2095
+ void fcn2(double x, double y, double *f, double *g) {
2096
+ (*f) = 2*x;
2097
+ (*g) = y;
2098
+ }
2099
+ <BLANKLINE>
2100
+
2101
+ If the generated function(s) will be part of a larger project where various
2102
+ global variables have been defined, the 'global_vars' option can be used
2103
+ to remove the specified variables from the function signature
2104
+
2105
+ >>> from sympy.utilities.codegen import codegen
2106
+ >>> from sympy.abc import x, y, z
2107
+ >>> [(f_name, f_code), header] = codegen(
2108
+ ... ("f", x+y*z), "F95", header=False, empty=False,
2109
+ ... argument_sequence=(x, y), global_vars=(z,))
2110
+ >>> print(f_code)
2111
+ REAL*8 function f(x, y)
2112
+ implicit none
2113
+ REAL*8, intent(in) :: x
2114
+ REAL*8, intent(in) :: y
2115
+ f = x + y*z
2116
+ end function
2117
+ <BLANKLINE>
2118
+
2119
+ """
2120
+
2121
+ # Initialize the code generator.
2122
+ if language is None:
2123
+ if code_gen is None:
2124
+ raise ValueError("Need either language or code_gen")
2125
+ else:
2126
+ if code_gen is not None:
2127
+ raise ValueError("You cannot specify both language and code_gen.")
2128
+ code_gen = get_code_generator(language, project, standard, printer)
2129
+
2130
+ if isinstance(name_expr[0], str):
2131
+ # single tuple is given, turn it into a singleton list with a tuple.
2132
+ name_expr = [name_expr]
2133
+
2134
+ if prefix is None:
2135
+ prefix = name_expr[0][0]
2136
+
2137
+ # Construct Routines appropriate for this code_gen from (name, expr) pairs.
2138
+ routines = []
2139
+ for name, expr in name_expr:
2140
+ routines.append(code_gen.routine(name, expr, argument_sequence,
2141
+ global_vars))
2142
+
2143
+ # Write the code.
2144
+ return code_gen.write(routines, prefix, to_files, header, empty)
2145
+
2146
+
2147
+ def make_routine(name, expr, argument_sequence=None,
2148
+ global_vars=None, language="F95"):
2149
+ """A factory that makes an appropriate Routine from an expression.
2150
+
2151
+ Parameters
2152
+ ==========
2153
+
2154
+ name : string
2155
+ The name of this routine in the generated code.
2156
+
2157
+ expr : expression or list/tuple of expressions
2158
+ A SymPy expression that the Routine instance will represent. If
2159
+ given a list or tuple of expressions, the routine will be
2160
+ considered to have multiple return values and/or output arguments.
2161
+
2162
+ argument_sequence : list or tuple, optional
2163
+ List arguments for the routine in a preferred order. If omitted,
2164
+ the results are language dependent, for example, alphabetical order
2165
+ or in the same order as the given expressions.
2166
+
2167
+ global_vars : iterable, optional
2168
+ Sequence of global variables used by the routine. Variables
2169
+ listed here will not show up as function arguments.
2170
+
2171
+ language : string, optional
2172
+ Specify a target language. The Routine itself should be
2173
+ language-agnostic but the precise way one is created, error
2174
+ checking, etc depend on the language. [default: "F95"].
2175
+
2176
+ Notes
2177
+ =====
2178
+
2179
+ A decision about whether to use output arguments or return values is made
2180
+ depending on both the language and the particular mathematical expressions.
2181
+ For an expression of type Equality, the left hand side is typically made
2182
+ into an OutputArgument (or perhaps an InOutArgument if appropriate).
2183
+ Otherwise, typically, the calculated expression is made a return values of
2184
+ the routine.
2185
+
2186
+ Examples
2187
+ ========
2188
+
2189
+ >>> from sympy.utilities.codegen import make_routine
2190
+ >>> from sympy.abc import x, y, f, g
2191
+ >>> from sympy import Eq
2192
+ >>> r = make_routine('test', [Eq(f, 2*x), Eq(g, x + y)])
2193
+ >>> [arg.result_var for arg in r.results]
2194
+ []
2195
+ >>> [arg.name for arg in r.arguments]
2196
+ [x, y, f, g]
2197
+ >>> [arg.name for arg in r.result_variables]
2198
+ [f, g]
2199
+ >>> r.local_vars
2200
+ set()
2201
+
2202
+ Another more complicated example with a mixture of specified and
2203
+ automatically-assigned names. Also has Matrix output.
2204
+
2205
+ >>> from sympy import Matrix
2206
+ >>> r = make_routine('fcn', [x*y, Eq(f, 1), Eq(g, x + g), Matrix([[x, 2]])])
2207
+ >>> [arg.result_var for arg in r.results] # doctest: +SKIP
2208
+ [result_5397460570204848505]
2209
+ >>> [arg.expr for arg in r.results]
2210
+ [x*y]
2211
+ >>> [arg.name for arg in r.arguments] # doctest: +SKIP
2212
+ [x, y, f, g, out_8598435338387848786]
2213
+
2214
+ We can examine the various arguments more closely:
2215
+
2216
+ >>> from sympy.utilities.codegen import (InputArgument, OutputArgument,
2217
+ ... InOutArgument)
2218
+ >>> [a.name for a in r.arguments if isinstance(a, InputArgument)]
2219
+ [x, y]
2220
+
2221
+ >>> [a.name for a in r.arguments if isinstance(a, OutputArgument)] # doctest: +SKIP
2222
+ [f, out_8598435338387848786]
2223
+ >>> [a.expr for a in r.arguments if isinstance(a, OutputArgument)]
2224
+ [1, Matrix([[x, 2]])]
2225
+
2226
+ >>> [a.name for a in r.arguments if isinstance(a, InOutArgument)]
2227
+ [g]
2228
+ >>> [a.expr for a in r.arguments if isinstance(a, InOutArgument)]
2229
+ [g + x]
2230
+
2231
+ """
2232
+
2233
+ # initialize a new code generator
2234
+ code_gen = get_code_generator(language)
2235
+
2236
+ return code_gen.routine(name, expr, argument_sequence, global_vars)
venv/lib/python3.10/site-packages/sympy/utilities/decorator.py ADDED
@@ -0,0 +1,330 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Useful utility decorators. """
2
+
3
+ import sys
4
+ import types
5
+ import inspect
6
+ from functools import wraps, update_wrapper
7
+
8
+ from sympy.utilities.exceptions import sympy_deprecation_warning
9
+
10
+ def threaded_factory(func, use_add):
11
+ """A factory for ``threaded`` decorators. """
12
+ from sympy.core import sympify
13
+ from sympy.matrices import MatrixBase
14
+ from sympy.utilities.iterables import iterable
15
+
16
+ @wraps(func)
17
+ def threaded_func(expr, *args, **kwargs):
18
+ if isinstance(expr, MatrixBase):
19
+ return expr.applyfunc(lambda f: func(f, *args, **kwargs))
20
+ elif iterable(expr):
21
+ try:
22
+ return expr.__class__([func(f, *args, **kwargs) for f in expr])
23
+ except TypeError:
24
+ return expr
25
+ else:
26
+ expr = sympify(expr)
27
+
28
+ if use_add and expr.is_Add:
29
+ return expr.__class__(*[ func(f, *args, **kwargs) for f in expr.args ])
30
+ elif expr.is_Relational:
31
+ return expr.__class__(func(expr.lhs, *args, **kwargs),
32
+ func(expr.rhs, *args, **kwargs))
33
+ else:
34
+ return func(expr, *args, **kwargs)
35
+
36
+ return threaded_func
37
+
38
+
39
+ def threaded(func):
40
+ """Apply ``func`` to sub--elements of an object, including :class:`~.Add`.
41
+
42
+ This decorator is intended to make it uniformly possible to apply a
43
+ function to all elements of composite objects, e.g. matrices, lists, tuples
44
+ and other iterable containers, or just expressions.
45
+
46
+ This version of :func:`threaded` decorator allows threading over
47
+ elements of :class:`~.Add` class. If this behavior is not desirable
48
+ use :func:`xthreaded` decorator.
49
+
50
+ Functions using this decorator must have the following signature::
51
+
52
+ @threaded
53
+ def function(expr, *args, **kwargs):
54
+
55
+ """
56
+ return threaded_factory(func, True)
57
+
58
+
59
+ def xthreaded(func):
60
+ """Apply ``func`` to sub--elements of an object, excluding :class:`~.Add`.
61
+
62
+ This decorator is intended to make it uniformly possible to apply a
63
+ function to all elements of composite objects, e.g. matrices, lists, tuples
64
+ and other iterable containers, or just expressions.
65
+
66
+ This version of :func:`threaded` decorator disallows threading over
67
+ elements of :class:`~.Add` class. If this behavior is not desirable
68
+ use :func:`threaded` decorator.
69
+
70
+ Functions using this decorator must have the following signature::
71
+
72
+ @xthreaded
73
+ def function(expr, *args, **kwargs):
74
+
75
+ """
76
+ return threaded_factory(func, False)
77
+
78
+
79
+ def conserve_mpmath_dps(func):
80
+ """After the function finishes, resets the value of mpmath.mp.dps to
81
+ the value it had before the function was run."""
82
+ import mpmath
83
+
84
+ def func_wrapper(*args, **kwargs):
85
+ dps = mpmath.mp.dps
86
+ try:
87
+ return func(*args, **kwargs)
88
+ finally:
89
+ mpmath.mp.dps = dps
90
+
91
+ func_wrapper = update_wrapper(func_wrapper, func)
92
+ return func_wrapper
93
+
94
+
95
+ class no_attrs_in_subclass:
96
+ """Don't 'inherit' certain attributes from a base class
97
+
98
+ >>> from sympy.utilities.decorator import no_attrs_in_subclass
99
+
100
+ >>> class A(object):
101
+ ... x = 'test'
102
+
103
+ >>> A.x = no_attrs_in_subclass(A, A.x)
104
+
105
+ >>> class B(A):
106
+ ... pass
107
+
108
+ >>> hasattr(A, 'x')
109
+ True
110
+ >>> hasattr(B, 'x')
111
+ False
112
+
113
+ """
114
+ def __init__(self, cls, f):
115
+ self.cls = cls
116
+ self.f = f
117
+
118
+ def __get__(self, instance, owner=None):
119
+ if owner == self.cls:
120
+ if hasattr(self.f, '__get__'):
121
+ return self.f.__get__(instance, owner)
122
+ return self.f
123
+ raise AttributeError
124
+
125
+
126
+ def doctest_depends_on(exe=None, modules=None, disable_viewers=None, python_version=None):
127
+ """
128
+ Adds metadata about the dependencies which need to be met for doctesting
129
+ the docstrings of the decorated objects.
130
+
131
+ exe should be a list of executables
132
+
133
+ modules should be a list of modules
134
+
135
+ disable_viewers should be a list of viewers for preview() to disable
136
+
137
+ python_version should be the minimum Python version required, as a tuple
138
+ (like (3, 0))
139
+ """
140
+ dependencies = {}
141
+ if exe is not None:
142
+ dependencies['executables'] = exe
143
+ if modules is not None:
144
+ dependencies['modules'] = modules
145
+ if disable_viewers is not None:
146
+ dependencies['disable_viewers'] = disable_viewers
147
+ if python_version is not None:
148
+ dependencies['python_version'] = python_version
149
+
150
+ def skiptests():
151
+ from sympy.testing.runtests import DependencyError, SymPyDocTests, PyTestReporter # lazy import
152
+ r = PyTestReporter()
153
+ t = SymPyDocTests(r, None)
154
+ try:
155
+ t._check_dependencies(**dependencies)
156
+ except DependencyError:
157
+ return True # Skip doctests
158
+ else:
159
+ return False # Run doctests
160
+
161
+ def depends_on_deco(fn):
162
+ fn._doctest_depends_on = dependencies
163
+ fn.__doctest_skip__ = skiptests
164
+
165
+ if inspect.isclass(fn):
166
+ fn._doctest_depdends_on = no_attrs_in_subclass(
167
+ fn, fn._doctest_depends_on)
168
+ fn.__doctest_skip__ = no_attrs_in_subclass(
169
+ fn, fn.__doctest_skip__)
170
+ return fn
171
+
172
+ return depends_on_deco
173
+
174
+
175
+ def public(obj):
176
+ """
177
+ Append ``obj``'s name to global ``__all__`` variable (call site).
178
+
179
+ By using this decorator on functions or classes you achieve the same goal
180
+ as by filling ``__all__`` variables manually, you just do not have to repeat
181
+ yourself (object's name). You also know if object is public at definition
182
+ site, not at some random location (where ``__all__`` was set).
183
+
184
+ Note that in multiple decorator setup (in almost all cases) ``@public``
185
+ decorator must be applied before any other decorators, because it relies
186
+ on the pointer to object's global namespace. If you apply other decorators
187
+ first, ``@public`` may end up modifying the wrong namespace.
188
+
189
+ Examples
190
+ ========
191
+
192
+ >>> from sympy.utilities.decorator import public
193
+
194
+ >>> __all__ # noqa: F821
195
+ Traceback (most recent call last):
196
+ ...
197
+ NameError: name '__all__' is not defined
198
+
199
+ >>> @public
200
+ ... def some_function():
201
+ ... pass
202
+
203
+ >>> __all__ # noqa: F821
204
+ ['some_function']
205
+
206
+ """
207
+ if isinstance(obj, types.FunctionType):
208
+ ns = obj.__globals__
209
+ name = obj.__name__
210
+ elif isinstance(obj, (type(type), type)):
211
+ ns = sys.modules[obj.__module__].__dict__
212
+ name = obj.__name__
213
+ else:
214
+ raise TypeError("expected a function or a class, got %s" % obj)
215
+
216
+ if "__all__" not in ns:
217
+ ns["__all__"] = [name]
218
+ else:
219
+ ns["__all__"].append(name)
220
+
221
+ return obj
222
+
223
+
224
+ def memoize_property(propfunc):
225
+ """Property decorator that caches the value of potentially expensive
226
+ `propfunc` after the first evaluation. The cached value is stored in
227
+ the corresponding property name with an attached underscore."""
228
+ attrname = '_' + propfunc.__name__
229
+ sentinel = object()
230
+
231
+ @wraps(propfunc)
232
+ def accessor(self):
233
+ val = getattr(self, attrname, sentinel)
234
+ if val is sentinel:
235
+ val = propfunc(self)
236
+ setattr(self, attrname, val)
237
+ return val
238
+
239
+ return property(accessor)
240
+
241
+
242
+ def deprecated(message, *, deprecated_since_version,
243
+ active_deprecations_target, stacklevel=3):
244
+ '''
245
+ Mark a function as deprecated.
246
+
247
+ This decorator should be used if an entire function or class is
248
+ deprecated. If only a certain functionality is deprecated, you should use
249
+ :func:`~.warns_deprecated_sympy` directly. This decorator is just a
250
+ convenience. There is no functional difference between using this
251
+ decorator and calling ``warns_deprecated_sympy()`` at the top of the
252
+ function.
253
+
254
+ The decorator takes the same arguments as
255
+ :func:`~.warns_deprecated_sympy`. See its
256
+ documentation for details on what the keywords to this decorator do.
257
+
258
+ See the :ref:`deprecation-policy` document for details on when and how
259
+ things should be deprecated in SymPy.
260
+
261
+ Examples
262
+ ========
263
+
264
+ >>> from sympy.utilities.decorator import deprecated
265
+ >>> from sympy import simplify
266
+ >>> @deprecated("""\
267
+ ... The simplify_this(expr) function is deprecated. Use simplify(expr)
268
+ ... instead.""", deprecated_since_version="1.1",
269
+ ... active_deprecations_target='simplify-this-deprecation')
270
+ ... def simplify_this(expr):
271
+ ... """
272
+ ... Simplify ``expr``.
273
+ ...
274
+ ... .. deprecated:: 1.1
275
+ ...
276
+ ... The ``simplify_this`` function is deprecated. Use :func:`simplify`
277
+ ... instead. See its documentation for more information. See
278
+ ... :ref:`simplify-this-deprecation` for details.
279
+ ...
280
+ ... """
281
+ ... return simplify(expr)
282
+ >>> from sympy.abc import x
283
+ >>> simplify_this(x*(x + 1) - x**2) # doctest: +SKIP
284
+ <stdin>:1: SymPyDeprecationWarning:
285
+ <BLANKLINE>
286
+ The simplify_this(expr) function is deprecated. Use simplify(expr)
287
+ instead.
288
+ <BLANKLINE>
289
+ See https://docs.sympy.org/latest/explanation/active-deprecations.html#simplify-this-deprecation
290
+ for details.
291
+ <BLANKLINE>
292
+ This has been deprecated since SymPy version 1.1. It
293
+ will be removed in a future version of SymPy.
294
+ <BLANKLINE>
295
+ simplify_this(x)
296
+ x
297
+
298
+ See Also
299
+ ========
300
+ sympy.utilities.exceptions.SymPyDeprecationWarning
301
+ sympy.utilities.exceptions.sympy_deprecation_warning
302
+ sympy.utilities.exceptions.ignore_warnings
303
+ sympy.testing.pytest.warns_deprecated_sympy
304
+
305
+ '''
306
+ decorator_kwargs = {"deprecated_since_version": deprecated_since_version,
307
+ "active_deprecations_target": active_deprecations_target}
308
+ def deprecated_decorator(wrapped):
309
+ if hasattr(wrapped, '__mro__'): # wrapped is actually a class
310
+ class wrapper(wrapped):
311
+ __doc__ = wrapped.__doc__
312
+ __module__ = wrapped.__module__
313
+ _sympy_deprecated_func = wrapped
314
+ if '__new__' in wrapped.__dict__:
315
+ def __new__(cls, *args, **kwargs):
316
+ sympy_deprecation_warning(message, **decorator_kwargs, stacklevel=stacklevel)
317
+ return super().__new__(cls, *args, **kwargs)
318
+ else:
319
+ def __init__(self, *args, **kwargs):
320
+ sympy_deprecation_warning(message, **decorator_kwargs, stacklevel=stacklevel)
321
+ super().__init__(*args, **kwargs)
322
+ wrapper.__name__ = wrapped.__name__
323
+ else:
324
+ @wraps(wrapped)
325
+ def wrapper(*args, **kwargs):
326
+ sympy_deprecation_warning(message, **decorator_kwargs, stacklevel=stacklevel)
327
+ return wrapped(*args, **kwargs)
328
+ wrapper._sympy_deprecated_func = wrapped
329
+ return wrapper
330
+ return deprecated_decorator
venv/lib/python3.10/site-packages/sympy/utilities/enumerative.py ADDED
@@ -0,0 +1,1157 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Algorithms and classes to support enumerative combinatorics.
3
+
4
+ Currently just multiset partitions, but more could be added.
5
+
6
+ Terminology (following Knuth, algorithm 7.1.2.5M TAOCP)
7
+ *multiset* aaabbcccc has a *partition* aaabc | bccc
8
+
9
+ The submultisets, aaabc and bccc of the partition are called
10
+ *parts*, or sometimes *vectors*. (Knuth notes that multiset
11
+ partitions can be thought of as partitions of vectors of integers,
12
+ where the ith element of the vector gives the multiplicity of
13
+ element i.)
14
+
15
+ The values a, b and c are *components* of the multiset. These
16
+ correspond to elements of a set, but in a multiset can be present
17
+ with a multiplicity greater than 1.
18
+
19
+ The algorithm deserves some explanation.
20
+
21
+ Think of the part aaabc from the multiset above. If we impose an
22
+ ordering on the components of the multiset, we can represent a part
23
+ with a vector, in which the value of the first element of the vector
24
+ corresponds to the multiplicity of the first component in that
25
+ part. Thus, aaabc can be represented by the vector [3, 1, 1]. We
26
+ can also define an ordering on parts, based on the lexicographic
27
+ ordering of the vector (leftmost vector element, i.e., the element
28
+ with the smallest component number, is the most significant), so
29
+ that [3, 1, 1] > [3, 1, 0] and [3, 1, 1] > [2, 1, 4]. The ordering
30
+ on parts can be extended to an ordering on partitions: First, sort
31
+ the parts in each partition, left-to-right in decreasing order. Then
32
+ partition A is greater than partition B if A's leftmost/greatest
33
+ part is greater than B's leftmost part. If the leftmost parts are
34
+ equal, compare the second parts, and so on.
35
+
36
+ In this ordering, the greatest partition of a given multiset has only
37
+ one part. The least partition is the one in which the components
38
+ are spread out, one per part.
39
+
40
+ The enumeration algorithms in this file yield the partitions of the
41
+ argument multiset in decreasing order. The main data structure is a
42
+ stack of parts, corresponding to the current partition. An
43
+ important invariant is that the parts on the stack are themselves in
44
+ decreasing order. This data structure is decremented to find the
45
+ next smaller partition. Most often, decrementing the partition will
46
+ only involve adjustments to the smallest parts at the top of the
47
+ stack, much as adjacent integers *usually* differ only in their last
48
+ few digits.
49
+
50
+ Knuth's algorithm uses two main operations on parts:
51
+
52
+ Decrement - change the part so that it is smaller in the
53
+ (vector) lexicographic order, but reduced by the smallest amount possible.
54
+ For example, if the multiset has vector [5,
55
+ 3, 1], and the bottom/greatest part is [4, 2, 1], this part would
56
+ decrement to [4, 2, 0], while [4, 0, 0] would decrement to [3, 3,
57
+ 1]. A singleton part is never decremented -- [1, 0, 0] is not
58
+ decremented to [0, 3, 1]. Instead, the decrement operator needs
59
+ to fail for this case. In Knuth's pseudocode, the decrement
60
+ operator is step m5.
61
+
62
+ Spread unallocated multiplicity - Once a part has been decremented,
63
+ it cannot be the rightmost part in the partition. There is some
64
+ multiplicity that has not been allocated, and new parts must be
65
+ created above it in the stack to use up this multiplicity. To
66
+ maintain the invariant that the parts on the stack are in
67
+ decreasing order, these new parts must be less than or equal to
68
+ the decremented part.
69
+ For example, if the multiset is [5, 3, 1], and its most
70
+ significant part has just been decremented to [5, 3, 0], the
71
+ spread operation will add a new part so that the stack becomes
72
+ [[5, 3, 0], [0, 0, 1]]. If the most significant part (for the
73
+ same multiset) has been decremented to [2, 0, 0] the stack becomes
74
+ [[2, 0, 0], [2, 0, 0], [1, 3, 1]]. In the pseudocode, the spread
75
+ operation for one part is step m2. The complete spread operation
76
+ is a loop of steps m2 and m3.
77
+
78
+ In order to facilitate the spread operation, Knuth stores, for each
79
+ component of each part, not just the multiplicity of that component
80
+ in the part, but also the total multiplicity available for this
81
+ component in this part or any lesser part above it on the stack.
82
+
83
+ One added twist is that Knuth does not represent the part vectors as
84
+ arrays. Instead, he uses a sparse representation, in which a
85
+ component of a part is represented as a component number (c), plus
86
+ the multiplicity of the component in that part (v) as well as the
87
+ total multiplicity available for that component (u). This saves
88
+ time that would be spent skipping over zeros.
89
+
90
+ """
91
+
92
+ class PartComponent:
93
+ """Internal class used in support of the multiset partitions
94
+ enumerators and the associated visitor functions.
95
+
96
+ Represents one component of one part of the current partition.
97
+
98
+ A stack of these, plus an auxiliary frame array, f, represents a
99
+ partition of the multiset.
100
+
101
+ Knuth's pseudocode makes c, u, and v separate arrays.
102
+ """
103
+
104
+ __slots__ = ('c', 'u', 'v')
105
+
106
+ def __init__(self):
107
+ self.c = 0 # Component number
108
+ self.u = 0 # The as yet unpartitioned amount in component c
109
+ # *before* it is allocated by this triple
110
+ self.v = 0 # Amount of c component in the current part
111
+ # (v<=u). An invariant of the representation is
112
+ # that the next higher triple for this component
113
+ # (if there is one) will have a value of u-v in
114
+ # its u attribute.
115
+
116
+ def __repr__(self):
117
+ "for debug/algorithm animation purposes"
118
+ return 'c:%d u:%d v:%d' % (self.c, self.u, self.v)
119
+
120
+ def __eq__(self, other):
121
+ """Define value oriented equality, which is useful for testers"""
122
+ return (isinstance(other, self.__class__) and
123
+ self.c == other.c and
124
+ self.u == other.u and
125
+ self.v == other.v)
126
+
127
+ def __ne__(self, other):
128
+ """Defined for consistency with __eq__"""
129
+ return not self == other
130
+
131
+
132
+ # This function tries to be a faithful implementation of algorithm
133
+ # 7.1.2.5M in Volume 4A, Combinatoral Algorithms, Part 1, of The Art
134
+ # of Computer Programming, by Donald Knuth. This includes using
135
+ # (mostly) the same variable names, etc. This makes for rather
136
+ # low-level Python.
137
+
138
+ # Changes from Knuth's pseudocode include
139
+ # - use PartComponent struct/object instead of 3 arrays
140
+ # - make the function a generator
141
+ # - map (with some difficulty) the GOTOs to Python control structures.
142
+ # - Knuth uses 1-based numbering for components, this code is 0-based
143
+ # - renamed variable l to lpart.
144
+ # - flag variable x takes on values True/False instead of 1/0
145
+ #
146
+ def multiset_partitions_taocp(multiplicities):
147
+ """Enumerates partitions of a multiset.
148
+
149
+ Parameters
150
+ ==========
151
+
152
+ multiplicities
153
+ list of integer multiplicities of the components of the multiset.
154
+
155
+ Yields
156
+ ======
157
+
158
+ state
159
+ Internal data structure which encodes a particular partition.
160
+ This output is then usually processed by a visitor function
161
+ which combines the information from this data structure with
162
+ the components themselves to produce an actual partition.
163
+
164
+ Unless they wish to create their own visitor function, users will
165
+ have little need to look inside this data structure. But, for
166
+ reference, it is a 3-element list with components:
167
+
168
+ f
169
+ is a frame array, which is used to divide pstack into parts.
170
+
171
+ lpart
172
+ points to the base of the topmost part.
173
+
174
+ pstack
175
+ is an array of PartComponent objects.
176
+
177
+ The ``state`` output offers a peek into the internal data
178
+ structures of the enumeration function. The client should
179
+ treat this as read-only; any modification of the data
180
+ structure will cause unpredictable (and almost certainly
181
+ incorrect) results. Also, the components of ``state`` are
182
+ modified in place at each iteration. Hence, the visitor must
183
+ be called at each loop iteration. Accumulating the ``state``
184
+ instances and processing them later will not work.
185
+
186
+ Examples
187
+ ========
188
+
189
+ >>> from sympy.utilities.enumerative import list_visitor
190
+ >>> from sympy.utilities.enumerative import multiset_partitions_taocp
191
+ >>> # variables components and multiplicities represent the multiset 'abb'
192
+ >>> components = 'ab'
193
+ >>> multiplicities = [1, 2]
194
+ >>> states = multiset_partitions_taocp(multiplicities)
195
+ >>> list(list_visitor(state, components) for state in states)
196
+ [[['a', 'b', 'b']],
197
+ [['a', 'b'], ['b']],
198
+ [['a'], ['b', 'b']],
199
+ [['a'], ['b'], ['b']]]
200
+
201
+ See Also
202
+ ========
203
+
204
+ sympy.utilities.iterables.multiset_partitions: Takes a multiset
205
+ as input and directly yields multiset partitions. It
206
+ dispatches to a number of functions, including this one, for
207
+ implementation. Most users will find it more convenient to
208
+ use than multiset_partitions_taocp.
209
+
210
+ """
211
+
212
+ # Important variables.
213
+ # m is the number of components, i.e., number of distinct elements
214
+ m = len(multiplicities)
215
+ # n is the cardinality, total number of elements whether or not distinct
216
+ n = sum(multiplicities)
217
+
218
+ # The main data structure, f segments pstack into parts. See
219
+ # list_visitor() for example code indicating how this internal
220
+ # state corresponds to a partition.
221
+
222
+ # Note: allocation of space for stack is conservative. Knuth's
223
+ # exercise 7.2.1.5.68 gives some indication of how to tighten this
224
+ # bound, but this is not implemented.
225
+ pstack = [PartComponent() for i in range(n * m + 1)]
226
+ f = [0] * (n + 1)
227
+
228
+ # Step M1 in Knuth (Initialize)
229
+ # Initial state - entire multiset in one part.
230
+ for j in range(m):
231
+ ps = pstack[j]
232
+ ps.c = j
233
+ ps.u = multiplicities[j]
234
+ ps.v = multiplicities[j]
235
+
236
+ # Other variables
237
+ f[0] = 0
238
+ a = 0
239
+ lpart = 0
240
+ f[1] = m
241
+ b = m # in general, current stack frame is from a to b - 1
242
+
243
+ while True:
244
+ while True:
245
+ # Step M2 (Subtract v from u)
246
+ j = a
247
+ k = b
248
+ x = False
249
+ while j < b:
250
+ pstack[k].u = pstack[j].u - pstack[j].v
251
+ if pstack[k].u == 0:
252
+ x = True
253
+ elif not x:
254
+ pstack[k].c = pstack[j].c
255
+ pstack[k].v = min(pstack[j].v, pstack[k].u)
256
+ x = pstack[k].u < pstack[j].v
257
+ k = k + 1
258
+ else: # x is True
259
+ pstack[k].c = pstack[j].c
260
+ pstack[k].v = pstack[k].u
261
+ k = k + 1
262
+ j = j + 1
263
+ # Note: x is True iff v has changed
264
+
265
+ # Step M3 (Push if nonzero.)
266
+ if k > b:
267
+ a = b
268
+ b = k
269
+ lpart = lpart + 1
270
+ f[lpart + 1] = b
271
+ # Return to M2
272
+ else:
273
+ break # Continue to M4
274
+
275
+ # M4 Visit a partition
276
+ state = [f, lpart, pstack]
277
+ yield state
278
+
279
+ # M5 (Decrease v)
280
+ while True:
281
+ j = b-1
282
+ while (pstack[j].v == 0):
283
+ j = j - 1
284
+ if j == a and pstack[j].v == 1:
285
+ # M6 (Backtrack)
286
+ if lpart == 0:
287
+ return
288
+ lpart = lpart - 1
289
+ b = a
290
+ a = f[lpart]
291
+ # Return to M5
292
+ else:
293
+ pstack[j].v = pstack[j].v - 1
294
+ for k in range(j + 1, b):
295
+ pstack[k].v = pstack[k].u
296
+ break # GOTO M2
297
+
298
+ # --------------- Visitor functions for multiset partitions ---------------
299
+ # A visitor takes the partition state generated by
300
+ # multiset_partitions_taocp or other enumerator, and produces useful
301
+ # output (such as the actual partition).
302
+
303
+
304
+ def factoring_visitor(state, primes):
305
+ """Use with multiset_partitions_taocp to enumerate the ways a
306
+ number can be expressed as a product of factors. For this usage,
307
+ the exponents of the prime factors of a number are arguments to
308
+ the partition enumerator, while the corresponding prime factors
309
+ are input here.
310
+
311
+ Examples
312
+ ========
313
+
314
+ To enumerate the factorings of a number we can think of the elements of the
315
+ partition as being the prime factors and the multiplicities as being their
316
+ exponents.
317
+
318
+ >>> from sympy.utilities.enumerative import factoring_visitor
319
+ >>> from sympy.utilities.enumerative import multiset_partitions_taocp
320
+ >>> from sympy import factorint
321
+ >>> primes, multiplicities = zip(*factorint(24).items())
322
+ >>> primes
323
+ (2, 3)
324
+ >>> multiplicities
325
+ (3, 1)
326
+ >>> states = multiset_partitions_taocp(multiplicities)
327
+ >>> list(factoring_visitor(state, primes) for state in states)
328
+ [[24], [8, 3], [12, 2], [4, 6], [4, 2, 3], [6, 2, 2], [2, 2, 2, 3]]
329
+ """
330
+ f, lpart, pstack = state
331
+ factoring = []
332
+ for i in range(lpart + 1):
333
+ factor = 1
334
+ for ps in pstack[f[i]: f[i + 1]]:
335
+ if ps.v > 0:
336
+ factor *= primes[ps.c] ** ps.v
337
+ factoring.append(factor)
338
+ return factoring
339
+
340
+
341
+ def list_visitor(state, components):
342
+ """Return a list of lists to represent the partition.
343
+
344
+ Examples
345
+ ========
346
+
347
+ >>> from sympy.utilities.enumerative import list_visitor
348
+ >>> from sympy.utilities.enumerative import multiset_partitions_taocp
349
+ >>> states = multiset_partitions_taocp([1, 2, 1])
350
+ >>> s = next(states)
351
+ >>> list_visitor(s, 'abc') # for multiset 'a b b c'
352
+ [['a', 'b', 'b', 'c']]
353
+ >>> s = next(states)
354
+ >>> list_visitor(s, [1, 2, 3]) # for multiset '1 2 2 3
355
+ [[1, 2, 2], [3]]
356
+ """
357
+ f, lpart, pstack = state
358
+
359
+ partition = []
360
+ for i in range(lpart+1):
361
+ part = []
362
+ for ps in pstack[f[i]:f[i+1]]:
363
+ if ps.v > 0:
364
+ part.extend([components[ps.c]] * ps.v)
365
+ partition.append(part)
366
+
367
+ return partition
368
+
369
+
370
+ class MultisetPartitionTraverser():
371
+ """
372
+ Has methods to ``enumerate`` and ``count`` the partitions of a multiset.
373
+
374
+ This implements a refactored and extended version of Knuth's algorithm
375
+ 7.1.2.5M [AOCP]_."
376
+
377
+ The enumeration methods of this class are generators and return
378
+ data structures which can be interpreted by the same visitor
379
+ functions used for the output of ``multiset_partitions_taocp``.
380
+
381
+ Examples
382
+ ========
383
+
384
+ >>> from sympy.utilities.enumerative import MultisetPartitionTraverser
385
+ >>> m = MultisetPartitionTraverser()
386
+ >>> m.count_partitions([4,4,4,2])
387
+ 127750
388
+ >>> m.count_partitions([3,3,3])
389
+ 686
390
+
391
+ See Also
392
+ ========
393
+
394
+ multiset_partitions_taocp
395
+ sympy.utilities.iterables.multiset_partitions
396
+
397
+ References
398
+ ==========
399
+
400
+ .. [AOCP] Algorithm 7.1.2.5M in Volume 4A, Combinatoral Algorithms,
401
+ Part 1, of The Art of Computer Programming, by Donald Knuth.
402
+
403
+ .. [Factorisatio] On a Problem of Oppenheim concerning
404
+ "Factorisatio Numerorum" E. R. Canfield, Paul Erdos, Carl
405
+ Pomerance, JOURNAL OF NUMBER THEORY, Vol. 17, No. 1. August
406
+ 1983. See section 7 for a description of an algorithm
407
+ similar to Knuth's.
408
+
409
+ .. [Yorgey] Generating Multiset Partitions, Brent Yorgey, The
410
+ Monad.Reader, Issue 8, September 2007.
411
+
412
+ """
413
+
414
+ def __init__(self):
415
+ self.debug = False
416
+ # TRACING variables. These are useful for gathering
417
+ # statistics on the algorithm itself, but have no particular
418
+ # benefit to a user of the code.
419
+ self.k1 = 0
420
+ self.k2 = 0
421
+ self.p1 = 0
422
+ self.pstack = None
423
+ self.f = None
424
+ self.lpart = 0
425
+ self.discarded = 0
426
+ # dp_stack is list of lists of (part_key, start_count) pairs
427
+ self.dp_stack = []
428
+
429
+ # dp_map is map part_key-> count, where count represents the
430
+ # number of multiset which are descendants of a part with this
431
+ # key, **or any of its decrements**
432
+
433
+ # Thus, when we find a part in the map, we add its count
434
+ # value to the running total, cut off the enumeration, and
435
+ # backtrack
436
+
437
+ if not hasattr(self, 'dp_map'):
438
+ self.dp_map = {}
439
+
440
+ def db_trace(self, msg):
441
+ """Useful for understanding/debugging the algorithms. Not
442
+ generally activated in end-user code."""
443
+ if self.debug:
444
+ # XXX: animation_visitor is undefined... Clearly this does not
445
+ # work and was not tested. Previous code in comments below.
446
+ raise RuntimeError
447
+ #letters = 'abcdefghijklmnopqrstuvwxyz'
448
+ #state = [self.f, self.lpart, self.pstack]
449
+ #print("DBG:", msg,
450
+ # ["".join(part) for part in list_visitor(state, letters)],
451
+ # animation_visitor(state))
452
+
453
+ #
454
+ # Helper methods for enumeration
455
+ #
456
+ def _initialize_enumeration(self, multiplicities):
457
+ """Allocates and initializes the partition stack.
458
+
459
+ This is called from the enumeration/counting routines, so
460
+ there is no need to call it separately."""
461
+
462
+ num_components = len(multiplicities)
463
+ # cardinality is the total number of elements, whether or not distinct
464
+ cardinality = sum(multiplicities)
465
+
466
+ # pstack is the partition stack, which is segmented by
467
+ # f into parts.
468
+ self.pstack = [PartComponent() for i in
469
+ range(num_components * cardinality + 1)]
470
+ self.f = [0] * (cardinality + 1)
471
+
472
+ # Initial state - entire multiset in one part.
473
+ for j in range(num_components):
474
+ ps = self.pstack[j]
475
+ ps.c = j
476
+ ps.u = multiplicities[j]
477
+ ps.v = multiplicities[j]
478
+
479
+ self.f[0] = 0
480
+ self.f[1] = num_components
481
+ self.lpart = 0
482
+
483
+ # The decrement_part() method corresponds to step M5 in Knuth's
484
+ # algorithm. This is the base version for enum_all(). Modified
485
+ # versions of this method are needed if we want to restrict
486
+ # sizes of the partitions produced.
487
+ def decrement_part(self, part):
488
+ """Decrements part (a subrange of pstack), if possible, returning
489
+ True iff the part was successfully decremented.
490
+
491
+ If you think of the v values in the part as a multi-digit
492
+ integer (least significant digit on the right) this is
493
+ basically decrementing that integer, but with the extra
494
+ constraint that the leftmost digit cannot be decremented to 0.
495
+
496
+ Parameters
497
+ ==========
498
+
499
+ part
500
+ The part, represented as a list of PartComponent objects,
501
+ which is to be decremented.
502
+
503
+ """
504
+ plen = len(part)
505
+ for j in range(plen - 1, -1, -1):
506
+ if j == 0 and part[j].v > 1 or j > 0 and part[j].v > 0:
507
+ # found val to decrement
508
+ part[j].v -= 1
509
+ # Reset trailing parts back to maximum
510
+ for k in range(j + 1, plen):
511
+ part[k].v = part[k].u
512
+ return True
513
+ return False
514
+
515
+ # Version to allow number of parts to be bounded from above.
516
+ # Corresponds to (a modified) step M5.
517
+ def decrement_part_small(self, part, ub):
518
+ """Decrements part (a subrange of pstack), if possible, returning
519
+ True iff the part was successfully decremented.
520
+
521
+ Parameters
522
+ ==========
523
+
524
+ part
525
+ part to be decremented (topmost part on the stack)
526
+
527
+ ub
528
+ the maximum number of parts allowed in a partition
529
+ returned by the calling traversal.
530
+
531
+ Notes
532
+ =====
533
+
534
+ The goal of this modification of the ordinary decrement method
535
+ is to fail (meaning that the subtree rooted at this part is to
536
+ be skipped) when it can be proved that this part can only have
537
+ child partitions which are larger than allowed by ``ub``. If a
538
+ decision is made to fail, it must be accurate, otherwise the
539
+ enumeration will miss some partitions. But, it is OK not to
540
+ capture all the possible failures -- if a part is passed that
541
+ should not be, the resulting too-large partitions are filtered
542
+ by the enumeration one level up. However, as is usual in
543
+ constrained enumerations, failing early is advantageous.
544
+
545
+ The tests used by this method catch the most common cases,
546
+ although this implementation is by no means the last word on
547
+ this problem. The tests include:
548
+
549
+ 1) ``lpart`` must be less than ``ub`` by at least 2. This is because
550
+ once a part has been decremented, the partition
551
+ will gain at least one child in the spread step.
552
+
553
+ 2) If the leading component of the part is about to be
554
+ decremented, check for how many parts will be added in
555
+ order to use up the unallocated multiplicity in that
556
+ leading component, and fail if this number is greater than
557
+ allowed by ``ub``. (See code for the exact expression.) This
558
+ test is given in the answer to Knuth's problem 7.2.1.5.69.
559
+
560
+ 3) If there is *exactly* enough room to expand the leading
561
+ component by the above test, check the next component (if
562
+ it exists) once decrementing has finished. If this has
563
+ ``v == 0``, this next component will push the expansion over the
564
+ limit by 1, so fail.
565
+ """
566
+ if self.lpart >= ub - 1:
567
+ self.p1 += 1 # increment to keep track of usefulness of tests
568
+ return False
569
+ plen = len(part)
570
+ for j in range(plen - 1, -1, -1):
571
+ # Knuth's mod, (answer to problem 7.2.1.5.69)
572
+ if j == 0 and (part[0].v - 1)*(ub - self.lpart) < part[0].u:
573
+ self.k1 += 1
574
+ return False
575
+
576
+ if j == 0 and part[j].v > 1 or j > 0 and part[j].v > 0:
577
+ # found val to decrement
578
+ part[j].v -= 1
579
+ # Reset trailing parts back to maximum
580
+ for k in range(j + 1, plen):
581
+ part[k].v = part[k].u
582
+
583
+ # Have now decremented part, but are we doomed to
584
+ # failure when it is expanded? Check one oddball case
585
+ # that turns out to be surprisingly common - exactly
586
+ # enough room to expand the leading component, but no
587
+ # room for the second component, which has v=0.
588
+ if (plen > 1 and part[1].v == 0 and
589
+ (part[0].u - part[0].v) ==
590
+ ((ub - self.lpart - 1) * part[0].v)):
591
+ self.k2 += 1
592
+ self.db_trace("Decrement fails test 3")
593
+ return False
594
+ return True
595
+ return False
596
+
597
+ def decrement_part_large(self, part, amt, lb):
598
+ """Decrements part, while respecting size constraint.
599
+
600
+ A part can have no children which are of sufficient size (as
601
+ indicated by ``lb``) unless that part has sufficient
602
+ unallocated multiplicity. When enforcing the size constraint,
603
+ this method will decrement the part (if necessary) by an
604
+ amount needed to ensure sufficient unallocated multiplicity.
605
+
606
+ Returns True iff the part was successfully decremented.
607
+
608
+ Parameters
609
+ ==========
610
+
611
+ part
612
+ part to be decremented (topmost part on the stack)
613
+
614
+ amt
615
+ Can only take values 0 or 1. A value of 1 means that the
616
+ part must be decremented, and then the size constraint is
617
+ enforced. A value of 0 means just to enforce the ``lb``
618
+ size constraint.
619
+
620
+ lb
621
+ The partitions produced by the calling enumeration must
622
+ have more parts than this value.
623
+
624
+ """
625
+
626
+ if amt == 1:
627
+ # In this case we always need to increment, *before*
628
+ # enforcing the "sufficient unallocated multiplicity"
629
+ # constraint. Easiest for this is just to call the
630
+ # regular decrement method.
631
+ if not self.decrement_part(part):
632
+ return False
633
+
634
+ # Next, perform any needed additional decrementing to respect
635
+ # "sufficient unallocated multiplicity" (or fail if this is
636
+ # not possible).
637
+ min_unalloc = lb - self.lpart
638
+ if min_unalloc <= 0:
639
+ return True
640
+ total_mult = sum(pc.u for pc in part)
641
+ total_alloc = sum(pc.v for pc in part)
642
+ if total_mult <= min_unalloc:
643
+ return False
644
+
645
+ deficit = min_unalloc - (total_mult - total_alloc)
646
+ if deficit <= 0:
647
+ return True
648
+
649
+ for i in range(len(part) - 1, -1, -1):
650
+ if i == 0:
651
+ if part[0].v > deficit:
652
+ part[0].v -= deficit
653
+ return True
654
+ else:
655
+ return False # This shouldn't happen, due to above check
656
+ else:
657
+ if part[i].v >= deficit:
658
+ part[i].v -= deficit
659
+ return True
660
+ else:
661
+ deficit -= part[i].v
662
+ part[i].v = 0
663
+
664
+ def decrement_part_range(self, part, lb, ub):
665
+ """Decrements part (a subrange of pstack), if possible, returning
666
+ True iff the part was successfully decremented.
667
+
668
+ Parameters
669
+ ==========
670
+
671
+ part
672
+ part to be decremented (topmost part on the stack)
673
+
674
+ ub
675
+ the maximum number of parts allowed in a partition
676
+ returned by the calling traversal.
677
+
678
+ lb
679
+ The partitions produced by the calling enumeration must
680
+ have more parts than this value.
681
+
682
+ Notes
683
+ =====
684
+
685
+ Combines the constraints of _small and _large decrement
686
+ methods. If returns success, part has been decremented at
687
+ least once, but perhaps by quite a bit more if needed to meet
688
+ the lb constraint.
689
+ """
690
+
691
+ # Constraint in the range case is just enforcing both the
692
+ # constraints from _small and _large cases. Note the 0 as the
693
+ # second argument to the _large call -- this is the signal to
694
+ # decrement only as needed to for constraint enforcement. The
695
+ # short circuiting and left-to-right order of the 'and'
696
+ # operator is important for this to work correctly.
697
+ return self.decrement_part_small(part, ub) and \
698
+ self.decrement_part_large(part, 0, lb)
699
+
700
+ def spread_part_multiplicity(self):
701
+ """Returns True if a new part has been created, and
702
+ adjusts pstack, f and lpart as needed.
703
+
704
+ Notes
705
+ =====
706
+
707
+ Spreads unallocated multiplicity from the current top part
708
+ into a new part created above the current on the stack. This
709
+ new part is constrained to be less than or equal to the old in
710
+ terms of the part ordering.
711
+
712
+ This call does nothing (and returns False) if the current top
713
+ part has no unallocated multiplicity.
714
+
715
+ """
716
+ j = self.f[self.lpart] # base of current top part
717
+ k = self.f[self.lpart + 1] # ub of current; potential base of next
718
+ base = k # save for later comparison
719
+
720
+ changed = False # Set to true when the new part (so far) is
721
+ # strictly less than (as opposed to less than
722
+ # or equal) to the old.
723
+ for j in range(self.f[self.lpart], self.f[self.lpart + 1]):
724
+ self.pstack[k].u = self.pstack[j].u - self.pstack[j].v
725
+ if self.pstack[k].u == 0:
726
+ changed = True
727
+ else:
728
+ self.pstack[k].c = self.pstack[j].c
729
+ if changed: # Put all available multiplicity in this part
730
+ self.pstack[k].v = self.pstack[k].u
731
+ else: # Still maintaining ordering constraint
732
+ if self.pstack[k].u < self.pstack[j].v:
733
+ self.pstack[k].v = self.pstack[k].u
734
+ changed = True
735
+ else:
736
+ self.pstack[k].v = self.pstack[j].v
737
+ k = k + 1
738
+ if k > base:
739
+ # Adjust for the new part on stack
740
+ self.lpart = self.lpart + 1
741
+ self.f[self.lpart + 1] = k
742
+ return True
743
+ return False
744
+
745
+ def top_part(self):
746
+ """Return current top part on the stack, as a slice of pstack.
747
+
748
+ """
749
+ return self.pstack[self.f[self.lpart]:self.f[self.lpart + 1]]
750
+
751
+ # Same interface and functionality as multiset_partitions_taocp(),
752
+ # but some might find this refactored version easier to follow.
753
+ def enum_all(self, multiplicities):
754
+ """Enumerate the partitions of a multiset.
755
+
756
+ Examples
757
+ ========
758
+
759
+ >>> from sympy.utilities.enumerative import list_visitor
760
+ >>> from sympy.utilities.enumerative import MultisetPartitionTraverser
761
+ >>> m = MultisetPartitionTraverser()
762
+ >>> states = m.enum_all([2,2])
763
+ >>> list(list_visitor(state, 'ab') for state in states)
764
+ [[['a', 'a', 'b', 'b']],
765
+ [['a', 'a', 'b'], ['b']],
766
+ [['a', 'a'], ['b', 'b']],
767
+ [['a', 'a'], ['b'], ['b']],
768
+ [['a', 'b', 'b'], ['a']],
769
+ [['a', 'b'], ['a', 'b']],
770
+ [['a', 'b'], ['a'], ['b']],
771
+ [['a'], ['a'], ['b', 'b']],
772
+ [['a'], ['a'], ['b'], ['b']]]
773
+
774
+ See Also
775
+ ========
776
+
777
+ multiset_partitions_taocp:
778
+ which provides the same result as this method, but is
779
+ about twice as fast. Hence, enum_all is primarily useful
780
+ for testing. Also see the function for a discussion of
781
+ states and visitors.
782
+
783
+ """
784
+ self._initialize_enumeration(multiplicities)
785
+ while True:
786
+ while self.spread_part_multiplicity():
787
+ pass
788
+
789
+ # M4 Visit a partition
790
+ state = [self.f, self.lpart, self.pstack]
791
+ yield state
792
+
793
+ # M5 (Decrease v)
794
+ while not self.decrement_part(self.top_part()):
795
+ # M6 (Backtrack)
796
+ if self.lpart == 0:
797
+ return
798
+ self.lpart -= 1
799
+
800
+ def enum_small(self, multiplicities, ub):
801
+ """Enumerate multiset partitions with no more than ``ub`` parts.
802
+
803
+ Equivalent to enum_range(multiplicities, 0, ub)
804
+
805
+ Parameters
806
+ ==========
807
+
808
+ multiplicities
809
+ list of multiplicities of the components of the multiset.
810
+
811
+ ub
812
+ Maximum number of parts
813
+
814
+ Examples
815
+ ========
816
+
817
+ >>> from sympy.utilities.enumerative import list_visitor
818
+ >>> from sympy.utilities.enumerative import MultisetPartitionTraverser
819
+ >>> m = MultisetPartitionTraverser()
820
+ >>> states = m.enum_small([2,2], 2)
821
+ >>> list(list_visitor(state, 'ab') for state in states)
822
+ [[['a', 'a', 'b', 'b']],
823
+ [['a', 'a', 'b'], ['b']],
824
+ [['a', 'a'], ['b', 'b']],
825
+ [['a', 'b', 'b'], ['a']],
826
+ [['a', 'b'], ['a', 'b']]]
827
+
828
+ The implementation is based, in part, on the answer given to
829
+ exercise 69, in Knuth [AOCP]_.
830
+
831
+ See Also
832
+ ========
833
+
834
+ enum_all, enum_large, enum_range
835
+
836
+ """
837
+
838
+ # Keep track of iterations which do not yield a partition.
839
+ # Clearly, we would like to keep this number small.
840
+ self.discarded = 0
841
+ if ub <= 0:
842
+ return
843
+ self._initialize_enumeration(multiplicities)
844
+ while True:
845
+ while self.spread_part_multiplicity():
846
+ self.db_trace('spread 1')
847
+ if self.lpart >= ub:
848
+ self.discarded += 1
849
+ self.db_trace(' Discarding')
850
+ self.lpart = ub - 2
851
+ break
852
+ else:
853
+ # M4 Visit a partition
854
+ state = [self.f, self.lpart, self.pstack]
855
+ yield state
856
+
857
+ # M5 (Decrease v)
858
+ while not self.decrement_part_small(self.top_part(), ub):
859
+ self.db_trace("Failed decrement, going to backtrack")
860
+ # M6 (Backtrack)
861
+ if self.lpart == 0:
862
+ return
863
+ self.lpart -= 1
864
+ self.db_trace("Backtracked to")
865
+ self.db_trace("decrement ok, about to expand")
866
+
867
+ def enum_large(self, multiplicities, lb):
868
+ """Enumerate the partitions of a multiset with lb < num(parts)
869
+
870
+ Equivalent to enum_range(multiplicities, lb, sum(multiplicities))
871
+
872
+ Parameters
873
+ ==========
874
+
875
+ multiplicities
876
+ list of multiplicities of the components of the multiset.
877
+
878
+ lb
879
+ Number of parts in the partition must be greater than
880
+ this lower bound.
881
+
882
+
883
+ Examples
884
+ ========
885
+
886
+ >>> from sympy.utilities.enumerative import list_visitor
887
+ >>> from sympy.utilities.enumerative import MultisetPartitionTraverser
888
+ >>> m = MultisetPartitionTraverser()
889
+ >>> states = m.enum_large([2,2], 2)
890
+ >>> list(list_visitor(state, 'ab') for state in states)
891
+ [[['a', 'a'], ['b'], ['b']],
892
+ [['a', 'b'], ['a'], ['b']],
893
+ [['a'], ['a'], ['b', 'b']],
894
+ [['a'], ['a'], ['b'], ['b']]]
895
+
896
+ See Also
897
+ ========
898
+
899
+ enum_all, enum_small, enum_range
900
+
901
+ """
902
+ self.discarded = 0
903
+ if lb >= sum(multiplicities):
904
+ return
905
+ self._initialize_enumeration(multiplicities)
906
+ self.decrement_part_large(self.top_part(), 0, lb)
907
+ while True:
908
+ good_partition = True
909
+ while self.spread_part_multiplicity():
910
+ if not self.decrement_part_large(self.top_part(), 0, lb):
911
+ # Failure here should be rare/impossible
912
+ self.discarded += 1
913
+ good_partition = False
914
+ break
915
+
916
+ # M4 Visit a partition
917
+ if good_partition:
918
+ state = [self.f, self.lpart, self.pstack]
919
+ yield state
920
+
921
+ # M5 (Decrease v)
922
+ while not self.decrement_part_large(self.top_part(), 1, lb):
923
+ # M6 (Backtrack)
924
+ if self.lpart == 0:
925
+ return
926
+ self.lpart -= 1
927
+
928
+ def enum_range(self, multiplicities, lb, ub):
929
+
930
+ """Enumerate the partitions of a multiset with
931
+ ``lb < num(parts) <= ub``.
932
+
933
+ In particular, if partitions with exactly ``k`` parts are
934
+ desired, call with ``(multiplicities, k - 1, k)``. This
935
+ method generalizes enum_all, enum_small, and enum_large.
936
+
937
+ Examples
938
+ ========
939
+
940
+ >>> from sympy.utilities.enumerative import list_visitor
941
+ >>> from sympy.utilities.enumerative import MultisetPartitionTraverser
942
+ >>> m = MultisetPartitionTraverser()
943
+ >>> states = m.enum_range([2,2], 1, 2)
944
+ >>> list(list_visitor(state, 'ab') for state in states)
945
+ [[['a', 'a', 'b'], ['b']],
946
+ [['a', 'a'], ['b', 'b']],
947
+ [['a', 'b', 'b'], ['a']],
948
+ [['a', 'b'], ['a', 'b']]]
949
+
950
+ """
951
+ # combine the constraints of the _large and _small
952
+ # enumerations.
953
+ self.discarded = 0
954
+ if ub <= 0 or lb >= sum(multiplicities):
955
+ return
956
+ self._initialize_enumeration(multiplicities)
957
+ self.decrement_part_large(self.top_part(), 0, lb)
958
+ while True:
959
+ good_partition = True
960
+ while self.spread_part_multiplicity():
961
+ self.db_trace("spread 1")
962
+ if not self.decrement_part_large(self.top_part(), 0, lb):
963
+ # Failure here - possible in range case?
964
+ self.db_trace(" Discarding (large cons)")
965
+ self.discarded += 1
966
+ good_partition = False
967
+ break
968
+ elif self.lpart >= ub:
969
+ self.discarded += 1
970
+ good_partition = False
971
+ self.db_trace(" Discarding small cons")
972
+ self.lpart = ub - 2
973
+ break
974
+
975
+ # M4 Visit a partition
976
+ if good_partition:
977
+ state = [self.f, self.lpart, self.pstack]
978
+ yield state
979
+
980
+ # M5 (Decrease v)
981
+ while not self.decrement_part_range(self.top_part(), lb, ub):
982
+ self.db_trace("Failed decrement, going to backtrack")
983
+ # M6 (Backtrack)
984
+ if self.lpart == 0:
985
+ return
986
+ self.lpart -= 1
987
+ self.db_trace("Backtracked to")
988
+ self.db_trace("decrement ok, about to expand")
989
+
990
+ def count_partitions_slow(self, multiplicities):
991
+ """Returns the number of partitions of a multiset whose elements
992
+ have the multiplicities given in ``multiplicities``.
993
+
994
+ Primarily for comparison purposes. It follows the same path as
995
+ enumerate, and counts, rather than generates, the partitions.
996
+
997
+ See Also
998
+ ========
999
+
1000
+ count_partitions
1001
+ Has the same calling interface, but is much faster.
1002
+
1003
+ """
1004
+ # number of partitions so far in the enumeration
1005
+ self.pcount = 0
1006
+ self._initialize_enumeration(multiplicities)
1007
+ while True:
1008
+ while self.spread_part_multiplicity():
1009
+ pass
1010
+
1011
+ # M4 Visit (count) a partition
1012
+ self.pcount += 1
1013
+
1014
+ # M5 (Decrease v)
1015
+ while not self.decrement_part(self.top_part()):
1016
+ # M6 (Backtrack)
1017
+ if self.lpart == 0:
1018
+ return self.pcount
1019
+ self.lpart -= 1
1020
+
1021
+ def count_partitions(self, multiplicities):
1022
+ """Returns the number of partitions of a multiset whose components
1023
+ have the multiplicities given in ``multiplicities``.
1024
+
1025
+ For larger counts, this method is much faster than calling one
1026
+ of the enumerators and counting the result. Uses dynamic
1027
+ programming to cut down on the number of nodes actually
1028
+ explored. The dictionary used in order to accelerate the
1029
+ counting process is stored in the ``MultisetPartitionTraverser``
1030
+ object and persists across calls. If the user does not
1031
+ expect to call ``count_partitions`` for any additional
1032
+ multisets, the object should be cleared to save memory. On
1033
+ the other hand, the cache built up from one count run can
1034
+ significantly speed up subsequent calls to ``count_partitions``,
1035
+ so it may be advantageous not to clear the object.
1036
+
1037
+ Examples
1038
+ ========
1039
+
1040
+ >>> from sympy.utilities.enumerative import MultisetPartitionTraverser
1041
+ >>> m = MultisetPartitionTraverser()
1042
+ >>> m.count_partitions([9,8,2])
1043
+ 288716
1044
+ >>> m.count_partitions([2,2])
1045
+ 9
1046
+ >>> del m
1047
+
1048
+ Notes
1049
+ =====
1050
+
1051
+ If one looks at the workings of Knuth's algorithm M [AOCP]_, it
1052
+ can be viewed as a traversal of a binary tree of parts. A
1053
+ part has (up to) two children, the left child resulting from
1054
+ the spread operation, and the right child from the decrement
1055
+ operation. The ordinary enumeration of multiset partitions is
1056
+ an in-order traversal of this tree, and with the partitions
1057
+ corresponding to paths from the root to the leaves. The
1058
+ mapping from paths to partitions is a little complicated,
1059
+ since the partition would contain only those parts which are
1060
+ leaves or the parents of a spread link, not those which are
1061
+ parents of a decrement link.
1062
+
1063
+ For counting purposes, it is sufficient to count leaves, and
1064
+ this can be done with a recursive in-order traversal. The
1065
+ number of leaves of a subtree rooted at a particular part is a
1066
+ function only of that part itself, so memoizing has the
1067
+ potential to speed up the counting dramatically.
1068
+
1069
+ This method follows a computational approach which is similar
1070
+ to the hypothetical memoized recursive function, but with two
1071
+ differences:
1072
+
1073
+ 1) This method is iterative, borrowing its structure from the
1074
+ other enumerations and maintaining an explicit stack of
1075
+ parts which are in the process of being counted. (There
1076
+ may be multisets which can be counted reasonably quickly by
1077
+ this implementation, but which would overflow the default
1078
+ Python recursion limit with a recursive implementation.)
1079
+
1080
+ 2) Instead of using the part data structure directly, a more
1081
+ compact key is constructed. This saves space, but more
1082
+ importantly coalesces some parts which would remain
1083
+ separate with physical keys.
1084
+
1085
+ Unlike the enumeration functions, there is currently no _range
1086
+ version of count_partitions. If someone wants to stretch
1087
+ their brain, it should be possible to construct one by
1088
+ memoizing with a histogram of counts rather than a single
1089
+ count, and combining the histograms.
1090
+ """
1091
+ # number of partitions so far in the enumeration
1092
+ self.pcount = 0
1093
+
1094
+ # dp_stack is list of lists of (part_key, start_count) pairs
1095
+ self.dp_stack = []
1096
+
1097
+ self._initialize_enumeration(multiplicities)
1098
+ pkey = part_key(self.top_part())
1099
+ self.dp_stack.append([(pkey, 0), ])
1100
+ while True:
1101
+ while self.spread_part_multiplicity():
1102
+ pkey = part_key(self.top_part())
1103
+ if pkey in self.dp_map:
1104
+ # Already have a cached value for the count of the
1105
+ # subtree rooted at this part. Add it to the
1106
+ # running counter, and break out of the spread
1107
+ # loop. The -1 below is to compensate for the
1108
+ # leaf that this code path would otherwise find,
1109
+ # and which gets incremented for below.
1110
+
1111
+ self.pcount += (self.dp_map[pkey] - 1)
1112
+ self.lpart -= 1
1113
+ break
1114
+ else:
1115
+ self.dp_stack.append([(pkey, self.pcount), ])
1116
+
1117
+ # M4 count a leaf partition
1118
+ self.pcount += 1
1119
+
1120
+ # M5 (Decrease v)
1121
+ while not self.decrement_part(self.top_part()):
1122
+ # M6 (Backtrack)
1123
+ for key, oldcount in self.dp_stack.pop():
1124
+ self.dp_map[key] = self.pcount - oldcount
1125
+ if self.lpart == 0:
1126
+ return self.pcount
1127
+ self.lpart -= 1
1128
+
1129
+ # At this point have successfully decremented the part on
1130
+ # the stack and it does not appear in the cache. It needs
1131
+ # to be added to the list at the top of dp_stack
1132
+ pkey = part_key(self.top_part())
1133
+ self.dp_stack[-1].append((pkey, self.pcount),)
1134
+
1135
+
1136
+ def part_key(part):
1137
+ """Helper for MultisetPartitionTraverser.count_partitions that
1138
+ creates a key for ``part``, that only includes information which can
1139
+ affect the count for that part. (Any irrelevant information just
1140
+ reduces the effectiveness of dynamic programming.)
1141
+
1142
+ Notes
1143
+ =====
1144
+
1145
+ This member function is a candidate for future exploration. There
1146
+ are likely symmetries that can be exploited to coalesce some
1147
+ ``part_key`` values, and thereby save space and improve
1148
+ performance.
1149
+
1150
+ """
1151
+ # The component number is irrelevant for counting partitions, so
1152
+ # leave it out of the memo key.
1153
+ rval = []
1154
+ for ps in part:
1155
+ rval.append(ps.u)
1156
+ rval.append(ps.v)
1157
+ return tuple(rval)
venv/lib/python3.10/site-packages/sympy/utilities/exceptions.py ADDED
@@ -0,0 +1,275 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ General SymPy exceptions and warnings.
3
+ """
4
+
5
+ import warnings
6
+ import contextlib
7
+
8
+ from textwrap import dedent
9
+
10
+
11
+ class SymPyDeprecationWarning(DeprecationWarning):
12
+ r"""
13
+ A warning for deprecated features of SymPy.
14
+
15
+ See the :ref:`deprecation-policy` document for details on when and how
16
+ things should be deprecated in SymPy.
17
+
18
+ Note that simply constructing this class will not cause a warning to be
19
+ issued. To do that, you must call the :func`sympy_deprecation_warning`
20
+ function. For this reason, it is not recommended to ever construct this
21
+ class directly.
22
+
23
+ Explanation
24
+ ===========
25
+
26
+ The ``SymPyDeprecationWarning`` class is a subclass of
27
+ ``DeprecationWarning`` that is used for all deprecations in SymPy. A
28
+ special subclass is used so that we can automatically augment the warning
29
+ message with additional metadata about the version the deprecation was
30
+ introduced in and a link to the documentation. This also allows users to
31
+ explicitly filter deprecation warnings from SymPy using ``warnings``
32
+ filters (see :ref:`silencing-sympy-deprecation-warnings`).
33
+
34
+ Additionally, ``SymPyDeprecationWarning`` is enabled to be shown by
35
+ default, unlike normal ``DeprecationWarning``\s, which are only shown by
36
+ default in interactive sessions. This ensures that deprecation warnings in
37
+ SymPy will actually be seen by users.
38
+
39
+ See the documentation of :func:`sympy_deprecation_warning` for a
40
+ description of the parameters to this function.
41
+
42
+ To mark a function as deprecated, you can use the :func:`@deprecated
43
+ <sympy.utilities.decorator.deprecated>` decorator.
44
+
45
+ See Also
46
+ ========
47
+ sympy.utilities.exceptions.sympy_deprecation_warning
48
+ sympy.utilities.exceptions.ignore_warnings
49
+ sympy.utilities.decorator.deprecated
50
+ sympy.testing.pytest.warns_deprecated_sympy
51
+
52
+ """
53
+ def __init__(self, message, *, deprecated_since_version, active_deprecations_target):
54
+
55
+ super().__init__(message, deprecated_since_version,
56
+ active_deprecations_target)
57
+ self.message = message
58
+ if not isinstance(deprecated_since_version, str):
59
+ raise TypeError(f"'deprecated_since_version' should be a string, got {deprecated_since_version!r}")
60
+ self.deprecated_since_version = deprecated_since_version
61
+ self.active_deprecations_target = active_deprecations_target
62
+ if any(i in active_deprecations_target for i in '()='):
63
+ raise ValueError("active_deprecations_target be the part inside of the '(...)='")
64
+
65
+ self.full_message = f"""
66
+
67
+ {dedent(message).strip()}
68
+
69
+ See https://docs.sympy.org/latest/explanation/active-deprecations.html#{active_deprecations_target}
70
+ for details.
71
+
72
+ This has been deprecated since SymPy version {deprecated_since_version}. It
73
+ will be removed in a future version of SymPy.
74
+ """
75
+
76
+ def __str__(self):
77
+ return self.full_message
78
+
79
+ def __repr__(self):
80
+ return f"{self.__class__.__name__}({self.message!r}, deprecated_since_version={self.deprecated_since_version!r}, active_deprecations_target={self.active_deprecations_target!r})"
81
+
82
+ def __eq__(self, other):
83
+ return isinstance(other, SymPyDeprecationWarning) and self.args == other.args
84
+
85
+ # Make pickling work. The by default, it tries to recreate the expression
86
+ # from its args, but this doesn't work because of our keyword-only
87
+ # arguments.
88
+ @classmethod
89
+ def _new(cls, message, deprecated_since_version,
90
+ active_deprecations_target):
91
+ return cls(message, deprecated_since_version=deprecated_since_version, active_deprecations_target=active_deprecations_target)
92
+
93
+ def __reduce__(self):
94
+ return (self._new, (self.message, self.deprecated_since_version, self.active_deprecations_target))
95
+
96
+ # Python by default hides DeprecationWarnings, which we do not want.
97
+ warnings.simplefilter("once", SymPyDeprecationWarning)
98
+
99
+ def sympy_deprecation_warning(message, *, deprecated_since_version,
100
+ active_deprecations_target, stacklevel=3):
101
+ r'''
102
+ Warn that a feature is deprecated in SymPy.
103
+
104
+ See the :ref:`deprecation-policy` document for details on when and how
105
+ things should be deprecated in SymPy.
106
+
107
+ To mark an entire function or class as deprecated, you can use the
108
+ :func:`@deprecated <sympy.utilities.decorator.deprecated>` decorator.
109
+
110
+ Parameters
111
+ ==========
112
+
113
+ message: str
114
+
115
+ The deprecation message. This may span multiple lines and contain
116
+ code examples. Messages should be wrapped to 80 characters. The
117
+ message is automatically dedented and leading and trailing whitespace
118
+ stripped. Messages may include dynamic content based on the user
119
+ input, but avoid using ``str(expression)`` if an expression can be
120
+ arbitrary, as it might be huge and make the warning message
121
+ unreadable.
122
+
123
+ deprecated_since_version: str
124
+
125
+ The version of SymPy the feature has been deprecated since. For new
126
+ deprecations, this should be the version in `sympy/release.py
127
+ <https://github.com/sympy/sympy/blob/master/sympy/release.py>`_
128
+ without the ``.dev``. If the next SymPy version ends up being
129
+ different from this, the release manager will need to update any
130
+ ``SymPyDeprecationWarning``\s using the incorrect version. This
131
+ argument is required and must be passed as a keyword argument.
132
+ (example: ``deprecated_since_version="1.10"``).
133
+
134
+ active_deprecations_target: str
135
+
136
+ The Sphinx target corresponding to the section for the deprecation in
137
+ the :ref:`active-deprecations` document (see
138
+ ``doc/src/explanation/active-deprecations.md``). This is used to
139
+ automatically generate a URL to the page in the warning message. This
140
+ argument is required and must be passed as a keyword argument.
141
+ (example: ``active_deprecations_target="deprecated-feature-abc"``)
142
+
143
+ stacklevel: int (default: 3)
144
+
145
+ The ``stacklevel`` parameter that is passed to ``warnings.warn``. If
146
+ you create a wrapper that calls this function, this should be
147
+ increased so that the warning message shows the user line of code that
148
+ produced the warning. Note that in some cases there will be multiple
149
+ possible different user code paths that could result in the warning.
150
+ In that case, just choose the smallest common stacklevel.
151
+
152
+ Examples
153
+ ========
154
+
155
+ >>> from sympy.utilities.exceptions import sympy_deprecation_warning
156
+ >>> def is_this_zero(x, y=0):
157
+ ... """
158
+ ... Determine if x = 0.
159
+ ...
160
+ ... Parameters
161
+ ... ==========
162
+ ...
163
+ ... x : Expr
164
+ ... The expression to check.
165
+ ...
166
+ ... y : Expr, optional
167
+ ... If provided, check if x = y.
168
+ ...
169
+ ... .. deprecated:: 1.1
170
+ ...
171
+ ... The ``y`` argument to ``is_this_zero`` is deprecated. Use
172
+ ... ``is_this_zero(x - y)`` instead.
173
+ ...
174
+ ... """
175
+ ... from sympy import simplify
176
+ ...
177
+ ... if y != 0:
178
+ ... sympy_deprecation_warning("""
179
+ ... The y argument to is_zero() is deprecated. Use is_zero(x - y) instead.""",
180
+ ... deprecated_since_version="1.1",
181
+ ... active_deprecations_target='is-this-zero-y-deprecation')
182
+ ... return simplify(x - y) == 0
183
+ >>> is_this_zero(0)
184
+ True
185
+ >>> is_this_zero(1, 1) # doctest: +SKIP
186
+ <stdin>:1: SymPyDeprecationWarning:
187
+ <BLANKLINE>
188
+ The y argument to is_zero() is deprecated. Use is_zero(x - y) instead.
189
+ <BLANKLINE>
190
+ See https://docs.sympy.org/latest/explanation/active-deprecations.html#is-this-zero-y-deprecation
191
+ for details.
192
+ <BLANKLINE>
193
+ This has been deprecated since SymPy version 1.1. It
194
+ will be removed in a future version of SymPy.
195
+ <BLANKLINE>
196
+ is_this_zero(1, 1)
197
+ True
198
+
199
+ See Also
200
+ ========
201
+
202
+ sympy.utilities.exceptions.SymPyDeprecationWarning
203
+ sympy.utilities.exceptions.ignore_warnings
204
+ sympy.utilities.decorator.deprecated
205
+ sympy.testing.pytest.warns_deprecated_sympy
206
+
207
+ '''
208
+ w = SymPyDeprecationWarning(message,
209
+ deprecated_since_version=deprecated_since_version,
210
+ active_deprecations_target=active_deprecations_target)
211
+ warnings.warn(w, stacklevel=stacklevel)
212
+
213
+
214
+ @contextlib.contextmanager
215
+ def ignore_warnings(warningcls):
216
+ '''
217
+ Context manager to suppress warnings during tests.
218
+
219
+ .. note::
220
+
221
+ Do not use this with SymPyDeprecationWarning in the tests.
222
+ warns_deprecated_sympy() should be used instead.
223
+
224
+ This function is useful for suppressing warnings during tests. The warns
225
+ function should be used to assert that a warning is raised. The
226
+ ignore_warnings function is useful in situation when the warning is not
227
+ guaranteed to be raised (e.g. on importing a module) or if the warning
228
+ comes from third-party code.
229
+
230
+ This function is also useful to prevent the same or similar warnings from
231
+ being issue twice due to recursive calls.
232
+
233
+ When the warning is coming (reliably) from SymPy the warns function should
234
+ be preferred to ignore_warnings.
235
+
236
+ >>> from sympy.utilities.exceptions import ignore_warnings
237
+ >>> import warnings
238
+
239
+ Here's a warning:
240
+
241
+ >>> with warnings.catch_warnings(): # reset warnings in doctest
242
+ ... warnings.simplefilter('error')
243
+ ... warnings.warn('deprecated', UserWarning)
244
+ Traceback (most recent call last):
245
+ ...
246
+ UserWarning: deprecated
247
+
248
+ Let's suppress it with ignore_warnings:
249
+
250
+ >>> with warnings.catch_warnings(): # reset warnings in doctest
251
+ ... warnings.simplefilter('error')
252
+ ... with ignore_warnings(UserWarning):
253
+ ... warnings.warn('deprecated', UserWarning)
254
+
255
+ (No warning emitted)
256
+
257
+ See Also
258
+ ========
259
+ sympy.utilities.exceptions.SymPyDeprecationWarning
260
+ sympy.utilities.exceptions.sympy_deprecation_warning
261
+ sympy.utilities.decorator.deprecated
262
+ sympy.testing.pytest.warns_deprecated_sympy
263
+
264
+ '''
265
+ # Absorbs all warnings in warnrec
266
+ with warnings.catch_warnings(record=True) as warnrec:
267
+ # Make sure our warning doesn't get filtered
268
+ warnings.simplefilter("always", warningcls)
269
+ # Now run the test
270
+ yield
271
+
272
+ # Reissue any warnings that we aren't testing for
273
+ for w in warnrec:
274
+ if not issubclass(w.category, warningcls):
275
+ warnings.warn_explicit(w.message, w.category, w.filename, w.lineno)
venv/lib/python3.10/site-packages/sympy/utilities/iterables.py ADDED
@@ -0,0 +1,3172 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from collections import Counter, defaultdict, OrderedDict
2
+ from itertools import (
3
+ chain, combinations, combinations_with_replacement, cycle, islice,
4
+ permutations, product, groupby
5
+ )
6
+ # For backwards compatibility
7
+ from itertools import product as cartes # noqa: F401
8
+ from operator import gt
9
+
10
+
11
+
12
+ # this is the logical location of these functions
13
+ from sympy.utilities.enumerative import (
14
+ multiset_partitions_taocp, list_visitor, MultisetPartitionTraverser)
15
+
16
+ from sympy.utilities.misc import as_int
17
+ from sympy.utilities.decorator import deprecated
18
+
19
+
20
+ def is_palindromic(s, i=0, j=None):
21
+ """
22
+ Return True if the sequence is the same from left to right as it
23
+ is from right to left in the whole sequence (default) or in the
24
+ Python slice ``s[i: j]``; else False.
25
+
26
+ Examples
27
+ ========
28
+
29
+ >>> from sympy.utilities.iterables import is_palindromic
30
+ >>> is_palindromic([1, 0, 1])
31
+ True
32
+ >>> is_palindromic('abcbb')
33
+ False
34
+ >>> is_palindromic('abcbb', 1)
35
+ False
36
+
37
+ Normal Python slicing is performed in place so there is no need to
38
+ create a slice of the sequence for testing:
39
+
40
+ >>> is_palindromic('abcbb', 1, -1)
41
+ True
42
+ >>> is_palindromic('abcbb', -4, -1)
43
+ True
44
+
45
+ See Also
46
+ ========
47
+
48
+ sympy.ntheory.digits.is_palindromic: tests integers
49
+
50
+ """
51
+ i, j, _ = slice(i, j).indices(len(s))
52
+ m = (j - i)//2
53
+ # if length is odd, middle element will be ignored
54
+ return all(s[i + k] == s[j - 1 - k] for k in range(m))
55
+
56
+
57
+ def flatten(iterable, levels=None, cls=None): # noqa: F811
58
+ """
59
+ Recursively denest iterable containers.
60
+
61
+ >>> from sympy import flatten
62
+
63
+ >>> flatten([1, 2, 3])
64
+ [1, 2, 3]
65
+ >>> flatten([1, 2, [3]])
66
+ [1, 2, 3]
67
+ >>> flatten([1, [2, 3], [4, 5]])
68
+ [1, 2, 3, 4, 5]
69
+ >>> flatten([1.0, 2, (1, None)])
70
+ [1.0, 2, 1, None]
71
+
72
+ If you want to denest only a specified number of levels of
73
+ nested containers, then set ``levels`` flag to the desired
74
+ number of levels::
75
+
76
+ >>> ls = [[(-2, -1), (1, 2)], [(0, 0)]]
77
+
78
+ >>> flatten(ls, levels=1)
79
+ [(-2, -1), (1, 2), (0, 0)]
80
+
81
+ If cls argument is specified, it will only flatten instances of that
82
+ class, for example:
83
+
84
+ >>> from sympy import Basic, S
85
+ >>> class MyOp(Basic):
86
+ ... pass
87
+ ...
88
+ >>> flatten([MyOp(S(1), MyOp(S(2), S(3)))], cls=MyOp)
89
+ [1, 2, 3]
90
+
91
+ adapted from https://kogs-www.informatik.uni-hamburg.de/~meine/python_tricks
92
+ """
93
+ from sympy.tensor.array import NDimArray
94
+ if levels is not None:
95
+ if not levels:
96
+ return iterable
97
+ elif levels > 0:
98
+ levels -= 1
99
+ else:
100
+ raise ValueError(
101
+ "expected non-negative number of levels, got %s" % levels)
102
+
103
+ if cls is None:
104
+ def reducible(x):
105
+ return is_sequence(x, set)
106
+ else:
107
+ def reducible(x):
108
+ return isinstance(x, cls)
109
+
110
+ result = []
111
+
112
+ for el in iterable:
113
+ if reducible(el):
114
+ if hasattr(el, 'args') and not isinstance(el, NDimArray):
115
+ el = el.args
116
+ result.extend(flatten(el, levels=levels, cls=cls))
117
+ else:
118
+ result.append(el)
119
+
120
+ return result
121
+
122
+
123
+ def unflatten(iter, n=2):
124
+ """Group ``iter`` into tuples of length ``n``. Raise an error if
125
+ the length of ``iter`` is not a multiple of ``n``.
126
+ """
127
+ if n < 1 or len(iter) % n:
128
+ raise ValueError('iter length is not a multiple of %i' % n)
129
+ return list(zip(*(iter[i::n] for i in range(n))))
130
+
131
+
132
+ def reshape(seq, how):
133
+ """Reshape the sequence according to the template in ``how``.
134
+
135
+ Examples
136
+ ========
137
+
138
+ >>> from sympy.utilities import reshape
139
+ >>> seq = list(range(1, 9))
140
+
141
+ >>> reshape(seq, [4]) # lists of 4
142
+ [[1, 2, 3, 4], [5, 6, 7, 8]]
143
+
144
+ >>> reshape(seq, (4,)) # tuples of 4
145
+ [(1, 2, 3, 4), (5, 6, 7, 8)]
146
+
147
+ >>> reshape(seq, (2, 2)) # tuples of 4
148
+ [(1, 2, 3, 4), (5, 6, 7, 8)]
149
+
150
+ >>> reshape(seq, (2, [2])) # (i, i, [i, i])
151
+ [(1, 2, [3, 4]), (5, 6, [7, 8])]
152
+
153
+ >>> reshape(seq, ((2,), [2])) # etc....
154
+ [((1, 2), [3, 4]), ((5, 6), [7, 8])]
155
+
156
+ >>> reshape(seq, (1, [2], 1))
157
+ [(1, [2, 3], 4), (5, [6, 7], 8)]
158
+
159
+ >>> reshape(tuple(seq), ([[1], 1, (2,)],))
160
+ (([[1], 2, (3, 4)],), ([[5], 6, (7, 8)],))
161
+
162
+ >>> reshape(tuple(seq), ([1], 1, (2,)))
163
+ (([1], 2, (3, 4)), ([5], 6, (7, 8)))
164
+
165
+ >>> reshape(list(range(12)), [2, [3], {2}, (1, (3,), 1)])
166
+ [[0, 1, [2, 3, 4], {5, 6}, (7, (8, 9, 10), 11)]]
167
+
168
+ """
169
+ m = sum(flatten(how))
170
+ n, rem = divmod(len(seq), m)
171
+ if m < 0 or rem:
172
+ raise ValueError('template must sum to positive number '
173
+ 'that divides the length of the sequence')
174
+ i = 0
175
+ container = type(how)
176
+ rv = [None]*n
177
+ for k in range(len(rv)):
178
+ _rv = []
179
+ for hi in how:
180
+ if isinstance(hi, int):
181
+ _rv.extend(seq[i: i + hi])
182
+ i += hi
183
+ else:
184
+ n = sum(flatten(hi))
185
+ hi_type = type(hi)
186
+ _rv.append(hi_type(reshape(seq[i: i + n], hi)[0]))
187
+ i += n
188
+ rv[k] = container(_rv)
189
+ return type(seq)(rv)
190
+
191
+
192
+ def group(seq, multiple=True):
193
+ """
194
+ Splits a sequence into a list of lists of equal, adjacent elements.
195
+
196
+ Examples
197
+ ========
198
+
199
+ >>> from sympy import group
200
+
201
+ >>> group([1, 1, 1, 2, 2, 3])
202
+ [[1, 1, 1], [2, 2], [3]]
203
+ >>> group([1, 1, 1, 2, 2, 3], multiple=False)
204
+ [(1, 3), (2, 2), (3, 1)]
205
+ >>> group([1, 1, 3, 2, 2, 1], multiple=False)
206
+ [(1, 2), (3, 1), (2, 2), (1, 1)]
207
+
208
+ See Also
209
+ ========
210
+
211
+ multiset
212
+
213
+ """
214
+ if multiple:
215
+ return [(list(g)) for _, g in groupby(seq)]
216
+ return [(k, len(list(g))) for k, g in groupby(seq)]
217
+
218
+
219
+ def _iproduct2(iterable1, iterable2):
220
+ '''Cartesian product of two possibly infinite iterables'''
221
+
222
+ it1 = iter(iterable1)
223
+ it2 = iter(iterable2)
224
+
225
+ elems1 = []
226
+ elems2 = []
227
+
228
+ sentinel = object()
229
+ def append(it, elems):
230
+ e = next(it, sentinel)
231
+ if e is not sentinel:
232
+ elems.append(e)
233
+
234
+ n = 0
235
+ append(it1, elems1)
236
+ append(it2, elems2)
237
+
238
+ while n <= len(elems1) + len(elems2):
239
+ for m in range(n-len(elems1)+1, len(elems2)):
240
+ yield (elems1[n-m], elems2[m])
241
+ n += 1
242
+ append(it1, elems1)
243
+ append(it2, elems2)
244
+
245
+
246
+ def iproduct(*iterables):
247
+ '''
248
+ Cartesian product of iterables.
249
+
250
+ Generator of the Cartesian product of iterables. This is analogous to
251
+ itertools.product except that it works with infinite iterables and will
252
+ yield any item from the infinite product eventually.
253
+
254
+ Examples
255
+ ========
256
+
257
+ >>> from sympy.utilities.iterables import iproduct
258
+ >>> sorted(iproduct([1,2], [3,4]))
259
+ [(1, 3), (1, 4), (2, 3), (2, 4)]
260
+
261
+ With an infinite iterator:
262
+
263
+ >>> from sympy import S
264
+ >>> (3,) in iproduct(S.Integers)
265
+ True
266
+ >>> (3, 4) in iproduct(S.Integers, S.Integers)
267
+ True
268
+
269
+ .. seealso::
270
+
271
+ `itertools.product
272
+ <https://docs.python.org/3/library/itertools.html#itertools.product>`_
273
+ '''
274
+ if len(iterables) == 0:
275
+ yield ()
276
+ return
277
+ elif len(iterables) == 1:
278
+ for e in iterables[0]:
279
+ yield (e,)
280
+ elif len(iterables) == 2:
281
+ yield from _iproduct2(*iterables)
282
+ else:
283
+ first, others = iterables[0], iterables[1:]
284
+ for ef, eo in _iproduct2(first, iproduct(*others)):
285
+ yield (ef,) + eo
286
+
287
+
288
+ def multiset(seq):
289
+ """Return the hashable sequence in multiset form with values being the
290
+ multiplicity of the item in the sequence.
291
+
292
+ Examples
293
+ ========
294
+
295
+ >>> from sympy.utilities.iterables import multiset
296
+ >>> multiset('mississippi')
297
+ {'i': 4, 'm': 1, 'p': 2, 's': 4}
298
+
299
+ See Also
300
+ ========
301
+
302
+ group
303
+
304
+ """
305
+ return dict(Counter(seq).items())
306
+
307
+
308
+
309
+
310
+ def ibin(n, bits=None, str=False):
311
+ """Return a list of length ``bits`` corresponding to the binary value
312
+ of ``n`` with small bits to the right (last). If bits is omitted, the
313
+ length will be the number required to represent ``n``. If the bits are
314
+ desired in reversed order, use the ``[::-1]`` slice of the returned list.
315
+
316
+ If a sequence of all bits-length lists starting from ``[0, 0,..., 0]``
317
+ through ``[1, 1, ..., 1]`` are desired, pass a non-integer for bits, e.g.
318
+ ``'all'``.
319
+
320
+ If the bit *string* is desired pass ``str=True``.
321
+
322
+ Examples
323
+ ========
324
+
325
+ >>> from sympy.utilities.iterables import ibin
326
+ >>> ibin(2)
327
+ [1, 0]
328
+ >>> ibin(2, 4)
329
+ [0, 0, 1, 0]
330
+
331
+ If all lists corresponding to 0 to 2**n - 1, pass a non-integer
332
+ for bits:
333
+
334
+ >>> bits = 2
335
+ >>> for i in ibin(2, 'all'):
336
+ ... print(i)
337
+ (0, 0)
338
+ (0, 1)
339
+ (1, 0)
340
+ (1, 1)
341
+
342
+ If a bit string is desired of a given length, use str=True:
343
+
344
+ >>> n = 123
345
+ >>> bits = 10
346
+ >>> ibin(n, bits, str=True)
347
+ '0001111011'
348
+ >>> ibin(n, bits, str=True)[::-1] # small bits left
349
+ '1101111000'
350
+ >>> list(ibin(3, 'all', str=True))
351
+ ['000', '001', '010', '011', '100', '101', '110', '111']
352
+
353
+ """
354
+ if n < 0:
355
+ raise ValueError("negative numbers are not allowed")
356
+ n = as_int(n)
357
+
358
+ if bits is None:
359
+ bits = 0
360
+ else:
361
+ try:
362
+ bits = as_int(bits)
363
+ except ValueError:
364
+ bits = -1
365
+ else:
366
+ if n.bit_length() > bits:
367
+ raise ValueError(
368
+ "`bits` must be >= {}".format(n.bit_length()))
369
+
370
+ if not str:
371
+ if bits >= 0:
372
+ return [1 if i == "1" else 0 for i in bin(n)[2:].rjust(bits, "0")]
373
+ else:
374
+ return variations(range(2), n, repetition=True)
375
+ else:
376
+ if bits >= 0:
377
+ return bin(n)[2:].rjust(bits, "0")
378
+ else:
379
+ return (bin(i)[2:].rjust(n, "0") for i in range(2**n))
380
+
381
+
382
+ def variations(seq, n, repetition=False):
383
+ r"""Returns an iterator over the n-sized variations of ``seq`` (size N).
384
+ ``repetition`` controls whether items in ``seq`` can appear more than once;
385
+
386
+ Examples
387
+ ========
388
+
389
+ ``variations(seq, n)`` will return `\frac{N!}{(N - n)!}` permutations without
390
+ repetition of ``seq``'s elements:
391
+
392
+ >>> from sympy import variations
393
+ >>> list(variations([1, 2], 2))
394
+ [(1, 2), (2, 1)]
395
+
396
+ ``variations(seq, n, True)`` will return the `N^n` permutations obtained
397
+ by allowing repetition of elements:
398
+
399
+ >>> list(variations([1, 2], 2, repetition=True))
400
+ [(1, 1), (1, 2), (2, 1), (2, 2)]
401
+
402
+ If you ask for more items than are in the set you get the empty set unless
403
+ you allow repetitions:
404
+
405
+ >>> list(variations([0, 1], 3, repetition=False))
406
+ []
407
+ >>> list(variations([0, 1], 3, repetition=True))[:4]
408
+ [(0, 0, 0), (0, 0, 1), (0, 1, 0), (0, 1, 1)]
409
+
410
+ .. seealso::
411
+
412
+ `itertools.permutations
413
+ <https://docs.python.org/3/library/itertools.html#itertools.permutations>`_,
414
+ `itertools.product
415
+ <https://docs.python.org/3/library/itertools.html#itertools.product>`_
416
+ """
417
+ if not repetition:
418
+ seq = tuple(seq)
419
+ if len(seq) < n:
420
+ return iter(()) # 0 length iterator
421
+ return permutations(seq, n)
422
+ else:
423
+ if n == 0:
424
+ return iter(((),)) # yields 1 empty tuple
425
+ else:
426
+ return product(seq, repeat=n)
427
+
428
+
429
+ def subsets(seq, k=None, repetition=False):
430
+ r"""Generates all `k`-subsets (combinations) from an `n`-element set, ``seq``.
431
+
432
+ A `k`-subset of an `n`-element set is any subset of length exactly `k`. The
433
+ number of `k`-subsets of an `n`-element set is given by ``binomial(n, k)``,
434
+ whereas there are `2^n` subsets all together. If `k` is ``None`` then all
435
+ `2^n` subsets will be returned from shortest to longest.
436
+
437
+ Examples
438
+ ========
439
+
440
+ >>> from sympy import subsets
441
+
442
+ ``subsets(seq, k)`` will return the
443
+ `\frac{n!}{k!(n - k)!}` `k`-subsets (combinations)
444
+ without repetition, i.e. once an item has been removed, it can no
445
+ longer be "taken":
446
+
447
+ >>> list(subsets([1, 2], 2))
448
+ [(1, 2)]
449
+ >>> list(subsets([1, 2]))
450
+ [(), (1,), (2,), (1, 2)]
451
+ >>> list(subsets([1, 2, 3], 2))
452
+ [(1, 2), (1, 3), (2, 3)]
453
+
454
+
455
+ ``subsets(seq, k, repetition=True)`` will return the
456
+ `\frac{(n - 1 + k)!}{k!(n - 1)!}`
457
+ combinations *with* repetition:
458
+
459
+ >>> list(subsets([1, 2], 2, repetition=True))
460
+ [(1, 1), (1, 2), (2, 2)]
461
+
462
+ If you ask for more items than are in the set you get the empty set unless
463
+ you allow repetitions:
464
+
465
+ >>> list(subsets([0, 1], 3, repetition=False))
466
+ []
467
+ >>> list(subsets([0, 1], 3, repetition=True))
468
+ [(0, 0, 0), (0, 0, 1), (0, 1, 1), (1, 1, 1)]
469
+
470
+ """
471
+ if k is None:
472
+ if not repetition:
473
+ return chain.from_iterable((combinations(seq, k)
474
+ for k in range(len(seq) + 1)))
475
+ else:
476
+ return chain.from_iterable((combinations_with_replacement(seq, k)
477
+ for k in range(len(seq) + 1)))
478
+ else:
479
+ if not repetition:
480
+ return combinations(seq, k)
481
+ else:
482
+ return combinations_with_replacement(seq, k)
483
+
484
+
485
+ def filter_symbols(iterator, exclude):
486
+ """
487
+ Only yield elements from `iterator` that do not occur in `exclude`.
488
+
489
+ Parameters
490
+ ==========
491
+
492
+ iterator : iterable
493
+ iterator to take elements from
494
+
495
+ exclude : iterable
496
+ elements to exclude
497
+
498
+ Returns
499
+ =======
500
+
501
+ iterator : iterator
502
+ filtered iterator
503
+ """
504
+ exclude = set(exclude)
505
+ for s in iterator:
506
+ if s not in exclude:
507
+ yield s
508
+
509
+ def numbered_symbols(prefix='x', cls=None, start=0, exclude=(), *args, **assumptions):
510
+ """
511
+ Generate an infinite stream of Symbols consisting of a prefix and
512
+ increasing subscripts provided that they do not occur in ``exclude``.
513
+
514
+ Parameters
515
+ ==========
516
+
517
+ prefix : str, optional
518
+ The prefix to use. By default, this function will generate symbols of
519
+ the form "x0", "x1", etc.
520
+
521
+ cls : class, optional
522
+ The class to use. By default, it uses ``Symbol``, but you can also use ``Wild``
523
+ or ``Dummy``.
524
+
525
+ start : int, optional
526
+ The start number. By default, it is 0.
527
+
528
+ Returns
529
+ =======
530
+
531
+ sym : Symbol
532
+ The subscripted symbols.
533
+ """
534
+ exclude = set(exclude or [])
535
+ if cls is None:
536
+ # We can't just make the default cls=Symbol because it isn't
537
+ # imported yet.
538
+ from sympy.core import Symbol
539
+ cls = Symbol
540
+
541
+ while True:
542
+ name = '%s%s' % (prefix, start)
543
+ s = cls(name, *args, **assumptions)
544
+ if s not in exclude:
545
+ yield s
546
+ start += 1
547
+
548
+
549
+ def capture(func):
550
+ """Return the printed output of func().
551
+
552
+ ``func`` should be a function without arguments that produces output with
553
+ print statements.
554
+
555
+ >>> from sympy.utilities.iterables import capture
556
+ >>> from sympy import pprint
557
+ >>> from sympy.abc import x
558
+ >>> def foo():
559
+ ... print('hello world!')
560
+ ...
561
+ >>> 'hello' in capture(foo) # foo, not foo()
562
+ True
563
+ >>> capture(lambda: pprint(2/x))
564
+ '2\\n-\\nx\\n'
565
+
566
+ """
567
+ from io import StringIO
568
+ import sys
569
+
570
+ stdout = sys.stdout
571
+ sys.stdout = file = StringIO()
572
+ try:
573
+ func()
574
+ finally:
575
+ sys.stdout = stdout
576
+ return file.getvalue()
577
+
578
+
579
+ def sift(seq, keyfunc, binary=False):
580
+ """
581
+ Sift the sequence, ``seq`` according to ``keyfunc``.
582
+
583
+ Returns
584
+ =======
585
+
586
+ When ``binary`` is ``False`` (default), the output is a dictionary
587
+ where elements of ``seq`` are stored in a list keyed to the value
588
+ of keyfunc for that element. If ``binary`` is True then a tuple
589
+ with lists ``T`` and ``F`` are returned where ``T`` is a list
590
+ containing elements of seq for which ``keyfunc`` was ``True`` and
591
+ ``F`` containing those elements for which ``keyfunc`` was ``False``;
592
+ a ValueError is raised if the ``keyfunc`` is not binary.
593
+
594
+ Examples
595
+ ========
596
+
597
+ >>> from sympy.utilities import sift
598
+ >>> from sympy.abc import x, y
599
+ >>> from sympy import sqrt, exp, pi, Tuple
600
+
601
+ >>> sift(range(5), lambda x: x % 2)
602
+ {0: [0, 2, 4], 1: [1, 3]}
603
+
604
+ sift() returns a defaultdict() object, so any key that has no matches will
605
+ give [].
606
+
607
+ >>> sift([x], lambda x: x.is_commutative)
608
+ {True: [x]}
609
+ >>> _[False]
610
+ []
611
+
612
+ Sometimes you will not know how many keys you will get:
613
+
614
+ >>> sift([sqrt(x), exp(x), (y**x)**2],
615
+ ... lambda x: x.as_base_exp()[0])
616
+ {E: [exp(x)], x: [sqrt(x)], y: [y**(2*x)]}
617
+
618
+ Sometimes you expect the results to be binary; the
619
+ results can be unpacked by setting ``binary`` to True:
620
+
621
+ >>> sift(range(4), lambda x: x % 2, binary=True)
622
+ ([1, 3], [0, 2])
623
+ >>> sift(Tuple(1, pi), lambda x: x.is_rational, binary=True)
624
+ ([1], [pi])
625
+
626
+ A ValueError is raised if the predicate was not actually binary
627
+ (which is a good test for the logic where sifting is used and
628
+ binary results were expected):
629
+
630
+ >>> unknown = exp(1) - pi # the rationality of this is unknown
631
+ >>> args = Tuple(1, pi, unknown)
632
+ >>> sift(args, lambda x: x.is_rational, binary=True)
633
+ Traceback (most recent call last):
634
+ ...
635
+ ValueError: keyfunc gave non-binary output
636
+
637
+ The non-binary sifting shows that there were 3 keys generated:
638
+
639
+ >>> set(sift(args, lambda x: x.is_rational).keys())
640
+ {None, False, True}
641
+
642
+ If you need to sort the sifted items it might be better to use
643
+ ``ordered`` which can economically apply multiple sort keys
644
+ to a sequence while sorting.
645
+
646
+ See Also
647
+ ========
648
+
649
+ ordered
650
+
651
+ """
652
+ if not binary:
653
+ m = defaultdict(list)
654
+ for i in seq:
655
+ m[keyfunc(i)].append(i)
656
+ return m
657
+ sift = F, T = [], []
658
+ for i in seq:
659
+ try:
660
+ sift[keyfunc(i)].append(i)
661
+ except (IndexError, TypeError):
662
+ raise ValueError('keyfunc gave non-binary output')
663
+ return T, F
664
+
665
+
666
+ def take(iter, n):
667
+ """Return ``n`` items from ``iter`` iterator. """
668
+ return [ value for _, value in zip(range(n), iter) ]
669
+
670
+
671
+ def dict_merge(*dicts):
672
+ """Merge dictionaries into a single dictionary. """
673
+ merged = {}
674
+
675
+ for dict in dicts:
676
+ merged.update(dict)
677
+
678
+ return merged
679
+
680
+
681
+ def common_prefix(*seqs):
682
+ """Return the subsequence that is a common start of sequences in ``seqs``.
683
+
684
+ >>> from sympy.utilities.iterables import common_prefix
685
+ >>> common_prefix(list(range(3)))
686
+ [0, 1, 2]
687
+ >>> common_prefix(list(range(3)), list(range(4)))
688
+ [0, 1, 2]
689
+ >>> common_prefix([1, 2, 3], [1, 2, 5])
690
+ [1, 2]
691
+ >>> common_prefix([1, 2, 3], [1, 3, 5])
692
+ [1]
693
+ """
694
+ if not all(seqs):
695
+ return []
696
+ elif len(seqs) == 1:
697
+ return seqs[0]
698
+ i = 0
699
+ for i in range(min(len(s) for s in seqs)):
700
+ if not all(seqs[j][i] == seqs[0][i] for j in range(len(seqs))):
701
+ break
702
+ else:
703
+ i += 1
704
+ return seqs[0][:i]
705
+
706
+
707
+ def common_suffix(*seqs):
708
+ """Return the subsequence that is a common ending of sequences in ``seqs``.
709
+
710
+ >>> from sympy.utilities.iterables import common_suffix
711
+ >>> common_suffix(list(range(3)))
712
+ [0, 1, 2]
713
+ >>> common_suffix(list(range(3)), list(range(4)))
714
+ []
715
+ >>> common_suffix([1, 2, 3], [9, 2, 3])
716
+ [2, 3]
717
+ >>> common_suffix([1, 2, 3], [9, 7, 3])
718
+ [3]
719
+ """
720
+
721
+ if not all(seqs):
722
+ return []
723
+ elif len(seqs) == 1:
724
+ return seqs[0]
725
+ i = 0
726
+ for i in range(-1, -min(len(s) for s in seqs) - 1, -1):
727
+ if not all(seqs[j][i] == seqs[0][i] for j in range(len(seqs))):
728
+ break
729
+ else:
730
+ i -= 1
731
+ if i == -1:
732
+ return []
733
+ else:
734
+ return seqs[0][i + 1:]
735
+
736
+
737
+ def prefixes(seq):
738
+ """
739
+ Generate all prefixes of a sequence.
740
+
741
+ Examples
742
+ ========
743
+
744
+ >>> from sympy.utilities.iterables import prefixes
745
+
746
+ >>> list(prefixes([1,2,3,4]))
747
+ [[1], [1, 2], [1, 2, 3], [1, 2, 3, 4]]
748
+
749
+ """
750
+ n = len(seq)
751
+
752
+ for i in range(n):
753
+ yield seq[:i + 1]
754
+
755
+
756
+ def postfixes(seq):
757
+ """
758
+ Generate all postfixes of a sequence.
759
+
760
+ Examples
761
+ ========
762
+
763
+ >>> from sympy.utilities.iterables import postfixes
764
+
765
+ >>> list(postfixes([1,2,3,4]))
766
+ [[4], [3, 4], [2, 3, 4], [1, 2, 3, 4]]
767
+
768
+ """
769
+ n = len(seq)
770
+
771
+ for i in range(n):
772
+ yield seq[n - i - 1:]
773
+
774
+
775
+ def topological_sort(graph, key=None):
776
+ r"""
777
+ Topological sort of graph's vertices.
778
+
779
+ Parameters
780
+ ==========
781
+
782
+ graph : tuple[list, list[tuple[T, T]]
783
+ A tuple consisting of a list of vertices and a list of edges of
784
+ a graph to be sorted topologically.
785
+
786
+ key : callable[T] (optional)
787
+ Ordering key for vertices on the same level. By default the natural
788
+ (e.g. lexicographic) ordering is used (in this case the base type
789
+ must implement ordering relations).
790
+
791
+ Examples
792
+ ========
793
+
794
+ Consider a graph::
795
+
796
+ +---+ +---+ +---+
797
+ | 7 |\ | 5 | | 3 |
798
+ +---+ \ +---+ +---+
799
+ | _\___/ ____ _/ |
800
+ | / \___/ \ / |
801
+ V V V V |
802
+ +----+ +---+ |
803
+ | 11 | | 8 | |
804
+ +----+ +---+ |
805
+ | | \____ ___/ _ |
806
+ | \ \ / / \ |
807
+ V \ V V / V V
808
+ +---+ \ +---+ | +----+
809
+ | 2 | | | 9 | | | 10 |
810
+ +---+ | +---+ | +----+
811
+ \________/
812
+
813
+ where vertices are integers. This graph can be encoded using
814
+ elementary Python's data structures as follows::
815
+
816
+ >>> V = [2, 3, 5, 7, 8, 9, 10, 11]
817
+ >>> E = [(7, 11), (7, 8), (5, 11), (3, 8), (3, 10),
818
+ ... (11, 2), (11, 9), (11, 10), (8, 9)]
819
+
820
+ To compute a topological sort for graph ``(V, E)`` issue::
821
+
822
+ >>> from sympy.utilities.iterables import topological_sort
823
+
824
+ >>> topological_sort((V, E))
825
+ [3, 5, 7, 8, 11, 2, 9, 10]
826
+
827
+ If specific tie breaking approach is needed, use ``key`` parameter::
828
+
829
+ >>> topological_sort((V, E), key=lambda v: -v)
830
+ [7, 5, 11, 3, 10, 8, 9, 2]
831
+
832
+ Only acyclic graphs can be sorted. If the input graph has a cycle,
833
+ then ``ValueError`` will be raised::
834
+
835
+ >>> topological_sort((V, E + [(10, 7)]))
836
+ Traceback (most recent call last):
837
+ ...
838
+ ValueError: cycle detected
839
+
840
+ References
841
+ ==========
842
+
843
+ .. [1] https://en.wikipedia.org/wiki/Topological_sorting
844
+
845
+ """
846
+ V, E = graph
847
+
848
+ L = []
849
+ S = set(V)
850
+ E = list(E)
851
+
852
+ for v, u in E:
853
+ S.discard(u)
854
+
855
+ if key is None:
856
+ def key(value):
857
+ return value
858
+
859
+ S = sorted(S, key=key, reverse=True)
860
+
861
+ while S:
862
+ node = S.pop()
863
+ L.append(node)
864
+
865
+ for u, v in list(E):
866
+ if u == node:
867
+ E.remove((u, v))
868
+
869
+ for _u, _v in E:
870
+ if v == _v:
871
+ break
872
+ else:
873
+ kv = key(v)
874
+
875
+ for i, s in enumerate(S):
876
+ ks = key(s)
877
+
878
+ if kv > ks:
879
+ S.insert(i, v)
880
+ break
881
+ else:
882
+ S.append(v)
883
+
884
+ if E:
885
+ raise ValueError("cycle detected")
886
+ else:
887
+ return L
888
+
889
+
890
+ def strongly_connected_components(G):
891
+ r"""
892
+ Strongly connected components of a directed graph in reverse topological
893
+ order.
894
+
895
+
896
+ Parameters
897
+ ==========
898
+
899
+ graph : tuple[list, list[tuple[T, T]]
900
+ A tuple consisting of a list of vertices and a list of edges of
901
+ a graph whose strongly connected components are to be found.
902
+
903
+
904
+ Examples
905
+ ========
906
+
907
+ Consider a directed graph (in dot notation)::
908
+
909
+ digraph {
910
+ A -> B
911
+ A -> C
912
+ B -> C
913
+ C -> B
914
+ B -> D
915
+ }
916
+
917
+ .. graphviz::
918
+
919
+ digraph {
920
+ A -> B
921
+ A -> C
922
+ B -> C
923
+ C -> B
924
+ B -> D
925
+ }
926
+
927
+ where vertices are the letters A, B, C and D. This graph can be encoded
928
+ using Python's elementary data structures as follows::
929
+
930
+ >>> V = ['A', 'B', 'C', 'D']
931
+ >>> E = [('A', 'B'), ('A', 'C'), ('B', 'C'), ('C', 'B'), ('B', 'D')]
932
+
933
+ The strongly connected components of this graph can be computed as
934
+
935
+ >>> from sympy.utilities.iterables import strongly_connected_components
936
+
937
+ >>> strongly_connected_components((V, E))
938
+ [['D'], ['B', 'C'], ['A']]
939
+
940
+ This also gives the components in reverse topological order.
941
+
942
+ Since the subgraph containing B and C has a cycle they must be together in
943
+ a strongly connected component. A and D are connected to the rest of the
944
+ graph but not in a cyclic manner so they appear as their own strongly
945
+ connected components.
946
+
947
+
948
+ Notes
949
+ =====
950
+
951
+ The vertices of the graph must be hashable for the data structures used.
952
+ If the vertices are unhashable replace them with integer indices.
953
+
954
+ This function uses Tarjan's algorithm to compute the strongly connected
955
+ components in `O(|V|+|E|)` (linear) time.
956
+
957
+
958
+ References
959
+ ==========
960
+
961
+ .. [1] https://en.wikipedia.org/wiki/Strongly_connected_component
962
+ .. [2] https://en.wikipedia.org/wiki/Tarjan%27s_strongly_connected_components_algorithm
963
+
964
+
965
+ See Also
966
+ ========
967
+
968
+ sympy.utilities.iterables.connected_components
969
+
970
+ """
971
+ # Map from a vertex to its neighbours
972
+ V, E = G
973
+ Gmap = {vi: [] for vi in V}
974
+ for v1, v2 in E:
975
+ Gmap[v1].append(v2)
976
+ return _strongly_connected_components(V, Gmap)
977
+
978
+
979
+ def _strongly_connected_components(V, Gmap):
980
+ """More efficient internal routine for strongly_connected_components"""
981
+ #
982
+ # Here V is an iterable of vertices and Gmap is a dict mapping each vertex
983
+ # to a list of neighbours e.g.:
984
+ #
985
+ # V = [0, 1, 2, 3]
986
+ # Gmap = {0: [2, 3], 1: [0]}
987
+ #
988
+ # For a large graph these data structures can often be created more
989
+ # efficiently then those expected by strongly_connected_components() which
990
+ # in this case would be
991
+ #
992
+ # V = [0, 1, 2, 3]
993
+ # Gmap = [(0, 2), (0, 3), (1, 0)]
994
+ #
995
+ # XXX: Maybe this should be the recommended function to use instead...
996
+ #
997
+
998
+ # Non-recursive Tarjan's algorithm:
999
+ lowlink = {}
1000
+ indices = {}
1001
+ stack = OrderedDict()
1002
+ callstack = []
1003
+ components = []
1004
+ nomore = object()
1005
+
1006
+ def start(v):
1007
+ index = len(stack)
1008
+ indices[v] = lowlink[v] = index
1009
+ stack[v] = None
1010
+ callstack.append((v, iter(Gmap[v])))
1011
+
1012
+ def finish(v1):
1013
+ # Finished a component?
1014
+ if lowlink[v1] == indices[v1]:
1015
+ component = [stack.popitem()[0]]
1016
+ while component[-1] is not v1:
1017
+ component.append(stack.popitem()[0])
1018
+ components.append(component[::-1])
1019
+ v2, _ = callstack.pop()
1020
+ if callstack:
1021
+ v1, _ = callstack[-1]
1022
+ lowlink[v1] = min(lowlink[v1], lowlink[v2])
1023
+
1024
+ for v in V:
1025
+ if v in indices:
1026
+ continue
1027
+ start(v)
1028
+ while callstack:
1029
+ v1, it1 = callstack[-1]
1030
+ v2 = next(it1, nomore)
1031
+ # Finished children of v1?
1032
+ if v2 is nomore:
1033
+ finish(v1)
1034
+ # Recurse on v2
1035
+ elif v2 not in indices:
1036
+ start(v2)
1037
+ elif v2 in stack:
1038
+ lowlink[v1] = min(lowlink[v1], indices[v2])
1039
+
1040
+ # Reverse topological sort order:
1041
+ return components
1042
+
1043
+
1044
+ def connected_components(G):
1045
+ r"""
1046
+ Connected components of an undirected graph or weakly connected components
1047
+ of a directed graph.
1048
+
1049
+
1050
+ Parameters
1051
+ ==========
1052
+
1053
+ graph : tuple[list, list[tuple[T, T]]
1054
+ A tuple consisting of a list of vertices and a list of edges of
1055
+ a graph whose connected components are to be found.
1056
+
1057
+
1058
+ Examples
1059
+ ========
1060
+
1061
+
1062
+ Given an undirected graph::
1063
+
1064
+ graph {
1065
+ A -- B
1066
+ C -- D
1067
+ }
1068
+
1069
+ .. graphviz::
1070
+
1071
+ graph {
1072
+ A -- B
1073
+ C -- D
1074
+ }
1075
+
1076
+ We can find the connected components using this function if we include
1077
+ each edge in both directions::
1078
+
1079
+ >>> from sympy.utilities.iterables import connected_components
1080
+
1081
+ >>> V = ['A', 'B', 'C', 'D']
1082
+ >>> E = [('A', 'B'), ('B', 'A'), ('C', 'D'), ('D', 'C')]
1083
+ >>> connected_components((V, E))
1084
+ [['A', 'B'], ['C', 'D']]
1085
+
1086
+ The weakly connected components of a directed graph can found the same
1087
+ way.
1088
+
1089
+
1090
+ Notes
1091
+ =====
1092
+
1093
+ The vertices of the graph must be hashable for the data structures used.
1094
+ If the vertices are unhashable replace them with integer indices.
1095
+
1096
+ This function uses Tarjan's algorithm to compute the connected components
1097
+ in `O(|V|+|E|)` (linear) time.
1098
+
1099
+
1100
+ References
1101
+ ==========
1102
+
1103
+ .. [1] https://en.wikipedia.org/wiki/Component_%28graph_theory%29
1104
+ .. [2] https://en.wikipedia.org/wiki/Tarjan%27s_strongly_connected_components_algorithm
1105
+
1106
+
1107
+ See Also
1108
+ ========
1109
+
1110
+ sympy.utilities.iterables.strongly_connected_components
1111
+
1112
+ """
1113
+ # Duplicate edges both ways so that the graph is effectively undirected
1114
+ # and return the strongly connected components:
1115
+ V, E = G
1116
+ E_undirected = []
1117
+ for v1, v2 in E:
1118
+ E_undirected.extend([(v1, v2), (v2, v1)])
1119
+ return strongly_connected_components((V, E_undirected))
1120
+
1121
+
1122
+ def rotate_left(x, y):
1123
+ """
1124
+ Left rotates a list x by the number of steps specified
1125
+ in y.
1126
+
1127
+ Examples
1128
+ ========
1129
+
1130
+ >>> from sympy.utilities.iterables import rotate_left
1131
+ >>> a = [0, 1, 2]
1132
+ >>> rotate_left(a, 1)
1133
+ [1, 2, 0]
1134
+ """
1135
+ if len(x) == 0:
1136
+ return []
1137
+ y = y % len(x)
1138
+ return x[y:] + x[:y]
1139
+
1140
+
1141
+ def rotate_right(x, y):
1142
+ """
1143
+ Right rotates a list x by the number of steps specified
1144
+ in y.
1145
+
1146
+ Examples
1147
+ ========
1148
+
1149
+ >>> from sympy.utilities.iterables import rotate_right
1150
+ >>> a = [0, 1, 2]
1151
+ >>> rotate_right(a, 1)
1152
+ [2, 0, 1]
1153
+ """
1154
+ if len(x) == 0:
1155
+ return []
1156
+ y = len(x) - y % len(x)
1157
+ return x[y:] + x[:y]
1158
+
1159
+
1160
+ def least_rotation(x, key=None):
1161
+ '''
1162
+ Returns the number of steps of left rotation required to
1163
+ obtain lexicographically minimal string/list/tuple, etc.
1164
+
1165
+ Examples
1166
+ ========
1167
+
1168
+ >>> from sympy.utilities.iterables import least_rotation, rotate_left
1169
+ >>> a = [3, 1, 5, 1, 2]
1170
+ >>> least_rotation(a)
1171
+ 3
1172
+ >>> rotate_left(a, _)
1173
+ [1, 2, 3, 1, 5]
1174
+
1175
+ References
1176
+ ==========
1177
+
1178
+ .. [1] https://en.wikipedia.org/wiki/Lexicographically_minimal_string_rotation
1179
+
1180
+ '''
1181
+ from sympy.functions.elementary.miscellaneous import Id
1182
+ if key is None: key = Id
1183
+ S = x + x # Concatenate string to it self to avoid modular arithmetic
1184
+ f = [-1] * len(S) # Failure function
1185
+ k = 0 # Least rotation of string found so far
1186
+ for j in range(1,len(S)):
1187
+ sj = S[j]
1188
+ i = f[j-k-1]
1189
+ while i != -1 and sj != S[k+i+1]:
1190
+ if key(sj) < key(S[k+i+1]):
1191
+ k = j-i-1
1192
+ i = f[i]
1193
+ if sj != S[k+i+1]:
1194
+ if key(sj) < key(S[k]):
1195
+ k = j
1196
+ f[j-k] = -1
1197
+ else:
1198
+ f[j-k] = i+1
1199
+ return k
1200
+
1201
+
1202
+ def multiset_combinations(m, n, g=None):
1203
+ """
1204
+ Return the unique combinations of size ``n`` from multiset ``m``.
1205
+
1206
+ Examples
1207
+ ========
1208
+
1209
+ >>> from sympy.utilities.iterables import multiset_combinations
1210
+ >>> from itertools import combinations
1211
+ >>> [''.join(i) for i in multiset_combinations('baby', 3)]
1212
+ ['abb', 'aby', 'bby']
1213
+
1214
+ >>> def count(f, s): return len(list(f(s, 3)))
1215
+
1216
+ The number of combinations depends on the number of letters; the
1217
+ number of unique combinations depends on how the letters are
1218
+ repeated.
1219
+
1220
+ >>> s1 = 'abracadabra'
1221
+ >>> s2 = 'banana tree'
1222
+ >>> count(combinations, s1), count(multiset_combinations, s1)
1223
+ (165, 23)
1224
+ >>> count(combinations, s2), count(multiset_combinations, s2)
1225
+ (165, 54)
1226
+
1227
+ """
1228
+ from sympy.core.sorting import ordered
1229
+ if g is None:
1230
+ if isinstance(m, dict):
1231
+ if any(as_int(v) < 0 for v in m.values()):
1232
+ raise ValueError('counts cannot be negative')
1233
+ N = sum(m.values())
1234
+ if n > N:
1235
+ return
1236
+ g = [[k, m[k]] for k in ordered(m)]
1237
+ else:
1238
+ m = list(m)
1239
+ N = len(m)
1240
+ if n > N:
1241
+ return
1242
+ try:
1243
+ m = multiset(m)
1244
+ g = [(k, m[k]) for k in ordered(m)]
1245
+ except TypeError:
1246
+ m = list(ordered(m))
1247
+ g = [list(i) for i in group(m, multiple=False)]
1248
+ del m
1249
+ else:
1250
+ # not checking counts since g is intended for internal use
1251
+ N = sum(v for k, v in g)
1252
+ if n > N or not n:
1253
+ yield []
1254
+ else:
1255
+ for i, (k, v) in enumerate(g):
1256
+ if v >= n:
1257
+ yield [k]*n
1258
+ v = n - 1
1259
+ for v in range(min(n, v), 0, -1):
1260
+ for j in multiset_combinations(None, n - v, g[i + 1:]):
1261
+ rv = [k]*v + j
1262
+ if len(rv) == n:
1263
+ yield rv
1264
+
1265
+ def multiset_permutations(m, size=None, g=None):
1266
+ """
1267
+ Return the unique permutations of multiset ``m``.
1268
+
1269
+ Examples
1270
+ ========
1271
+
1272
+ >>> from sympy.utilities.iterables import multiset_permutations
1273
+ >>> from sympy import factorial
1274
+ >>> [''.join(i) for i in multiset_permutations('aab')]
1275
+ ['aab', 'aba', 'baa']
1276
+ >>> factorial(len('banana'))
1277
+ 720
1278
+ >>> len(list(multiset_permutations('banana')))
1279
+ 60
1280
+ """
1281
+ from sympy.core.sorting import ordered
1282
+ if g is None:
1283
+ if isinstance(m, dict):
1284
+ if any(as_int(v) < 0 for v in m.values()):
1285
+ raise ValueError('counts cannot be negative')
1286
+ g = [[k, m[k]] for k in ordered(m)]
1287
+ else:
1288
+ m = list(ordered(m))
1289
+ g = [list(i) for i in group(m, multiple=False)]
1290
+ del m
1291
+ do = [gi for gi in g if gi[1] > 0]
1292
+ SUM = sum([gi[1] for gi in do])
1293
+ if not do or size is not None and (size > SUM or size < 1):
1294
+ if not do and size is None or size == 0:
1295
+ yield []
1296
+ return
1297
+ elif size == 1:
1298
+ for k, v in do:
1299
+ yield [k]
1300
+ elif len(do) == 1:
1301
+ k, v = do[0]
1302
+ v = v if size is None else (size if size <= v else 0)
1303
+ yield [k for i in range(v)]
1304
+ elif all(v == 1 for k, v in do):
1305
+ for p in permutations([k for k, v in do], size):
1306
+ yield list(p)
1307
+ else:
1308
+ size = size if size is not None else SUM
1309
+ for i, (k, v) in enumerate(do):
1310
+ do[i][1] -= 1
1311
+ for j in multiset_permutations(None, size - 1, do):
1312
+ if j:
1313
+ yield [k] + j
1314
+ do[i][1] += 1
1315
+
1316
+
1317
+ def _partition(seq, vector, m=None):
1318
+ """
1319
+ Return the partition of seq as specified by the partition vector.
1320
+
1321
+ Examples
1322
+ ========
1323
+
1324
+ >>> from sympy.utilities.iterables import _partition
1325
+ >>> _partition('abcde', [1, 0, 1, 2, 0])
1326
+ [['b', 'e'], ['a', 'c'], ['d']]
1327
+
1328
+ Specifying the number of bins in the partition is optional:
1329
+
1330
+ >>> _partition('abcde', [1, 0, 1, 2, 0], 3)
1331
+ [['b', 'e'], ['a', 'c'], ['d']]
1332
+
1333
+ The output of _set_partitions can be passed as follows:
1334
+
1335
+ >>> output = (3, [1, 0, 1, 2, 0])
1336
+ >>> _partition('abcde', *output)
1337
+ [['b', 'e'], ['a', 'c'], ['d']]
1338
+
1339
+ See Also
1340
+ ========
1341
+
1342
+ combinatorics.partitions.Partition.from_rgs
1343
+
1344
+ """
1345
+ if m is None:
1346
+ m = max(vector) + 1
1347
+ elif isinstance(vector, int): # entered as m, vector
1348
+ vector, m = m, vector
1349
+ p = [[] for i in range(m)]
1350
+ for i, v in enumerate(vector):
1351
+ p[v].append(seq[i])
1352
+ return p
1353
+
1354
+
1355
+ def _set_partitions(n):
1356
+ """Cycle through all partitions of n elements, yielding the
1357
+ current number of partitions, ``m``, and a mutable list, ``q``
1358
+ such that ``element[i]`` is in part ``q[i]`` of the partition.
1359
+
1360
+ NOTE: ``q`` is modified in place and generally should not be changed
1361
+ between function calls.
1362
+
1363
+ Examples
1364
+ ========
1365
+
1366
+ >>> from sympy.utilities.iterables import _set_partitions, _partition
1367
+ >>> for m, q in _set_partitions(3):
1368
+ ... print('%s %s %s' % (m, q, _partition('abc', q, m)))
1369
+ 1 [0, 0, 0] [['a', 'b', 'c']]
1370
+ 2 [0, 0, 1] [['a', 'b'], ['c']]
1371
+ 2 [0, 1, 0] [['a', 'c'], ['b']]
1372
+ 2 [0, 1, 1] [['a'], ['b', 'c']]
1373
+ 3 [0, 1, 2] [['a'], ['b'], ['c']]
1374
+
1375
+ Notes
1376
+ =====
1377
+
1378
+ This algorithm is similar to, and solves the same problem as,
1379
+ Algorithm 7.2.1.5H, from volume 4A of Knuth's The Art of Computer
1380
+ Programming. Knuth uses the term "restricted growth string" where
1381
+ this code refers to a "partition vector". In each case, the meaning is
1382
+ the same: the value in the ith element of the vector specifies to
1383
+ which part the ith set element is to be assigned.
1384
+
1385
+ At the lowest level, this code implements an n-digit big-endian
1386
+ counter (stored in the array q) which is incremented (with carries) to
1387
+ get the next partition in the sequence. A special twist is that a
1388
+ digit is constrained to be at most one greater than the maximum of all
1389
+ the digits to the left of it. The array p maintains this maximum, so
1390
+ that the code can efficiently decide when a digit can be incremented
1391
+ in place or whether it needs to be reset to 0 and trigger a carry to
1392
+ the next digit. The enumeration starts with all the digits 0 (which
1393
+ corresponds to all the set elements being assigned to the same 0th
1394
+ part), and ends with 0123...n, which corresponds to each set element
1395
+ being assigned to a different, singleton, part.
1396
+
1397
+ This routine was rewritten to use 0-based lists while trying to
1398
+ preserve the beauty and efficiency of the original algorithm.
1399
+
1400
+ References
1401
+ ==========
1402
+
1403
+ .. [1] Nijenhuis, Albert and Wilf, Herbert. (1978) Combinatorial Algorithms,
1404
+ 2nd Ed, p 91, algorithm "nexequ". Available online from
1405
+ https://www.math.upenn.edu/~wilf/website/CombAlgDownld.html (viewed
1406
+ November 17, 2012).
1407
+
1408
+ """
1409
+ p = [0]*n
1410
+ q = [0]*n
1411
+ nc = 1
1412
+ yield nc, q
1413
+ while nc != n:
1414
+ m = n
1415
+ while 1:
1416
+ m -= 1
1417
+ i = q[m]
1418
+ if p[i] != 1:
1419
+ break
1420
+ q[m] = 0
1421
+ i += 1
1422
+ q[m] = i
1423
+ m += 1
1424
+ nc += m - n
1425
+ p[0] += n - m
1426
+ if i == nc:
1427
+ p[nc] = 0
1428
+ nc += 1
1429
+ p[i - 1] -= 1
1430
+ p[i] += 1
1431
+ yield nc, q
1432
+
1433
+
1434
+ def multiset_partitions(multiset, m=None):
1435
+ """
1436
+ Return unique partitions of the given multiset (in list form).
1437
+ If ``m`` is None, all multisets will be returned, otherwise only
1438
+ partitions with ``m`` parts will be returned.
1439
+
1440
+ If ``multiset`` is an integer, a range [0, 1, ..., multiset - 1]
1441
+ will be supplied.
1442
+
1443
+ Examples
1444
+ ========
1445
+
1446
+ >>> from sympy.utilities.iterables import multiset_partitions
1447
+ >>> list(multiset_partitions([1, 2, 3, 4], 2))
1448
+ [[[1, 2, 3], [4]], [[1, 2, 4], [3]], [[1, 2], [3, 4]],
1449
+ [[1, 3, 4], [2]], [[1, 3], [2, 4]], [[1, 4], [2, 3]],
1450
+ [[1], [2, 3, 4]]]
1451
+ >>> list(multiset_partitions([1, 2, 3, 4], 1))
1452
+ [[[1, 2, 3, 4]]]
1453
+
1454
+ Only unique partitions are returned and these will be returned in a
1455
+ canonical order regardless of the order of the input:
1456
+
1457
+ >>> a = [1, 2, 2, 1]
1458
+ >>> ans = list(multiset_partitions(a, 2))
1459
+ >>> a.sort()
1460
+ >>> list(multiset_partitions(a, 2)) == ans
1461
+ True
1462
+ >>> a = range(3, 1, -1)
1463
+ >>> (list(multiset_partitions(a)) ==
1464
+ ... list(multiset_partitions(sorted(a))))
1465
+ True
1466
+
1467
+ If m is omitted then all partitions will be returned:
1468
+
1469
+ >>> list(multiset_partitions([1, 1, 2]))
1470
+ [[[1, 1, 2]], [[1, 1], [2]], [[1, 2], [1]], [[1], [1], [2]]]
1471
+ >>> list(multiset_partitions([1]*3))
1472
+ [[[1, 1, 1]], [[1], [1, 1]], [[1], [1], [1]]]
1473
+
1474
+ Counting
1475
+ ========
1476
+
1477
+ The number of partitions of a set is given by the bell number:
1478
+
1479
+ >>> from sympy import bell
1480
+ >>> len(list(multiset_partitions(5))) == bell(5) == 52
1481
+ True
1482
+
1483
+ The number of partitions of length k from a set of size n is given by the
1484
+ Stirling Number of the 2nd kind:
1485
+
1486
+ >>> from sympy.functions.combinatorial.numbers import stirling
1487
+ >>> stirling(5, 2) == len(list(multiset_partitions(5, 2))) == 15
1488
+ True
1489
+
1490
+ These comments on counting apply to *sets*, not multisets.
1491
+
1492
+ Notes
1493
+ =====
1494
+
1495
+ When all the elements are the same in the multiset, the order
1496
+ of the returned partitions is determined by the ``partitions``
1497
+ routine. If one is counting partitions then it is better to use
1498
+ the ``nT`` function.
1499
+
1500
+ See Also
1501
+ ========
1502
+
1503
+ partitions
1504
+ sympy.combinatorics.partitions.Partition
1505
+ sympy.combinatorics.partitions.IntegerPartition
1506
+ sympy.functions.combinatorial.numbers.nT
1507
+
1508
+ """
1509
+ # This function looks at the supplied input and dispatches to
1510
+ # several special-case routines as they apply.
1511
+ if isinstance(multiset, int):
1512
+ n = multiset
1513
+ if m and m > n:
1514
+ return
1515
+ multiset = list(range(n))
1516
+ if m == 1:
1517
+ yield [multiset[:]]
1518
+ return
1519
+
1520
+ # If m is not None, it can sometimes be faster to use
1521
+ # MultisetPartitionTraverser.enum_range() even for inputs
1522
+ # which are sets. Since the _set_partitions code is quite
1523
+ # fast, this is only advantageous when the overall set
1524
+ # partitions outnumber those with the desired number of parts
1525
+ # by a large factor. (At least 60.) Such a switch is not
1526
+ # currently implemented.
1527
+ for nc, q in _set_partitions(n):
1528
+ if m is None or nc == m:
1529
+ rv = [[] for i in range(nc)]
1530
+ for i in range(n):
1531
+ rv[q[i]].append(multiset[i])
1532
+ yield rv
1533
+ return
1534
+
1535
+ if len(multiset) == 1 and isinstance(multiset, str):
1536
+ multiset = [multiset]
1537
+
1538
+ if not has_variety(multiset):
1539
+ # Only one component, repeated n times. The resulting
1540
+ # partitions correspond to partitions of integer n.
1541
+ n = len(multiset)
1542
+ if m and m > n:
1543
+ return
1544
+ if m == 1:
1545
+ yield [multiset[:]]
1546
+ return
1547
+ x = multiset[:1]
1548
+ for size, p in partitions(n, m, size=True):
1549
+ if m is None or size == m:
1550
+ rv = []
1551
+ for k in sorted(p):
1552
+ rv.extend([x*k]*p[k])
1553
+ yield rv
1554
+ else:
1555
+ from sympy.core.sorting import ordered
1556
+ multiset = list(ordered(multiset))
1557
+ n = len(multiset)
1558
+ if m and m > n:
1559
+ return
1560
+ if m == 1:
1561
+ yield [multiset[:]]
1562
+ return
1563
+
1564
+ # Split the information of the multiset into two lists -
1565
+ # one of the elements themselves, and one (of the same length)
1566
+ # giving the number of repeats for the corresponding element.
1567
+ elements, multiplicities = zip(*group(multiset, False))
1568
+
1569
+ if len(elements) < len(multiset):
1570
+ # General case - multiset with more than one distinct element
1571
+ # and at least one element repeated more than once.
1572
+ if m:
1573
+ mpt = MultisetPartitionTraverser()
1574
+ for state in mpt.enum_range(multiplicities, m-1, m):
1575
+ yield list_visitor(state, elements)
1576
+ else:
1577
+ for state in multiset_partitions_taocp(multiplicities):
1578
+ yield list_visitor(state, elements)
1579
+ else:
1580
+ # Set partitions case - no repeated elements. Pretty much
1581
+ # same as int argument case above, with same possible, but
1582
+ # currently unimplemented optimization for some cases when
1583
+ # m is not None
1584
+ for nc, q in _set_partitions(n):
1585
+ if m is None or nc == m:
1586
+ rv = [[] for i in range(nc)]
1587
+ for i in range(n):
1588
+ rv[q[i]].append(i)
1589
+ yield [[multiset[j] for j in i] for i in rv]
1590
+
1591
+
1592
+ def partitions(n, m=None, k=None, size=False):
1593
+ """Generate all partitions of positive integer, n.
1594
+
1595
+ Parameters
1596
+ ==========
1597
+
1598
+ m : integer (default gives partitions of all sizes)
1599
+ limits number of parts in partition (mnemonic: m, maximum parts)
1600
+ k : integer (default gives partitions number from 1 through n)
1601
+ limits the numbers that are kept in the partition (mnemonic: k, keys)
1602
+ size : bool (default False, only partition is returned)
1603
+ when ``True`` then (M, P) is returned where M is the sum of the
1604
+ multiplicities and P is the generated partition.
1605
+
1606
+ Each partition is represented as a dictionary, mapping an integer
1607
+ to the number of copies of that integer in the partition. For example,
1608
+ the first partition of 4 returned is {4: 1}, "4: one of them".
1609
+
1610
+ Examples
1611
+ ========
1612
+
1613
+ >>> from sympy.utilities.iterables import partitions
1614
+
1615
+ The numbers appearing in the partition (the key of the returned dict)
1616
+ are limited with k:
1617
+
1618
+ >>> for p in partitions(6, k=2): # doctest: +SKIP
1619
+ ... print(p)
1620
+ {2: 3}
1621
+ {1: 2, 2: 2}
1622
+ {1: 4, 2: 1}
1623
+ {1: 6}
1624
+
1625
+ The maximum number of parts in the partition (the sum of the values in
1626
+ the returned dict) are limited with m (default value, None, gives
1627
+ partitions from 1 through n):
1628
+
1629
+ >>> for p in partitions(6, m=2): # doctest: +SKIP
1630
+ ... print(p)
1631
+ ...
1632
+ {6: 1}
1633
+ {1: 1, 5: 1}
1634
+ {2: 1, 4: 1}
1635
+ {3: 2}
1636
+
1637
+ References
1638
+ ==========
1639
+
1640
+ .. [1] modified from Tim Peter's version to allow for k and m values:
1641
+ https://code.activestate.com/recipes/218332-generator-for-integer-partitions/
1642
+
1643
+ See Also
1644
+ ========
1645
+
1646
+ sympy.combinatorics.partitions.Partition
1647
+ sympy.combinatorics.partitions.IntegerPartition
1648
+
1649
+ """
1650
+ if (n <= 0 or
1651
+ m is not None and m < 1 or
1652
+ k is not None and k < 1 or
1653
+ m and k and m*k < n):
1654
+ # the empty set is the only way to handle these inputs
1655
+ # and returning {} to represent it is consistent with
1656
+ # the counting convention, e.g. nT(0) == 1.
1657
+ if size:
1658
+ yield 0, {}
1659
+ else:
1660
+ yield {}
1661
+ return
1662
+
1663
+ if m is None:
1664
+ m = n
1665
+ else:
1666
+ m = min(m, n)
1667
+ k = min(k or n, n)
1668
+
1669
+ n, m, k = as_int(n), as_int(m), as_int(k)
1670
+ q, r = divmod(n, k)
1671
+ ms = {k: q}
1672
+ keys = [k] # ms.keys(), from largest to smallest
1673
+ if r:
1674
+ ms[r] = 1
1675
+ keys.append(r)
1676
+ room = m - q - bool(r)
1677
+ if size:
1678
+ yield sum(ms.values()), ms.copy()
1679
+ else:
1680
+ yield ms.copy()
1681
+
1682
+ while keys != [1]:
1683
+ # Reuse any 1's.
1684
+ if keys[-1] == 1:
1685
+ del keys[-1]
1686
+ reuse = ms.pop(1)
1687
+ room += reuse
1688
+ else:
1689
+ reuse = 0
1690
+
1691
+ while 1:
1692
+ # Let i be the smallest key larger than 1. Reuse one
1693
+ # instance of i.
1694
+ i = keys[-1]
1695
+ newcount = ms[i] = ms[i] - 1
1696
+ reuse += i
1697
+ if newcount == 0:
1698
+ del keys[-1], ms[i]
1699
+ room += 1
1700
+
1701
+ # Break the remainder into pieces of size i-1.
1702
+ i -= 1
1703
+ q, r = divmod(reuse, i)
1704
+ need = q + bool(r)
1705
+ if need > room:
1706
+ if not keys:
1707
+ return
1708
+ continue
1709
+
1710
+ ms[i] = q
1711
+ keys.append(i)
1712
+ if r:
1713
+ ms[r] = 1
1714
+ keys.append(r)
1715
+ break
1716
+ room -= need
1717
+ if size:
1718
+ yield sum(ms.values()), ms.copy()
1719
+ else:
1720
+ yield ms.copy()
1721
+
1722
+
1723
+ def ordered_partitions(n, m=None, sort=True):
1724
+ """Generates ordered partitions of integer ``n``.
1725
+
1726
+ Parameters
1727
+ ==========
1728
+
1729
+ m : integer (default None)
1730
+ The default value gives partitions of all sizes else only
1731
+ those with size m. In addition, if ``m`` is not None then
1732
+ partitions are generated *in place* (see examples).
1733
+ sort : bool (default True)
1734
+ Controls whether partitions are
1735
+ returned in sorted order when ``m`` is not None; when False,
1736
+ the partitions are returned as fast as possible with elements
1737
+ sorted, but when m|n the partitions will not be in
1738
+ ascending lexicographical order.
1739
+
1740
+ Examples
1741
+ ========
1742
+
1743
+ >>> from sympy.utilities.iterables import ordered_partitions
1744
+
1745
+ All partitions of 5 in ascending lexicographical:
1746
+
1747
+ >>> for p in ordered_partitions(5):
1748
+ ... print(p)
1749
+ [1, 1, 1, 1, 1]
1750
+ [1, 1, 1, 2]
1751
+ [1, 1, 3]
1752
+ [1, 2, 2]
1753
+ [1, 4]
1754
+ [2, 3]
1755
+ [5]
1756
+
1757
+ Only partitions of 5 with two parts:
1758
+
1759
+ >>> for p in ordered_partitions(5, 2):
1760
+ ... print(p)
1761
+ [1, 4]
1762
+ [2, 3]
1763
+
1764
+ When ``m`` is given, a given list objects will be used more than
1765
+ once for speed reasons so you will not see the correct partitions
1766
+ unless you make a copy of each as it is generated:
1767
+
1768
+ >>> [p for p in ordered_partitions(7, 3)]
1769
+ [[1, 1, 1], [1, 1, 1], [1, 1, 1], [2, 2, 2]]
1770
+ >>> [list(p) for p in ordered_partitions(7, 3)]
1771
+ [[1, 1, 5], [1, 2, 4], [1, 3, 3], [2, 2, 3]]
1772
+
1773
+ When ``n`` is a multiple of ``m``, the elements are still sorted
1774
+ but the partitions themselves will be *unordered* if sort is False;
1775
+ the default is to return them in ascending lexicographical order.
1776
+
1777
+ >>> for p in ordered_partitions(6, 2):
1778
+ ... print(p)
1779
+ [1, 5]
1780
+ [2, 4]
1781
+ [3, 3]
1782
+
1783
+ But if speed is more important than ordering, sort can be set to
1784
+ False:
1785
+
1786
+ >>> for p in ordered_partitions(6, 2, sort=False):
1787
+ ... print(p)
1788
+ [1, 5]
1789
+ [3, 3]
1790
+ [2, 4]
1791
+
1792
+ References
1793
+ ==========
1794
+
1795
+ .. [1] Generating Integer Partitions, [online],
1796
+ Available: https://jeromekelleher.net/generating-integer-partitions.html
1797
+ .. [2] Jerome Kelleher and Barry O'Sullivan, "Generating All
1798
+ Partitions: A Comparison Of Two Encodings", [online],
1799
+ Available: https://arxiv.org/pdf/0909.2331v2.pdf
1800
+ """
1801
+ if n < 1 or m is not None and m < 1:
1802
+ # the empty set is the only way to handle these inputs
1803
+ # and returning {} to represent it is consistent with
1804
+ # the counting convention, e.g. nT(0) == 1.
1805
+ yield []
1806
+ return
1807
+
1808
+ if m is None:
1809
+ # The list `a`'s leading elements contain the partition in which
1810
+ # y is the biggest element and x is either the same as y or the
1811
+ # 2nd largest element; v and w are adjacent element indices
1812
+ # to which x and y are being assigned, respectively.
1813
+ a = [1]*n
1814
+ y = -1
1815
+ v = n
1816
+ while v > 0:
1817
+ v -= 1
1818
+ x = a[v] + 1
1819
+ while y >= 2 * x:
1820
+ a[v] = x
1821
+ y -= x
1822
+ v += 1
1823
+ w = v + 1
1824
+ while x <= y:
1825
+ a[v] = x
1826
+ a[w] = y
1827
+ yield a[:w + 1]
1828
+ x += 1
1829
+ y -= 1
1830
+ a[v] = x + y
1831
+ y = a[v] - 1
1832
+ yield a[:w]
1833
+ elif m == 1:
1834
+ yield [n]
1835
+ elif n == m:
1836
+ yield [1]*n
1837
+ else:
1838
+ # recursively generate partitions of size m
1839
+ for b in range(1, n//m + 1):
1840
+ a = [b]*m
1841
+ x = n - b*m
1842
+ if not x:
1843
+ if sort:
1844
+ yield a
1845
+ elif not sort and x <= m:
1846
+ for ax in ordered_partitions(x, sort=False):
1847
+ mi = len(ax)
1848
+ a[-mi:] = [i + b for i in ax]
1849
+ yield a
1850
+ a[-mi:] = [b]*mi
1851
+ else:
1852
+ for mi in range(1, m):
1853
+ for ax in ordered_partitions(x, mi, sort=True):
1854
+ a[-mi:] = [i + b for i in ax]
1855
+ yield a
1856
+ a[-mi:] = [b]*mi
1857
+
1858
+
1859
+ def binary_partitions(n):
1860
+ """
1861
+ Generates the binary partition of n.
1862
+
1863
+ A binary partition consists only of numbers that are
1864
+ powers of two. Each step reduces a `2^{k+1}` to `2^k` and
1865
+ `2^k`. Thus 16 is converted to 8 and 8.
1866
+
1867
+ Examples
1868
+ ========
1869
+
1870
+ >>> from sympy.utilities.iterables import binary_partitions
1871
+ >>> for i in binary_partitions(5):
1872
+ ... print(i)
1873
+ ...
1874
+ [4, 1]
1875
+ [2, 2, 1]
1876
+ [2, 1, 1, 1]
1877
+ [1, 1, 1, 1, 1]
1878
+
1879
+ References
1880
+ ==========
1881
+
1882
+ .. [1] TAOCP 4, section 7.2.1.5, problem 64
1883
+
1884
+ """
1885
+ from math import ceil, log
1886
+ power = int(2**(ceil(log(n, 2))))
1887
+ acc = 0
1888
+ partition = []
1889
+ while power:
1890
+ if acc + power <= n:
1891
+ partition.append(power)
1892
+ acc += power
1893
+ power >>= 1
1894
+
1895
+ last_num = len(partition) - 1 - (n & 1)
1896
+ while last_num >= 0:
1897
+ yield partition
1898
+ if partition[last_num] == 2:
1899
+ partition[last_num] = 1
1900
+ partition.append(1)
1901
+ last_num -= 1
1902
+ continue
1903
+ partition.append(1)
1904
+ partition[last_num] >>= 1
1905
+ x = partition[last_num + 1] = partition[last_num]
1906
+ last_num += 1
1907
+ while x > 1:
1908
+ if x <= len(partition) - last_num - 1:
1909
+ del partition[-x + 1:]
1910
+ last_num += 1
1911
+ partition[last_num] = x
1912
+ else:
1913
+ x >>= 1
1914
+ yield [1]*n
1915
+
1916
+
1917
+ def has_dups(seq):
1918
+ """Return True if there are any duplicate elements in ``seq``.
1919
+
1920
+ Examples
1921
+ ========
1922
+
1923
+ >>> from sympy import has_dups, Dict, Set
1924
+ >>> has_dups((1, 2, 1))
1925
+ True
1926
+ >>> has_dups(range(3))
1927
+ False
1928
+ >>> all(has_dups(c) is False for c in (set(), Set(), dict(), Dict()))
1929
+ True
1930
+ """
1931
+ from sympy.core.containers import Dict
1932
+ from sympy.sets.sets import Set
1933
+ if isinstance(seq, (dict, set, Dict, Set)):
1934
+ return False
1935
+ unique = set()
1936
+ try:
1937
+ return any(True for s in seq if s in unique or unique.add(s))
1938
+ except TypeError:
1939
+ return len(seq) != len(list(uniq(seq)))
1940
+
1941
+
1942
+ def has_variety(seq):
1943
+ """Return True if there are any different elements in ``seq``.
1944
+
1945
+ Examples
1946
+ ========
1947
+
1948
+ >>> from sympy import has_variety
1949
+
1950
+ >>> has_variety((1, 2, 1))
1951
+ True
1952
+ >>> has_variety((1, 1, 1))
1953
+ False
1954
+ """
1955
+ for i, s in enumerate(seq):
1956
+ if i == 0:
1957
+ sentinel = s
1958
+ else:
1959
+ if s != sentinel:
1960
+ return True
1961
+ return False
1962
+
1963
+
1964
+ def uniq(seq, result=None):
1965
+ """
1966
+ Yield unique elements from ``seq`` as an iterator. The second
1967
+ parameter ``result`` is used internally; it is not necessary
1968
+ to pass anything for this.
1969
+
1970
+ Note: changing the sequence during iteration will raise a
1971
+ RuntimeError if the size of the sequence is known; if you pass
1972
+ an iterator and advance the iterator you will change the
1973
+ output of this routine but there will be no warning.
1974
+
1975
+ Examples
1976
+ ========
1977
+
1978
+ >>> from sympy.utilities.iterables import uniq
1979
+ >>> dat = [1, 4, 1, 5, 4, 2, 1, 2]
1980
+ >>> type(uniq(dat)) in (list, tuple)
1981
+ False
1982
+
1983
+ >>> list(uniq(dat))
1984
+ [1, 4, 5, 2]
1985
+ >>> list(uniq(x for x in dat))
1986
+ [1, 4, 5, 2]
1987
+ >>> list(uniq([[1], [2, 1], [1]]))
1988
+ [[1], [2, 1]]
1989
+ """
1990
+ try:
1991
+ n = len(seq)
1992
+ except TypeError:
1993
+ n = None
1994
+ def check():
1995
+ # check that size of seq did not change during iteration;
1996
+ # if n == None the object won't support size changing, e.g.
1997
+ # an iterator can't be changed
1998
+ if n is not None and len(seq) != n:
1999
+ raise RuntimeError('sequence changed size during iteration')
2000
+ try:
2001
+ seen = set()
2002
+ result = result or []
2003
+ for i, s in enumerate(seq):
2004
+ if not (s in seen or seen.add(s)):
2005
+ yield s
2006
+ check()
2007
+ except TypeError:
2008
+ if s not in result:
2009
+ yield s
2010
+ check()
2011
+ result.append(s)
2012
+ if hasattr(seq, '__getitem__'):
2013
+ yield from uniq(seq[i + 1:], result)
2014
+ else:
2015
+ yield from uniq(seq, result)
2016
+
2017
+
2018
+ def generate_bell(n):
2019
+ """Return permutations of [0, 1, ..., n - 1] such that each permutation
2020
+ differs from the last by the exchange of a single pair of neighbors.
2021
+ The ``n!`` permutations are returned as an iterator. In order to obtain
2022
+ the next permutation from a random starting permutation, use the
2023
+ ``next_trotterjohnson`` method of the Permutation class (which generates
2024
+ the same sequence in a different manner).
2025
+
2026
+ Examples
2027
+ ========
2028
+
2029
+ >>> from itertools import permutations
2030
+ >>> from sympy.utilities.iterables import generate_bell
2031
+ >>> from sympy import zeros, Matrix
2032
+
2033
+ This is the sort of permutation used in the ringing of physical bells,
2034
+ and does not produce permutations in lexicographical order. Rather, the
2035
+ permutations differ from each other by exactly one inversion, and the
2036
+ position at which the swapping occurs varies periodically in a simple
2037
+ fashion. Consider the first few permutations of 4 elements generated
2038
+ by ``permutations`` and ``generate_bell``:
2039
+
2040
+ >>> list(permutations(range(4)))[:5]
2041
+ [(0, 1, 2, 3), (0, 1, 3, 2), (0, 2, 1, 3), (0, 2, 3, 1), (0, 3, 1, 2)]
2042
+ >>> list(generate_bell(4))[:5]
2043
+ [(0, 1, 2, 3), (0, 1, 3, 2), (0, 3, 1, 2), (3, 0, 1, 2), (3, 0, 2, 1)]
2044
+
2045
+ Notice how the 2nd and 3rd lexicographical permutations have 3 elements
2046
+ out of place whereas each "bell" permutation always has only two
2047
+ elements out of place relative to the previous permutation (and so the
2048
+ signature (+/-1) of a permutation is opposite of the signature of the
2049
+ previous permutation).
2050
+
2051
+ How the position of inversion varies across the elements can be seen
2052
+ by tracing out where the largest number appears in the permutations:
2053
+
2054
+ >>> m = zeros(4, 24)
2055
+ >>> for i, p in enumerate(generate_bell(4)):
2056
+ ... m[:, i] = Matrix([j - 3 for j in list(p)]) # make largest zero
2057
+ >>> m.print_nonzero('X')
2058
+ [XXX XXXXXX XXXXXX XXX]
2059
+ [XX XX XXXX XX XXXX XX XX]
2060
+ [X XXXX XX XXXX XX XXXX X]
2061
+ [ XXXXXX XXXXXX XXXXXX ]
2062
+
2063
+ See Also
2064
+ ========
2065
+
2066
+ sympy.combinatorics.permutations.Permutation.next_trotterjohnson
2067
+
2068
+ References
2069
+ ==========
2070
+
2071
+ .. [1] https://en.wikipedia.org/wiki/Method_ringing
2072
+
2073
+ .. [2] https://stackoverflow.com/questions/4856615/recursive-permutation/4857018
2074
+
2075
+ .. [3] https://web.archive.org/web/20160313023044/http://programminggeeks.com/bell-algorithm-for-permutation/
2076
+
2077
+ .. [4] https://en.wikipedia.org/wiki/Steinhaus%E2%80%93Johnson%E2%80%93Trotter_algorithm
2078
+
2079
+ .. [5] Generating involutions, derangements, and relatives by ECO
2080
+ Vincent Vajnovszki, DMTCS vol 1 issue 12, 2010
2081
+
2082
+ """
2083
+ n = as_int(n)
2084
+ if n < 1:
2085
+ raise ValueError('n must be a positive integer')
2086
+ if n == 1:
2087
+ yield (0,)
2088
+ elif n == 2:
2089
+ yield (0, 1)
2090
+ yield (1, 0)
2091
+ elif n == 3:
2092
+ yield from [(0, 1, 2), (0, 2, 1), (2, 0, 1), (2, 1, 0), (1, 2, 0), (1, 0, 2)]
2093
+ else:
2094
+ m = n - 1
2095
+ op = [0] + [-1]*m
2096
+ l = list(range(n))
2097
+ while True:
2098
+ yield tuple(l)
2099
+ # find biggest element with op
2100
+ big = None, -1 # idx, value
2101
+ for i in range(n):
2102
+ if op[i] and l[i] > big[1]:
2103
+ big = i, l[i]
2104
+ i, _ = big
2105
+ if i is None:
2106
+ break # there are no ops left
2107
+ # swap it with neighbor in the indicated direction
2108
+ j = i + op[i]
2109
+ l[i], l[j] = l[j], l[i]
2110
+ op[i], op[j] = op[j], op[i]
2111
+ # if it landed at the end or if the neighbor in the same
2112
+ # direction is bigger then turn off op
2113
+ if j == 0 or j == m or l[j + op[j]] > l[j]:
2114
+ op[j] = 0
2115
+ # any element bigger to the left gets +1 op
2116
+ for i in range(j):
2117
+ if l[i] > l[j]:
2118
+ op[i] = 1
2119
+ # any element bigger to the right gets -1 op
2120
+ for i in range(j + 1, n):
2121
+ if l[i] > l[j]:
2122
+ op[i] = -1
2123
+
2124
+
2125
+ def generate_involutions(n):
2126
+ """
2127
+ Generates involutions.
2128
+
2129
+ An involution is a permutation that when multiplied
2130
+ by itself equals the identity permutation. In this
2131
+ implementation the involutions are generated using
2132
+ Fixed Points.
2133
+
2134
+ Alternatively, an involution can be considered as
2135
+ a permutation that does not contain any cycles with
2136
+ a length that is greater than two.
2137
+
2138
+ Examples
2139
+ ========
2140
+
2141
+ >>> from sympy.utilities.iterables import generate_involutions
2142
+ >>> list(generate_involutions(3))
2143
+ [(0, 1, 2), (0, 2, 1), (1, 0, 2), (2, 1, 0)]
2144
+ >>> len(list(generate_involutions(4)))
2145
+ 10
2146
+
2147
+ References
2148
+ ==========
2149
+
2150
+ .. [1] https://mathworld.wolfram.com/PermutationInvolution.html
2151
+
2152
+ """
2153
+ idx = list(range(n))
2154
+ for p in permutations(idx):
2155
+ for i in idx:
2156
+ if p[p[i]] != i:
2157
+ break
2158
+ else:
2159
+ yield p
2160
+
2161
+
2162
+ def multiset_derangements(s):
2163
+ """Generate derangements of the elements of s *in place*.
2164
+
2165
+ Examples
2166
+ ========
2167
+
2168
+ >>> from sympy.utilities.iterables import multiset_derangements, uniq
2169
+
2170
+ Because the derangements of multisets (not sets) are generated
2171
+ in place, copies of the return value must be made if a collection
2172
+ of derangements is desired or else all values will be the same:
2173
+
2174
+ >>> list(uniq([i for i in multiset_derangements('1233')]))
2175
+ [[None, None, None, None]]
2176
+ >>> [i.copy() for i in multiset_derangements('1233')]
2177
+ [['3', '3', '1', '2'], ['3', '3', '2', '1']]
2178
+ >>> [''.join(i) for i in multiset_derangements('1233')]
2179
+ ['3312', '3321']
2180
+ """
2181
+ from sympy.core.sorting import ordered
2182
+ # create multiset dictionary of hashable elements or else
2183
+ # remap elements to integers
2184
+ try:
2185
+ ms = multiset(s)
2186
+ except TypeError:
2187
+ # give each element a canonical integer value
2188
+ key = dict(enumerate(ordered(uniq(s))))
2189
+ h = []
2190
+ for si in s:
2191
+ for k in key:
2192
+ if key[k] == si:
2193
+ h.append(k)
2194
+ break
2195
+ for i in multiset_derangements(h):
2196
+ yield [key[j] for j in i]
2197
+ return
2198
+
2199
+ mx = max(ms.values()) # max repetition of any element
2200
+ n = len(s) # the number of elements
2201
+
2202
+ ## special cases
2203
+
2204
+ # 1) one element has more than half the total cardinality of s: no
2205
+ # derangements are possible.
2206
+ if mx*2 > n:
2207
+ return
2208
+
2209
+ # 2) all elements appear once: singletons
2210
+ if len(ms) == n:
2211
+ yield from _set_derangements(s)
2212
+ return
2213
+
2214
+ # find the first element that is repeated the most to place
2215
+ # in the following two special cases where the selection
2216
+ # is unambiguous: either there are two elements with multiplicity
2217
+ # of mx or else there is only one with multiplicity mx
2218
+ for M in ms:
2219
+ if ms[M] == mx:
2220
+ break
2221
+
2222
+ inonM = [i for i in range(n) if s[i] != M] # location of non-M
2223
+ iM = [i for i in range(n) if s[i] == M] # locations of M
2224
+ rv = [None]*n
2225
+
2226
+ # 3) half are the same
2227
+ if 2*mx == n:
2228
+ # M goes into non-M locations
2229
+ for i in inonM:
2230
+ rv[i] = M
2231
+ # permutations of non-M go to M locations
2232
+ for p in multiset_permutations([s[i] for i in inonM]):
2233
+ for i, pi in zip(iM, p):
2234
+ rv[i] = pi
2235
+ yield rv
2236
+ # clean-up (and encourages proper use of routine)
2237
+ rv[:] = [None]*n
2238
+ return
2239
+
2240
+ # 4) single repeat covers all but 1 of the non-repeats:
2241
+ # if there is one repeat then the multiset of the values
2242
+ # of ms would be {mx: 1, 1: n - mx}, i.e. there would
2243
+ # be n - mx + 1 values with the condition that n - 2*mx = 1
2244
+ if n - 2*mx == 1 and len(ms.values()) == n - mx + 1:
2245
+ for i, i1 in enumerate(inonM):
2246
+ ifill = inonM[:i] + inonM[i+1:]
2247
+ for j in ifill:
2248
+ rv[j] = M
2249
+ for p in permutations([s[j] for j in ifill]):
2250
+ rv[i1] = s[i1]
2251
+ for j, pi in zip(iM, p):
2252
+ rv[j] = pi
2253
+ k = i1
2254
+ for j in iM:
2255
+ rv[j], rv[k] = rv[k], rv[j]
2256
+ yield rv
2257
+ k = j
2258
+ # clean-up (and encourages proper use of routine)
2259
+ rv[:] = [None]*n
2260
+ return
2261
+
2262
+ ## general case is handled with 3 helpers:
2263
+ # 1) `finish_derangements` will place the last two elements
2264
+ # which have arbitrary multiplicities, e.g. for multiset
2265
+ # {c: 3, a: 2, b: 2}, the last two elements are a and b
2266
+ # 2) `iopen` will tell where a given element can be placed
2267
+ # 3) `do` will recursively place elements into subsets of
2268
+ # valid locations
2269
+
2270
+ def finish_derangements():
2271
+ """Place the last two elements into the partially completed
2272
+ derangement, and yield the results.
2273
+ """
2274
+
2275
+ a = take[1][0] # penultimate element
2276
+ a_ct = take[1][1]
2277
+ b = take[0][0] # last element to be placed
2278
+ b_ct = take[0][1]
2279
+
2280
+ # split the indexes of the not-already-assigned elements of rv into
2281
+ # three categories
2282
+ forced_a = [] # positions which must have an a
2283
+ forced_b = [] # positions which must have a b
2284
+ open_free = [] # positions which could take either
2285
+ for i in range(len(s)):
2286
+ if rv[i] is None:
2287
+ if s[i] == a:
2288
+ forced_b.append(i)
2289
+ elif s[i] == b:
2290
+ forced_a.append(i)
2291
+ else:
2292
+ open_free.append(i)
2293
+
2294
+ if len(forced_a) > a_ct or len(forced_b) > b_ct:
2295
+ # No derangement possible
2296
+ return
2297
+
2298
+ for i in forced_a:
2299
+ rv[i] = a
2300
+ for i in forced_b:
2301
+ rv[i] = b
2302
+ for a_place in combinations(open_free, a_ct - len(forced_a)):
2303
+ for a_pos in a_place:
2304
+ rv[a_pos] = a
2305
+ for i in open_free:
2306
+ if rv[i] is None: # anything not in the subset is set to b
2307
+ rv[i] = b
2308
+ yield rv
2309
+ # Clean up/undo the final placements
2310
+ for i in open_free:
2311
+ rv[i] = None
2312
+
2313
+ # additional cleanup - clear forced_a, forced_b
2314
+ for i in forced_a:
2315
+ rv[i] = None
2316
+ for i in forced_b:
2317
+ rv[i] = None
2318
+
2319
+ def iopen(v):
2320
+ # return indices at which element v can be placed in rv:
2321
+ # locations which are not already occupied if that location
2322
+ # does not already contain v in the same location of s
2323
+ return [i for i in range(n) if rv[i] is None and s[i] != v]
2324
+
2325
+ def do(j):
2326
+ if j == 1:
2327
+ # handle the last two elements (regardless of multiplicity)
2328
+ # with a special method
2329
+ yield from finish_derangements()
2330
+ else:
2331
+ # place the mx elements of M into a subset of places
2332
+ # into which it can be replaced
2333
+ M, mx = take[j]
2334
+ for i in combinations(iopen(M), mx):
2335
+ # place M
2336
+ for ii in i:
2337
+ rv[ii] = M
2338
+ # recursively place the next element
2339
+ yield from do(j - 1)
2340
+ # mark positions where M was placed as once again
2341
+ # open for placement of other elements
2342
+ for ii in i:
2343
+ rv[ii] = None
2344
+
2345
+ # process elements in order of canonically decreasing multiplicity
2346
+ take = sorted(ms.items(), key=lambda x:(x[1], x[0]))
2347
+ yield from do(len(take) - 1)
2348
+ rv[:] = [None]*n
2349
+
2350
+
2351
+ def random_derangement(t, choice=None, strict=True):
2352
+ """Return a list of elements in which none are in the same positions
2353
+ as they were originally. If an element fills more than half of the positions
2354
+ then an error will be raised since no derangement is possible. To obtain
2355
+ a derangement of as many items as possible--with some of the most numerous
2356
+ remaining in their original positions--pass `strict=False`. To produce a
2357
+ pseudorandom derangment, pass a pseudorandom selector like `choice` (see
2358
+ below).
2359
+
2360
+ Examples
2361
+ ========
2362
+
2363
+ >>> from sympy.utilities.iterables import random_derangement
2364
+ >>> t = 'SymPy: a CAS in pure Python'
2365
+ >>> d = random_derangement(t)
2366
+ >>> all(i != j for i, j in zip(d, t))
2367
+ True
2368
+
2369
+ A predictable result can be obtained by using a pseudorandom
2370
+ generator for the choice:
2371
+
2372
+ >>> from sympy.core.random import seed, choice as c
2373
+ >>> seed(1)
2374
+ >>> d = [''.join(random_derangement(t, c)) for i in range(5)]
2375
+ >>> assert len(set(d)) != 1 # we got different values
2376
+
2377
+ By reseeding, the same sequence can be obtained:
2378
+
2379
+ >>> seed(1)
2380
+ >>> d2 = [''.join(random_derangement(t, c)) for i in range(5)]
2381
+ >>> assert d == d2
2382
+ """
2383
+ if choice is None:
2384
+ import secrets
2385
+ choice = secrets.choice
2386
+ def shuffle(rv):
2387
+ '''Knuth shuffle'''
2388
+ for i in range(len(rv) - 1, 0, -1):
2389
+ x = choice(rv[:i + 1])
2390
+ j = rv.index(x)
2391
+ rv[i], rv[j] = rv[j], rv[i]
2392
+ def pick(rv, n):
2393
+ '''shuffle rv and return the first n values
2394
+ '''
2395
+ shuffle(rv)
2396
+ return rv[:n]
2397
+ ms = multiset(t)
2398
+ tot = len(t)
2399
+ ms = sorted(ms.items(), key=lambda x: x[1])
2400
+ # if there are not enough spaces for the most
2401
+ # plentiful element to move to then some of them
2402
+ # will have to stay in place
2403
+ M, mx = ms[-1]
2404
+ n = len(t)
2405
+ xs = 2*mx - tot
2406
+ if xs > 0:
2407
+ if strict:
2408
+ raise ValueError('no derangement possible')
2409
+ opts = [i for (i, c) in enumerate(t) if c == ms[-1][0]]
2410
+ pick(opts, xs)
2411
+ stay = sorted(opts[:xs])
2412
+ rv = list(t)
2413
+ for i in reversed(stay):
2414
+ rv.pop(i)
2415
+ rv = random_derangement(rv, choice)
2416
+ for i in stay:
2417
+ rv.insert(i, ms[-1][0])
2418
+ return ''.join(rv) if type(t) is str else rv
2419
+ # the normal derangement calculated from here
2420
+ if n == len(ms):
2421
+ # approx 1/3 will succeed
2422
+ rv = list(t)
2423
+ while True:
2424
+ shuffle(rv)
2425
+ if all(i != j for i,j in zip(rv, t)):
2426
+ break
2427
+ else:
2428
+ # general case
2429
+ rv = [None]*n
2430
+ while True:
2431
+ j = 0
2432
+ while j > -len(ms): # do most numerous first
2433
+ j -= 1
2434
+ e, c = ms[j]
2435
+ opts = [i for i in range(n) if rv[i] is None and t[i] != e]
2436
+ if len(opts) < c:
2437
+ for i in range(n):
2438
+ rv[i] = None
2439
+ break # try again
2440
+ pick(opts, c)
2441
+ for i in range(c):
2442
+ rv[opts[i]] = e
2443
+ else:
2444
+ return rv
2445
+ return rv
2446
+
2447
+
2448
+ def _set_derangements(s):
2449
+ """
2450
+ yield derangements of items in ``s`` which are assumed to contain
2451
+ no repeated elements
2452
+ """
2453
+ if len(s) < 2:
2454
+ return
2455
+ if len(s) == 2:
2456
+ yield [s[1], s[0]]
2457
+ return
2458
+ if len(s) == 3:
2459
+ yield [s[1], s[2], s[0]]
2460
+ yield [s[2], s[0], s[1]]
2461
+ return
2462
+ for p in permutations(s):
2463
+ if not any(i == j for i, j in zip(p, s)):
2464
+ yield list(p)
2465
+
2466
+
2467
+ def generate_derangements(s):
2468
+ """
2469
+ Return unique derangements of the elements of iterable ``s``.
2470
+
2471
+ Examples
2472
+ ========
2473
+
2474
+ >>> from sympy.utilities.iterables import generate_derangements
2475
+ >>> list(generate_derangements([0, 1, 2]))
2476
+ [[1, 2, 0], [2, 0, 1]]
2477
+ >>> list(generate_derangements([0, 1, 2, 2]))
2478
+ [[2, 2, 0, 1], [2, 2, 1, 0]]
2479
+ >>> list(generate_derangements([0, 1, 1]))
2480
+ []
2481
+
2482
+ See Also
2483
+ ========
2484
+
2485
+ sympy.functions.combinatorial.factorials.subfactorial
2486
+
2487
+ """
2488
+ if not has_dups(s):
2489
+ yield from _set_derangements(s)
2490
+ else:
2491
+ for p in multiset_derangements(s):
2492
+ yield list(p)
2493
+
2494
+
2495
+ def necklaces(n, k, free=False):
2496
+ """
2497
+ A routine to generate necklaces that may (free=True) or may not
2498
+ (free=False) be turned over to be viewed. The "necklaces" returned
2499
+ are comprised of ``n`` integers (beads) with ``k`` different
2500
+ values (colors). Only unique necklaces are returned.
2501
+
2502
+ Examples
2503
+ ========
2504
+
2505
+ >>> from sympy.utilities.iterables import necklaces, bracelets
2506
+ >>> def show(s, i):
2507
+ ... return ''.join(s[j] for j in i)
2508
+
2509
+ The "unrestricted necklace" is sometimes also referred to as a
2510
+ "bracelet" (an object that can be turned over, a sequence that can
2511
+ be reversed) and the term "necklace" is used to imply a sequence
2512
+ that cannot be reversed. So ACB == ABC for a bracelet (rotate and
2513
+ reverse) while the two are different for a necklace since rotation
2514
+ alone cannot make the two sequences the same.
2515
+
2516
+ (mnemonic: Bracelets can be viewed Backwards, but Not Necklaces.)
2517
+
2518
+ >>> B = [show('ABC', i) for i in bracelets(3, 3)]
2519
+ >>> N = [show('ABC', i) for i in necklaces(3, 3)]
2520
+ >>> set(N) - set(B)
2521
+ {'ACB'}
2522
+
2523
+ >>> list(necklaces(4, 2))
2524
+ [(0, 0, 0, 0), (0, 0, 0, 1), (0, 0, 1, 1),
2525
+ (0, 1, 0, 1), (0, 1, 1, 1), (1, 1, 1, 1)]
2526
+
2527
+ >>> [show('.o', i) for i in bracelets(4, 2)]
2528
+ ['....', '...o', '..oo', '.o.o', '.ooo', 'oooo']
2529
+
2530
+ References
2531
+ ==========
2532
+
2533
+ .. [1] https://mathworld.wolfram.com/Necklace.html
2534
+
2535
+ .. [2] Frank Ruskey, Carla Savage, and Terry Min Yih Wang,
2536
+ Generating necklaces, Journal of Algorithms 13 (1992), 414-430;
2537
+ https://doi.org/10.1016/0196-6774(92)90047-G
2538
+
2539
+ """
2540
+ # The FKM algorithm
2541
+ if k == 0 and n > 0:
2542
+ return
2543
+ a = [0]*n
2544
+ yield tuple(a)
2545
+ if n == 0:
2546
+ return
2547
+ while True:
2548
+ i = n - 1
2549
+ while a[i] == k - 1:
2550
+ i -= 1
2551
+ if i == -1:
2552
+ return
2553
+ a[i] += 1
2554
+ for j in range(n - i - 1):
2555
+ a[j + i + 1] = a[j]
2556
+ if n % (i + 1) == 0 and (not free or all(a <= a[j::-1] + a[-1:j:-1] for j in range(n - 1))):
2557
+ # No need to test j = n - 1.
2558
+ yield tuple(a)
2559
+
2560
+
2561
+ def bracelets(n, k):
2562
+ """Wrapper to necklaces to return a free (unrestricted) necklace."""
2563
+ return necklaces(n, k, free=True)
2564
+
2565
+
2566
+ def generate_oriented_forest(n):
2567
+ """
2568
+ This algorithm generates oriented forests.
2569
+
2570
+ An oriented graph is a directed graph having no symmetric pair of directed
2571
+ edges. A forest is an acyclic graph, i.e., it has no cycles. A forest can
2572
+ also be described as a disjoint union of trees, which are graphs in which
2573
+ any two vertices are connected by exactly one simple path.
2574
+
2575
+ Examples
2576
+ ========
2577
+
2578
+ >>> from sympy.utilities.iterables import generate_oriented_forest
2579
+ >>> list(generate_oriented_forest(4))
2580
+ [[0, 1, 2, 3], [0, 1, 2, 2], [0, 1, 2, 1], [0, 1, 2, 0], \
2581
+ [0, 1, 1, 1], [0, 1, 1, 0], [0, 1, 0, 1], [0, 1, 0, 0], [0, 0, 0, 0]]
2582
+
2583
+ References
2584
+ ==========
2585
+
2586
+ .. [1] T. Beyer and S.M. Hedetniemi: constant time generation of
2587
+ rooted trees, SIAM J. Computing Vol. 9, No. 4, November 1980
2588
+
2589
+ .. [2] https://stackoverflow.com/questions/1633833/oriented-forest-taocp-algorithm-in-python
2590
+
2591
+ """
2592
+ P = list(range(-1, n))
2593
+ while True:
2594
+ yield P[1:]
2595
+ if P[n] > 0:
2596
+ P[n] = P[P[n]]
2597
+ else:
2598
+ for p in range(n - 1, 0, -1):
2599
+ if P[p] != 0:
2600
+ target = P[p] - 1
2601
+ for q in range(p - 1, 0, -1):
2602
+ if P[q] == target:
2603
+ break
2604
+ offset = p - q
2605
+ for i in range(p, n + 1):
2606
+ P[i] = P[i - offset]
2607
+ break
2608
+ else:
2609
+ break
2610
+
2611
+
2612
+ def minlex(seq, directed=True, key=None):
2613
+ r"""
2614
+ Return the rotation of the sequence in which the lexically smallest
2615
+ elements appear first, e.g. `cba \rightarrow acb`.
2616
+
2617
+ The sequence returned is a tuple, unless the input sequence is a string
2618
+ in which case a string is returned.
2619
+
2620
+ If ``directed`` is False then the smaller of the sequence and the
2621
+ reversed sequence is returned, e.g. `cba \rightarrow abc`.
2622
+
2623
+ If ``key`` is not None then it is used to extract a comparison key from each element in iterable.
2624
+
2625
+ Examples
2626
+ ========
2627
+
2628
+ >>> from sympy.combinatorics.polyhedron import minlex
2629
+ >>> minlex((1, 2, 0))
2630
+ (0, 1, 2)
2631
+ >>> minlex((1, 0, 2))
2632
+ (0, 2, 1)
2633
+ >>> minlex((1, 0, 2), directed=False)
2634
+ (0, 1, 2)
2635
+
2636
+ >>> minlex('11010011000', directed=True)
2637
+ '00011010011'
2638
+ >>> minlex('11010011000', directed=False)
2639
+ '00011001011'
2640
+
2641
+ >>> minlex(('bb', 'aaa', 'c', 'a'))
2642
+ ('a', 'bb', 'aaa', 'c')
2643
+ >>> minlex(('bb', 'aaa', 'c', 'a'), key=len)
2644
+ ('c', 'a', 'bb', 'aaa')
2645
+
2646
+ """
2647
+ from sympy.functions.elementary.miscellaneous import Id
2648
+ if key is None: key = Id
2649
+ best = rotate_left(seq, least_rotation(seq, key=key))
2650
+ if not directed:
2651
+ rseq = seq[::-1]
2652
+ rbest = rotate_left(rseq, least_rotation(rseq, key=key))
2653
+ best = min(best, rbest, key=key)
2654
+
2655
+ # Convert to tuple, unless we started with a string.
2656
+ return tuple(best) if not isinstance(seq, str) else best
2657
+
2658
+
2659
+ def runs(seq, op=gt):
2660
+ """Group the sequence into lists in which successive elements
2661
+ all compare the same with the comparison operator, ``op``:
2662
+ op(seq[i + 1], seq[i]) is True from all elements in a run.
2663
+
2664
+ Examples
2665
+ ========
2666
+
2667
+ >>> from sympy.utilities.iterables import runs
2668
+ >>> from operator import ge
2669
+ >>> runs([0, 1, 2, 2, 1, 4, 3, 2, 2])
2670
+ [[0, 1, 2], [2], [1, 4], [3], [2], [2]]
2671
+ >>> runs([0, 1, 2, 2, 1, 4, 3, 2, 2], op=ge)
2672
+ [[0, 1, 2, 2], [1, 4], [3], [2, 2]]
2673
+ """
2674
+ cycles = []
2675
+ seq = iter(seq)
2676
+ try:
2677
+ run = [next(seq)]
2678
+ except StopIteration:
2679
+ return []
2680
+ while True:
2681
+ try:
2682
+ ei = next(seq)
2683
+ except StopIteration:
2684
+ break
2685
+ if op(ei, run[-1]):
2686
+ run.append(ei)
2687
+ continue
2688
+ else:
2689
+ cycles.append(run)
2690
+ run = [ei]
2691
+ if run:
2692
+ cycles.append(run)
2693
+ return cycles
2694
+
2695
+
2696
+ def sequence_partitions(l, n, /):
2697
+ r"""Returns the partition of sequence $l$ into $n$ bins
2698
+
2699
+ Explanation
2700
+ ===========
2701
+
2702
+ Given the sequence $l_1 \cdots l_m \in V^+$ where
2703
+ $V^+$ is the Kleene plus of $V$
2704
+
2705
+ The set of $n$ partitions of $l$ is defined as:
2706
+
2707
+ .. math::
2708
+ \{(s_1, \cdots, s_n) | s_1 \in V^+, \cdots, s_n \in V^+,
2709
+ s_1 \cdots s_n = l_1 \cdots l_m\}
2710
+
2711
+ Parameters
2712
+ ==========
2713
+
2714
+ l : Sequence[T]
2715
+ A nonempty sequence of any Python objects
2716
+
2717
+ n : int
2718
+ A positive integer
2719
+
2720
+ Yields
2721
+ ======
2722
+
2723
+ out : list[Sequence[T]]
2724
+ A list of sequences with concatenation equals $l$.
2725
+ This should conform with the type of $l$.
2726
+
2727
+ Examples
2728
+ ========
2729
+
2730
+ >>> from sympy.utilities.iterables import sequence_partitions
2731
+ >>> for out in sequence_partitions([1, 2, 3, 4], 2):
2732
+ ... print(out)
2733
+ [[1], [2, 3, 4]]
2734
+ [[1, 2], [3, 4]]
2735
+ [[1, 2, 3], [4]]
2736
+
2737
+ Notes
2738
+ =====
2739
+
2740
+ This is modified version of EnricoGiampieri's partition generator
2741
+ from https://stackoverflow.com/questions/13131491/partition-n-items-into-k-bins-in-python-lazily
2742
+
2743
+ See Also
2744
+ ========
2745
+
2746
+ sequence_partitions_empty
2747
+ """
2748
+ # Asserting l is nonempty is done only for sanity check
2749
+ if n == 1 and l:
2750
+ yield [l]
2751
+ return
2752
+ for i in range(1, len(l)):
2753
+ for part in sequence_partitions(l[i:], n - 1):
2754
+ yield [l[:i]] + part
2755
+
2756
+
2757
+ def sequence_partitions_empty(l, n, /):
2758
+ r"""Returns the partition of sequence $l$ into $n$ bins with
2759
+ empty sequence
2760
+
2761
+ Explanation
2762
+ ===========
2763
+
2764
+ Given the sequence $l_1 \cdots l_m \in V^*$ where
2765
+ $V^*$ is the Kleene star of $V$
2766
+
2767
+ The set of $n$ partitions of $l$ is defined as:
2768
+
2769
+ .. math::
2770
+ \{(s_1, \cdots, s_n) | s_1 \in V^*, \cdots, s_n \in V^*,
2771
+ s_1 \cdots s_n = l_1 \cdots l_m\}
2772
+
2773
+ There are more combinations than :func:`sequence_partitions` because
2774
+ empty sequence can fill everywhere, so we try to provide different
2775
+ utility for this.
2776
+
2777
+ Parameters
2778
+ ==========
2779
+
2780
+ l : Sequence[T]
2781
+ A sequence of any Python objects (can be possibly empty)
2782
+
2783
+ n : int
2784
+ A positive integer
2785
+
2786
+ Yields
2787
+ ======
2788
+
2789
+ out : list[Sequence[T]]
2790
+ A list of sequences with concatenation equals $l$.
2791
+ This should conform with the type of $l$.
2792
+
2793
+ Examples
2794
+ ========
2795
+
2796
+ >>> from sympy.utilities.iterables import sequence_partitions_empty
2797
+ >>> for out in sequence_partitions_empty([1, 2, 3, 4], 2):
2798
+ ... print(out)
2799
+ [[], [1, 2, 3, 4]]
2800
+ [[1], [2, 3, 4]]
2801
+ [[1, 2], [3, 4]]
2802
+ [[1, 2, 3], [4]]
2803
+ [[1, 2, 3, 4], []]
2804
+
2805
+ See Also
2806
+ ========
2807
+
2808
+ sequence_partitions
2809
+ """
2810
+ if n < 1:
2811
+ return
2812
+ if n == 1:
2813
+ yield [l]
2814
+ return
2815
+ for i in range(0, len(l) + 1):
2816
+ for part in sequence_partitions_empty(l[i:], n - 1):
2817
+ yield [l[:i]] + part
2818
+
2819
+
2820
+ def kbins(l, k, ordered=None):
2821
+ """
2822
+ Return sequence ``l`` partitioned into ``k`` bins.
2823
+
2824
+ Examples
2825
+ ========
2826
+
2827
+ The default is to give the items in the same order, but grouped
2828
+ into k partitions without any reordering:
2829
+
2830
+ >>> from sympy.utilities.iterables import kbins
2831
+ >>> for p in kbins(list(range(5)), 2):
2832
+ ... print(p)
2833
+ ...
2834
+ [[0], [1, 2, 3, 4]]
2835
+ [[0, 1], [2, 3, 4]]
2836
+ [[0, 1, 2], [3, 4]]
2837
+ [[0, 1, 2, 3], [4]]
2838
+
2839
+ The ``ordered`` flag is either None (to give the simple partition
2840
+ of the elements) or is a 2 digit integer indicating whether the order of
2841
+ the bins and the order of the items in the bins matters. Given::
2842
+
2843
+ A = [[0], [1, 2]]
2844
+ B = [[1, 2], [0]]
2845
+ C = [[2, 1], [0]]
2846
+ D = [[0], [2, 1]]
2847
+
2848
+ the following values for ``ordered`` have the shown meanings::
2849
+
2850
+ 00 means A == B == C == D
2851
+ 01 means A == B
2852
+ 10 means A == D
2853
+ 11 means A == A
2854
+
2855
+ >>> for ordered_flag in [None, 0, 1, 10, 11]:
2856
+ ... print('ordered = %s' % ordered_flag)
2857
+ ... for p in kbins(list(range(3)), 2, ordered=ordered_flag):
2858
+ ... print(' %s' % p)
2859
+ ...
2860
+ ordered = None
2861
+ [[0], [1, 2]]
2862
+ [[0, 1], [2]]
2863
+ ordered = 0
2864
+ [[0, 1], [2]]
2865
+ [[0, 2], [1]]
2866
+ [[0], [1, 2]]
2867
+ ordered = 1
2868
+ [[0], [1, 2]]
2869
+ [[0], [2, 1]]
2870
+ [[1], [0, 2]]
2871
+ [[1], [2, 0]]
2872
+ [[2], [0, 1]]
2873
+ [[2], [1, 0]]
2874
+ ordered = 10
2875
+ [[0, 1], [2]]
2876
+ [[2], [0, 1]]
2877
+ [[0, 2], [1]]
2878
+ [[1], [0, 2]]
2879
+ [[0], [1, 2]]
2880
+ [[1, 2], [0]]
2881
+ ordered = 11
2882
+ [[0], [1, 2]]
2883
+ [[0, 1], [2]]
2884
+ [[0], [2, 1]]
2885
+ [[0, 2], [1]]
2886
+ [[1], [0, 2]]
2887
+ [[1, 0], [2]]
2888
+ [[1], [2, 0]]
2889
+ [[1, 2], [0]]
2890
+ [[2], [0, 1]]
2891
+ [[2, 0], [1]]
2892
+ [[2], [1, 0]]
2893
+ [[2, 1], [0]]
2894
+
2895
+ See Also
2896
+ ========
2897
+
2898
+ partitions, multiset_partitions
2899
+
2900
+ """
2901
+ if ordered is None:
2902
+ yield from sequence_partitions(l, k)
2903
+ elif ordered == 11:
2904
+ for pl in multiset_permutations(l):
2905
+ pl = list(pl)
2906
+ yield from sequence_partitions(pl, k)
2907
+ elif ordered == 00:
2908
+ yield from multiset_partitions(l, k)
2909
+ elif ordered == 10:
2910
+ for p in multiset_partitions(l, k):
2911
+ for perm in permutations(p):
2912
+ yield list(perm)
2913
+ elif ordered == 1:
2914
+ for kgot, p in partitions(len(l), k, size=True):
2915
+ if kgot != k:
2916
+ continue
2917
+ for li in multiset_permutations(l):
2918
+ rv = []
2919
+ i = j = 0
2920
+ li = list(li)
2921
+ for size, multiplicity in sorted(p.items()):
2922
+ for m in range(multiplicity):
2923
+ j = i + size
2924
+ rv.append(li[i: j])
2925
+ i = j
2926
+ yield rv
2927
+ else:
2928
+ raise ValueError(
2929
+ 'ordered must be one of 00, 01, 10 or 11, not %s' % ordered)
2930
+
2931
+
2932
+ def permute_signs(t):
2933
+ """Return iterator in which the signs of non-zero elements
2934
+ of t are permuted.
2935
+
2936
+ Examples
2937
+ ========
2938
+
2939
+ >>> from sympy.utilities.iterables import permute_signs
2940
+ >>> list(permute_signs((0, 1, 2)))
2941
+ [(0, 1, 2), (0, -1, 2), (0, 1, -2), (0, -1, -2)]
2942
+ """
2943
+ for signs in product(*[(1, -1)]*(len(t) - t.count(0))):
2944
+ signs = list(signs)
2945
+ yield type(t)([i*signs.pop() if i else i for i in t])
2946
+
2947
+
2948
+ def signed_permutations(t):
2949
+ """Return iterator in which the signs of non-zero elements
2950
+ of t and the order of the elements are permuted.
2951
+
2952
+ Examples
2953
+ ========
2954
+
2955
+ >>> from sympy.utilities.iterables import signed_permutations
2956
+ >>> list(signed_permutations((0, 1, 2)))
2957
+ [(0, 1, 2), (0, -1, 2), (0, 1, -2), (0, -1, -2), (0, 2, 1),
2958
+ (0, -2, 1), (0, 2, -1), (0, -2, -1), (1, 0, 2), (-1, 0, 2),
2959
+ (1, 0, -2), (-1, 0, -2), (1, 2, 0), (-1, 2, 0), (1, -2, 0),
2960
+ (-1, -2, 0), (2, 0, 1), (-2, 0, 1), (2, 0, -1), (-2, 0, -1),
2961
+ (2, 1, 0), (-2, 1, 0), (2, -1, 0), (-2, -1, 0)]
2962
+ """
2963
+ return (type(t)(i) for j in permutations(t)
2964
+ for i in permute_signs(j))
2965
+
2966
+
2967
+ def rotations(s, dir=1):
2968
+ """Return a generator giving the items in s as list where
2969
+ each subsequent list has the items rotated to the left (default)
2970
+ or right (``dir=-1``) relative to the previous list.
2971
+
2972
+ Examples
2973
+ ========
2974
+
2975
+ >>> from sympy import rotations
2976
+ >>> list(rotations([1,2,3]))
2977
+ [[1, 2, 3], [2, 3, 1], [3, 1, 2]]
2978
+ >>> list(rotations([1,2,3], -1))
2979
+ [[1, 2, 3], [3, 1, 2], [2, 3, 1]]
2980
+ """
2981
+ seq = list(s)
2982
+ for i in range(len(seq)):
2983
+ yield seq
2984
+ seq = rotate_left(seq, dir)
2985
+
2986
+
2987
+ def roundrobin(*iterables):
2988
+ """roundrobin recipe taken from itertools documentation:
2989
+ https://docs.python.org/3/library/itertools.html#itertools-recipes
2990
+
2991
+ roundrobin('ABC', 'D', 'EF') --> A D E B F C
2992
+
2993
+ Recipe credited to George Sakkis
2994
+ """
2995
+ nexts = cycle(iter(it).__next__ for it in iterables)
2996
+
2997
+ pending = len(iterables)
2998
+ while pending:
2999
+ try:
3000
+ for nxt in nexts:
3001
+ yield nxt()
3002
+ except StopIteration:
3003
+ pending -= 1
3004
+ nexts = cycle(islice(nexts, pending))
3005
+
3006
+
3007
+
3008
+ class NotIterable:
3009
+ """
3010
+ Use this as mixin when creating a class which is not supposed to
3011
+ return true when iterable() is called on its instances because
3012
+ calling list() on the instance, for example, would result in
3013
+ an infinite loop.
3014
+ """
3015
+ pass
3016
+
3017
+
3018
+ def iterable(i, exclude=(str, dict, NotIterable)):
3019
+ """
3020
+ Return a boolean indicating whether ``i`` is SymPy iterable.
3021
+ True also indicates that the iterator is finite, e.g. you can
3022
+ call list(...) on the instance.
3023
+
3024
+ When SymPy is working with iterables, it is almost always assuming
3025
+ that the iterable is not a string or a mapping, so those are excluded
3026
+ by default. If you want a pure Python definition, make exclude=None. To
3027
+ exclude multiple items, pass them as a tuple.
3028
+
3029
+ You can also set the _iterable attribute to True or False on your class,
3030
+ which will override the checks here, including the exclude test.
3031
+
3032
+ As a rule of thumb, some SymPy functions use this to check if they should
3033
+ recursively map over an object. If an object is technically iterable in
3034
+ the Python sense but does not desire this behavior (e.g., because its
3035
+ iteration is not finite, or because iteration might induce an unwanted
3036
+ computation), it should disable it by setting the _iterable attribute to False.
3037
+
3038
+ See also: is_sequence
3039
+
3040
+ Examples
3041
+ ========
3042
+
3043
+ >>> from sympy.utilities.iterables import iterable
3044
+ >>> from sympy import Tuple
3045
+ >>> things = [[1], (1,), set([1]), Tuple(1), (j for j in [1, 2]), {1:2}, '1', 1]
3046
+ >>> for i in things:
3047
+ ... print('%s %s' % (iterable(i), type(i)))
3048
+ True <... 'list'>
3049
+ True <... 'tuple'>
3050
+ True <... 'set'>
3051
+ True <class 'sympy.core.containers.Tuple'>
3052
+ True <... 'generator'>
3053
+ False <... 'dict'>
3054
+ False <... 'str'>
3055
+ False <... 'int'>
3056
+
3057
+ >>> iterable({}, exclude=None)
3058
+ True
3059
+ >>> iterable({}, exclude=str)
3060
+ True
3061
+ >>> iterable("no", exclude=str)
3062
+ False
3063
+
3064
+ """
3065
+ if hasattr(i, '_iterable'):
3066
+ return i._iterable
3067
+ try:
3068
+ iter(i)
3069
+ except TypeError:
3070
+ return False
3071
+ if exclude:
3072
+ return not isinstance(i, exclude)
3073
+ return True
3074
+
3075
+
3076
+ def is_sequence(i, include=None):
3077
+ """
3078
+ Return a boolean indicating whether ``i`` is a sequence in the SymPy
3079
+ sense. If anything that fails the test below should be included as
3080
+ being a sequence for your application, set 'include' to that object's
3081
+ type; multiple types should be passed as a tuple of types.
3082
+
3083
+ Note: although generators can generate a sequence, they often need special
3084
+ handling to make sure their elements are captured before the generator is
3085
+ exhausted, so these are not included by default in the definition of a
3086
+ sequence.
3087
+
3088
+ See also: iterable
3089
+
3090
+ Examples
3091
+ ========
3092
+
3093
+ >>> from sympy.utilities.iterables import is_sequence
3094
+ >>> from types import GeneratorType
3095
+ >>> is_sequence([])
3096
+ True
3097
+ >>> is_sequence(set())
3098
+ False
3099
+ >>> is_sequence('abc')
3100
+ False
3101
+ >>> is_sequence('abc', include=str)
3102
+ True
3103
+ >>> generator = (c for c in 'abc')
3104
+ >>> is_sequence(generator)
3105
+ False
3106
+ >>> is_sequence(generator, include=(str, GeneratorType))
3107
+ True
3108
+
3109
+ """
3110
+ return (hasattr(i, '__getitem__') and
3111
+ iterable(i) or
3112
+ bool(include) and
3113
+ isinstance(i, include))
3114
+
3115
+
3116
+ @deprecated(
3117
+ """
3118
+ Using postorder_traversal from the sympy.utilities.iterables submodule is
3119
+ deprecated.
3120
+
3121
+ Instead, use postorder_traversal from the top-level sympy namespace, like
3122
+
3123
+ sympy.postorder_traversal
3124
+ """,
3125
+ deprecated_since_version="1.10",
3126
+ active_deprecations_target="deprecated-traversal-functions-moved")
3127
+ def postorder_traversal(node, keys=None):
3128
+ from sympy.core.traversal import postorder_traversal as _postorder_traversal
3129
+ return _postorder_traversal(node, keys=keys)
3130
+
3131
+
3132
+ @deprecated(
3133
+ """
3134
+ Using interactive_traversal from the sympy.utilities.iterables submodule
3135
+ is deprecated.
3136
+
3137
+ Instead, use interactive_traversal from the top-level sympy namespace,
3138
+ like
3139
+
3140
+ sympy.interactive_traversal
3141
+ """,
3142
+ deprecated_since_version="1.10",
3143
+ active_deprecations_target="deprecated-traversal-functions-moved")
3144
+ def interactive_traversal(expr):
3145
+ from sympy.interactive.traversal import interactive_traversal as _interactive_traversal
3146
+ return _interactive_traversal(expr)
3147
+
3148
+
3149
+ @deprecated(
3150
+ """
3151
+ Importing default_sort_key from sympy.utilities.iterables is deprecated.
3152
+ Use from sympy import default_sort_key instead.
3153
+ """,
3154
+ deprecated_since_version="1.10",
3155
+ active_deprecations_target="deprecated-sympy-core-compatibility",
3156
+ )
3157
+ def default_sort_key(*args, **kwargs):
3158
+ from sympy import default_sort_key as _default_sort_key
3159
+ return _default_sort_key(*args, **kwargs)
3160
+
3161
+
3162
+ @deprecated(
3163
+ """
3164
+ Importing default_sort_key from sympy.utilities.iterables is deprecated.
3165
+ Use from sympy import default_sort_key instead.
3166
+ """,
3167
+ deprecated_since_version="1.10",
3168
+ active_deprecations_target="deprecated-sympy-core-compatibility",
3169
+ )
3170
+ def ordered(*args, **kwargs):
3171
+ from sympy import ordered as _ordered
3172
+ return _ordered(*args, **kwargs)
venv/lib/python3.10/site-packages/sympy/utilities/lambdify.py ADDED
@@ -0,0 +1,1526 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This module provides convenient functions to transform SymPy expressions to
3
+ lambda functions which can be used to calculate numerical values very fast.
4
+ """
5
+
6
+ from __future__ import annotations
7
+ from typing import Any
8
+
9
+ import builtins
10
+ import inspect
11
+ import keyword
12
+ import textwrap
13
+ import linecache
14
+
15
+ # Required despite static analysis claiming it is not used
16
+ from sympy.external import import_module # noqa:F401
17
+ from sympy.utilities.exceptions import sympy_deprecation_warning
18
+ from sympy.utilities.decorator import doctest_depends_on
19
+ from sympy.utilities.iterables import (is_sequence, iterable,
20
+ NotIterable, flatten)
21
+ from sympy.utilities.misc import filldedent
22
+
23
+ __doctest_requires__ = {('lambdify',): ['numpy', 'tensorflow']}
24
+
25
+ # Default namespaces, letting us define translations that can't be defined
26
+ # by simple variable maps, like I => 1j
27
+ MATH_DEFAULT: dict[str, Any] = {}
28
+ MPMATH_DEFAULT: dict[str, Any] = {}
29
+ NUMPY_DEFAULT: dict[str, Any] = {"I": 1j}
30
+ SCIPY_DEFAULT: dict[str, Any] = {"I": 1j}
31
+ CUPY_DEFAULT: dict[str, Any] = {"I": 1j}
32
+ JAX_DEFAULT: dict[str, Any] = {"I": 1j}
33
+ TENSORFLOW_DEFAULT: dict[str, Any] = {}
34
+ SYMPY_DEFAULT: dict[str, Any] = {}
35
+ NUMEXPR_DEFAULT: dict[str, Any] = {}
36
+
37
+ # These are the namespaces the lambda functions will use.
38
+ # These are separate from the names above because they are modified
39
+ # throughout this file, whereas the defaults should remain unmodified.
40
+
41
+ MATH = MATH_DEFAULT.copy()
42
+ MPMATH = MPMATH_DEFAULT.copy()
43
+ NUMPY = NUMPY_DEFAULT.copy()
44
+ SCIPY = SCIPY_DEFAULT.copy()
45
+ CUPY = CUPY_DEFAULT.copy()
46
+ JAX = JAX_DEFAULT.copy()
47
+ TENSORFLOW = TENSORFLOW_DEFAULT.copy()
48
+ SYMPY = SYMPY_DEFAULT.copy()
49
+ NUMEXPR = NUMEXPR_DEFAULT.copy()
50
+
51
+
52
+ # Mappings between SymPy and other modules function names.
53
+ MATH_TRANSLATIONS = {
54
+ "ceiling": "ceil",
55
+ "E": "e",
56
+ "ln": "log",
57
+ }
58
+
59
+ # NOTE: This dictionary is reused in Function._eval_evalf to allow subclasses
60
+ # of Function to automatically evalf.
61
+ MPMATH_TRANSLATIONS = {
62
+ "Abs": "fabs",
63
+ "elliptic_k": "ellipk",
64
+ "elliptic_f": "ellipf",
65
+ "elliptic_e": "ellipe",
66
+ "elliptic_pi": "ellippi",
67
+ "ceiling": "ceil",
68
+ "chebyshevt": "chebyt",
69
+ "chebyshevu": "chebyu",
70
+ "E": "e",
71
+ "I": "j",
72
+ "ln": "log",
73
+ #"lowergamma":"lower_gamma",
74
+ "oo": "inf",
75
+ #"uppergamma":"upper_gamma",
76
+ "LambertW": "lambertw",
77
+ "MutableDenseMatrix": "matrix",
78
+ "ImmutableDenseMatrix": "matrix",
79
+ "conjugate": "conj",
80
+ "dirichlet_eta": "altzeta",
81
+ "Ei": "ei",
82
+ "Shi": "shi",
83
+ "Chi": "chi",
84
+ "Si": "si",
85
+ "Ci": "ci",
86
+ "RisingFactorial": "rf",
87
+ "FallingFactorial": "ff",
88
+ "betainc_regularized": "betainc",
89
+ }
90
+
91
+ NUMPY_TRANSLATIONS: dict[str, str] = {
92
+ "Heaviside": "heaviside",
93
+ }
94
+ SCIPY_TRANSLATIONS: dict[str, str] = {}
95
+ CUPY_TRANSLATIONS: dict[str, str] = {}
96
+ JAX_TRANSLATIONS: dict[str, str] = {}
97
+
98
+ TENSORFLOW_TRANSLATIONS: dict[str, str] = {}
99
+
100
+ NUMEXPR_TRANSLATIONS: dict[str, str] = {}
101
+
102
+ # Available modules:
103
+ MODULES = {
104
+ "math": (MATH, MATH_DEFAULT, MATH_TRANSLATIONS, ("from math import *",)),
105
+ "mpmath": (MPMATH, MPMATH_DEFAULT, MPMATH_TRANSLATIONS, ("from mpmath import *",)),
106
+ "numpy": (NUMPY, NUMPY_DEFAULT, NUMPY_TRANSLATIONS, ("import numpy; from numpy import *; from numpy.linalg import *",)),
107
+ "scipy": (SCIPY, SCIPY_DEFAULT, SCIPY_TRANSLATIONS, ("import scipy; import numpy; from scipy.special import *",)),
108
+ "cupy": (CUPY, CUPY_DEFAULT, CUPY_TRANSLATIONS, ("import cupy",)),
109
+ "jax": (JAX, JAX_DEFAULT, JAX_TRANSLATIONS, ("import jax",)),
110
+ "tensorflow": (TENSORFLOW, TENSORFLOW_DEFAULT, TENSORFLOW_TRANSLATIONS, ("import tensorflow",)),
111
+ "sympy": (SYMPY, SYMPY_DEFAULT, {}, (
112
+ "from sympy.functions import *",
113
+ "from sympy.matrices import *",
114
+ "from sympy import Integral, pi, oo, nan, zoo, E, I",)),
115
+ "numexpr" : (NUMEXPR, NUMEXPR_DEFAULT, NUMEXPR_TRANSLATIONS,
116
+ ("import_module('numexpr')", )),
117
+ }
118
+
119
+
120
+ def _import(module, reload=False):
121
+ """
122
+ Creates a global translation dictionary for module.
123
+
124
+ The argument module has to be one of the following strings: "math",
125
+ "mpmath", "numpy", "sympy", "tensorflow", "jax".
126
+ These dictionaries map names of Python functions to their equivalent in
127
+ other modules.
128
+ """
129
+ try:
130
+ namespace, namespace_default, translations, import_commands = MODULES[
131
+ module]
132
+ except KeyError:
133
+ raise NameError(
134
+ "'%s' module cannot be used for lambdification" % module)
135
+
136
+ # Clear namespace or exit
137
+ if namespace != namespace_default:
138
+ # The namespace was already generated, don't do it again if not forced.
139
+ if reload:
140
+ namespace.clear()
141
+ namespace.update(namespace_default)
142
+ else:
143
+ return
144
+
145
+ for import_command in import_commands:
146
+ if import_command.startswith('import_module'):
147
+ module = eval(import_command)
148
+
149
+ if module is not None:
150
+ namespace.update(module.__dict__)
151
+ continue
152
+ else:
153
+ try:
154
+ exec(import_command, {}, namespace)
155
+ continue
156
+ except ImportError:
157
+ pass
158
+
159
+ raise ImportError(
160
+ "Cannot import '%s' with '%s' command" % (module, import_command))
161
+
162
+ # Add translated names to namespace
163
+ for sympyname, translation in translations.items():
164
+ namespace[sympyname] = namespace[translation]
165
+
166
+ # For computing the modulus of a SymPy expression we use the builtin abs
167
+ # function, instead of the previously used fabs function for all
168
+ # translation modules. This is because the fabs function in the math
169
+ # module does not accept complex valued arguments. (see issue 9474). The
170
+ # only exception, where we don't use the builtin abs function is the
171
+ # mpmath translation module, because mpmath.fabs returns mpf objects in
172
+ # contrast to abs().
173
+ if 'Abs' not in namespace:
174
+ namespace['Abs'] = abs
175
+
176
+ # Used for dynamically generated filenames that are inserted into the
177
+ # linecache.
178
+ _lambdify_generated_counter = 1
179
+
180
+
181
+ @doctest_depends_on(modules=('numpy', 'scipy', 'tensorflow',), python_version=(3,))
182
+ def lambdify(args, expr, modules=None, printer=None, use_imps=True,
183
+ dummify=False, cse=False, docstring_limit=1000):
184
+ """Convert a SymPy expression into a function that allows for fast
185
+ numeric evaluation.
186
+
187
+ .. warning::
188
+ This function uses ``exec``, and thus should not be used on
189
+ unsanitized input.
190
+
191
+ .. deprecated:: 1.7
192
+ Passing a set for the *args* parameter is deprecated as sets are
193
+ unordered. Use an ordered iterable such as a list or tuple.
194
+
195
+ Explanation
196
+ ===========
197
+
198
+ For example, to convert the SymPy expression ``sin(x) + cos(x)`` to an
199
+ equivalent NumPy function that numerically evaluates it:
200
+
201
+ >>> from sympy import sin, cos, symbols, lambdify
202
+ >>> import numpy as np
203
+ >>> x = symbols('x')
204
+ >>> expr = sin(x) + cos(x)
205
+ >>> expr
206
+ sin(x) + cos(x)
207
+ >>> f = lambdify(x, expr, 'numpy')
208
+ >>> a = np.array([1, 2])
209
+ >>> f(a)
210
+ [1.38177329 0.49315059]
211
+
212
+ The primary purpose of this function is to provide a bridge from SymPy
213
+ expressions to numerical libraries such as NumPy, SciPy, NumExpr, mpmath,
214
+ and tensorflow. In general, SymPy functions do not work with objects from
215
+ other libraries, such as NumPy arrays, and functions from numeric
216
+ libraries like NumPy or mpmath do not work on SymPy expressions.
217
+ ``lambdify`` bridges the two by converting a SymPy expression to an
218
+ equivalent numeric function.
219
+
220
+ The basic workflow with ``lambdify`` is to first create a SymPy expression
221
+ representing whatever mathematical function you wish to evaluate. This
222
+ should be done using only SymPy functions and expressions. Then, use
223
+ ``lambdify`` to convert this to an equivalent function for numerical
224
+ evaluation. For instance, above we created ``expr`` using the SymPy symbol
225
+ ``x`` and SymPy functions ``sin`` and ``cos``, then converted it to an
226
+ equivalent NumPy function ``f``, and called it on a NumPy array ``a``.
227
+
228
+ Parameters
229
+ ==========
230
+
231
+ args : List[Symbol]
232
+ A variable or a list of variables whose nesting represents the
233
+ nesting of the arguments that will be passed to the function.
234
+
235
+ Variables can be symbols, undefined functions, or matrix symbols.
236
+
237
+ >>> from sympy import Eq
238
+ >>> from sympy.abc import x, y, z
239
+
240
+ The list of variables should match the structure of how the
241
+ arguments will be passed to the function. Simply enclose the
242
+ parameters as they will be passed in a list.
243
+
244
+ To call a function like ``f(x)`` then ``[x]``
245
+ should be the first argument to ``lambdify``; for this
246
+ case a single ``x`` can also be used:
247
+
248
+ >>> f = lambdify(x, x + 1)
249
+ >>> f(1)
250
+ 2
251
+ >>> f = lambdify([x], x + 1)
252
+ >>> f(1)
253
+ 2
254
+
255
+ To call a function like ``f(x, y)`` then ``[x, y]`` will
256
+ be the first argument of the ``lambdify``:
257
+
258
+ >>> f = lambdify([x, y], x + y)
259
+ >>> f(1, 1)
260
+ 2
261
+
262
+ To call a function with a single 3-element tuple like
263
+ ``f((x, y, z))`` then ``[(x, y, z)]`` will be the first
264
+ argument of the ``lambdify``:
265
+
266
+ >>> f = lambdify([(x, y, z)], Eq(z**2, x**2 + y**2))
267
+ >>> f((3, 4, 5))
268
+ True
269
+
270
+ If two args will be passed and the first is a scalar but
271
+ the second is a tuple with two arguments then the items
272
+ in the list should match that structure:
273
+
274
+ >>> f = lambdify([x, (y, z)], x + y + z)
275
+ >>> f(1, (2, 3))
276
+ 6
277
+
278
+ expr : Expr
279
+ An expression, list of expressions, or matrix to be evaluated.
280
+
281
+ Lists may be nested.
282
+ If the expression is a list, the output will also be a list.
283
+
284
+ >>> f = lambdify(x, [x, [x + 1, x + 2]])
285
+ >>> f(1)
286
+ [1, [2, 3]]
287
+
288
+ If it is a matrix, an array will be returned (for the NumPy module).
289
+
290
+ >>> from sympy import Matrix
291
+ >>> f = lambdify(x, Matrix([x, x + 1]))
292
+ >>> f(1)
293
+ [[1]
294
+ [2]]
295
+
296
+ Note that the argument order here (variables then expression) is used
297
+ to emulate the Python ``lambda`` keyword. ``lambdify(x, expr)`` works
298
+ (roughly) like ``lambda x: expr``
299
+ (see :ref:`lambdify-how-it-works` below).
300
+
301
+ modules : str, optional
302
+ Specifies the numeric library to use.
303
+
304
+ If not specified, *modules* defaults to:
305
+
306
+ - ``["scipy", "numpy"]`` if SciPy is installed
307
+ - ``["numpy"]`` if only NumPy is installed
308
+ - ``["math", "mpmath", "sympy"]`` if neither is installed.
309
+
310
+ That is, SymPy functions are replaced as far as possible by
311
+ either ``scipy`` or ``numpy`` functions if available, and Python's
312
+ standard library ``math``, or ``mpmath`` functions otherwise.
313
+
314
+ *modules* can be one of the following types:
315
+
316
+ - The strings ``"math"``, ``"mpmath"``, ``"numpy"``, ``"numexpr"``,
317
+ ``"scipy"``, ``"sympy"``, or ``"tensorflow"`` or ``"jax"``. This uses the
318
+ corresponding printer and namespace mapping for that module.
319
+ - A module (e.g., ``math``). This uses the global namespace of the
320
+ module. If the module is one of the above known modules, it will
321
+ also use the corresponding printer and namespace mapping
322
+ (i.e., ``modules=numpy`` is equivalent to ``modules="numpy"``).
323
+ - A dictionary that maps names of SymPy functions to arbitrary
324
+ functions
325
+ (e.g., ``{'sin': custom_sin}``).
326
+ - A list that contains a mix of the arguments above, with higher
327
+ priority given to entries appearing first
328
+ (e.g., to use the NumPy module but override the ``sin`` function
329
+ with a custom version, you can use
330
+ ``[{'sin': custom_sin}, 'numpy']``).
331
+
332
+ dummify : bool, optional
333
+ Whether or not the variables in the provided expression that are not
334
+ valid Python identifiers are substituted with dummy symbols.
335
+
336
+ This allows for undefined functions like ``Function('f')(t)`` to be
337
+ supplied as arguments. By default, the variables are only dummified
338
+ if they are not valid Python identifiers.
339
+
340
+ Set ``dummify=True`` to replace all arguments with dummy symbols
341
+ (if ``args`` is not a string) - for example, to ensure that the
342
+ arguments do not redefine any built-in names.
343
+
344
+ cse : bool, or callable, optional
345
+ Large expressions can be computed more efficiently when
346
+ common subexpressions are identified and precomputed before
347
+ being used multiple time. Finding the subexpressions will make
348
+ creation of the 'lambdify' function slower, however.
349
+
350
+ When ``True``, ``sympy.simplify.cse`` is used, otherwise (the default)
351
+ the user may pass a function matching the ``cse`` signature.
352
+
353
+ docstring_limit : int or None
354
+ When lambdifying large expressions, a significant proportion of the time
355
+ spent inside ``lambdify`` is spent producing a string representation of
356
+ the expression for use in the automatically generated docstring of the
357
+ returned function. For expressions containing hundreds or more nodes the
358
+ resulting docstring often becomes so long and dense that it is difficult
359
+ to read. To reduce the runtime of lambdify, the rendering of the full
360
+ expression inside the docstring can be disabled.
361
+
362
+ When ``None``, the full expression is rendered in the docstring. When
363
+ ``0`` or a negative ``int``, an ellipsis is rendering in the docstring
364
+ instead of the expression. When a strictly positive ``int``, if the
365
+ number of nodes in the expression exceeds ``docstring_limit`` an
366
+ ellipsis is rendered in the docstring, otherwise a string representation
367
+ of the expression is rendered as normal. The default is ``1000``.
368
+
369
+ Examples
370
+ ========
371
+
372
+ >>> from sympy.utilities.lambdify import implemented_function
373
+ >>> from sympy import sqrt, sin, Matrix
374
+ >>> from sympy import Function
375
+ >>> from sympy.abc import w, x, y, z
376
+
377
+ >>> f = lambdify(x, x**2)
378
+ >>> f(2)
379
+ 4
380
+ >>> f = lambdify((x, y, z), [z, y, x])
381
+ >>> f(1,2,3)
382
+ [3, 2, 1]
383
+ >>> f = lambdify(x, sqrt(x))
384
+ >>> f(4)
385
+ 2.0
386
+ >>> f = lambdify((x, y), sin(x*y)**2)
387
+ >>> f(0, 5)
388
+ 0.0
389
+ >>> row = lambdify((x, y), Matrix((x, x + y)).T, modules='sympy')
390
+ >>> row(1, 2)
391
+ Matrix([[1, 3]])
392
+
393
+ ``lambdify`` can be used to translate SymPy expressions into mpmath
394
+ functions. This may be preferable to using ``evalf`` (which uses mpmath on
395
+ the backend) in some cases.
396
+
397
+ >>> f = lambdify(x, sin(x), 'mpmath')
398
+ >>> f(1)
399
+ 0.8414709848078965
400
+
401
+ Tuple arguments are handled and the lambdified function should
402
+ be called with the same type of arguments as were used to create
403
+ the function:
404
+
405
+ >>> f = lambdify((x, (y, z)), x + y)
406
+ >>> f(1, (2, 4))
407
+ 3
408
+
409
+ The ``flatten`` function can be used to always work with flattened
410
+ arguments:
411
+
412
+ >>> from sympy.utilities.iterables import flatten
413
+ >>> args = w, (x, (y, z))
414
+ >>> vals = 1, (2, (3, 4))
415
+ >>> f = lambdify(flatten(args), w + x + y + z)
416
+ >>> f(*flatten(vals))
417
+ 10
418
+
419
+ Functions present in ``expr`` can also carry their own numerical
420
+ implementations, in a callable attached to the ``_imp_`` attribute. This
421
+ can be used with undefined functions using the ``implemented_function``
422
+ factory:
423
+
424
+ >>> f = implemented_function(Function('f'), lambda x: x+1)
425
+ >>> func = lambdify(x, f(x))
426
+ >>> func(4)
427
+ 5
428
+
429
+ ``lambdify`` always prefers ``_imp_`` implementations to implementations
430
+ in other namespaces, unless the ``use_imps`` input parameter is False.
431
+
432
+ Usage with Tensorflow:
433
+
434
+ >>> import tensorflow as tf
435
+ >>> from sympy import Max, sin, lambdify
436
+ >>> from sympy.abc import x
437
+
438
+ >>> f = Max(x, sin(x))
439
+ >>> func = lambdify(x, f, 'tensorflow')
440
+
441
+ After tensorflow v2, eager execution is enabled by default.
442
+ If you want to get the compatible result across tensorflow v1 and v2
443
+ as same as this tutorial, run this line.
444
+
445
+ >>> tf.compat.v1.enable_eager_execution()
446
+
447
+ If you have eager execution enabled, you can get the result out
448
+ immediately as you can use numpy.
449
+
450
+ If you pass tensorflow objects, you may get an ``EagerTensor``
451
+ object instead of value.
452
+
453
+ >>> result = func(tf.constant(1.0))
454
+ >>> print(result)
455
+ tf.Tensor(1.0, shape=(), dtype=float32)
456
+ >>> print(result.__class__)
457
+ <class 'tensorflow.python.framework.ops.EagerTensor'>
458
+
459
+ You can use ``.numpy()`` to get the numpy value of the tensor.
460
+
461
+ >>> result.numpy()
462
+ 1.0
463
+
464
+ >>> var = tf.Variable(2.0)
465
+ >>> result = func(var) # also works for tf.Variable and tf.Placeholder
466
+ >>> result.numpy()
467
+ 2.0
468
+
469
+ And it works with any shape array.
470
+
471
+ >>> tensor = tf.constant([[1.0, 2.0], [3.0, 4.0]])
472
+ >>> result = func(tensor)
473
+ >>> result.numpy()
474
+ [[1. 2.]
475
+ [3. 4.]]
476
+
477
+ Notes
478
+ =====
479
+
480
+ - For functions involving large array calculations, numexpr can provide a
481
+ significant speedup over numpy. Please note that the available functions
482
+ for numexpr are more limited than numpy but can be expanded with
483
+ ``implemented_function`` and user defined subclasses of Function. If
484
+ specified, numexpr may be the only option in modules. The official list
485
+ of numexpr functions can be found at:
486
+ https://numexpr.readthedocs.io/projects/NumExpr3/en/latest/user_guide.html#supported-functions
487
+
488
+ - In the above examples, the generated functions can accept scalar
489
+ values or numpy arrays as arguments. However, in some cases
490
+ the generated function relies on the input being a numpy array:
491
+
492
+ >>> import numpy
493
+ >>> from sympy import Piecewise
494
+ >>> from sympy.testing.pytest import ignore_warnings
495
+ >>> f = lambdify(x, Piecewise((x, x <= 1), (1/x, x > 1)), "numpy")
496
+
497
+ >>> with ignore_warnings(RuntimeWarning):
498
+ ... f(numpy.array([-1, 0, 1, 2]))
499
+ [-1. 0. 1. 0.5]
500
+
501
+ >>> f(0)
502
+ Traceback (most recent call last):
503
+ ...
504
+ ZeroDivisionError: division by zero
505
+
506
+ In such cases, the input should be wrapped in a numpy array:
507
+
508
+ >>> with ignore_warnings(RuntimeWarning):
509
+ ... float(f(numpy.array([0])))
510
+ 0.0
511
+
512
+ Or if numpy functionality is not required another module can be used:
513
+
514
+ >>> f = lambdify(x, Piecewise((x, x <= 1), (1/x, x > 1)), "math")
515
+ >>> f(0)
516
+ 0
517
+
518
+ .. _lambdify-how-it-works:
519
+
520
+ How it works
521
+ ============
522
+
523
+ When using this function, it helps a great deal to have an idea of what it
524
+ is doing. At its core, lambdify is nothing more than a namespace
525
+ translation, on top of a special printer that makes some corner cases work
526
+ properly.
527
+
528
+ To understand lambdify, first we must properly understand how Python
529
+ namespaces work. Say we had two files. One called ``sin_cos_sympy.py``,
530
+ with
531
+
532
+ .. code:: python
533
+
534
+ # sin_cos_sympy.py
535
+
536
+ from sympy.functions.elementary.trigonometric import (cos, sin)
537
+
538
+ def sin_cos(x):
539
+ return sin(x) + cos(x)
540
+
541
+
542
+ and one called ``sin_cos_numpy.py`` with
543
+
544
+ .. code:: python
545
+
546
+ # sin_cos_numpy.py
547
+
548
+ from numpy import sin, cos
549
+
550
+ def sin_cos(x):
551
+ return sin(x) + cos(x)
552
+
553
+ The two files define an identical function ``sin_cos``. However, in the
554
+ first file, ``sin`` and ``cos`` are defined as the SymPy ``sin`` and
555
+ ``cos``. In the second, they are defined as the NumPy versions.
556
+
557
+ If we were to import the first file and use the ``sin_cos`` function, we
558
+ would get something like
559
+
560
+ >>> from sin_cos_sympy import sin_cos # doctest: +SKIP
561
+ >>> sin_cos(1) # doctest: +SKIP
562
+ cos(1) + sin(1)
563
+
564
+ On the other hand, if we imported ``sin_cos`` from the second file, we
565
+ would get
566
+
567
+ >>> from sin_cos_numpy import sin_cos # doctest: +SKIP
568
+ >>> sin_cos(1) # doctest: +SKIP
569
+ 1.38177329068
570
+
571
+ In the first case we got a symbolic output, because it used the symbolic
572
+ ``sin`` and ``cos`` functions from SymPy. In the second, we got a numeric
573
+ result, because ``sin_cos`` used the numeric ``sin`` and ``cos`` functions
574
+ from NumPy. But notice that the versions of ``sin`` and ``cos`` that were
575
+ used was not inherent to the ``sin_cos`` function definition. Both
576
+ ``sin_cos`` definitions are exactly the same. Rather, it was based on the
577
+ names defined at the module where the ``sin_cos`` function was defined.
578
+
579
+ The key point here is that when function in Python references a name that
580
+ is not defined in the function, that name is looked up in the "global"
581
+ namespace of the module where that function is defined.
582
+
583
+ Now, in Python, we can emulate this behavior without actually writing a
584
+ file to disk using the ``exec`` function. ``exec`` takes a string
585
+ containing a block of Python code, and a dictionary that should contain
586
+ the global variables of the module. It then executes the code "in" that
587
+ dictionary, as if it were the module globals. The following is equivalent
588
+ to the ``sin_cos`` defined in ``sin_cos_sympy.py``:
589
+
590
+ >>> import sympy
591
+ >>> module_dictionary = {'sin': sympy.sin, 'cos': sympy.cos}
592
+ >>> exec('''
593
+ ... def sin_cos(x):
594
+ ... return sin(x) + cos(x)
595
+ ... ''', module_dictionary)
596
+ >>> sin_cos = module_dictionary['sin_cos']
597
+ >>> sin_cos(1)
598
+ cos(1) + sin(1)
599
+
600
+ and similarly with ``sin_cos_numpy``:
601
+
602
+ >>> import numpy
603
+ >>> module_dictionary = {'sin': numpy.sin, 'cos': numpy.cos}
604
+ >>> exec('''
605
+ ... def sin_cos(x):
606
+ ... return sin(x) + cos(x)
607
+ ... ''', module_dictionary)
608
+ >>> sin_cos = module_dictionary['sin_cos']
609
+ >>> sin_cos(1)
610
+ 1.38177329068
611
+
612
+ So now we can get an idea of how ``lambdify`` works. The name "lambdify"
613
+ comes from the fact that we can think of something like ``lambdify(x,
614
+ sin(x) + cos(x), 'numpy')`` as ``lambda x: sin(x) + cos(x)``, where
615
+ ``sin`` and ``cos`` come from the ``numpy`` namespace. This is also why
616
+ the symbols argument is first in ``lambdify``, as opposed to most SymPy
617
+ functions where it comes after the expression: to better mimic the
618
+ ``lambda`` keyword.
619
+
620
+ ``lambdify`` takes the input expression (like ``sin(x) + cos(x)``) and
621
+
622
+ 1. Converts it to a string
623
+ 2. Creates a module globals dictionary based on the modules that are
624
+ passed in (by default, it uses the NumPy module)
625
+ 3. Creates the string ``"def func({vars}): return {expr}"``, where ``{vars}`` is the
626
+ list of variables separated by commas, and ``{expr}`` is the string
627
+ created in step 1., then ``exec``s that string with the module globals
628
+ namespace and returns ``func``.
629
+
630
+ In fact, functions returned by ``lambdify`` support inspection. So you can
631
+ see exactly how they are defined by using ``inspect.getsource``, or ``??`` if you
632
+ are using IPython or the Jupyter notebook.
633
+
634
+ >>> f = lambdify(x, sin(x) + cos(x))
635
+ >>> import inspect
636
+ >>> print(inspect.getsource(f))
637
+ def _lambdifygenerated(x):
638
+ return sin(x) + cos(x)
639
+
640
+ This shows us the source code of the function, but not the namespace it
641
+ was defined in. We can inspect that by looking at the ``__globals__``
642
+ attribute of ``f``:
643
+
644
+ >>> f.__globals__['sin']
645
+ <ufunc 'sin'>
646
+ >>> f.__globals__['cos']
647
+ <ufunc 'cos'>
648
+ >>> f.__globals__['sin'] is numpy.sin
649
+ True
650
+
651
+ This shows us that ``sin`` and ``cos`` in the namespace of ``f`` will be
652
+ ``numpy.sin`` and ``numpy.cos``.
653
+
654
+ Note that there are some convenience layers in each of these steps, but at
655
+ the core, this is how ``lambdify`` works. Step 1 is done using the
656
+ ``LambdaPrinter`` printers defined in the printing module (see
657
+ :mod:`sympy.printing.lambdarepr`). This allows different SymPy expressions
658
+ to define how they should be converted to a string for different modules.
659
+ You can change which printer ``lambdify`` uses by passing a custom printer
660
+ in to the ``printer`` argument.
661
+
662
+ Step 2 is augmented by certain translations. There are default
663
+ translations for each module, but you can provide your own by passing a
664
+ list to the ``modules`` argument. For instance,
665
+
666
+ >>> def mysin(x):
667
+ ... print('taking the sin of', x)
668
+ ... return numpy.sin(x)
669
+ ...
670
+ >>> f = lambdify(x, sin(x), [{'sin': mysin}, 'numpy'])
671
+ >>> f(1)
672
+ taking the sin of 1
673
+ 0.8414709848078965
674
+
675
+ The globals dictionary is generated from the list by merging the
676
+ dictionary ``{'sin': mysin}`` and the module dictionary for NumPy. The
677
+ merging is done so that earlier items take precedence, which is why
678
+ ``mysin`` is used above instead of ``numpy.sin``.
679
+
680
+ If you want to modify the way ``lambdify`` works for a given function, it
681
+ is usually easiest to do so by modifying the globals dictionary as such.
682
+ In more complicated cases, it may be necessary to create and pass in a
683
+ custom printer.
684
+
685
+ Finally, step 3 is augmented with certain convenience operations, such as
686
+ the addition of a docstring.
687
+
688
+ Understanding how ``lambdify`` works can make it easier to avoid certain
689
+ gotchas when using it. For instance, a common mistake is to create a
690
+ lambdified function for one module (say, NumPy), and pass it objects from
691
+ another (say, a SymPy expression).
692
+
693
+ For instance, say we create
694
+
695
+ >>> from sympy.abc import x
696
+ >>> f = lambdify(x, x + 1, 'numpy')
697
+
698
+ Now if we pass in a NumPy array, we get that array plus 1
699
+
700
+ >>> import numpy
701
+ >>> a = numpy.array([1, 2])
702
+ >>> f(a)
703
+ [2 3]
704
+
705
+ But what happens if you make the mistake of passing in a SymPy expression
706
+ instead of a NumPy array:
707
+
708
+ >>> f(x + 1)
709
+ x + 2
710
+
711
+ This worked, but it was only by accident. Now take a different lambdified
712
+ function:
713
+
714
+ >>> from sympy import sin
715
+ >>> g = lambdify(x, x + sin(x), 'numpy')
716
+
717
+ This works as expected on NumPy arrays:
718
+
719
+ >>> g(a)
720
+ [1.84147098 2.90929743]
721
+
722
+ But if we try to pass in a SymPy expression, it fails
723
+
724
+ >>> try:
725
+ ... g(x + 1)
726
+ ... # NumPy release after 1.17 raises TypeError instead of
727
+ ... # AttributeError
728
+ ... except (AttributeError, TypeError):
729
+ ... raise AttributeError() # doctest: +IGNORE_EXCEPTION_DETAIL
730
+ Traceback (most recent call last):
731
+ ...
732
+ AttributeError:
733
+
734
+ Now, let's look at what happened. The reason this fails is that ``g``
735
+ calls ``numpy.sin`` on the input expression, and ``numpy.sin`` does not
736
+ know how to operate on a SymPy object. **As a general rule, NumPy
737
+ functions do not know how to operate on SymPy expressions, and SymPy
738
+ functions do not know how to operate on NumPy arrays. This is why lambdify
739
+ exists: to provide a bridge between SymPy and NumPy.**
740
+
741
+ However, why is it that ``f`` did work? That's because ``f`` does not call
742
+ any functions, it only adds 1. So the resulting function that is created,
743
+ ``def _lambdifygenerated(x): return x + 1`` does not depend on the globals
744
+ namespace it is defined in. Thus it works, but only by accident. A future
745
+ version of ``lambdify`` may remove this behavior.
746
+
747
+ Be aware that certain implementation details described here may change in
748
+ future versions of SymPy. The API of passing in custom modules and
749
+ printers will not change, but the details of how a lambda function is
750
+ created may change. However, the basic idea will remain the same, and
751
+ understanding it will be helpful to understanding the behavior of
752
+ lambdify.
753
+
754
+ **In general: you should create lambdified functions for one module (say,
755
+ NumPy), and only pass it input types that are compatible with that module
756
+ (say, NumPy arrays).** Remember that by default, if the ``module``
757
+ argument is not provided, ``lambdify`` creates functions using the NumPy
758
+ and SciPy namespaces.
759
+ """
760
+ from sympy.core.symbol import Symbol
761
+ from sympy.core.expr import Expr
762
+
763
+ # If the user hasn't specified any modules, use what is available.
764
+ if modules is None:
765
+ try:
766
+ _import("scipy")
767
+ except ImportError:
768
+ try:
769
+ _import("numpy")
770
+ except ImportError:
771
+ # Use either numpy (if available) or python.math where possible.
772
+ # XXX: This leads to different behaviour on different systems and
773
+ # might be the reason for irreproducible errors.
774
+ modules = ["math", "mpmath", "sympy"]
775
+ else:
776
+ modules = ["numpy"]
777
+ else:
778
+ modules = ["numpy", "scipy"]
779
+
780
+ # Get the needed namespaces.
781
+ namespaces = []
782
+ # First find any function implementations
783
+ if use_imps:
784
+ namespaces.append(_imp_namespace(expr))
785
+ # Check for dict before iterating
786
+ if isinstance(modules, (dict, str)) or not hasattr(modules, '__iter__'):
787
+ namespaces.append(modules)
788
+ else:
789
+ # consistency check
790
+ if _module_present('numexpr', modules) and len(modules) > 1:
791
+ raise TypeError("numexpr must be the only item in 'modules'")
792
+ namespaces += list(modules)
793
+ # fill namespace with first having highest priority
794
+ namespace = {}
795
+ for m in namespaces[::-1]:
796
+ buf = _get_namespace(m)
797
+ namespace.update(buf)
798
+
799
+ if hasattr(expr, "atoms"):
800
+ #Try if you can extract symbols from the expression.
801
+ #Move on if expr.atoms in not implemented.
802
+ syms = expr.atoms(Symbol)
803
+ for term in syms:
804
+ namespace.update({str(term): term})
805
+
806
+ if printer is None:
807
+ if _module_present('mpmath', namespaces):
808
+ from sympy.printing.pycode import MpmathPrinter as Printer # type: ignore
809
+ elif _module_present('scipy', namespaces):
810
+ from sympy.printing.numpy import SciPyPrinter as Printer # type: ignore
811
+ elif _module_present('numpy', namespaces):
812
+ from sympy.printing.numpy import NumPyPrinter as Printer # type: ignore
813
+ elif _module_present('cupy', namespaces):
814
+ from sympy.printing.numpy import CuPyPrinter as Printer # type: ignore
815
+ elif _module_present('jax', namespaces):
816
+ from sympy.printing.numpy import JaxPrinter as Printer # type: ignore
817
+ elif _module_present('numexpr', namespaces):
818
+ from sympy.printing.lambdarepr import NumExprPrinter as Printer # type: ignore
819
+ elif _module_present('tensorflow', namespaces):
820
+ from sympy.printing.tensorflow import TensorflowPrinter as Printer # type: ignore
821
+ elif _module_present('sympy', namespaces):
822
+ from sympy.printing.pycode import SymPyPrinter as Printer # type: ignore
823
+ else:
824
+ from sympy.printing.pycode import PythonCodePrinter as Printer # type: ignore
825
+ user_functions = {}
826
+ for m in namespaces[::-1]:
827
+ if isinstance(m, dict):
828
+ for k in m:
829
+ user_functions[k] = k
830
+ printer = Printer({'fully_qualified_modules': False, 'inline': True,
831
+ 'allow_unknown_functions': True,
832
+ 'user_functions': user_functions})
833
+
834
+ if isinstance(args, set):
835
+ sympy_deprecation_warning(
836
+ """
837
+ Passing the function arguments to lambdify() as a set is deprecated. This
838
+ leads to unpredictable results since sets are unordered. Instead, use a list
839
+ or tuple for the function arguments.
840
+ """,
841
+ deprecated_since_version="1.6.3",
842
+ active_deprecations_target="deprecated-lambdify-arguments-set",
843
+ )
844
+
845
+ # Get the names of the args, for creating a docstring
846
+ iterable_args = (args,) if isinstance(args, Expr) else args
847
+ names = []
848
+
849
+ # Grab the callers frame, for getting the names by inspection (if needed)
850
+ callers_local_vars = inspect.currentframe().f_back.f_locals.items() # type: ignore
851
+ for n, var in enumerate(iterable_args):
852
+ if hasattr(var, 'name'):
853
+ names.append(var.name)
854
+ else:
855
+ # It's an iterable. Try to get name by inspection of calling frame.
856
+ name_list = [var_name for var_name, var_val in callers_local_vars
857
+ if var_val is var]
858
+ if len(name_list) == 1:
859
+ names.append(name_list[0])
860
+ else:
861
+ # Cannot infer name with certainty. arg_# will have to do.
862
+ names.append('arg_' + str(n))
863
+
864
+ # Create the function definition code and execute it
865
+ funcname = '_lambdifygenerated'
866
+ if _module_present('tensorflow', namespaces):
867
+ funcprinter = _TensorflowEvaluatorPrinter(printer, dummify)
868
+ else:
869
+ funcprinter = _EvaluatorPrinter(printer, dummify)
870
+
871
+ if cse == True:
872
+ from sympy.simplify.cse_main import cse as _cse
873
+ cses, _expr = _cse(expr, list=False)
874
+ elif callable(cse):
875
+ cses, _expr = cse(expr)
876
+ else:
877
+ cses, _expr = (), expr
878
+ funcstr = funcprinter.doprint(funcname, iterable_args, _expr, cses=cses)
879
+
880
+ # Collect the module imports from the code printers.
881
+ imp_mod_lines = []
882
+ for mod, keys in (getattr(printer, 'module_imports', None) or {}).items():
883
+ for k in keys:
884
+ if k not in namespace:
885
+ ln = "from %s import %s" % (mod, k)
886
+ try:
887
+ exec(ln, {}, namespace)
888
+ except ImportError:
889
+ # Tensorflow 2.0 has issues with importing a specific
890
+ # function from its submodule.
891
+ # https://github.com/tensorflow/tensorflow/issues/33022
892
+ ln = "%s = %s.%s" % (k, mod, k)
893
+ exec(ln, {}, namespace)
894
+ imp_mod_lines.append(ln)
895
+
896
+ # Provide lambda expression with builtins, and compatible implementation of range
897
+ namespace.update({'builtins':builtins, 'range':range})
898
+
899
+ funclocals = {}
900
+ global _lambdify_generated_counter
901
+ filename = '<lambdifygenerated-%s>' % _lambdify_generated_counter
902
+ _lambdify_generated_counter += 1
903
+ c = compile(funcstr, filename, 'exec')
904
+ exec(c, namespace, funclocals)
905
+ # mtime has to be None or else linecache.checkcache will remove it
906
+ linecache.cache[filename] = (len(funcstr), None, funcstr.splitlines(True), filename) # type: ignore
907
+
908
+ func = funclocals[funcname]
909
+
910
+ # Apply the docstring
911
+ sig = "func({})".format(", ".join(str(i) for i in names))
912
+ sig = textwrap.fill(sig, subsequent_indent=' '*8)
913
+ if _too_large_for_docstring(expr, docstring_limit):
914
+ expr_str = 'EXPRESSION REDACTED DUE TO LENGTH'
915
+ src_str = 'SOURCE CODE REDACTED DUE TO LENGTH'
916
+ else:
917
+ expr_str = str(expr)
918
+ if len(expr_str) > 78:
919
+ expr_str = textwrap.wrap(expr_str, 75)[0] + '...'
920
+ src_str = funcstr
921
+ func.__doc__ = (
922
+ "Created with lambdify. Signature:\n\n"
923
+ "{sig}\n\n"
924
+ "Expression:\n\n"
925
+ "{expr}\n\n"
926
+ "Source code:\n\n"
927
+ "{src}\n\n"
928
+ "Imported modules:\n\n"
929
+ "{imp_mods}"
930
+ ).format(sig=sig, expr=expr_str, src=src_str, imp_mods='\n'.join(imp_mod_lines))
931
+ return func
932
+
933
+ def _module_present(modname, modlist):
934
+ if modname in modlist:
935
+ return True
936
+ for m in modlist:
937
+ if hasattr(m, '__name__') and m.__name__ == modname:
938
+ return True
939
+ return False
940
+
941
+ def _get_namespace(m):
942
+ """
943
+ This is used by _lambdify to parse its arguments.
944
+ """
945
+ if isinstance(m, str):
946
+ _import(m)
947
+ return MODULES[m][0]
948
+ elif isinstance(m, dict):
949
+ return m
950
+ elif hasattr(m, "__dict__"):
951
+ return m.__dict__
952
+ else:
953
+ raise TypeError("Argument must be either a string, dict or module but it is: %s" % m)
954
+
955
+
956
+ def _recursive_to_string(doprint, arg):
957
+ """Functions in lambdify accept both SymPy types and non-SymPy types such as python
958
+ lists and tuples. This method ensures that we only call the doprint method of the
959
+ printer with SymPy types (so that the printer safely can use SymPy-methods)."""
960
+ from sympy.matrices.common import MatrixOperations
961
+ from sympy.core.basic import Basic
962
+
963
+ if isinstance(arg, (Basic, MatrixOperations)):
964
+ return doprint(arg)
965
+ elif iterable(arg):
966
+ if isinstance(arg, list):
967
+ left, right = "[", "]"
968
+ elif isinstance(arg, tuple):
969
+ left, right = "(", ",)"
970
+ else:
971
+ raise NotImplementedError("unhandled type: %s, %s" % (type(arg), arg))
972
+ return left +', '.join(_recursive_to_string(doprint, e) for e in arg) + right
973
+ elif isinstance(arg, str):
974
+ return arg
975
+ else:
976
+ return doprint(arg)
977
+
978
+
979
+ def lambdastr(args, expr, printer=None, dummify=None):
980
+ """
981
+ Returns a string that can be evaluated to a lambda function.
982
+
983
+ Examples
984
+ ========
985
+
986
+ >>> from sympy.abc import x, y, z
987
+ >>> from sympy.utilities.lambdify import lambdastr
988
+ >>> lambdastr(x, x**2)
989
+ 'lambda x: (x**2)'
990
+ >>> lambdastr((x,y,z), [z,y,x])
991
+ 'lambda x,y,z: ([z, y, x])'
992
+
993
+ Although tuples may not appear as arguments to lambda in Python 3,
994
+ lambdastr will create a lambda function that will unpack the original
995
+ arguments so that nested arguments can be handled:
996
+
997
+ >>> lambdastr((x, (y, z)), x + y)
998
+ 'lambda _0,_1: (lambda x,y,z: (x + y))(_0,_1[0],_1[1])'
999
+ """
1000
+ # Transforming everything to strings.
1001
+ from sympy.matrices import DeferredVector
1002
+ from sympy.core.basic import Basic
1003
+ from sympy.core.function import (Derivative, Function)
1004
+ from sympy.core.symbol import (Dummy, Symbol)
1005
+ from sympy.core.sympify import sympify
1006
+
1007
+ if printer is not None:
1008
+ if inspect.isfunction(printer):
1009
+ lambdarepr = printer
1010
+ else:
1011
+ if inspect.isclass(printer):
1012
+ lambdarepr = lambda expr: printer().doprint(expr)
1013
+ else:
1014
+ lambdarepr = lambda expr: printer.doprint(expr)
1015
+ else:
1016
+ #XXX: This has to be done here because of circular imports
1017
+ from sympy.printing.lambdarepr import lambdarepr
1018
+
1019
+ def sub_args(args, dummies_dict):
1020
+ if isinstance(args, str):
1021
+ return args
1022
+ elif isinstance(args, DeferredVector):
1023
+ return str(args)
1024
+ elif iterable(args):
1025
+ dummies = flatten([sub_args(a, dummies_dict) for a in args])
1026
+ return ",".join(str(a) for a in dummies)
1027
+ else:
1028
+ # replace these with Dummy symbols
1029
+ if isinstance(args, (Function, Symbol, Derivative)):
1030
+ dummies = Dummy()
1031
+ dummies_dict.update({args : dummies})
1032
+ return str(dummies)
1033
+ else:
1034
+ return str(args)
1035
+
1036
+ def sub_expr(expr, dummies_dict):
1037
+ expr = sympify(expr)
1038
+ # dict/tuple are sympified to Basic
1039
+ if isinstance(expr, Basic):
1040
+ expr = expr.xreplace(dummies_dict)
1041
+ # list is not sympified to Basic
1042
+ elif isinstance(expr, list):
1043
+ expr = [sub_expr(a, dummies_dict) for a in expr]
1044
+ return expr
1045
+
1046
+ # Transform args
1047
+ def isiter(l):
1048
+ return iterable(l, exclude=(str, DeferredVector, NotIterable))
1049
+
1050
+ def flat_indexes(iterable):
1051
+ n = 0
1052
+
1053
+ for el in iterable:
1054
+ if isiter(el):
1055
+ for ndeep in flat_indexes(el):
1056
+ yield (n,) + ndeep
1057
+ else:
1058
+ yield (n,)
1059
+
1060
+ n += 1
1061
+
1062
+ if dummify is None:
1063
+ dummify = any(isinstance(a, Basic) and
1064
+ a.atoms(Function, Derivative) for a in (
1065
+ args if isiter(args) else [args]))
1066
+
1067
+ if isiter(args) and any(isiter(i) for i in args):
1068
+ dum_args = [str(Dummy(str(i))) for i in range(len(args))]
1069
+
1070
+ indexed_args = ','.join([
1071
+ dum_args[ind[0]] + ''.join(["[%s]" % k for k in ind[1:]])
1072
+ for ind in flat_indexes(args)])
1073
+
1074
+ lstr = lambdastr(flatten(args), expr, printer=printer, dummify=dummify)
1075
+
1076
+ return 'lambda %s: (%s)(%s)' % (','.join(dum_args), lstr, indexed_args)
1077
+
1078
+ dummies_dict = {}
1079
+ if dummify:
1080
+ args = sub_args(args, dummies_dict)
1081
+ else:
1082
+ if isinstance(args, str):
1083
+ pass
1084
+ elif iterable(args, exclude=DeferredVector):
1085
+ args = ",".join(str(a) for a in args)
1086
+
1087
+ # Transform expr
1088
+ if dummify:
1089
+ if isinstance(expr, str):
1090
+ pass
1091
+ else:
1092
+ expr = sub_expr(expr, dummies_dict)
1093
+ expr = _recursive_to_string(lambdarepr, expr)
1094
+ return "lambda %s: (%s)" % (args, expr)
1095
+
1096
+ class _EvaluatorPrinter:
1097
+ def __init__(self, printer=None, dummify=False):
1098
+ self._dummify = dummify
1099
+
1100
+ #XXX: This has to be done here because of circular imports
1101
+ from sympy.printing.lambdarepr import LambdaPrinter
1102
+
1103
+ if printer is None:
1104
+ printer = LambdaPrinter()
1105
+
1106
+ if inspect.isfunction(printer):
1107
+ self._exprrepr = printer
1108
+ else:
1109
+ if inspect.isclass(printer):
1110
+ printer = printer()
1111
+
1112
+ self._exprrepr = printer.doprint
1113
+
1114
+ #if hasattr(printer, '_print_Symbol'):
1115
+ # symbolrepr = printer._print_Symbol
1116
+
1117
+ #if hasattr(printer, '_print_Dummy'):
1118
+ # dummyrepr = printer._print_Dummy
1119
+
1120
+ # Used to print the generated function arguments in a standard way
1121
+ self._argrepr = LambdaPrinter().doprint
1122
+
1123
+ def doprint(self, funcname, args, expr, *, cses=()):
1124
+ """
1125
+ Returns the function definition code as a string.
1126
+ """
1127
+ from sympy.core.symbol import Dummy
1128
+
1129
+ funcbody = []
1130
+
1131
+ if not iterable(args):
1132
+ args = [args]
1133
+
1134
+ if cses:
1135
+ subvars, subexprs = zip(*cses)
1136
+ exprs = [expr] + list(subexprs)
1137
+ argstrs, exprs = self._preprocess(args, exprs)
1138
+ expr, subexprs = exprs[0], exprs[1:]
1139
+ cses = zip(subvars, subexprs)
1140
+ else:
1141
+ argstrs, expr = self._preprocess(args, expr)
1142
+
1143
+ # Generate argument unpacking and final argument list
1144
+ funcargs = []
1145
+ unpackings = []
1146
+
1147
+ for argstr in argstrs:
1148
+ if iterable(argstr):
1149
+ funcargs.append(self._argrepr(Dummy()))
1150
+ unpackings.extend(self._print_unpacking(argstr, funcargs[-1]))
1151
+ else:
1152
+ funcargs.append(argstr)
1153
+
1154
+ funcsig = 'def {}({}):'.format(funcname, ', '.join(funcargs))
1155
+
1156
+ # Wrap input arguments before unpacking
1157
+ funcbody.extend(self._print_funcargwrapping(funcargs))
1158
+
1159
+ funcbody.extend(unpackings)
1160
+
1161
+ for s, e in cses:
1162
+ if e is None:
1163
+ funcbody.append('del {}'.format(s))
1164
+ else:
1165
+ funcbody.append('{} = {}'.format(s, self._exprrepr(e)))
1166
+
1167
+ str_expr = _recursive_to_string(self._exprrepr, expr)
1168
+
1169
+ if '\n' in str_expr:
1170
+ str_expr = '({})'.format(str_expr)
1171
+ funcbody.append('return {}'.format(str_expr))
1172
+
1173
+ funclines = [funcsig]
1174
+ funclines.extend([' ' + line for line in funcbody])
1175
+
1176
+ return '\n'.join(funclines) + '\n'
1177
+
1178
+ @classmethod
1179
+ def _is_safe_ident(cls, ident):
1180
+ return isinstance(ident, str) and ident.isidentifier() \
1181
+ and not keyword.iskeyword(ident)
1182
+
1183
+ def _preprocess(self, args, expr):
1184
+ """Preprocess args, expr to replace arguments that do not map
1185
+ to valid Python identifiers.
1186
+
1187
+ Returns string form of args, and updated expr.
1188
+ """
1189
+ from sympy.core.basic import Basic
1190
+ from sympy.core.sorting import ordered
1191
+ from sympy.core.function import (Derivative, Function)
1192
+ from sympy.core.symbol import Dummy, uniquely_named_symbol
1193
+ from sympy.matrices import DeferredVector
1194
+ from sympy.core.expr import Expr
1195
+
1196
+ # Args of type Dummy can cause name collisions with args
1197
+ # of type Symbol. Force dummify of everything in this
1198
+ # situation.
1199
+ dummify = self._dummify or any(
1200
+ isinstance(arg, Dummy) for arg in flatten(args))
1201
+
1202
+ argstrs = [None]*len(args)
1203
+ for arg, i in reversed(list(ordered(zip(args, range(len(args)))))):
1204
+ if iterable(arg):
1205
+ s, expr = self._preprocess(arg, expr)
1206
+ elif isinstance(arg, DeferredVector):
1207
+ s = str(arg)
1208
+ elif isinstance(arg, Basic) and arg.is_symbol:
1209
+ s = self._argrepr(arg)
1210
+ if dummify or not self._is_safe_ident(s):
1211
+ dummy = Dummy()
1212
+ if isinstance(expr, Expr):
1213
+ dummy = uniquely_named_symbol(
1214
+ dummy.name, expr, modify=lambda s: '_' + s)
1215
+ s = self._argrepr(dummy)
1216
+ expr = self._subexpr(expr, {arg: dummy})
1217
+ elif dummify or isinstance(arg, (Function, Derivative)):
1218
+ dummy = Dummy()
1219
+ s = self._argrepr(dummy)
1220
+ expr = self._subexpr(expr, {arg: dummy})
1221
+ else:
1222
+ s = str(arg)
1223
+ argstrs[i] = s
1224
+ return argstrs, expr
1225
+
1226
+ def _subexpr(self, expr, dummies_dict):
1227
+ from sympy.matrices import DeferredVector
1228
+ from sympy.core.sympify import sympify
1229
+
1230
+ expr = sympify(expr)
1231
+ xreplace = getattr(expr, 'xreplace', None)
1232
+ if xreplace is not None:
1233
+ expr = xreplace(dummies_dict)
1234
+ else:
1235
+ if isinstance(expr, DeferredVector):
1236
+ pass
1237
+ elif isinstance(expr, dict):
1238
+ k = [self._subexpr(sympify(a), dummies_dict) for a in expr.keys()]
1239
+ v = [self._subexpr(sympify(a), dummies_dict) for a in expr.values()]
1240
+ expr = dict(zip(k, v))
1241
+ elif isinstance(expr, tuple):
1242
+ expr = tuple(self._subexpr(sympify(a), dummies_dict) for a in expr)
1243
+ elif isinstance(expr, list):
1244
+ expr = [self._subexpr(sympify(a), dummies_dict) for a in expr]
1245
+ return expr
1246
+
1247
+ def _print_funcargwrapping(self, args):
1248
+ """Generate argument wrapping code.
1249
+
1250
+ args is the argument list of the generated function (strings).
1251
+
1252
+ Return value is a list of lines of code that will be inserted at
1253
+ the beginning of the function definition.
1254
+ """
1255
+ return []
1256
+
1257
+ def _print_unpacking(self, unpackto, arg):
1258
+ """Generate argument unpacking code.
1259
+
1260
+ arg is the function argument to be unpacked (a string), and
1261
+ unpackto is a list or nested lists of the variable names (strings) to
1262
+ unpack to.
1263
+ """
1264
+ def unpack_lhs(lvalues):
1265
+ return '[{}]'.format(', '.join(
1266
+ unpack_lhs(val) if iterable(val) else val for val in lvalues))
1267
+
1268
+ return ['{} = {}'.format(unpack_lhs(unpackto), arg)]
1269
+
1270
+ class _TensorflowEvaluatorPrinter(_EvaluatorPrinter):
1271
+ def _print_unpacking(self, lvalues, rvalue):
1272
+ """Generate argument unpacking code.
1273
+
1274
+ This method is used when the input value is not interable,
1275
+ but can be indexed (see issue #14655).
1276
+ """
1277
+
1278
+ def flat_indexes(elems):
1279
+ n = 0
1280
+
1281
+ for el in elems:
1282
+ if iterable(el):
1283
+ for ndeep in flat_indexes(el):
1284
+ yield (n,) + ndeep
1285
+ else:
1286
+ yield (n,)
1287
+
1288
+ n += 1
1289
+
1290
+ indexed = ', '.join('{}[{}]'.format(rvalue, ']['.join(map(str, ind)))
1291
+ for ind in flat_indexes(lvalues))
1292
+
1293
+ return ['[{}] = [{}]'.format(', '.join(flatten(lvalues)), indexed)]
1294
+
1295
+ def _imp_namespace(expr, namespace=None):
1296
+ """ Return namespace dict with function implementations
1297
+
1298
+ We need to search for functions in anything that can be thrown at
1299
+ us - that is - anything that could be passed as ``expr``. Examples
1300
+ include SymPy expressions, as well as tuples, lists and dicts that may
1301
+ contain SymPy expressions.
1302
+
1303
+ Parameters
1304
+ ----------
1305
+ expr : object
1306
+ Something passed to lambdify, that will generate valid code from
1307
+ ``str(expr)``.
1308
+ namespace : None or mapping
1309
+ Namespace to fill. None results in new empty dict
1310
+
1311
+ Returns
1312
+ -------
1313
+ namespace : dict
1314
+ dict with keys of implemented function names within ``expr`` and
1315
+ corresponding values being the numerical implementation of
1316
+ function
1317
+
1318
+ Examples
1319
+ ========
1320
+
1321
+ >>> from sympy.abc import x
1322
+ >>> from sympy.utilities.lambdify import implemented_function, _imp_namespace
1323
+ >>> from sympy import Function
1324
+ >>> f = implemented_function(Function('f'), lambda x: x+1)
1325
+ >>> g = implemented_function(Function('g'), lambda x: x*10)
1326
+ >>> namespace = _imp_namespace(f(g(x)))
1327
+ >>> sorted(namespace.keys())
1328
+ ['f', 'g']
1329
+ """
1330
+ # Delayed import to avoid circular imports
1331
+ from sympy.core.function import FunctionClass
1332
+ if namespace is None:
1333
+ namespace = {}
1334
+ # tuples, lists, dicts are valid expressions
1335
+ if is_sequence(expr):
1336
+ for arg in expr:
1337
+ _imp_namespace(arg, namespace)
1338
+ return namespace
1339
+ elif isinstance(expr, dict):
1340
+ for key, val in expr.items():
1341
+ # functions can be in dictionary keys
1342
+ _imp_namespace(key, namespace)
1343
+ _imp_namespace(val, namespace)
1344
+ return namespace
1345
+ # SymPy expressions may be Functions themselves
1346
+ func = getattr(expr, 'func', None)
1347
+ if isinstance(func, FunctionClass):
1348
+ imp = getattr(func, '_imp_', None)
1349
+ if imp is not None:
1350
+ name = expr.func.__name__
1351
+ if name in namespace and namespace[name] != imp:
1352
+ raise ValueError('We found more than one '
1353
+ 'implementation with name '
1354
+ '"%s"' % name)
1355
+ namespace[name] = imp
1356
+ # and / or they may take Functions as arguments
1357
+ if hasattr(expr, 'args'):
1358
+ for arg in expr.args:
1359
+ _imp_namespace(arg, namespace)
1360
+ return namespace
1361
+
1362
+
1363
+ def implemented_function(symfunc, implementation):
1364
+ """ Add numerical ``implementation`` to function ``symfunc``.
1365
+
1366
+ ``symfunc`` can be an ``UndefinedFunction`` instance, or a name string.
1367
+ In the latter case we create an ``UndefinedFunction`` instance with that
1368
+ name.
1369
+
1370
+ Be aware that this is a quick workaround, not a general method to create
1371
+ special symbolic functions. If you want to create a symbolic function to be
1372
+ used by all the machinery of SymPy you should subclass the ``Function``
1373
+ class.
1374
+
1375
+ Parameters
1376
+ ----------
1377
+ symfunc : ``str`` or ``UndefinedFunction`` instance
1378
+ If ``str``, then create new ``UndefinedFunction`` with this as
1379
+ name. If ``symfunc`` is an Undefined function, create a new function
1380
+ with the same name and the implemented function attached.
1381
+ implementation : callable
1382
+ numerical implementation to be called by ``evalf()`` or ``lambdify``
1383
+
1384
+ Returns
1385
+ -------
1386
+ afunc : sympy.FunctionClass instance
1387
+ function with attached implementation
1388
+
1389
+ Examples
1390
+ ========
1391
+
1392
+ >>> from sympy.abc import x
1393
+ >>> from sympy.utilities.lambdify import implemented_function
1394
+ >>> from sympy import lambdify
1395
+ >>> f = implemented_function('f', lambda x: x+1)
1396
+ >>> lam_f = lambdify(x, f(x))
1397
+ >>> lam_f(4)
1398
+ 5
1399
+ """
1400
+ # Delayed import to avoid circular imports
1401
+ from sympy.core.function import UndefinedFunction
1402
+ # if name, create function to hold implementation
1403
+ kwargs = {}
1404
+ if isinstance(symfunc, UndefinedFunction):
1405
+ kwargs = symfunc._kwargs
1406
+ symfunc = symfunc.__name__
1407
+ if isinstance(symfunc, str):
1408
+ # Keyword arguments to UndefinedFunction are added as attributes to
1409
+ # the created class.
1410
+ symfunc = UndefinedFunction(
1411
+ symfunc, _imp_=staticmethod(implementation), **kwargs)
1412
+ elif not isinstance(symfunc, UndefinedFunction):
1413
+ raise ValueError(filldedent('''
1414
+ symfunc should be either a string or
1415
+ an UndefinedFunction instance.'''))
1416
+ return symfunc
1417
+
1418
+
1419
+ def _too_large_for_docstring(expr, limit):
1420
+ """Decide whether an ``Expr`` is too large to be fully rendered in a
1421
+ ``lambdify`` docstring.
1422
+
1423
+ This is a fast alternative to ``count_ops``, which can become prohibitively
1424
+ slow for large expressions, because in this instance we only care whether
1425
+ ``limit`` is exceeded rather than counting the exact number of nodes in the
1426
+ expression.
1427
+
1428
+ Parameters
1429
+ ==========
1430
+ expr : ``Expr``, (nested) ``list`` of ``Expr``, or ``Matrix``
1431
+ The same objects that can be passed to the ``expr`` argument of
1432
+ ``lambdify``.
1433
+ limit : ``int`` or ``None``
1434
+ The threshold above which an expression contains too many nodes to be
1435
+ usefully rendered in the docstring. If ``None`` then there is no limit.
1436
+
1437
+ Returns
1438
+ =======
1439
+ bool
1440
+ ``True`` if the number of nodes in the expression exceeds the limit,
1441
+ ``False`` otherwise.
1442
+
1443
+ Examples
1444
+ ========
1445
+
1446
+ >>> from sympy.abc import x, y, z
1447
+ >>> from sympy.utilities.lambdify import _too_large_for_docstring
1448
+ >>> expr = x
1449
+ >>> _too_large_for_docstring(expr, None)
1450
+ False
1451
+ >>> _too_large_for_docstring(expr, 100)
1452
+ False
1453
+ >>> _too_large_for_docstring(expr, 1)
1454
+ False
1455
+ >>> _too_large_for_docstring(expr, 0)
1456
+ True
1457
+ >>> _too_large_for_docstring(expr, -1)
1458
+ True
1459
+
1460
+ Does this split it?
1461
+
1462
+ >>> expr = [x, y, z]
1463
+ >>> _too_large_for_docstring(expr, None)
1464
+ False
1465
+ >>> _too_large_for_docstring(expr, 100)
1466
+ False
1467
+ >>> _too_large_for_docstring(expr, 1)
1468
+ True
1469
+ >>> _too_large_for_docstring(expr, 0)
1470
+ True
1471
+ >>> _too_large_for_docstring(expr, -1)
1472
+ True
1473
+
1474
+ >>> expr = [x, [y], z, [[x+y], [x*y*z, [x+y+z]]]]
1475
+ >>> _too_large_for_docstring(expr, None)
1476
+ False
1477
+ >>> _too_large_for_docstring(expr, 100)
1478
+ False
1479
+ >>> _too_large_for_docstring(expr, 1)
1480
+ True
1481
+ >>> _too_large_for_docstring(expr, 0)
1482
+ True
1483
+ >>> _too_large_for_docstring(expr, -1)
1484
+ True
1485
+
1486
+ >>> expr = ((x + y + z)**5).expand()
1487
+ >>> _too_large_for_docstring(expr, None)
1488
+ False
1489
+ >>> _too_large_for_docstring(expr, 100)
1490
+ True
1491
+ >>> _too_large_for_docstring(expr, 1)
1492
+ True
1493
+ >>> _too_large_for_docstring(expr, 0)
1494
+ True
1495
+ >>> _too_large_for_docstring(expr, -1)
1496
+ True
1497
+
1498
+ >>> from sympy import Matrix
1499
+ >>> expr = Matrix([[(x + y + z), ((x + y + z)**2).expand(),
1500
+ ... ((x + y + z)**3).expand(), ((x + y + z)**4).expand()]])
1501
+ >>> _too_large_for_docstring(expr, None)
1502
+ False
1503
+ >>> _too_large_for_docstring(expr, 1000)
1504
+ False
1505
+ >>> _too_large_for_docstring(expr, 100)
1506
+ True
1507
+ >>> _too_large_for_docstring(expr, 1)
1508
+ True
1509
+ >>> _too_large_for_docstring(expr, 0)
1510
+ True
1511
+ >>> _too_large_for_docstring(expr, -1)
1512
+ True
1513
+
1514
+ """
1515
+ # Must be imported here to avoid a circular import error
1516
+ from sympy.core.traversal import postorder_traversal
1517
+
1518
+ if limit is None:
1519
+ return False
1520
+
1521
+ i = 0
1522
+ for _ in postorder_traversal(expr):
1523
+ i += 1
1524
+ if i > limit:
1525
+ return True
1526
+ return False
venv/lib/python3.10/site-packages/sympy/utilities/magic.py ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Functions that involve magic. """
2
+
3
+ def pollute(names, objects):
4
+ """Pollute the global namespace with symbols -> objects mapping. """
5
+ from inspect import currentframe
6
+ frame = currentframe().f_back.f_back
7
+
8
+ try:
9
+ for name, obj in zip(names, objects):
10
+ frame.f_globals[name] = obj
11
+ finally:
12
+ del frame # break cyclic dependencies as stated in inspect docs
venv/lib/python3.10/site-packages/sympy/utilities/matchpy_connector.py ADDED
@@ -0,0 +1,289 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ The objects in this module allow the usage of the MatchPy pattern matching
3
+ library on SymPy expressions.
4
+ """
5
+ import re
6
+ from typing import List, Callable
7
+
8
+ from sympy.core.sympify import _sympify
9
+ from sympy.external import import_module
10
+ from sympy.functions import (log, sin, cos, tan, cot, csc, sec, erf, gamma, uppergamma)
11
+ from sympy.functions.elementary.hyperbolic import acosh, asinh, atanh, acoth, acsch, asech, cosh, sinh, tanh, coth, sech, csch
12
+ from sympy.functions.elementary.trigonometric import atan, acsc, asin, acot, acos, asec
13
+ from sympy.functions.special.error_functions import fresnelc, fresnels, erfc, erfi, Ei
14
+ from sympy.core.add import Add
15
+ from sympy.core.basic import Basic
16
+ from sympy.core.expr import Expr
17
+ from sympy.core.mul import Mul
18
+ from sympy.core.power import Pow
19
+ from sympy.core.relational import (Equality, Unequality)
20
+ from sympy.core.symbol import Symbol
21
+ from sympy.functions.elementary.exponential import exp
22
+ from sympy.integrals.integrals import Integral
23
+ from sympy.printing.repr import srepr
24
+ from sympy.utilities.decorator import doctest_depends_on
25
+
26
+ matchpy = import_module("matchpy")
27
+
28
+ if matchpy:
29
+ from matchpy import Operation, CommutativeOperation, AssociativeOperation, OneIdentityOperation
30
+ from matchpy.expressions.functions import op_iter, create_operation_expression, op_len
31
+
32
+ Operation.register(Integral)
33
+ Operation.register(Pow)
34
+ OneIdentityOperation.register(Pow)
35
+
36
+ Operation.register(Add)
37
+ OneIdentityOperation.register(Add)
38
+ CommutativeOperation.register(Add)
39
+ AssociativeOperation.register(Add)
40
+
41
+ Operation.register(Mul)
42
+ OneIdentityOperation.register(Mul)
43
+ CommutativeOperation.register(Mul)
44
+ AssociativeOperation.register(Mul)
45
+
46
+ Operation.register(Equality)
47
+ CommutativeOperation.register(Equality)
48
+ Operation.register(Unequality)
49
+ CommutativeOperation.register(Unequality)
50
+
51
+ Operation.register(exp)
52
+ Operation.register(log)
53
+ Operation.register(gamma)
54
+ Operation.register(uppergamma)
55
+ Operation.register(fresnels)
56
+ Operation.register(fresnelc)
57
+ Operation.register(erf)
58
+ Operation.register(Ei)
59
+ Operation.register(erfc)
60
+ Operation.register(erfi)
61
+ Operation.register(sin)
62
+ Operation.register(cos)
63
+ Operation.register(tan)
64
+ Operation.register(cot)
65
+ Operation.register(csc)
66
+ Operation.register(sec)
67
+ Operation.register(sinh)
68
+ Operation.register(cosh)
69
+ Operation.register(tanh)
70
+ Operation.register(coth)
71
+ Operation.register(csch)
72
+ Operation.register(sech)
73
+ Operation.register(asin)
74
+ Operation.register(acos)
75
+ Operation.register(atan)
76
+ Operation.register(acot)
77
+ Operation.register(acsc)
78
+ Operation.register(asec)
79
+ Operation.register(asinh)
80
+ Operation.register(acosh)
81
+ Operation.register(atanh)
82
+ Operation.register(acoth)
83
+ Operation.register(acsch)
84
+ Operation.register(asech)
85
+
86
+ @op_iter.register(Integral) # type: ignore
87
+ def _(operation):
88
+ return iter((operation._args[0],) + operation._args[1])
89
+
90
+ @op_iter.register(Basic) # type: ignore
91
+ def _(operation):
92
+ return iter(operation._args)
93
+
94
+ @op_len.register(Integral) # type: ignore
95
+ def _(operation):
96
+ return 1 + len(operation._args[1])
97
+
98
+ @op_len.register(Basic) # type: ignore
99
+ def _(operation):
100
+ return len(operation._args)
101
+
102
+ @create_operation_expression.register(Basic)
103
+ def sympy_op_factory(old_operation, new_operands, variable_name=True):
104
+ return type(old_operation)(*new_operands)
105
+
106
+
107
+ if matchpy:
108
+ from matchpy import Wildcard
109
+ else:
110
+ class Wildcard: # type: ignore
111
+ def __init__(self, min_length, fixed_size, variable_name, optional):
112
+ self.min_count = min_length
113
+ self.fixed_size = fixed_size
114
+ self.variable_name = variable_name
115
+ self.optional = optional
116
+
117
+
118
+ @doctest_depends_on(modules=('matchpy',))
119
+ class _WildAbstract(Wildcard, Symbol):
120
+ min_length: int # abstract field required in subclasses
121
+ fixed_size: bool # abstract field required in subclasses
122
+
123
+ def __init__(self, variable_name=None, optional=None, **assumptions):
124
+ min_length = self.min_length
125
+ fixed_size = self.fixed_size
126
+ if optional is not None:
127
+ optional = _sympify(optional)
128
+ Wildcard.__init__(self, min_length, fixed_size, str(variable_name), optional)
129
+
130
+ def __getstate__(self):
131
+ return {
132
+ "min_length": self.min_length,
133
+ "fixed_size": self.fixed_size,
134
+ "min_count": self.min_count,
135
+ "variable_name": self.variable_name,
136
+ "optional": self.optional,
137
+ }
138
+
139
+ def __new__(cls, variable_name=None, optional=None, **assumptions):
140
+ cls._sanitize(assumptions, cls)
141
+ return _WildAbstract.__xnew__(cls, variable_name, optional, **assumptions)
142
+
143
+ def __getnewargs__(self):
144
+ return self.variable_name, self.optional
145
+
146
+ @staticmethod
147
+ def __xnew__(cls, variable_name=None, optional=None, **assumptions):
148
+ obj = Symbol.__xnew__(cls, variable_name, **assumptions)
149
+ return obj
150
+
151
+ def _hashable_content(self):
152
+ if self.optional:
153
+ return super()._hashable_content() + (self.min_count, self.fixed_size, self.variable_name, self.optional)
154
+ else:
155
+ return super()._hashable_content() + (self.min_count, self.fixed_size, self.variable_name)
156
+
157
+ def __copy__(self) -> '_WildAbstract':
158
+ return type(self)(variable_name=self.variable_name, optional=self.optional)
159
+
160
+ def __repr__(self):
161
+ return str(self)
162
+
163
+ def __str__(self):
164
+ return self.name
165
+
166
+
167
+ @doctest_depends_on(modules=('matchpy',))
168
+ class WildDot(_WildAbstract):
169
+ min_length = 1
170
+ fixed_size = True
171
+
172
+
173
+ @doctest_depends_on(modules=('matchpy',))
174
+ class WildPlus(_WildAbstract):
175
+ min_length = 1
176
+ fixed_size = False
177
+
178
+
179
+ @doctest_depends_on(modules=('matchpy',))
180
+ class WildStar(_WildAbstract):
181
+ min_length = 0
182
+ fixed_size = False
183
+
184
+
185
+ def _get_srepr(expr):
186
+ s = srepr(expr)
187
+ s = re.sub(r"WildDot\('(\w+)'\)", r"\1", s)
188
+ s = re.sub(r"WildPlus\('(\w+)'\)", r"*\1", s)
189
+ s = re.sub(r"WildStar\('(\w+)'\)", r"*\1", s)
190
+ return s
191
+
192
+
193
+ @doctest_depends_on(modules=('matchpy',))
194
+ class Replacer:
195
+ """
196
+ Replacer object to perform multiple pattern matching and subexpression
197
+ replacements in SymPy expressions.
198
+
199
+ Examples
200
+ ========
201
+
202
+ Example to construct a simple first degree equation solver:
203
+
204
+ >>> from sympy.utilities.matchpy_connector import WildDot, Replacer
205
+ >>> from sympy import Equality, Symbol
206
+ >>> x = Symbol("x")
207
+ >>> a_ = WildDot("a_", optional=1)
208
+ >>> b_ = WildDot("b_", optional=0)
209
+
210
+ The lines above have defined two wildcards, ``a_`` and ``b_``, the
211
+ coefficients of the equation `a x + b = 0`. The optional values specified
212
+ indicate which expression to return in case no match is found, they are
213
+ necessary in equations like `a x = 0` and `x + b = 0`.
214
+
215
+ Create two constraints to make sure that ``a_`` and ``b_`` will not match
216
+ any expression containing ``x``:
217
+
218
+ >>> from matchpy import CustomConstraint
219
+ >>> free_x_a = CustomConstraint(lambda a_: not a_.has(x))
220
+ >>> free_x_b = CustomConstraint(lambda b_: not b_.has(x))
221
+
222
+ Now create the rule replacer with the constraints:
223
+
224
+ >>> replacer = Replacer(common_constraints=[free_x_a, free_x_b])
225
+
226
+ Add the matching rule:
227
+
228
+ >>> replacer.add(Equality(a_*x + b_, 0), -b_/a_)
229
+
230
+ Let's try it:
231
+
232
+ >>> replacer.replace(Equality(3*x + 4, 0))
233
+ -4/3
234
+
235
+ Notice that it will not match equations expressed with other patterns:
236
+
237
+ >>> eq = Equality(3*x, 4)
238
+ >>> replacer.replace(eq)
239
+ Eq(3*x, 4)
240
+
241
+ In order to extend the matching patterns, define another one (we also need
242
+ to clear the cache, because the previous result has already been memorized
243
+ and the pattern matcher will not iterate again if given the same expression)
244
+
245
+ >>> replacer.add(Equality(a_*x, b_), b_/a_)
246
+ >>> replacer._replacer.matcher.clear()
247
+ >>> replacer.replace(eq)
248
+ 4/3
249
+ """
250
+
251
+ def __init__(self, common_constraints: list = []):
252
+ self._replacer = matchpy.ManyToOneReplacer()
253
+ self._common_constraint = common_constraints
254
+
255
+ def _get_lambda(self, lambda_str: str) -> Callable[..., Expr]:
256
+ exec("from sympy import *")
257
+ return eval(lambda_str, locals())
258
+
259
+ def _get_custom_constraint(self, constraint_expr: Expr, condition_template: str) -> Callable[..., Expr]:
260
+ wilds = [x.name for x in constraint_expr.atoms(_WildAbstract)]
261
+ lambdaargs = ', '.join(wilds)
262
+ fullexpr = _get_srepr(constraint_expr)
263
+ condition = condition_template.format(fullexpr)
264
+ return matchpy.CustomConstraint(
265
+ self._get_lambda(f"lambda {lambdaargs}: ({condition})"))
266
+
267
+ def _get_custom_constraint_nonfalse(self, constraint_expr: Expr) -> Callable[..., Expr]:
268
+ return self._get_custom_constraint(constraint_expr, "({}) != False")
269
+
270
+ def _get_custom_constraint_true(self, constraint_expr: Expr) -> Callable[..., Expr]:
271
+ return self._get_custom_constraint(constraint_expr, "({}) == True")
272
+
273
+ def add(self, expr: Expr, result: Expr, conditions_true: List[Expr] = [], conditions_nonfalse: List[Expr] = []) -> None:
274
+ expr = _sympify(expr)
275
+ result = _sympify(result)
276
+ lambda_str = f"lambda {', '.join((x.name for x in expr.atoms(_WildAbstract)))}: {_get_srepr(result)}"
277
+ lambda_expr = self._get_lambda(lambda_str)
278
+ constraints = self._common_constraint[:]
279
+ constraint_conditions_true = [
280
+ self._get_custom_constraint_true(cond) for cond in conditions_true]
281
+ constraint_conditions_nonfalse = [
282
+ self._get_custom_constraint_nonfalse(cond) for cond in conditions_nonfalse]
283
+ constraints.extend(constraint_conditions_true)
284
+ constraints.extend(constraint_conditions_nonfalse)
285
+ self._replacer.add(
286
+ matchpy.ReplacementRule(matchpy.Pattern(expr, *constraints), lambda_expr))
287
+
288
+ def replace(self, expr: Expr) -> Expr:
289
+ return self._replacer.replace(expr)
venv/lib/python3.10/site-packages/sympy/utilities/mathml/__init__.py ADDED
@@ -0,0 +1,79 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Module with some functions for MathML, like transforming MathML
2
+ content in MathML presentation.
3
+
4
+ To use this module, you will need lxml.
5
+ """
6
+
7
+ from sympy.utilities.pkgdata import get_resource
8
+ from sympy.utilities.decorator import doctest_depends_on
9
+
10
+
11
+ __doctest_requires__ = {('apply_xsl', 'c2p'): ['lxml']}
12
+
13
+
14
+ def add_mathml_headers(s):
15
+ return """<math xmlns:mml="http://www.w3.org/1998/Math/MathML"
16
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
17
+ xsi:schemaLocation="http://www.w3.org/1998/Math/MathML
18
+ http://www.w3.org/Math/XMLSchema/mathml2/mathml2.xsd">""" + s + "</math>"
19
+
20
+
21
+ @doctest_depends_on(modules=('lxml',))
22
+ def apply_xsl(mml, xsl):
23
+ """Apply a xsl to a MathML string.
24
+
25
+ Parameters
26
+ ==========
27
+
28
+ mml
29
+ A string with MathML code.
30
+ xsl
31
+ A string representing a path to a xsl (xml stylesheet) file.
32
+ This file name is relative to the PYTHONPATH.
33
+
34
+ Examples
35
+ ========
36
+
37
+ >>> from sympy.utilities.mathml import apply_xsl
38
+ >>> xsl = 'mathml/data/simple_mmlctop.xsl'
39
+ >>> mml = '<apply> <plus/> <ci>a</ci> <ci>b</ci> </apply>'
40
+ >>> res = apply_xsl(mml,xsl)
41
+ >>> ''.join(res.splitlines())
42
+ '<?xml version="1.0"?><mrow xmlns="http://www.w3.org/1998/Math/MathML"> <mi>a</mi> <mo> + </mo> <mi>b</mi></mrow>'
43
+ """
44
+ from lxml import etree
45
+
46
+ parser = etree.XMLParser(resolve_entities=False)
47
+ ac = etree.XSLTAccessControl.DENY_ALL
48
+
49
+ s = etree.XML(get_resource(xsl).read(), parser=parser)
50
+ transform = etree.XSLT(s, access_control=ac)
51
+ doc = etree.XML(mml, parser=parser)
52
+ result = transform(doc)
53
+ s = str(result)
54
+ return s
55
+
56
+
57
+ @doctest_depends_on(modules=('lxml',))
58
+ def c2p(mml, simple=False):
59
+ """Transforms a document in MathML content (like the one that sympy produces)
60
+ in one document in MathML presentation, more suitable for printing, and more
61
+ widely accepted
62
+
63
+ Examples
64
+ ========
65
+
66
+ >>> from sympy.utilities.mathml import c2p
67
+ >>> mml = '<apply> <exp/> <cn>2</cn> </apply>'
68
+ >>> c2p(mml,simple=True) != c2p(mml,simple=False)
69
+ True
70
+
71
+ """
72
+
73
+ if not mml.startswith('<math'):
74
+ mml = add_mathml_headers(mml)
75
+
76
+ if simple:
77
+ return apply_xsl(mml, 'mathml/data/simple_mmlctop.xsl')
78
+
79
+ return apply_xsl(mml, 'mathml/data/mmlctop.xsl')
venv/lib/python3.10/site-packages/sympy/utilities/mathml/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (2.59 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/mathml/data/mmlctop.xsl ADDED
The diff for this file is too large to render. See raw diff
 
venv/lib/python3.10/site-packages/sympy/utilities/mathml/data/mmltex.xsl ADDED
The diff for this file is too large to render. See raw diff
 
venv/lib/python3.10/site-packages/sympy/utilities/mathml/data/simple_mmlctop.xsl ADDED
The diff for this file is too large to render. See raw diff
 
venv/lib/python3.10/site-packages/sympy/utilities/memoization.py ADDED
@@ -0,0 +1,59 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from functools import wraps
2
+
3
+
4
+ def recurrence_memo(initial):
5
+ """
6
+ Memo decorator for sequences defined by recurrence
7
+
8
+ See usage examples e.g. in the specfun/combinatorial module
9
+ """
10
+ cache = initial
11
+
12
+ def decorator(f):
13
+ @wraps(f)
14
+ def g(n):
15
+ L = len(cache)
16
+ if n <= L - 1:
17
+ return cache[n]
18
+ for i in range(L, n + 1):
19
+ cache.append(f(i, cache))
20
+ return cache[-1]
21
+ return g
22
+ return decorator
23
+
24
+
25
+ def assoc_recurrence_memo(base_seq):
26
+ """
27
+ Memo decorator for associated sequences defined by recurrence starting from base
28
+
29
+ base_seq(n) -- callable to get base sequence elements
30
+
31
+ XXX works only for Pn0 = base_seq(0) cases
32
+ XXX works only for m <= n cases
33
+ """
34
+
35
+ cache = []
36
+
37
+ def decorator(f):
38
+ @wraps(f)
39
+ def g(n, m):
40
+ L = len(cache)
41
+ if n < L:
42
+ return cache[n][m]
43
+
44
+ for i in range(L, n + 1):
45
+ # get base sequence
46
+ F_i0 = base_seq(i)
47
+ F_i_cache = [F_i0]
48
+ cache.append(F_i_cache)
49
+
50
+ # XXX only works for m <= n cases
51
+ # generate assoc sequence
52
+ for j in range(1, i + 1):
53
+ F_ij = f(i, j, cache)
54
+ F_i_cache.append(F_ij)
55
+
56
+ return cache[n][m]
57
+
58
+ return g
59
+ return decorator
venv/lib/python3.10/site-packages/sympy/utilities/misc.py ADDED
@@ -0,0 +1,565 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Miscellaneous stuff that does not really fit anywhere else."""
2
+
3
+ from __future__ import annotations
4
+
5
+ import operator
6
+ import sys
7
+ import os
8
+ import re as _re
9
+ import struct
10
+ from textwrap import fill, dedent
11
+
12
+
13
+ class Undecidable(ValueError):
14
+ # an error to be raised when a decision cannot be made definitively
15
+ # where a definitive answer is needed
16
+ pass
17
+
18
+
19
+ def filldedent(s, w=70, **kwargs):
20
+ """
21
+ Strips leading and trailing empty lines from a copy of ``s``, then dedents,
22
+ fills and returns it.
23
+
24
+ Empty line stripping serves to deal with docstrings like this one that
25
+ start with a newline after the initial triple quote, inserting an empty
26
+ line at the beginning of the string.
27
+
28
+ Additional keyword arguments will be passed to ``textwrap.fill()``.
29
+
30
+ See Also
31
+ ========
32
+ strlines, rawlines
33
+
34
+ """
35
+ return '\n' + fill(dedent(str(s)).strip('\n'), width=w, **kwargs)
36
+
37
+
38
+ def strlines(s, c=64, short=False):
39
+ """Return a cut-and-pastable string that, when printed, is
40
+ equivalent to the input. The lines will be surrounded by
41
+ parentheses and no line will be longer than c (default 64)
42
+ characters. If the line contains newlines characters, the
43
+ `rawlines` result will be returned. If ``short`` is True
44
+ (default is False) then if there is one line it will be
45
+ returned without bounding parentheses.
46
+
47
+ Examples
48
+ ========
49
+
50
+ >>> from sympy.utilities.misc import strlines
51
+ >>> q = 'this is a long string that should be broken into shorter lines'
52
+ >>> print(strlines(q, 40))
53
+ (
54
+ 'this is a long string that should be b'
55
+ 'roken into shorter lines'
56
+ )
57
+ >>> q == (
58
+ ... 'this is a long string that should be b'
59
+ ... 'roken into shorter lines'
60
+ ... )
61
+ True
62
+
63
+ See Also
64
+ ========
65
+ filldedent, rawlines
66
+ """
67
+ if not isinstance(s, str):
68
+ raise ValueError('expecting string input')
69
+ if '\n' in s:
70
+ return rawlines(s)
71
+ q = '"' if repr(s).startswith('"') else "'"
72
+ q = (q,)*2
73
+ if '\\' in s: # use r-string
74
+ m = '(\nr%s%%s%s\n)' % q
75
+ j = '%s\nr%s' % q
76
+ c -= 3
77
+ else:
78
+ m = '(\n%s%%s%s\n)' % q
79
+ j = '%s\n%s' % q
80
+ c -= 2
81
+ out = []
82
+ while s:
83
+ out.append(s[:c])
84
+ s=s[c:]
85
+ if short and len(out) == 1:
86
+ return (m % out[0]).splitlines()[1] # strip bounding (\n...\n)
87
+ return m % j.join(out)
88
+
89
+
90
+ def rawlines(s):
91
+ """Return a cut-and-pastable string that, when printed, is equivalent
92
+ to the input. Use this when there is more than one line in the
93
+ string. The string returned is formatted so it can be indented
94
+ nicely within tests; in some cases it is wrapped in the dedent
95
+ function which has to be imported from textwrap.
96
+
97
+ Examples
98
+ ========
99
+
100
+ Note: because there are characters in the examples below that need
101
+ to be escaped because they are themselves within a triple quoted
102
+ docstring, expressions below look more complicated than they would
103
+ be if they were printed in an interpreter window.
104
+
105
+ >>> from sympy.utilities.misc import rawlines
106
+ >>> from sympy import TableForm
107
+ >>> s = str(TableForm([[1, 10]], headings=(None, ['a', 'bee'])))
108
+ >>> print(rawlines(s))
109
+ (
110
+ 'a bee\\n'
111
+ '-----\\n'
112
+ '1 10 '
113
+ )
114
+ >>> print(rawlines('''this
115
+ ... that'''))
116
+ dedent('''\\
117
+ this
118
+ that''')
119
+
120
+ >>> print(rawlines('''this
121
+ ... that
122
+ ... '''))
123
+ dedent('''\\
124
+ this
125
+ that
126
+ ''')
127
+
128
+ >>> s = \"\"\"this
129
+ ... is a triple '''
130
+ ... \"\"\"
131
+ >>> print(rawlines(s))
132
+ dedent(\"\"\"\\
133
+ this
134
+ is a triple '''
135
+ \"\"\")
136
+
137
+ >>> print(rawlines('''this
138
+ ... that
139
+ ... '''))
140
+ (
141
+ 'this\\n'
142
+ 'that\\n'
143
+ ' '
144
+ )
145
+
146
+ See Also
147
+ ========
148
+ filldedent, strlines
149
+ """
150
+ lines = s.split('\n')
151
+ if len(lines) == 1:
152
+ return repr(lines[0])
153
+ triple = ["'''" in s, '"""' in s]
154
+ if any(li.endswith(' ') for li in lines) or '\\' in s or all(triple):
155
+ rv = []
156
+ # add on the newlines
157
+ trailing = s.endswith('\n')
158
+ last = len(lines) - 1
159
+ for i, li in enumerate(lines):
160
+ if i != last or trailing:
161
+ rv.append(repr(li + '\n'))
162
+ else:
163
+ rv.append(repr(li))
164
+ return '(\n %s\n)' % '\n '.join(rv)
165
+ else:
166
+ rv = '\n '.join(lines)
167
+ if triple[0]:
168
+ return 'dedent("""\\\n %s""")' % rv
169
+ else:
170
+ return "dedent('''\\\n %s''')" % rv
171
+
172
+ ARCH = str(struct.calcsize('P') * 8) + "-bit"
173
+
174
+
175
+ # XXX: PyPy does not support hash randomization
176
+ HASH_RANDOMIZATION = getattr(sys.flags, 'hash_randomization', False)
177
+
178
+ _debug_tmp: list[str] = []
179
+ _debug_iter = 0
180
+
181
+ def debug_decorator(func):
182
+ """If SYMPY_DEBUG is True, it will print a nice execution tree with
183
+ arguments and results of all decorated functions, else do nothing.
184
+ """
185
+ from sympy import SYMPY_DEBUG
186
+
187
+ if not SYMPY_DEBUG:
188
+ return func
189
+
190
+ def maketree(f, *args, **kw):
191
+ global _debug_tmp
192
+ global _debug_iter
193
+ oldtmp = _debug_tmp
194
+ _debug_tmp = []
195
+ _debug_iter += 1
196
+
197
+ def tree(subtrees):
198
+ def indent(s, variant=1):
199
+ x = s.split("\n")
200
+ r = "+-%s\n" % x[0]
201
+ for a in x[1:]:
202
+ if a == "":
203
+ continue
204
+ if variant == 1:
205
+ r += "| %s\n" % a
206
+ else:
207
+ r += " %s\n" % a
208
+ return r
209
+ if len(subtrees) == 0:
210
+ return ""
211
+ f = []
212
+ for a in subtrees[:-1]:
213
+ f.append(indent(a))
214
+ f.append(indent(subtrees[-1], 2))
215
+ return ''.join(f)
216
+
217
+ # If there is a bug and the algorithm enters an infinite loop, enable the
218
+ # following lines. It will print the names and parameters of all major functions
219
+ # that are called, *before* they are called
220
+ #from functools import reduce
221
+ #print("%s%s %s%s" % (_debug_iter, reduce(lambda x, y: x + y, \
222
+ # map(lambda x: '-', range(1, 2 + _debug_iter))), f.__name__, args))
223
+
224
+ r = f(*args, **kw)
225
+
226
+ _debug_iter -= 1
227
+ s = "%s%s = %s\n" % (f.__name__, args, r)
228
+ if _debug_tmp != []:
229
+ s += tree(_debug_tmp)
230
+ _debug_tmp = oldtmp
231
+ _debug_tmp.append(s)
232
+ if _debug_iter == 0:
233
+ print(_debug_tmp[0])
234
+ _debug_tmp = []
235
+ return r
236
+
237
+ def decorated(*args, **kwargs):
238
+ return maketree(func, *args, **kwargs)
239
+
240
+ return decorated
241
+
242
+
243
+ def debug(*args):
244
+ """
245
+ Print ``*args`` if SYMPY_DEBUG is True, else do nothing.
246
+ """
247
+ from sympy import SYMPY_DEBUG
248
+ if SYMPY_DEBUG:
249
+ print(*args, file=sys.stderr)
250
+
251
+
252
+ def debugf(string, args):
253
+ """
254
+ Print ``string%args`` if SYMPY_DEBUG is True, else do nothing. This is
255
+ intended for debug messages using formatted strings.
256
+ """
257
+ from sympy import SYMPY_DEBUG
258
+ if SYMPY_DEBUG:
259
+ print(string%args, file=sys.stderr)
260
+
261
+
262
+ def find_executable(executable, path=None):
263
+ """Try to find 'executable' in the directories listed in 'path' (a
264
+ string listing directories separated by 'os.pathsep'; defaults to
265
+ os.environ['PATH']). Returns the complete filename or None if not
266
+ found
267
+ """
268
+ from .exceptions import sympy_deprecation_warning
269
+ sympy_deprecation_warning(
270
+ """
271
+ sympy.utilities.misc.find_executable() is deprecated. Use the standard
272
+ library shutil.which() function instead.
273
+ """,
274
+ deprecated_since_version="1.7",
275
+ active_deprecations_target="deprecated-find-executable",
276
+ )
277
+ if path is None:
278
+ path = os.environ['PATH']
279
+ paths = path.split(os.pathsep)
280
+ extlist = ['']
281
+ if os.name == 'os2':
282
+ (base, ext) = os.path.splitext(executable)
283
+ # executable files on OS/2 can have an arbitrary extension, but
284
+ # .exe is automatically appended if no dot is present in the name
285
+ if not ext:
286
+ executable = executable + ".exe"
287
+ elif sys.platform == 'win32':
288
+ pathext = os.environ['PATHEXT'].lower().split(os.pathsep)
289
+ (base, ext) = os.path.splitext(executable)
290
+ if ext.lower() not in pathext:
291
+ extlist = pathext
292
+ for ext in extlist:
293
+ execname = executable + ext
294
+ if os.path.isfile(execname):
295
+ return execname
296
+ else:
297
+ for p in paths:
298
+ f = os.path.join(p, execname)
299
+ if os.path.isfile(f):
300
+ return f
301
+
302
+ return None
303
+
304
+
305
+ def func_name(x, short=False):
306
+ """Return function name of `x` (if defined) else the `type(x)`.
307
+ If short is True and there is a shorter alias for the result,
308
+ return the alias.
309
+
310
+ Examples
311
+ ========
312
+
313
+ >>> from sympy.utilities.misc import func_name
314
+ >>> from sympy import Matrix
315
+ >>> from sympy.abc import x
316
+ >>> func_name(Matrix.eye(3))
317
+ 'MutableDenseMatrix'
318
+ >>> func_name(x < 1)
319
+ 'StrictLessThan'
320
+ >>> func_name(x < 1, short=True)
321
+ 'Lt'
322
+ """
323
+ alias = {
324
+ 'GreaterThan': 'Ge',
325
+ 'StrictGreaterThan': 'Gt',
326
+ 'LessThan': 'Le',
327
+ 'StrictLessThan': 'Lt',
328
+ 'Equality': 'Eq',
329
+ 'Unequality': 'Ne',
330
+ }
331
+ typ = type(x)
332
+ if str(typ).startswith("<type '"):
333
+ typ = str(typ).split("'")[1].split("'")[0]
334
+ elif str(typ).startswith("<class '"):
335
+ typ = str(typ).split("'")[1].split("'")[0]
336
+ rv = getattr(getattr(x, 'func', x), '__name__', typ)
337
+ if '.' in rv:
338
+ rv = rv.split('.')[-1]
339
+ if short:
340
+ rv = alias.get(rv, rv)
341
+ return rv
342
+
343
+
344
+ def _replace(reps):
345
+ """Return a function that can make the replacements, given in
346
+ ``reps``, on a string. The replacements should be given as mapping.
347
+
348
+ Examples
349
+ ========
350
+
351
+ >>> from sympy.utilities.misc import _replace
352
+ >>> f = _replace(dict(foo='bar', d='t'))
353
+ >>> f('food')
354
+ 'bart'
355
+ >>> f = _replace({})
356
+ >>> f('food')
357
+ 'food'
358
+ """
359
+ if not reps:
360
+ return lambda x: x
361
+ D = lambda match: reps[match.group(0)]
362
+ pattern = _re.compile("|".join(
363
+ [_re.escape(k) for k, v in reps.items()]), _re.M)
364
+ return lambda string: pattern.sub(D, string)
365
+
366
+
367
+ def replace(string, *reps):
368
+ """Return ``string`` with all keys in ``reps`` replaced with
369
+ their corresponding values, longer strings first, irrespective
370
+ of the order they are given. ``reps`` may be passed as tuples
371
+ or a single mapping.
372
+
373
+ Examples
374
+ ========
375
+
376
+ >>> from sympy.utilities.misc import replace
377
+ >>> replace('foo', {'oo': 'ar', 'f': 'b'})
378
+ 'bar'
379
+ >>> replace("spamham sha", ("spam", "eggs"), ("sha","md5"))
380
+ 'eggsham md5'
381
+
382
+ There is no guarantee that a unique answer will be
383
+ obtained if keys in a mapping overlap (i.e. are the same
384
+ length and have some identical sequence at the
385
+ beginning/end):
386
+
387
+ >>> reps = [
388
+ ... ('ab', 'x'),
389
+ ... ('bc', 'y')]
390
+ >>> replace('abc', *reps) in ('xc', 'ay')
391
+ True
392
+
393
+ References
394
+ ==========
395
+
396
+ .. [1] https://stackoverflow.com/questions/6116978/how-to-replace-multiple-substrings-of-a-string
397
+ """
398
+ if len(reps) == 1:
399
+ kv = reps[0]
400
+ if isinstance(kv, dict):
401
+ reps = kv
402
+ else:
403
+ return string.replace(*kv)
404
+ else:
405
+ reps = dict(reps)
406
+ return _replace(reps)(string)
407
+
408
+
409
+ def translate(s, a, b=None, c=None):
410
+ """Return ``s`` where characters have been replaced or deleted.
411
+
412
+ SYNTAX
413
+ ======
414
+
415
+ translate(s, None, deletechars):
416
+ all characters in ``deletechars`` are deleted
417
+ translate(s, map [,deletechars]):
418
+ all characters in ``deletechars`` (if provided) are deleted
419
+ then the replacements defined by map are made; if the keys
420
+ of map are strings then the longer ones are handled first.
421
+ Multicharacter deletions should have a value of ''.
422
+ translate(s, oldchars, newchars, deletechars)
423
+ all characters in ``deletechars`` are deleted
424
+ then each character in ``oldchars`` is replaced with the
425
+ corresponding character in ``newchars``
426
+
427
+ Examples
428
+ ========
429
+
430
+ >>> from sympy.utilities.misc import translate
431
+ >>> abc = 'abc'
432
+ >>> translate(abc, None, 'a')
433
+ 'bc'
434
+ >>> translate(abc, {'a': 'x'}, 'c')
435
+ 'xb'
436
+ >>> translate(abc, {'abc': 'x', 'a': 'y'})
437
+ 'x'
438
+
439
+ >>> translate('abcd', 'ac', 'AC', 'd')
440
+ 'AbC'
441
+
442
+ There is no guarantee that a unique answer will be
443
+ obtained if keys in a mapping overlap are the same
444
+ length and have some identical sequences at the
445
+ beginning/end:
446
+
447
+ >>> translate(abc, {'ab': 'x', 'bc': 'y'}) in ('xc', 'ay')
448
+ True
449
+ """
450
+
451
+ mr = {}
452
+ if a is None:
453
+ if c is not None:
454
+ raise ValueError('c should be None when a=None is passed, instead got %s' % c)
455
+ if b is None:
456
+ return s
457
+ c = b
458
+ a = b = ''
459
+ else:
460
+ if isinstance(a, dict):
461
+ short = {}
462
+ for k in list(a.keys()):
463
+ if len(k) == 1 and len(a[k]) == 1:
464
+ short[k] = a.pop(k)
465
+ mr = a
466
+ c = b
467
+ if short:
468
+ a, b = [''.join(i) for i in list(zip(*short.items()))]
469
+ else:
470
+ a = b = ''
471
+ elif len(a) != len(b):
472
+ raise ValueError('oldchars and newchars have different lengths')
473
+
474
+ if c:
475
+ val = str.maketrans('', '', c)
476
+ s = s.translate(val)
477
+ s = replace(s, mr)
478
+ n = str.maketrans(a, b)
479
+ return s.translate(n)
480
+
481
+
482
+ def ordinal(num):
483
+ """Return ordinal number string of num, e.g. 1 becomes 1st.
484
+ """
485
+ # modified from https://codereview.stackexchange.com/questions/41298/producing-ordinal-numbers
486
+ n = as_int(num)
487
+ k = abs(n) % 100
488
+ if 11 <= k <= 13:
489
+ suffix = 'th'
490
+ elif k % 10 == 1:
491
+ suffix = 'st'
492
+ elif k % 10 == 2:
493
+ suffix = 'nd'
494
+ elif k % 10 == 3:
495
+ suffix = 'rd'
496
+ else:
497
+ suffix = 'th'
498
+ return str(n) + suffix
499
+
500
+
501
+ def as_int(n, strict=True):
502
+ """
503
+ Convert the argument to a builtin integer.
504
+
505
+ The return value is guaranteed to be equal to the input. ValueError is
506
+ raised if the input has a non-integral value. When ``strict`` is True, this
507
+ uses `__index__ <https://docs.python.org/3/reference/datamodel.html#object.__index__>`_
508
+ and when it is False it uses ``int``.
509
+
510
+
511
+ Examples
512
+ ========
513
+
514
+ >>> from sympy.utilities.misc import as_int
515
+ >>> from sympy import sqrt, S
516
+
517
+ The function is primarily concerned with sanitizing input for
518
+ functions that need to work with builtin integers, so anything that
519
+ is unambiguously an integer should be returned as an int:
520
+
521
+ >>> as_int(S(3))
522
+ 3
523
+
524
+ Floats, being of limited precision, are not assumed to be exact and
525
+ will raise an error unless the ``strict`` flag is False. This
526
+ precision issue becomes apparent for large floating point numbers:
527
+
528
+ >>> big = 1e23
529
+ >>> type(big) is float
530
+ True
531
+ >>> big == int(big)
532
+ True
533
+ >>> as_int(big)
534
+ Traceback (most recent call last):
535
+ ...
536
+ ValueError: ... is not an integer
537
+ >>> as_int(big, strict=False)
538
+ 99999999999999991611392
539
+
540
+ Input that might be a complex representation of an integer value is
541
+ also rejected by default:
542
+
543
+ >>> one = sqrt(3 + 2*sqrt(2)) - sqrt(2)
544
+ >>> int(one) == 1
545
+ True
546
+ >>> as_int(one)
547
+ Traceback (most recent call last):
548
+ ...
549
+ ValueError: ... is not an integer
550
+ """
551
+ if strict:
552
+ try:
553
+ if isinstance(n, bool):
554
+ raise TypeError
555
+ return operator.index(n)
556
+ except TypeError:
557
+ raise ValueError('%s is not an integer' % (n,))
558
+ else:
559
+ try:
560
+ result = int(n)
561
+ except TypeError:
562
+ raise ValueError('%s is not an integer' % (n,))
563
+ if n != result:
564
+ raise ValueError('%s is not an integer' % (n,))
565
+ return result
venv/lib/python3.10/site-packages/sympy/utilities/pkgdata.py ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ pkgdata is a simple, extensible way for a package to acquire data file
3
+ resources.
4
+
5
+ The getResource function is equivalent to the standard idioms, such as
6
+ the following minimal implementation::
7
+
8
+ import sys, os
9
+
10
+ def getResource(identifier, pkgname=__name__):
11
+ pkgpath = os.path.dirname(sys.modules[pkgname].__file__)
12
+ path = os.path.join(pkgpath, identifier)
13
+ return open(os.path.normpath(path), mode='rb')
14
+
15
+ When a __loader__ is present on the module given by __name__, it will defer
16
+ getResource to its get_data implementation and return it as a file-like
17
+ object (such as StringIO).
18
+ """
19
+
20
+ import sys
21
+ import os
22
+ from io import StringIO
23
+
24
+
25
+ def get_resource(identifier, pkgname=__name__):
26
+ """
27
+ Acquire a readable object for a given package name and identifier.
28
+ An IOError will be raised if the resource cannot be found.
29
+
30
+ For example::
31
+
32
+ mydata = get_resource('mypkgdata.jpg').read()
33
+
34
+ Note that the package name must be fully qualified, if given, such
35
+ that it would be found in sys.modules.
36
+
37
+ In some cases, getResource will return a real file object. In that
38
+ case, it may be useful to use its name attribute to get the path
39
+ rather than use it as a file-like object. For example, you may
40
+ be handing data off to a C API.
41
+ """
42
+
43
+ mod = sys.modules[pkgname]
44
+ fn = getattr(mod, '__file__', None)
45
+ if fn is None:
46
+ raise OSError("%r has no __file__!")
47
+ path = os.path.join(os.path.dirname(fn), identifier)
48
+ loader = getattr(mod, '__loader__', None)
49
+ if loader is not None:
50
+ try:
51
+ data = loader.get_data(path)
52
+ except (OSError, AttributeError):
53
+ pass
54
+ else:
55
+ return StringIO(data.decode('utf-8'))
56
+ return open(os.path.normpath(path), 'rb')
venv/lib/python3.10/site-packages/sympy/utilities/pytest.py ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ .. deprecated:: 1.6
3
+
4
+ sympy.utilities.pytest has been renamed to sympy.testing.pytest.
5
+ """
6
+ from sympy.utilities.exceptions import sympy_deprecation_warning
7
+
8
+ sympy_deprecation_warning("The sympy.utilities.pytest submodule is deprecated. Use sympy.testing.pytest instead.",
9
+ deprecated_since_version="1.6",
10
+ active_deprecations_target="deprecated-sympy-utilities-submodules")
11
+
12
+ from sympy.testing.pytest import * # noqa:F401
venv/lib/python3.10/site-packages/sympy/utilities/randtest.py ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ .. deprecated:: 1.6
3
+
4
+ sympy.utilities.randtest has been renamed to sympy.core.random.
5
+ """
6
+ from sympy.utilities.exceptions import sympy_deprecation_warning
7
+
8
+ sympy_deprecation_warning("The sympy.utilities.randtest submodule is deprecated. Use sympy.core.random instead.",
9
+ deprecated_since_version="1.6",
10
+ active_deprecations_target="deprecated-sympy-utilities-submodules")
11
+
12
+ from sympy.core.random import * # noqa:F401
venv/lib/python3.10/site-packages/sympy/utilities/runtests.py ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ .. deprecated:: 1.6
3
+
4
+ sympy.utilities.runtests has been renamed to sympy.testing.runtests.
5
+ """
6
+
7
+ from sympy.utilities.exceptions import sympy_deprecation_warning
8
+
9
+ sympy_deprecation_warning("The sympy.utilities.runtests submodule is deprecated. Use sympy.testing.runtests instead.",
10
+ deprecated_since_version="1.6",
11
+ active_deprecations_target="deprecated-sympy-utilities-submodules")
12
+
13
+ from sympy.testing.runtests import * # noqa:F401
venv/lib/python3.10/site-packages/sympy/utilities/source.py ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This module adds several functions for interactive source code inspection.
3
+ """
4
+
5
+
6
+ def get_class(lookup_view):
7
+ """
8
+ Convert a string version of a class name to the object.
9
+
10
+ For example, get_class('sympy.core.Basic') will return
11
+ class Basic located in module sympy.core
12
+ """
13
+ if isinstance(lookup_view, str):
14
+ mod_name, func_name = get_mod_func(lookup_view)
15
+ if func_name != '':
16
+ lookup_view = getattr(
17
+ __import__(mod_name, {}, {}, ['*']), func_name)
18
+ if not callable(lookup_view):
19
+ raise AttributeError(
20
+ "'%s.%s' is not a callable." % (mod_name, func_name))
21
+ return lookup_view
22
+
23
+
24
+ def get_mod_func(callback):
25
+ """
26
+ splits the string path to a class into a string path to the module
27
+ and the name of the class.
28
+
29
+ Examples
30
+ ========
31
+
32
+ >>> from sympy.utilities.source import get_mod_func
33
+ >>> get_mod_func('sympy.core.basic.Basic')
34
+ ('sympy.core.basic', 'Basic')
35
+
36
+ """
37
+ dot = callback.rfind('.')
38
+ if dot == -1:
39
+ return callback, ''
40
+ return callback[:dot], callback[dot + 1:]
venv/lib/python3.10/site-packages/sympy/utilities/tests/__init__.py ADDED
File without changes
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (189 Bytes). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_autowrap.cpython-310.pyc ADDED
Binary file (14.5 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_codegen.cpython-310.pyc ADDED
Binary file (44.2 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_codegen_julia.cpython-310.pyc ADDED
Binary file (15.4 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_codegen_octave.cpython-310.pyc ADDED
Binary file (15.2 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_codegen_rust.cpython-310.pyc ADDED
Binary file (10.4 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_decorator.cpython-310.pyc ADDED
Binary file (5.03 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_deprecated.cpython-310.pyc ADDED
Binary file (704 Bytes). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_enumerative.cpython-310.pyc ADDED
Binary file (5.33 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_exceptions.cpython-310.pyc ADDED
Binary file (862 Bytes). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_iterables.cpython-310.pyc ADDED
Binary file (39 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_lambdify.cpython-310.pyc ADDED
Binary file (60.5 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_matchpy_connector.cpython-310.pyc ADDED
Binary file (4.43 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_mathml.cpython-310.pyc ADDED
Binary file (1.11 kB). View file
 
venv/lib/python3.10/site-packages/sympy/utilities/tests/__pycache__/test_misc.cpython-310.pyc ADDED
Binary file (5.29 kB). View file