applied-ai-018 commited on
Commit
3aaa86e
·
verified ·
1 Parent(s): fcc998f

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. ckpts/universal/global_step80/zero/24.mlp.dense_h_to_4h.weight/exp_avg.pt +3 -0
  2. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/__init__.cpython-310.pyc +0 -0
  3. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/betweenness.cpython-310.pyc +0 -0
  4. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/betweenness_subset.cpython-310.pyc +0 -0
  5. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/closeness.cpython-310.pyc +0 -0
  6. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/current_flow_betweenness.cpython-310.pyc +0 -0
  7. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/current_flow_betweenness_subset.cpython-310.pyc +0 -0
  8. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/current_flow_closeness.cpython-310.pyc +0 -0
  9. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/degree_alg.cpython-310.pyc +0 -0
  10. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/dispersion.cpython-310.pyc +0 -0
  11. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/eigenvector.cpython-310.pyc +0 -0
  12. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/flow_matrix.cpython-310.pyc +0 -0
  13. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/group.cpython-310.pyc +0 -0
  14. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/harmonic.cpython-310.pyc +0 -0
  15. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/katz.cpython-310.pyc +0 -0
  16. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/laplacian.cpython-310.pyc +0 -0
  17. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/load.cpython-310.pyc +0 -0
  18. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/percolation.cpython-310.pyc +0 -0
  19. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/reaching.cpython-310.pyc +0 -0
  20. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/second_order.cpython-310.pyc +0 -0
  21. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/subgraph_alg.cpython-310.pyc +0 -0
  22. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/trophic.cpython-310.pyc +0 -0
  23. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/voterank_alg.cpython-310.pyc +0 -0
  24. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/betweenness.py +435 -0
  25. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/current_flow_betweenness_subset.py +226 -0
  26. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/dispersion.py +107 -0
  27. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/eigenvector.py +341 -0
  28. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/flow_matrix.py +130 -0
  29. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/harmonic.py +80 -0
  30. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/laplacian.py +149 -0
  31. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/load.py +199 -0
  32. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/percolation.py +128 -0
  33. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/reaching.py +206 -0
  34. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/__init__.cpython-310.pyc +0 -0
  35. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_betweenness_centrality.cpython-310.pyc +0 -0
  36. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_betweenness_centrality_subset.cpython-310.pyc +0 -0
  37. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_closeness_centrality.cpython-310.pyc +0 -0
  38. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_current_flow_betweenness_centrality.cpython-310.pyc +0 -0
  39. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_current_flow_betweenness_centrality_subset.cpython-310.pyc +0 -0
  40. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_current_flow_closeness.cpython-310.pyc +0 -0
  41. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_degree_centrality.cpython-310.pyc +0 -0
  42. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_dispersion.cpython-310.pyc +0 -0
  43. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_eigenvector_centrality.cpython-310.pyc +0 -0
  44. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_group.cpython-310.pyc +0 -0
  45. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_harmonic_centrality.cpython-310.pyc +0 -0
  46. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_katz_centrality.cpython-310.pyc +0 -0
  47. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_laplacian_centrality.cpython-310.pyc +0 -0
  48. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_load_centrality.cpython-310.pyc +0 -0
  49. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_percolation_centrality.cpython-310.pyc +0 -0
  50. venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_reaching.cpython-310.pyc +0 -0
ckpts/universal/global_step80/zero/24.mlp.dense_h_to_4h.weight/exp_avg.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2d95348d63ff37f429c12caa2847fa139f4b005d19b8b49a3efdbafe9ceed34c
3
+ size 33555612
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (681 Bytes). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/betweenness.cpython-310.pyc ADDED
Binary file (12.3 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/betweenness_subset.cpython-310.pyc ADDED
Binary file (8.22 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/closeness.cpython-310.pyc ADDED
Binary file (9.18 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/current_flow_betweenness.cpython-310.pyc ADDED
Binary file (11.3 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/current_flow_betweenness_subset.cpython-310.pyc ADDED
Binary file (7.97 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/current_flow_closeness.cpython-310.pyc ADDED
Binary file (3.38 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/degree_alg.cpython-310.pyc ADDED
Binary file (4.57 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/dispersion.cpython-310.pyc ADDED
Binary file (3.3 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/eigenvector.cpython-310.pyc ADDED
Binary file (12.7 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/flow_matrix.cpython-310.pyc ADDED
Binary file (5.09 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/group.cpython-310.pyc ADDED
Binary file (21.5 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/harmonic.cpython-310.pyc ADDED
Binary file (2.84 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/katz.cpython-310.pyc ADDED
Binary file (10.8 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/laplacian.cpython-310.pyc ADDED
Binary file (5.52 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/load.cpython-310.pyc ADDED
Binary file (5.68 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/percolation.cpython-310.pyc ADDED
Binary file (4.17 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/reaching.cpython-310.pyc ADDED
Binary file (6.71 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/second_order.cpython-310.pyc ADDED
Binary file (5.4 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/subgraph_alg.cpython-310.pyc ADDED
Binary file (9.29 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/trophic.cpython-310.pyc ADDED
Binary file (4.3 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/__pycache__/voterank_alg.cpython-310.pyc ADDED
Binary file (2.91 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/betweenness.py ADDED
@@ -0,0 +1,435 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Betweenness centrality measures."""
2
+ from collections import deque
3
+ from heapq import heappop, heappush
4
+ from itertools import count
5
+
6
+ import networkx as nx
7
+ from networkx.algorithms.shortest_paths.weighted import _weight_function
8
+ from networkx.utils import py_random_state
9
+ from networkx.utils.decorators import not_implemented_for
10
+
11
+ __all__ = ["betweenness_centrality", "edge_betweenness_centrality"]
12
+
13
+
14
+ @py_random_state(5)
15
+ @nx._dispatchable(edge_attrs="weight")
16
+ def betweenness_centrality(
17
+ G, k=None, normalized=True, weight=None, endpoints=False, seed=None
18
+ ):
19
+ r"""Compute the shortest-path betweenness centrality for nodes.
20
+
21
+ Betweenness centrality of a node $v$ is the sum of the
22
+ fraction of all-pairs shortest paths that pass through $v$
23
+
24
+ .. math::
25
+
26
+ c_B(v) =\sum_{s,t \in V} \frac{\sigma(s, t|v)}{\sigma(s, t)}
27
+
28
+ where $V$ is the set of nodes, $\sigma(s, t)$ is the number of
29
+ shortest $(s, t)$-paths, and $\sigma(s, t|v)$ is the number of
30
+ those paths passing through some node $v$ other than $s, t$.
31
+ If $s = t$, $\sigma(s, t) = 1$, and if $v \in {s, t}$,
32
+ $\sigma(s, t|v) = 0$ [2]_.
33
+
34
+ Parameters
35
+ ----------
36
+ G : graph
37
+ A NetworkX graph.
38
+
39
+ k : int, optional (default=None)
40
+ If k is not None use k node samples to estimate betweenness.
41
+ The value of k <= n where n is the number of nodes in the graph.
42
+ Higher values give better approximation.
43
+
44
+ normalized : bool, optional
45
+ If True the betweenness values are normalized by `2/((n-1)(n-2))`
46
+ for graphs, and `1/((n-1)(n-2))` for directed graphs where `n`
47
+ is the number of nodes in G.
48
+
49
+ weight : None or string, optional (default=None)
50
+ If None, all edge weights are considered equal.
51
+ Otherwise holds the name of the edge attribute used as weight.
52
+ Weights are used to calculate weighted shortest paths, so they are
53
+ interpreted as distances.
54
+
55
+ endpoints : bool, optional
56
+ If True include the endpoints in the shortest path counts.
57
+
58
+ seed : integer, random_state, or None (default)
59
+ Indicator of random number generation state.
60
+ See :ref:`Randomness<randomness>`.
61
+ Note that this is only used if k is not None.
62
+
63
+ Returns
64
+ -------
65
+ nodes : dictionary
66
+ Dictionary of nodes with betweenness centrality as the value.
67
+
68
+ See Also
69
+ --------
70
+ edge_betweenness_centrality
71
+ load_centrality
72
+
73
+ Notes
74
+ -----
75
+ The algorithm is from Ulrik Brandes [1]_.
76
+ See [4]_ for the original first published version and [2]_ for details on
77
+ algorithms for variations and related metrics.
78
+
79
+ For approximate betweenness calculations set k=#samples to use
80
+ k nodes ("pivots") to estimate the betweenness values. For an estimate
81
+ of the number of pivots needed see [3]_.
82
+
83
+ For weighted graphs the edge weights must be greater than zero.
84
+ Zero edge weights can produce an infinite number of equal length
85
+ paths between pairs of nodes.
86
+
87
+ The total number of paths between source and target is counted
88
+ differently for directed and undirected graphs. Directed paths
89
+ are easy to count. Undirected paths are tricky: should a path
90
+ from "u" to "v" count as 1 undirected path or as 2 directed paths?
91
+
92
+ For betweenness_centrality we report the number of undirected
93
+ paths when G is undirected.
94
+
95
+ For betweenness_centrality_subset the reporting is different.
96
+ If the source and target subsets are the same, then we want
97
+ to count undirected paths. But if the source and target subsets
98
+ differ -- for example, if sources is {0} and targets is {1},
99
+ then we are only counting the paths in one direction. They are
100
+ undirected paths but we are counting them in a directed way.
101
+ To count them as undirected paths, each should count as half a path.
102
+
103
+ This algorithm is not guaranteed to be correct if edge weights
104
+ are floating point numbers. As a workaround you can use integer
105
+ numbers by multiplying the relevant edge attributes by a convenient
106
+ constant factor (eg 100) and converting to integers.
107
+
108
+ References
109
+ ----------
110
+ .. [1] Ulrik Brandes:
111
+ A Faster Algorithm for Betweenness Centrality.
112
+ Journal of Mathematical Sociology 25(2):163-177, 2001.
113
+ https://doi.org/10.1080/0022250X.2001.9990249
114
+ .. [2] Ulrik Brandes:
115
+ On Variants of Shortest-Path Betweenness
116
+ Centrality and their Generic Computation.
117
+ Social Networks 30(2):136-145, 2008.
118
+ https://doi.org/10.1016/j.socnet.2007.11.001
119
+ .. [3] Ulrik Brandes and Christian Pich:
120
+ Centrality Estimation in Large Networks.
121
+ International Journal of Bifurcation and Chaos 17(7):2303-2318, 2007.
122
+ https://dx.doi.org/10.1142/S0218127407018403
123
+ .. [4] Linton C. Freeman:
124
+ A set of measures of centrality based on betweenness.
125
+ Sociometry 40: 35–41, 1977
126
+ https://doi.org/10.2307/3033543
127
+ """
128
+ betweenness = dict.fromkeys(G, 0.0) # b[v]=0 for v in G
129
+ if k is None:
130
+ nodes = G
131
+ else:
132
+ nodes = seed.sample(list(G.nodes()), k)
133
+ for s in nodes:
134
+ # single source shortest paths
135
+ if weight is None: # use BFS
136
+ S, P, sigma, _ = _single_source_shortest_path_basic(G, s)
137
+ else: # use Dijkstra's algorithm
138
+ S, P, sigma, _ = _single_source_dijkstra_path_basic(G, s, weight)
139
+ # accumulation
140
+ if endpoints:
141
+ betweenness, _ = _accumulate_endpoints(betweenness, S, P, sigma, s)
142
+ else:
143
+ betweenness, _ = _accumulate_basic(betweenness, S, P, sigma, s)
144
+ # rescaling
145
+ betweenness = _rescale(
146
+ betweenness,
147
+ len(G),
148
+ normalized=normalized,
149
+ directed=G.is_directed(),
150
+ k=k,
151
+ endpoints=endpoints,
152
+ )
153
+ return betweenness
154
+
155
+
156
+ @py_random_state(4)
157
+ @nx._dispatchable(edge_attrs="weight")
158
+ def edge_betweenness_centrality(G, k=None, normalized=True, weight=None, seed=None):
159
+ r"""Compute betweenness centrality for edges.
160
+
161
+ Betweenness centrality of an edge $e$ is the sum of the
162
+ fraction of all-pairs shortest paths that pass through $e$
163
+
164
+ .. math::
165
+
166
+ c_B(e) =\sum_{s,t \in V} \frac{\sigma(s, t|e)}{\sigma(s, t)}
167
+
168
+ where $V$ is the set of nodes, $\sigma(s, t)$ is the number of
169
+ shortest $(s, t)$-paths, and $\sigma(s, t|e)$ is the number of
170
+ those paths passing through edge $e$ [2]_.
171
+
172
+ Parameters
173
+ ----------
174
+ G : graph
175
+ A NetworkX graph.
176
+
177
+ k : int, optional (default=None)
178
+ If k is not None use k node samples to estimate betweenness.
179
+ The value of k <= n where n is the number of nodes in the graph.
180
+ Higher values give better approximation.
181
+
182
+ normalized : bool, optional
183
+ If True the betweenness values are normalized by $2/(n(n-1))$
184
+ for graphs, and $1/(n(n-1))$ for directed graphs where $n$
185
+ is the number of nodes in G.
186
+
187
+ weight : None or string, optional (default=None)
188
+ If None, all edge weights are considered equal.
189
+ Otherwise holds the name of the edge attribute used as weight.
190
+ Weights are used to calculate weighted shortest paths, so they are
191
+ interpreted as distances.
192
+
193
+ seed : integer, random_state, or None (default)
194
+ Indicator of random number generation state.
195
+ See :ref:`Randomness<randomness>`.
196
+ Note that this is only used if k is not None.
197
+
198
+ Returns
199
+ -------
200
+ edges : dictionary
201
+ Dictionary of edges with betweenness centrality as the value.
202
+
203
+ See Also
204
+ --------
205
+ betweenness_centrality
206
+ edge_load
207
+
208
+ Notes
209
+ -----
210
+ The algorithm is from Ulrik Brandes [1]_.
211
+
212
+ For weighted graphs the edge weights must be greater than zero.
213
+ Zero edge weights can produce an infinite number of equal length
214
+ paths between pairs of nodes.
215
+
216
+ References
217
+ ----------
218
+ .. [1] A Faster Algorithm for Betweenness Centrality. Ulrik Brandes,
219
+ Journal of Mathematical Sociology 25(2):163-177, 2001.
220
+ https://doi.org/10.1080/0022250X.2001.9990249
221
+ .. [2] Ulrik Brandes: On Variants of Shortest-Path Betweenness
222
+ Centrality and their Generic Computation.
223
+ Social Networks 30(2):136-145, 2008.
224
+ https://doi.org/10.1016/j.socnet.2007.11.001
225
+ """
226
+ betweenness = dict.fromkeys(G, 0.0) # b[v]=0 for v in G
227
+ # b[e]=0 for e in G.edges()
228
+ betweenness.update(dict.fromkeys(G.edges(), 0.0))
229
+ if k is None:
230
+ nodes = G
231
+ else:
232
+ nodes = seed.sample(list(G.nodes()), k)
233
+ for s in nodes:
234
+ # single source shortest paths
235
+ if weight is None: # use BFS
236
+ S, P, sigma, _ = _single_source_shortest_path_basic(G, s)
237
+ else: # use Dijkstra's algorithm
238
+ S, P, sigma, _ = _single_source_dijkstra_path_basic(G, s, weight)
239
+ # accumulation
240
+ betweenness = _accumulate_edges(betweenness, S, P, sigma, s)
241
+ # rescaling
242
+ for n in G: # remove nodes to only return edges
243
+ del betweenness[n]
244
+ betweenness = _rescale_e(
245
+ betweenness, len(G), normalized=normalized, directed=G.is_directed()
246
+ )
247
+ if G.is_multigraph():
248
+ betweenness = _add_edge_keys(G, betweenness, weight=weight)
249
+ return betweenness
250
+
251
+
252
+ # helpers for betweenness centrality
253
+
254
+
255
+ def _single_source_shortest_path_basic(G, s):
256
+ S = []
257
+ P = {}
258
+ for v in G:
259
+ P[v] = []
260
+ sigma = dict.fromkeys(G, 0.0) # sigma[v]=0 for v in G
261
+ D = {}
262
+ sigma[s] = 1.0
263
+ D[s] = 0
264
+ Q = deque([s])
265
+ while Q: # use BFS to find shortest paths
266
+ v = Q.popleft()
267
+ S.append(v)
268
+ Dv = D[v]
269
+ sigmav = sigma[v]
270
+ for w in G[v]:
271
+ if w not in D:
272
+ Q.append(w)
273
+ D[w] = Dv + 1
274
+ if D[w] == Dv + 1: # this is a shortest path, count paths
275
+ sigma[w] += sigmav
276
+ P[w].append(v) # predecessors
277
+ return S, P, sigma, D
278
+
279
+
280
+ def _single_source_dijkstra_path_basic(G, s, weight):
281
+ weight = _weight_function(G, weight)
282
+ # modified from Eppstein
283
+ S = []
284
+ P = {}
285
+ for v in G:
286
+ P[v] = []
287
+ sigma = dict.fromkeys(G, 0.0) # sigma[v]=0 for v in G
288
+ D = {}
289
+ sigma[s] = 1.0
290
+ push = heappush
291
+ pop = heappop
292
+ seen = {s: 0}
293
+ c = count()
294
+ Q = [] # use Q as heap with (distance,node id) tuples
295
+ push(Q, (0, next(c), s, s))
296
+ while Q:
297
+ (dist, _, pred, v) = pop(Q)
298
+ if v in D:
299
+ continue # already searched this node.
300
+ sigma[v] += sigma[pred] # count paths
301
+ S.append(v)
302
+ D[v] = dist
303
+ for w, edgedata in G[v].items():
304
+ vw_dist = dist + weight(v, w, edgedata)
305
+ if w not in D and (w not in seen or vw_dist < seen[w]):
306
+ seen[w] = vw_dist
307
+ push(Q, (vw_dist, next(c), v, w))
308
+ sigma[w] = 0.0
309
+ P[w] = [v]
310
+ elif vw_dist == seen[w]: # handle equal paths
311
+ sigma[w] += sigma[v]
312
+ P[w].append(v)
313
+ return S, P, sigma, D
314
+
315
+
316
+ def _accumulate_basic(betweenness, S, P, sigma, s):
317
+ delta = dict.fromkeys(S, 0)
318
+ while S:
319
+ w = S.pop()
320
+ coeff = (1 + delta[w]) / sigma[w]
321
+ for v in P[w]:
322
+ delta[v] += sigma[v] * coeff
323
+ if w != s:
324
+ betweenness[w] += delta[w]
325
+ return betweenness, delta
326
+
327
+
328
+ def _accumulate_endpoints(betweenness, S, P, sigma, s):
329
+ betweenness[s] += len(S) - 1
330
+ delta = dict.fromkeys(S, 0)
331
+ while S:
332
+ w = S.pop()
333
+ coeff = (1 + delta[w]) / sigma[w]
334
+ for v in P[w]:
335
+ delta[v] += sigma[v] * coeff
336
+ if w != s:
337
+ betweenness[w] += delta[w] + 1
338
+ return betweenness, delta
339
+
340
+
341
+ def _accumulate_edges(betweenness, S, P, sigma, s):
342
+ delta = dict.fromkeys(S, 0)
343
+ while S:
344
+ w = S.pop()
345
+ coeff = (1 + delta[w]) / sigma[w]
346
+ for v in P[w]:
347
+ c = sigma[v] * coeff
348
+ if (v, w) not in betweenness:
349
+ betweenness[(w, v)] += c
350
+ else:
351
+ betweenness[(v, w)] += c
352
+ delta[v] += c
353
+ if w != s:
354
+ betweenness[w] += delta[w]
355
+ return betweenness
356
+
357
+
358
+ def _rescale(betweenness, n, normalized, directed=False, k=None, endpoints=False):
359
+ if normalized:
360
+ if endpoints:
361
+ if n < 2:
362
+ scale = None # no normalization
363
+ else:
364
+ # Scale factor should include endpoint nodes
365
+ scale = 1 / (n * (n - 1))
366
+ elif n <= 2:
367
+ scale = None # no normalization b=0 for all nodes
368
+ else:
369
+ scale = 1 / ((n - 1) * (n - 2))
370
+ else: # rescale by 2 for undirected graphs
371
+ if not directed:
372
+ scale = 0.5
373
+ else:
374
+ scale = None
375
+ if scale is not None:
376
+ if k is not None:
377
+ scale = scale * n / k
378
+ for v in betweenness:
379
+ betweenness[v] *= scale
380
+ return betweenness
381
+
382
+
383
+ def _rescale_e(betweenness, n, normalized, directed=False, k=None):
384
+ if normalized:
385
+ if n <= 1:
386
+ scale = None # no normalization b=0 for all nodes
387
+ else:
388
+ scale = 1 / (n * (n - 1))
389
+ else: # rescale by 2 for undirected graphs
390
+ if not directed:
391
+ scale = 0.5
392
+ else:
393
+ scale = None
394
+ if scale is not None:
395
+ if k is not None:
396
+ scale = scale * n / k
397
+ for v in betweenness:
398
+ betweenness[v] *= scale
399
+ return betweenness
400
+
401
+
402
+ @not_implemented_for("graph")
403
+ def _add_edge_keys(G, betweenness, weight=None):
404
+ r"""Adds the corrected betweenness centrality (BC) values for multigraphs.
405
+
406
+ Parameters
407
+ ----------
408
+ G : NetworkX graph.
409
+
410
+ betweenness : dictionary
411
+ Dictionary mapping adjacent node tuples to betweenness centrality values.
412
+
413
+ weight : string or function
414
+ See `_weight_function` for details. Defaults to `None`.
415
+
416
+ Returns
417
+ -------
418
+ edges : dictionary
419
+ The parameter `betweenness` including edges with keys and their
420
+ betweenness centrality values.
421
+
422
+ The BC value is divided among edges of equal weight.
423
+ """
424
+ _weight = _weight_function(G, weight)
425
+
426
+ edge_bc = dict.fromkeys(G.edges, 0.0)
427
+ for u, v in betweenness:
428
+ d = G[u][v]
429
+ wt = _weight(u, v, d)
430
+ keys = [k for k in d if _weight(u, v, {k: d[k]}) == wt]
431
+ bc = betweenness[(u, v)] / len(keys)
432
+ for k in keys:
433
+ edge_bc[(u, v, k)] = bc
434
+
435
+ return edge_bc
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/current_flow_betweenness_subset.py ADDED
@@ -0,0 +1,226 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Current-flow betweenness centrality measures for subsets of nodes."""
2
+ import networkx as nx
3
+ from networkx.algorithms.centrality.flow_matrix import flow_matrix_row
4
+ from networkx.utils import not_implemented_for, reverse_cuthill_mckee_ordering
5
+
6
+ __all__ = [
7
+ "current_flow_betweenness_centrality_subset",
8
+ "edge_current_flow_betweenness_centrality_subset",
9
+ ]
10
+
11
+
12
+ @not_implemented_for("directed")
13
+ @nx._dispatchable(edge_attrs="weight")
14
+ def current_flow_betweenness_centrality_subset(
15
+ G, sources, targets, normalized=True, weight=None, dtype=float, solver="lu"
16
+ ):
17
+ r"""Compute current-flow betweenness centrality for subsets of nodes.
18
+
19
+ Current-flow betweenness centrality uses an electrical current
20
+ model for information spreading in contrast to betweenness
21
+ centrality which uses shortest paths.
22
+
23
+ Current-flow betweenness centrality is also known as
24
+ random-walk betweenness centrality [2]_.
25
+
26
+ Parameters
27
+ ----------
28
+ G : graph
29
+ A NetworkX graph
30
+
31
+ sources: list of nodes
32
+ Nodes to use as sources for current
33
+
34
+ targets: list of nodes
35
+ Nodes to use as sinks for current
36
+
37
+ normalized : bool, optional (default=True)
38
+ If True the betweenness values are normalized by b=b/(n-1)(n-2) where
39
+ n is the number of nodes in G.
40
+
41
+ weight : string or None, optional (default=None)
42
+ Key for edge data used as the edge weight.
43
+ If None, then use 1 as each edge weight.
44
+ The weight reflects the capacity or the strength of the
45
+ edge.
46
+
47
+ dtype: data type (float)
48
+ Default data type for internal matrices.
49
+ Set to np.float32 for lower memory consumption.
50
+
51
+ solver: string (default='lu')
52
+ Type of linear solver to use for computing the flow matrix.
53
+ Options are "full" (uses most memory), "lu" (recommended), and
54
+ "cg" (uses least memory).
55
+
56
+ Returns
57
+ -------
58
+ nodes : dictionary
59
+ Dictionary of nodes with betweenness centrality as the value.
60
+
61
+ See Also
62
+ --------
63
+ approximate_current_flow_betweenness_centrality
64
+ betweenness_centrality
65
+ edge_betweenness_centrality
66
+ edge_current_flow_betweenness_centrality
67
+
68
+ Notes
69
+ -----
70
+ Current-flow betweenness can be computed in $O(I(n-1)+mn \log n)$
71
+ time [1]_, where $I(n-1)$ is the time needed to compute the
72
+ inverse Laplacian. For a full matrix this is $O(n^3)$ but using
73
+ sparse methods you can achieve $O(nm{\sqrt k})$ where $k$ is the
74
+ Laplacian matrix condition number.
75
+
76
+ The space required is $O(nw)$ where $w$ is the width of the sparse
77
+ Laplacian matrix. Worse case is $w=n$ for $O(n^2)$.
78
+
79
+ If the edges have a 'weight' attribute they will be used as
80
+ weights in this algorithm. Unspecified weights are set to 1.
81
+
82
+ References
83
+ ----------
84
+ .. [1] Centrality Measures Based on Current Flow.
85
+ Ulrik Brandes and Daniel Fleischer,
86
+ Proc. 22nd Symp. Theoretical Aspects of Computer Science (STACS '05).
87
+ LNCS 3404, pp. 533-544. Springer-Verlag, 2005.
88
+ https://doi.org/10.1007/978-3-540-31856-9_44
89
+
90
+ .. [2] A measure of betweenness centrality based on random walks,
91
+ M. E. J. Newman, Social Networks 27, 39-54 (2005).
92
+ """
93
+ import numpy as np
94
+
95
+ from networkx.utils import reverse_cuthill_mckee_ordering
96
+
97
+ if not nx.is_connected(G):
98
+ raise nx.NetworkXError("Graph not connected.")
99
+ N = G.number_of_nodes()
100
+ ordering = list(reverse_cuthill_mckee_ordering(G))
101
+ # make a copy with integer labels according to rcm ordering
102
+ # this could be done without a copy if we really wanted to
103
+ mapping = dict(zip(ordering, range(N)))
104
+ H = nx.relabel_nodes(G, mapping)
105
+ betweenness = dict.fromkeys(H, 0.0) # b[n]=0 for n in H
106
+ for row, (s, t) in flow_matrix_row(H, weight=weight, dtype=dtype, solver=solver):
107
+ for ss in sources:
108
+ i = mapping[ss]
109
+ for tt in targets:
110
+ j = mapping[tt]
111
+ betweenness[s] += 0.5 * abs(row.item(i) - row.item(j))
112
+ betweenness[t] += 0.5 * abs(row.item(i) - row.item(j))
113
+ if normalized:
114
+ nb = (N - 1.0) * (N - 2.0) # normalization factor
115
+ else:
116
+ nb = 2.0
117
+ for node in H:
118
+ betweenness[node] = betweenness[node] / nb + 1.0 / (2 - N)
119
+ return {ordering[node]: value for node, value in betweenness.items()}
120
+
121
+
122
+ @not_implemented_for("directed")
123
+ @nx._dispatchable(edge_attrs="weight")
124
+ def edge_current_flow_betweenness_centrality_subset(
125
+ G, sources, targets, normalized=True, weight=None, dtype=float, solver="lu"
126
+ ):
127
+ r"""Compute current-flow betweenness centrality for edges using subsets
128
+ of nodes.
129
+
130
+ Current-flow betweenness centrality uses an electrical current
131
+ model for information spreading in contrast to betweenness
132
+ centrality which uses shortest paths.
133
+
134
+ Current-flow betweenness centrality is also known as
135
+ random-walk betweenness centrality [2]_.
136
+
137
+ Parameters
138
+ ----------
139
+ G : graph
140
+ A NetworkX graph
141
+
142
+ sources: list of nodes
143
+ Nodes to use as sources for current
144
+
145
+ targets: list of nodes
146
+ Nodes to use as sinks for current
147
+
148
+ normalized : bool, optional (default=True)
149
+ If True the betweenness values are normalized by b=b/(n-1)(n-2) where
150
+ n is the number of nodes in G.
151
+
152
+ weight : string or None, optional (default=None)
153
+ Key for edge data used as the edge weight.
154
+ If None, then use 1 as each edge weight.
155
+ The weight reflects the capacity or the strength of the
156
+ edge.
157
+
158
+ dtype: data type (float)
159
+ Default data type for internal matrices.
160
+ Set to np.float32 for lower memory consumption.
161
+
162
+ solver: string (default='lu')
163
+ Type of linear solver to use for computing the flow matrix.
164
+ Options are "full" (uses most memory), "lu" (recommended), and
165
+ "cg" (uses least memory).
166
+
167
+ Returns
168
+ -------
169
+ nodes : dict
170
+ Dictionary of edge tuples with betweenness centrality as the value.
171
+
172
+ See Also
173
+ --------
174
+ betweenness_centrality
175
+ edge_betweenness_centrality
176
+ current_flow_betweenness_centrality
177
+
178
+ Notes
179
+ -----
180
+ Current-flow betweenness can be computed in $O(I(n-1)+mn \log n)$
181
+ time [1]_, where $I(n-1)$ is the time needed to compute the
182
+ inverse Laplacian. For a full matrix this is $O(n^3)$ but using
183
+ sparse methods you can achieve $O(nm{\sqrt k})$ where $k$ is the
184
+ Laplacian matrix condition number.
185
+
186
+ The space required is $O(nw)$ where $w$ is the width of the sparse
187
+ Laplacian matrix. Worse case is $w=n$ for $O(n^2)$.
188
+
189
+ If the edges have a 'weight' attribute they will be used as
190
+ weights in this algorithm. Unspecified weights are set to 1.
191
+
192
+ References
193
+ ----------
194
+ .. [1] Centrality Measures Based on Current Flow.
195
+ Ulrik Brandes and Daniel Fleischer,
196
+ Proc. 22nd Symp. Theoretical Aspects of Computer Science (STACS '05).
197
+ LNCS 3404, pp. 533-544. Springer-Verlag, 2005.
198
+ https://doi.org/10.1007/978-3-540-31856-9_44
199
+
200
+ .. [2] A measure of betweenness centrality based on random walks,
201
+ M. E. J. Newman, Social Networks 27, 39-54 (2005).
202
+ """
203
+ import numpy as np
204
+
205
+ if not nx.is_connected(G):
206
+ raise nx.NetworkXError("Graph not connected.")
207
+ N = G.number_of_nodes()
208
+ ordering = list(reverse_cuthill_mckee_ordering(G))
209
+ # make a copy with integer labels according to rcm ordering
210
+ # this could be done without a copy if we really wanted to
211
+ mapping = dict(zip(ordering, range(N)))
212
+ H = nx.relabel_nodes(G, mapping)
213
+ edges = (tuple(sorted((u, v))) for u, v in H.edges())
214
+ betweenness = dict.fromkeys(edges, 0.0)
215
+ if normalized:
216
+ nb = (N - 1.0) * (N - 2.0) # normalization factor
217
+ else:
218
+ nb = 2.0
219
+ for row, (e) in flow_matrix_row(H, weight=weight, dtype=dtype, solver=solver):
220
+ for ss in sources:
221
+ i = mapping[ss]
222
+ for tt in targets:
223
+ j = mapping[tt]
224
+ betweenness[e] += 0.5 * abs(row.item(i) - row.item(j))
225
+ betweenness[e] /= nb
226
+ return {(ordering[s], ordering[t]): value for (s, t), value in betweenness.items()}
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/dispersion.py ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from itertools import combinations
2
+
3
+ import networkx as nx
4
+
5
+ __all__ = ["dispersion"]
6
+
7
+
8
+ @nx._dispatchable
9
+ def dispersion(G, u=None, v=None, normalized=True, alpha=1.0, b=0.0, c=0.0):
10
+ r"""Calculate dispersion between `u` and `v` in `G`.
11
+
12
+ A link between two actors (`u` and `v`) has a high dispersion when their
13
+ mutual ties (`s` and `t`) are not well connected with each other.
14
+
15
+ Parameters
16
+ ----------
17
+ G : graph
18
+ A NetworkX graph.
19
+ u : node, optional
20
+ The source for the dispersion score (e.g. ego node of the network).
21
+ v : node, optional
22
+ The target of the dispersion score if specified.
23
+ normalized : bool
24
+ If True (default) normalize by the embeddedness of the nodes (u and v).
25
+ alpha, b, c : float
26
+ Parameters for the normalization procedure. When `normalized` is True,
27
+ the dispersion value is normalized by::
28
+
29
+ result = ((dispersion + b) ** alpha) / (embeddedness + c)
30
+
31
+ as long as the denominator is nonzero.
32
+
33
+ Returns
34
+ -------
35
+ nodes : dictionary
36
+ If u (v) is specified, returns a dictionary of nodes with dispersion
37
+ score for all "target" ("source") nodes. If neither u nor v is
38
+ specified, returns a dictionary of dictionaries for all nodes 'u' in the
39
+ graph with a dispersion score for each node 'v'.
40
+
41
+ Notes
42
+ -----
43
+ This implementation follows Lars Backstrom and Jon Kleinberg [1]_. Typical
44
+ usage would be to run dispersion on the ego network $G_u$ if $u$ were
45
+ specified. Running :func:`dispersion` with neither $u$ nor $v$ specified
46
+ can take some time to complete.
47
+
48
+ References
49
+ ----------
50
+ .. [1] Romantic Partnerships and the Dispersion of Social Ties:
51
+ A Network Analysis of Relationship Status on Facebook.
52
+ Lars Backstrom, Jon Kleinberg.
53
+ https://arxiv.org/pdf/1310.6753v1.pdf
54
+
55
+ """
56
+
57
+ def _dispersion(G_u, u, v):
58
+ """dispersion for all nodes 'v' in a ego network G_u of node 'u'"""
59
+ u_nbrs = set(G_u[u])
60
+ ST = {n for n in G_u[v] if n in u_nbrs}
61
+ set_uv = {u, v}
62
+ # all possible ties of connections that u and b share
63
+ possib = combinations(ST, 2)
64
+ total = 0
65
+ for s, t in possib:
66
+ # neighbors of s that are in G_u, not including u and v
67
+ nbrs_s = u_nbrs.intersection(G_u[s]) - set_uv
68
+ # s and t are not directly connected
69
+ if t not in nbrs_s:
70
+ # s and t do not share a connection
71
+ if nbrs_s.isdisjoint(G_u[t]):
72
+ # tick for disp(u, v)
73
+ total += 1
74
+ # neighbors that u and v share
75
+ embeddedness = len(ST)
76
+
77
+ dispersion_val = total
78
+ if normalized:
79
+ dispersion_val = (total + b) ** alpha
80
+ if embeddedness + c != 0:
81
+ dispersion_val /= embeddedness + c
82
+
83
+ return dispersion_val
84
+
85
+ if u is None:
86
+ # v and u are not specified
87
+ if v is None:
88
+ results = {n: {} for n in G}
89
+ for u in G:
90
+ for v in G[u]:
91
+ results[u][v] = _dispersion(G, u, v)
92
+ # u is not specified, but v is
93
+ else:
94
+ results = dict.fromkeys(G[v], {})
95
+ for u in G[v]:
96
+ results[u] = _dispersion(G, v, u)
97
+ else:
98
+ # u is specified with no target v
99
+ if v is None:
100
+ results = dict.fromkeys(G[u], {})
101
+ for v in G[u]:
102
+ results[v] = _dispersion(G, u, v)
103
+ # both u and v are specified
104
+ else:
105
+ results = _dispersion(G, u, v)
106
+
107
+ return results
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/eigenvector.py ADDED
@@ -0,0 +1,341 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Functions for computing eigenvector centrality."""
2
+ import math
3
+
4
+ import networkx as nx
5
+ from networkx.utils import not_implemented_for
6
+
7
+ __all__ = ["eigenvector_centrality", "eigenvector_centrality_numpy"]
8
+
9
+
10
+ @not_implemented_for("multigraph")
11
+ @nx._dispatchable(edge_attrs="weight")
12
+ def eigenvector_centrality(G, max_iter=100, tol=1.0e-6, nstart=None, weight=None):
13
+ r"""Compute the eigenvector centrality for the graph G.
14
+
15
+ Eigenvector centrality computes the centrality for a node by adding
16
+ the centrality of its predecessors. The centrality for node $i$ is the
17
+ $i$-th element of a left eigenvector associated with the eigenvalue $\lambda$
18
+ of maximum modulus that is positive. Such an eigenvector $x$ is
19
+ defined up to a multiplicative constant by the equation
20
+
21
+ .. math::
22
+
23
+ \lambda x^T = x^T A,
24
+
25
+ where $A$ is the adjacency matrix of the graph G. By definition of
26
+ row-column product, the equation above is equivalent to
27
+
28
+ .. math::
29
+
30
+ \lambda x_i = \sum_{j\to i}x_j.
31
+
32
+ That is, adding the eigenvector centralities of the predecessors of
33
+ $i$ one obtains the eigenvector centrality of $i$ multiplied by
34
+ $\lambda$. In the case of undirected graphs, $x$ also solves the familiar
35
+ right-eigenvector equation $Ax = \lambda x$.
36
+
37
+ By virtue of the Perron–Frobenius theorem [1]_, if G is strongly
38
+ connected there is a unique eigenvector $x$, and all its entries
39
+ are strictly positive.
40
+
41
+ If G is not strongly connected there might be several left
42
+ eigenvectors associated with $\lambda$, and some of their elements
43
+ might be zero.
44
+
45
+ Parameters
46
+ ----------
47
+ G : graph
48
+ A networkx graph.
49
+
50
+ max_iter : integer, optional (default=100)
51
+ Maximum number of power iterations.
52
+
53
+ tol : float, optional (default=1.0e-6)
54
+ Error tolerance (in Euclidean norm) used to check convergence in
55
+ power iteration.
56
+
57
+ nstart : dictionary, optional (default=None)
58
+ Starting value of power iteration for each node. Must have a nonzero
59
+ projection on the desired eigenvector for the power method to converge.
60
+ If None, this implementation uses an all-ones vector, which is a safe
61
+ choice.
62
+
63
+ weight : None or string, optional (default=None)
64
+ If None, all edge weights are considered equal. Otherwise holds the
65
+ name of the edge attribute used as weight. In this measure the
66
+ weight is interpreted as the connection strength.
67
+
68
+ Returns
69
+ -------
70
+ nodes : dictionary
71
+ Dictionary of nodes with eigenvector centrality as the value. The
72
+ associated vector has unit Euclidean norm and the values are
73
+ nonegative.
74
+
75
+ Examples
76
+ --------
77
+ >>> G = nx.path_graph(4)
78
+ >>> centrality = nx.eigenvector_centrality(G)
79
+ >>> sorted((v, f"{c:0.2f}") for v, c in centrality.items())
80
+ [(0, '0.37'), (1, '0.60'), (2, '0.60'), (3, '0.37')]
81
+
82
+ Raises
83
+ ------
84
+ NetworkXPointlessConcept
85
+ If the graph G is the null graph.
86
+
87
+ NetworkXError
88
+ If each value in `nstart` is zero.
89
+
90
+ PowerIterationFailedConvergence
91
+ If the algorithm fails to converge to the specified tolerance
92
+ within the specified number of iterations of the power iteration
93
+ method.
94
+
95
+ See Also
96
+ --------
97
+ eigenvector_centrality_numpy
98
+ :func:`~networkx.algorithms.link_analysis.pagerank_alg.pagerank`
99
+ :func:`~networkx.algorithms.link_analysis.hits_alg.hits`
100
+
101
+ Notes
102
+ -----
103
+ Eigenvector centrality was introduced by Landau [2]_ for chess
104
+ tournaments. It was later rediscovered by Wei [3]_ and then
105
+ popularized by Kendall [4]_ in the context of sport ranking. Berge
106
+ introduced a general definition for graphs based on social connections
107
+ [5]_. Bonacich [6]_ reintroduced again eigenvector centrality and made
108
+ it popular in link analysis.
109
+
110
+ This function computes the left dominant eigenvector, which corresponds
111
+ to adding the centrality of predecessors: this is the usual approach.
112
+ To add the centrality of successors first reverse the graph with
113
+ ``G.reverse()``.
114
+
115
+ The implementation uses power iteration [7]_ to compute a dominant
116
+ eigenvector starting from the provided vector `nstart`. Convergence is
117
+ guaranteed as long as `nstart` has a nonzero projection on a dominant
118
+ eigenvector, which certainly happens using the default value.
119
+
120
+ The method stops when the change in the computed vector between two
121
+ iterations is smaller than an error tolerance of ``G.number_of_nodes()
122
+ * tol`` or after ``max_iter`` iterations, but in the second case it
123
+ raises an exception.
124
+
125
+ This implementation uses $(A + I)$ rather than the adjacency matrix
126
+ $A$ because the change preserves eigenvectors, but it shifts the
127
+ spectrum, thus guaranteeing convergence even for networks with
128
+ negative eigenvalues of maximum modulus.
129
+
130
+ References
131
+ ----------
132
+ .. [1] Abraham Berman and Robert J. Plemmons.
133
+ "Nonnegative Matrices in the Mathematical Sciences."
134
+ Classics in Applied Mathematics. SIAM, 1994.
135
+
136
+ .. [2] Edmund Landau.
137
+ "Zur relativen Wertbemessung der Turnierresultate."
138
+ Deutsches Wochenschach, 11:366–369, 1895.
139
+
140
+ .. [3] Teh-Hsing Wei.
141
+ "The Algebraic Foundations of Ranking Theory."
142
+ PhD thesis, University of Cambridge, 1952.
143
+
144
+ .. [4] Maurice G. Kendall.
145
+ "Further contributions to the theory of paired comparisons."
146
+ Biometrics, 11(1):43–62, 1955.
147
+ https://www.jstor.org/stable/3001479
148
+
149
+ .. [5] Claude Berge
150
+ "Théorie des graphes et ses applications."
151
+ Dunod, Paris, France, 1958.
152
+
153
+ .. [6] Phillip Bonacich.
154
+ "Technique for analyzing overlapping memberships."
155
+ Sociological Methodology, 4:176–185, 1972.
156
+ https://www.jstor.org/stable/270732
157
+
158
+ .. [7] Power iteration:: https://en.wikipedia.org/wiki/Power_iteration
159
+
160
+ """
161
+ if len(G) == 0:
162
+ raise nx.NetworkXPointlessConcept(
163
+ "cannot compute centrality for the null graph"
164
+ )
165
+ # If no initial vector is provided, start with the all-ones vector.
166
+ if nstart is None:
167
+ nstart = {v: 1 for v in G}
168
+ if all(v == 0 for v in nstart.values()):
169
+ raise nx.NetworkXError("initial vector cannot have all zero values")
170
+ # Normalize the initial vector so that each entry is in [0, 1]. This is
171
+ # guaranteed to never have a divide-by-zero error by the previous line.
172
+ nstart_sum = sum(nstart.values())
173
+ x = {k: v / nstart_sum for k, v in nstart.items()}
174
+ nnodes = G.number_of_nodes()
175
+ # make up to max_iter iterations
176
+ for _ in range(max_iter):
177
+ xlast = x
178
+ x = xlast.copy() # Start with xlast times I to iterate with (A+I)
179
+ # do the multiplication y^T = x^T A (left eigenvector)
180
+ for n in x:
181
+ for nbr in G[n]:
182
+ w = G[n][nbr].get(weight, 1) if weight else 1
183
+ x[nbr] += xlast[n] * w
184
+ # Normalize the vector. The normalization denominator `norm`
185
+ # should never be zero by the Perron--Frobenius
186
+ # theorem. However, in case it is due to numerical error, we
187
+ # assume the norm to be one instead.
188
+ norm = math.hypot(*x.values()) or 1
189
+ x = {k: v / norm for k, v in x.items()}
190
+ # Check for convergence (in the L_1 norm).
191
+ if sum(abs(x[n] - xlast[n]) for n in x) < nnodes * tol:
192
+ return x
193
+ raise nx.PowerIterationFailedConvergence(max_iter)
194
+
195
+
196
+ @nx._dispatchable(edge_attrs="weight")
197
+ def eigenvector_centrality_numpy(G, weight=None, max_iter=50, tol=0):
198
+ r"""Compute the eigenvector centrality for the graph G.
199
+
200
+ Eigenvector centrality computes the centrality for a node by adding
201
+ the centrality of its predecessors. The centrality for node $i$ is the
202
+ $i$-th element of a left eigenvector associated with the eigenvalue $\lambda$
203
+ of maximum modulus that is positive. Such an eigenvector $x$ is
204
+ defined up to a multiplicative constant by the equation
205
+
206
+ .. math::
207
+
208
+ \lambda x^T = x^T A,
209
+
210
+ where $A$ is the adjacency matrix of the graph G. By definition of
211
+ row-column product, the equation above is equivalent to
212
+
213
+ .. math::
214
+
215
+ \lambda x_i = \sum_{j\to i}x_j.
216
+
217
+ That is, adding the eigenvector centralities of the predecessors of
218
+ $i$ one obtains the eigenvector centrality of $i$ multiplied by
219
+ $\lambda$. In the case of undirected graphs, $x$ also solves the familiar
220
+ right-eigenvector equation $Ax = \lambda x$.
221
+
222
+ By virtue of the Perron–Frobenius theorem [1]_, if G is strongly
223
+ connected there is a unique eigenvector $x$, and all its entries
224
+ are strictly positive.
225
+
226
+ If G is not strongly connected there might be several left
227
+ eigenvectors associated with $\lambda$, and some of their elements
228
+ might be zero.
229
+
230
+ Parameters
231
+ ----------
232
+ G : graph
233
+ A networkx graph.
234
+
235
+ max_iter : integer, optional (default=50)
236
+ Maximum number of Arnoldi update iterations allowed.
237
+
238
+ tol : float, optional (default=0)
239
+ Relative accuracy for eigenvalues (stopping criterion).
240
+ The default value of 0 implies machine precision.
241
+
242
+ weight : None or string, optional (default=None)
243
+ If None, all edge weights are considered equal. Otherwise holds the
244
+ name of the edge attribute used as weight. In this measure the
245
+ weight is interpreted as the connection strength.
246
+
247
+ Returns
248
+ -------
249
+ nodes : dictionary
250
+ Dictionary of nodes with eigenvector centrality as the value. The
251
+ associated vector has unit Euclidean norm and the values are
252
+ nonegative.
253
+
254
+ Examples
255
+ --------
256
+ >>> G = nx.path_graph(4)
257
+ >>> centrality = nx.eigenvector_centrality_numpy(G)
258
+ >>> print([f"{node} {centrality[node]:0.2f}" for node in centrality])
259
+ ['0 0.37', '1 0.60', '2 0.60', '3 0.37']
260
+
261
+ Raises
262
+ ------
263
+ NetworkXPointlessConcept
264
+ If the graph G is the null graph.
265
+
266
+ ArpackNoConvergence
267
+ When the requested convergence is not obtained. The currently
268
+ converged eigenvalues and eigenvectors can be found as
269
+ eigenvalues and eigenvectors attributes of the exception object.
270
+
271
+ See Also
272
+ --------
273
+ :func:`scipy.sparse.linalg.eigs`
274
+ eigenvector_centrality
275
+ :func:`~networkx.algorithms.link_analysis.pagerank_alg.pagerank`
276
+ :func:`~networkx.algorithms.link_analysis.hits_alg.hits`
277
+
278
+ Notes
279
+ -----
280
+ Eigenvector centrality was introduced by Landau [2]_ for chess
281
+ tournaments. It was later rediscovered by Wei [3]_ and then
282
+ popularized by Kendall [4]_ in the context of sport ranking. Berge
283
+ introduced a general definition for graphs based on social connections
284
+ [5]_. Bonacich [6]_ reintroduced again eigenvector centrality and made
285
+ it popular in link analysis.
286
+
287
+ This function computes the left dominant eigenvector, which corresponds
288
+ to adding the centrality of predecessors: this is the usual approach.
289
+ To add the centrality of successors first reverse the graph with
290
+ ``G.reverse()``.
291
+
292
+ This implementation uses the
293
+ :func:`SciPy sparse eigenvalue solver<scipy.sparse.linalg.eigs>` (ARPACK)
294
+ to find the largest eigenvalue/eigenvector pair using Arnoldi iterations
295
+ [7]_.
296
+
297
+ References
298
+ ----------
299
+ .. [1] Abraham Berman and Robert J. Plemmons.
300
+ "Nonnegative Matrices in the Mathematical Sciences."
301
+ Classics in Applied Mathematics. SIAM, 1994.
302
+
303
+ .. [2] Edmund Landau.
304
+ "Zur relativen Wertbemessung der Turnierresultate."
305
+ Deutsches Wochenschach, 11:366–369, 1895.
306
+
307
+ .. [3] Teh-Hsing Wei.
308
+ "The Algebraic Foundations of Ranking Theory."
309
+ PhD thesis, University of Cambridge, 1952.
310
+
311
+ .. [4] Maurice G. Kendall.
312
+ "Further contributions to the theory of paired comparisons."
313
+ Biometrics, 11(1):43–62, 1955.
314
+ https://www.jstor.org/stable/3001479
315
+
316
+ .. [5] Claude Berge
317
+ "Théorie des graphes et ses applications."
318
+ Dunod, Paris, France, 1958.
319
+
320
+ .. [6] Phillip Bonacich.
321
+ "Technique for analyzing overlapping memberships."
322
+ Sociological Methodology, 4:176–185, 1972.
323
+ https://www.jstor.org/stable/270732
324
+
325
+ .. [7] Arnoldi iteration:: https://en.wikipedia.org/wiki/Arnoldi_iteration
326
+
327
+ """
328
+ import numpy as np
329
+ import scipy as sp
330
+
331
+ if len(G) == 0:
332
+ raise nx.NetworkXPointlessConcept(
333
+ "cannot compute centrality for the null graph"
334
+ )
335
+ M = nx.to_scipy_sparse_array(G, nodelist=list(G), weight=weight, dtype=float)
336
+ _, eigenvector = sp.sparse.linalg.eigs(
337
+ M.T, k=1, which="LR", maxiter=max_iter, tol=tol
338
+ )
339
+ largest = eigenvector.flatten().real
340
+ norm = np.sign(largest.sum()) * sp.linalg.norm(largest)
341
+ return dict(zip(G, (largest / norm).tolist()))
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/flow_matrix.py ADDED
@@ -0,0 +1,130 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Helpers for current-flow betweenness and current-flow closeness
2
+ # Lazy computations for inverse Laplacian and flow-matrix rows.
3
+ import networkx as nx
4
+
5
+
6
+ @nx._dispatchable(edge_attrs="weight")
7
+ def flow_matrix_row(G, weight=None, dtype=float, solver="lu"):
8
+ # Generate a row of the current-flow matrix
9
+ import numpy as np
10
+
11
+ solvername = {
12
+ "full": FullInverseLaplacian,
13
+ "lu": SuperLUInverseLaplacian,
14
+ "cg": CGInverseLaplacian,
15
+ }
16
+ n = G.number_of_nodes()
17
+ L = nx.laplacian_matrix(G, nodelist=range(n), weight=weight).asformat("csc")
18
+ L = L.astype(dtype)
19
+ C = solvername[solver](L, dtype=dtype) # initialize solver
20
+ w = C.w # w is the Laplacian matrix width
21
+ # row-by-row flow matrix
22
+ for u, v in sorted(sorted((u, v)) for u, v in G.edges()):
23
+ B = np.zeros(w, dtype=dtype)
24
+ c = G[u][v].get(weight, 1.0)
25
+ B[u % w] = c
26
+ B[v % w] = -c
27
+ # get only the rows needed in the inverse laplacian
28
+ # and multiply to get the flow matrix row
29
+ row = B @ C.get_rows(u, v)
30
+ yield row, (u, v)
31
+
32
+
33
+ # Class to compute the inverse laplacian only for specified rows
34
+ # Allows computation of the current-flow matrix without storing entire
35
+ # inverse laplacian matrix
36
+ class InverseLaplacian:
37
+ def __init__(self, L, width=None, dtype=None):
38
+ global np
39
+ import numpy as np
40
+
41
+ (n, n) = L.shape
42
+ self.dtype = dtype
43
+ self.n = n
44
+ if width is None:
45
+ self.w = self.width(L)
46
+ else:
47
+ self.w = width
48
+ self.C = np.zeros((self.w, n), dtype=dtype)
49
+ self.L1 = L[1:, 1:]
50
+ self.init_solver(L)
51
+
52
+ def init_solver(self, L):
53
+ pass
54
+
55
+ def solve(self, r):
56
+ raise nx.NetworkXError("Implement solver")
57
+
58
+ def solve_inverse(self, r):
59
+ raise nx.NetworkXError("Implement solver")
60
+
61
+ def get_rows(self, r1, r2):
62
+ for r in range(r1, r2 + 1):
63
+ self.C[r % self.w, 1:] = self.solve_inverse(r)
64
+ return self.C
65
+
66
+ def get_row(self, r):
67
+ self.C[r % self.w, 1:] = self.solve_inverse(r)
68
+ return self.C[r % self.w]
69
+
70
+ def width(self, L):
71
+ m = 0
72
+ for i, row in enumerate(L):
73
+ w = 0
74
+ x, y = np.nonzero(row)
75
+ if len(y) > 0:
76
+ v = y - i
77
+ w = v.max() - v.min() + 1
78
+ m = max(w, m)
79
+ return m
80
+
81
+
82
+ class FullInverseLaplacian(InverseLaplacian):
83
+ def init_solver(self, L):
84
+ self.IL = np.zeros(L.shape, dtype=self.dtype)
85
+ self.IL[1:, 1:] = np.linalg.inv(self.L1.todense())
86
+
87
+ def solve(self, rhs):
88
+ s = np.zeros(rhs.shape, dtype=self.dtype)
89
+ s = self.IL @ rhs
90
+ return s
91
+
92
+ def solve_inverse(self, r):
93
+ return self.IL[r, 1:]
94
+
95
+
96
+ class SuperLUInverseLaplacian(InverseLaplacian):
97
+ def init_solver(self, L):
98
+ import scipy as sp
99
+
100
+ self.lusolve = sp.sparse.linalg.factorized(self.L1.tocsc())
101
+
102
+ def solve_inverse(self, r):
103
+ rhs = np.zeros(self.n, dtype=self.dtype)
104
+ rhs[r] = 1
105
+ return self.lusolve(rhs[1:])
106
+
107
+ def solve(self, rhs):
108
+ s = np.zeros(rhs.shape, dtype=self.dtype)
109
+ s[1:] = self.lusolve(rhs[1:])
110
+ return s
111
+
112
+
113
+ class CGInverseLaplacian(InverseLaplacian):
114
+ def init_solver(self, L):
115
+ global sp
116
+ import scipy as sp
117
+
118
+ ilu = sp.sparse.linalg.spilu(self.L1.tocsc())
119
+ n = self.n - 1
120
+ self.M = sp.sparse.linalg.LinearOperator(shape=(n, n), matvec=ilu.solve)
121
+
122
+ def solve(self, rhs):
123
+ s = np.zeros(rhs.shape, dtype=self.dtype)
124
+ s[1:] = sp.sparse.linalg.cg(self.L1, rhs[1:], M=self.M, atol=0)[0]
125
+ return s
126
+
127
+ def solve_inverse(self, r):
128
+ rhs = np.zeros(self.n, self.dtype)
129
+ rhs[r] = 1
130
+ return sp.sparse.linalg.cg(self.L1, rhs[1:], M=self.M, atol=0)[0]
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/harmonic.py ADDED
@@ -0,0 +1,80 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Functions for computing the harmonic centrality of a graph."""
2
+ from functools import partial
3
+
4
+ import networkx as nx
5
+
6
+ __all__ = ["harmonic_centrality"]
7
+
8
+
9
+ @nx._dispatchable(edge_attrs="distance")
10
+ def harmonic_centrality(G, nbunch=None, distance=None, sources=None):
11
+ r"""Compute harmonic centrality for nodes.
12
+
13
+ Harmonic centrality [1]_ of a node `u` is the sum of the reciprocal
14
+ of the shortest path distances from all other nodes to `u`
15
+
16
+ .. math::
17
+
18
+ C(u) = \sum_{v \neq u} \frac{1}{d(v, u)}
19
+
20
+ where `d(v, u)` is the shortest-path distance between `v` and `u`.
21
+
22
+ If `sources` is given as an argument, the returned harmonic centrality
23
+ values are calculated as the sum of the reciprocals of the shortest
24
+ path distances from the nodes specified in `sources` to `u` instead
25
+ of from all nodes to `u`.
26
+
27
+ Notice that higher values indicate higher centrality.
28
+
29
+ Parameters
30
+ ----------
31
+ G : graph
32
+ A NetworkX graph
33
+
34
+ nbunch : container (default: all nodes in G)
35
+ Container of nodes for which harmonic centrality values are calculated.
36
+
37
+ sources : container (default: all nodes in G)
38
+ Container of nodes `v` over which reciprocal distances are computed.
39
+ Nodes not in `G` are silently ignored.
40
+
41
+ distance : edge attribute key, optional (default=None)
42
+ Use the specified edge attribute as the edge distance in shortest
43
+ path calculations. If `None`, then each edge will have distance equal to 1.
44
+
45
+ Returns
46
+ -------
47
+ nodes : dictionary
48
+ Dictionary of nodes with harmonic centrality as the value.
49
+
50
+ See Also
51
+ --------
52
+ betweenness_centrality, load_centrality, eigenvector_centrality,
53
+ degree_centrality, closeness_centrality
54
+
55
+ Notes
56
+ -----
57
+ If the 'distance' keyword is set to an edge attribute key then the
58
+ shortest-path length will be computed using Dijkstra's algorithm with
59
+ that edge attribute as the edge weight.
60
+
61
+ References
62
+ ----------
63
+ .. [1] Boldi, Paolo, and Sebastiano Vigna. "Axioms for centrality."
64
+ Internet Mathematics 10.3-4 (2014): 222-262.
65
+ """
66
+
67
+ nbunch = set(G.nbunch_iter(nbunch)) if nbunch is not None else set(G.nodes)
68
+ sources = set(G.nbunch_iter(sources)) if sources is not None else G.nodes
69
+
70
+ spl = partial(nx.shortest_path_length, G, weight=distance)
71
+ centrality = {u: 0 for u in nbunch}
72
+ for v in sources:
73
+ dist = spl(v)
74
+ for u in nbunch.intersection(dist):
75
+ d = dist[u]
76
+ if d == 0: # handle u == v and edges with 0 weight
77
+ continue
78
+ centrality[u] += 1 / d
79
+
80
+ return centrality
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/laplacian.py ADDED
@@ -0,0 +1,149 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Laplacian centrality measures.
3
+ """
4
+ import networkx as nx
5
+
6
+ __all__ = ["laplacian_centrality"]
7
+
8
+
9
+ @nx._dispatchable(edge_attrs="weight")
10
+ def laplacian_centrality(
11
+ G, normalized=True, nodelist=None, weight="weight", walk_type=None, alpha=0.95
12
+ ):
13
+ r"""Compute the Laplacian centrality for nodes in the graph `G`.
14
+
15
+ The Laplacian Centrality of a node ``i`` is measured by the drop in the
16
+ Laplacian Energy after deleting node ``i`` from the graph. The Laplacian Energy
17
+ is the sum of the squared eigenvalues of a graph's Laplacian matrix.
18
+
19
+ .. math::
20
+
21
+ C_L(u_i,G) = \frac{(\Delta E)_i}{E_L (G)} = \frac{E_L (G)-E_L (G_i)}{E_L (G)}
22
+
23
+ E_L (G) = \sum_{i=0}^n \lambda_i^2
24
+
25
+ Where $E_L (G)$ is the Laplacian energy of graph `G`,
26
+ E_L (G_i) is the Laplacian energy of graph `G` after deleting node ``i``
27
+ and $\lambda_i$ are the eigenvalues of `G`'s Laplacian matrix.
28
+ This formula shows the normalized value. Without normalization,
29
+ the numerator on the right side is returned.
30
+
31
+ Parameters
32
+ ----------
33
+ G : graph
34
+ A networkx graph
35
+
36
+ normalized : bool (default = True)
37
+ If True the centrality score is scaled so the sum over all nodes is 1.
38
+ If False the centrality score for each node is the drop in Laplacian
39
+ energy when that node is removed.
40
+
41
+ nodelist : list, optional (default = None)
42
+ The rows and columns are ordered according to the nodes in nodelist.
43
+ If nodelist is None, then the ordering is produced by G.nodes().
44
+
45
+ weight: string or None, optional (default=`weight`)
46
+ Optional parameter `weight` to compute the Laplacian matrix.
47
+ The edge data key used to compute each value in the matrix.
48
+ If None, then each edge has weight 1.
49
+
50
+ walk_type : string or None, optional (default=None)
51
+ Optional parameter `walk_type` used when calling
52
+ :func:`directed_laplacian_matrix <networkx.directed_laplacian_matrix>`.
53
+ One of ``"random"``, ``"lazy"``, or ``"pagerank"``. If ``walk_type=None``
54
+ (the default), then a value is selected according to the properties of `G`:
55
+ - ``walk_type="random"`` if `G` is strongly connected and aperiodic
56
+ - ``walk_type="lazy"`` if `G` is strongly connected but not aperiodic
57
+ - ``walk_type="pagerank"`` for all other cases.
58
+
59
+ alpha : real (default = 0.95)
60
+ Optional parameter `alpha` used when calling
61
+ :func:`directed_laplacian_matrix <networkx.directed_laplacian_matrix>`.
62
+ (1 - alpha) is the teleportation probability used with pagerank.
63
+
64
+ Returns
65
+ -------
66
+ nodes : dictionary
67
+ Dictionary of nodes with Laplacian centrality as the value.
68
+
69
+ Examples
70
+ --------
71
+ >>> G = nx.Graph()
72
+ >>> edges = [(0, 1, 4), (0, 2, 2), (2, 1, 1), (1, 3, 2), (1, 4, 2), (4, 5, 1)]
73
+ >>> G.add_weighted_edges_from(edges)
74
+ >>> sorted((v, f"{c:0.2f}") for v, c in laplacian_centrality(G).items())
75
+ [(0, '0.70'), (1, '0.90'), (2, '0.28'), (3, '0.22'), (4, '0.26'), (5, '0.04')]
76
+
77
+ Notes
78
+ -----
79
+ The algorithm is implemented based on [1]_ with an extension to directed graphs
80
+ using the ``directed_laplacian_matrix`` function.
81
+
82
+ Raises
83
+ ------
84
+ NetworkXPointlessConcept
85
+ If the graph `G` is the null graph.
86
+ ZeroDivisionError
87
+ If the graph `G` has no edges (is empty) and normalization is requested.
88
+
89
+ References
90
+ ----------
91
+ .. [1] Qi, X., Fuller, E., Wu, Q., Wu, Y., and Zhang, C.-Q. (2012).
92
+ Laplacian centrality: A new centrality measure for weighted networks.
93
+ Information Sciences, 194:240-253.
94
+ https://math.wvu.edu/~cqzhang/Publication-files/my-paper/INS-2012-Laplacian-W.pdf
95
+
96
+ See Also
97
+ --------
98
+ :func:`~networkx.linalg.laplacianmatrix.directed_laplacian_matrix`
99
+ :func:`~networkx.linalg.laplacianmatrix.laplacian_matrix`
100
+ """
101
+ import numpy as np
102
+ import scipy as sp
103
+
104
+ if len(G) == 0:
105
+ raise nx.NetworkXPointlessConcept("null graph has no centrality defined")
106
+ if G.size(weight=weight) == 0:
107
+ if normalized:
108
+ raise ZeroDivisionError("graph with no edges has zero full energy")
109
+ return {n: 0 for n in G}
110
+
111
+ if nodelist is not None:
112
+ nodeset = set(G.nbunch_iter(nodelist))
113
+ if len(nodeset) != len(nodelist):
114
+ raise nx.NetworkXError("nodelist has duplicate nodes or nodes not in G")
115
+ nodes = nodelist + [n for n in G if n not in nodeset]
116
+ else:
117
+ nodelist = nodes = list(G)
118
+
119
+ if G.is_directed():
120
+ lap_matrix = nx.directed_laplacian_matrix(G, nodes, weight, walk_type, alpha)
121
+ else:
122
+ lap_matrix = nx.laplacian_matrix(G, nodes, weight).toarray()
123
+
124
+ full_energy = np.power(sp.linalg.eigh(lap_matrix, eigvals_only=True), 2).sum()
125
+
126
+ # calculate laplacian centrality
127
+ laplace_centralities_dict = {}
128
+ for i, node in enumerate(nodelist):
129
+ # remove row and col i from lap_matrix
130
+ all_but_i = list(np.arange(lap_matrix.shape[0]))
131
+ all_but_i.remove(i)
132
+ A_2 = lap_matrix[all_but_i, :][:, all_but_i]
133
+
134
+ # Adjust diagonal for removed row
135
+ new_diag = lap_matrix.diagonal() - abs(lap_matrix[:, i])
136
+ np.fill_diagonal(A_2, new_diag[all_but_i])
137
+
138
+ if len(all_but_i) > 0: # catches degenerate case of single node
139
+ new_energy = np.power(sp.linalg.eigh(A_2, eigvals_only=True), 2).sum()
140
+ else:
141
+ new_energy = 0.0
142
+
143
+ lapl_cent = full_energy - new_energy
144
+ if normalized:
145
+ lapl_cent = lapl_cent / full_energy
146
+
147
+ laplace_centralities_dict[node] = float(lapl_cent)
148
+
149
+ return laplace_centralities_dict
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/load.py ADDED
@@ -0,0 +1,199 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Load centrality."""
2
+ from operator import itemgetter
3
+
4
+ import networkx as nx
5
+
6
+ __all__ = ["load_centrality", "edge_load_centrality"]
7
+
8
+
9
+ @nx._dispatchable(edge_attrs="weight")
10
+ def newman_betweenness_centrality(G, v=None, cutoff=None, normalized=True, weight=None):
11
+ """Compute load centrality for nodes.
12
+
13
+ The load centrality of a node is the fraction of all shortest
14
+ paths that pass through that node.
15
+
16
+ Parameters
17
+ ----------
18
+ G : graph
19
+ A networkx graph.
20
+
21
+ normalized : bool, optional (default=True)
22
+ If True the betweenness values are normalized by b=b/(n-1)(n-2) where
23
+ n is the number of nodes in G.
24
+
25
+ weight : None or string, optional (default=None)
26
+ If None, edge weights are ignored.
27
+ Otherwise holds the name of the edge attribute used as weight.
28
+ The weight of an edge is treated as the length or distance between the two sides.
29
+
30
+ cutoff : bool, optional (default=None)
31
+ If specified, only consider paths of length <= cutoff.
32
+
33
+ Returns
34
+ -------
35
+ nodes : dictionary
36
+ Dictionary of nodes with centrality as the value.
37
+
38
+ See Also
39
+ --------
40
+ betweenness_centrality
41
+
42
+ Notes
43
+ -----
44
+ Load centrality is slightly different than betweenness. It was originally
45
+ introduced by [2]_. For this load algorithm see [1]_.
46
+
47
+ References
48
+ ----------
49
+ .. [1] Mark E. J. Newman:
50
+ Scientific collaboration networks. II.
51
+ Shortest paths, weighted networks, and centrality.
52
+ Physical Review E 64, 016132, 2001.
53
+ http://journals.aps.org/pre/abstract/10.1103/PhysRevE.64.016132
54
+ .. [2] Kwang-Il Goh, Byungnam Kahng and Doochul Kim
55
+ Universal behavior of Load Distribution in Scale-Free Networks.
56
+ Physical Review Letters 87(27):1–4, 2001.
57
+ https://doi.org/10.1103/PhysRevLett.87.278701
58
+ """
59
+ if v is not None: # only one node
60
+ betweenness = 0.0
61
+ for source in G:
62
+ ubetween = _node_betweenness(G, source, cutoff, False, weight)
63
+ betweenness += ubetween[v] if v in ubetween else 0
64
+ if normalized:
65
+ order = G.order()
66
+ if order <= 2:
67
+ return betweenness # no normalization b=0 for all nodes
68
+ betweenness *= 1.0 / ((order - 1) * (order - 2))
69
+ else:
70
+ betweenness = {}.fromkeys(G, 0.0)
71
+ for source in betweenness:
72
+ ubetween = _node_betweenness(G, source, cutoff, False, weight)
73
+ for vk in ubetween:
74
+ betweenness[vk] += ubetween[vk]
75
+ if normalized:
76
+ order = G.order()
77
+ if order <= 2:
78
+ return betweenness # no normalization b=0 for all nodes
79
+ scale = 1.0 / ((order - 1) * (order - 2))
80
+ for v in betweenness:
81
+ betweenness[v] *= scale
82
+ return betweenness # all nodes
83
+
84
+
85
+ def _node_betweenness(G, source, cutoff=False, normalized=True, weight=None):
86
+ """Node betweenness_centrality helper:
87
+
88
+ See betweenness_centrality for what you probably want.
89
+ This actually computes "load" and not betweenness.
90
+ See https://networkx.lanl.gov/ticket/103
91
+
92
+ This calculates the load of each node for paths from a single source.
93
+ (The fraction of number of shortests paths from source that go
94
+ through each node.)
95
+
96
+ To get the load for a node you need to do all-pairs shortest paths.
97
+
98
+ If weight is not None then use Dijkstra for finding shortest paths.
99
+ """
100
+ # get the predecessor and path length data
101
+ if weight is None:
102
+ (pred, length) = nx.predecessor(G, source, cutoff=cutoff, return_seen=True)
103
+ else:
104
+ (pred, length) = nx.dijkstra_predecessor_and_distance(G, source, cutoff, weight)
105
+
106
+ # order the nodes by path length
107
+ onodes = [(l, vert) for (vert, l) in length.items()]
108
+ onodes.sort()
109
+ onodes[:] = [vert for (l, vert) in onodes if l > 0]
110
+
111
+ # initialize betweenness
112
+ between = {}.fromkeys(length, 1.0)
113
+
114
+ while onodes:
115
+ v = onodes.pop()
116
+ if v in pred:
117
+ num_paths = len(pred[v]) # Discount betweenness if more than
118
+ for x in pred[v]: # one shortest path.
119
+ if x == source: # stop if hit source because all remaining v
120
+ break # also have pred[v]==[source]
121
+ between[x] += between[v] / num_paths
122
+ # remove source
123
+ for v in between:
124
+ between[v] -= 1
125
+ # rescale to be between 0 and 1
126
+ if normalized:
127
+ l = len(between)
128
+ if l > 2:
129
+ # scale by 1/the number of possible paths
130
+ scale = 1 / ((l - 1) * (l - 2))
131
+ for v in between:
132
+ between[v] *= scale
133
+ return between
134
+
135
+
136
+ load_centrality = newman_betweenness_centrality
137
+
138
+
139
+ @nx._dispatchable
140
+ def edge_load_centrality(G, cutoff=False):
141
+ """Compute edge load.
142
+
143
+ WARNING: This concept of edge load has not been analysed
144
+ or discussed outside of NetworkX that we know of.
145
+ It is based loosely on load_centrality in the sense that
146
+ it counts the number of shortest paths which cross each edge.
147
+ This function is for demonstration and testing purposes.
148
+
149
+ Parameters
150
+ ----------
151
+ G : graph
152
+ A networkx graph
153
+
154
+ cutoff : bool, optional (default=False)
155
+ If specified, only consider paths of length <= cutoff.
156
+
157
+ Returns
158
+ -------
159
+ A dict keyed by edge 2-tuple to the number of shortest paths
160
+ which use that edge. Where more than one path is shortest
161
+ the count is divided equally among paths.
162
+ """
163
+ betweenness = {}
164
+ for u, v in G.edges():
165
+ betweenness[(u, v)] = 0.0
166
+ betweenness[(v, u)] = 0.0
167
+
168
+ for source in G:
169
+ ubetween = _edge_betweenness(G, source, cutoff=cutoff)
170
+ for e, ubetweenv in ubetween.items():
171
+ betweenness[e] += ubetweenv # cumulative total
172
+ return betweenness
173
+
174
+
175
+ def _edge_betweenness(G, source, nodes=None, cutoff=False):
176
+ """Edge betweenness helper."""
177
+ # get the predecessor data
178
+ (pred, length) = nx.predecessor(G, source, cutoff=cutoff, return_seen=True)
179
+ # order the nodes by path length
180
+ onodes = [n for n, d in sorted(length.items(), key=itemgetter(1))]
181
+ # initialize betweenness, doesn't account for any edge weights
182
+ between = {}
183
+ for u, v in G.edges(nodes):
184
+ between[(u, v)] = 1.0
185
+ between[(v, u)] = 1.0
186
+
187
+ while onodes: # work through all paths
188
+ v = onodes.pop()
189
+ if v in pred:
190
+ # Discount betweenness if more than one shortest path.
191
+ num_paths = len(pred[v])
192
+ for w in pred[v]:
193
+ if w in pred:
194
+ # Discount betweenness, mult path
195
+ num_paths = len(pred[w])
196
+ for x in pred[w]:
197
+ between[(w, x)] += between[(v, w)] / num_paths
198
+ between[(x, w)] += between[(w, v)] / num_paths
199
+ return between
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/percolation.py ADDED
@@ -0,0 +1,128 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Percolation centrality measures."""
2
+
3
+ import networkx as nx
4
+ from networkx.algorithms.centrality.betweenness import (
5
+ _single_source_dijkstra_path_basic as dijkstra,
6
+ )
7
+ from networkx.algorithms.centrality.betweenness import (
8
+ _single_source_shortest_path_basic as shortest_path,
9
+ )
10
+
11
+ __all__ = ["percolation_centrality"]
12
+
13
+
14
+ @nx._dispatchable(node_attrs="attribute", edge_attrs="weight")
15
+ def percolation_centrality(G, attribute="percolation", states=None, weight=None):
16
+ r"""Compute the percolation centrality for nodes.
17
+
18
+ Percolation centrality of a node $v$, at a given time, is defined
19
+ as the proportion of ‘percolated paths’ that go through that node.
20
+
21
+ This measure quantifies relative impact of nodes based on their
22
+ topological connectivity, as well as their percolation states.
23
+
24
+ Percolation states of nodes are used to depict network percolation
25
+ scenarios (such as during infection transmission in a social network
26
+ of individuals, spreading of computer viruses on computer networks, or
27
+ transmission of disease over a network of towns) over time. In this
28
+ measure usually the percolation state is expressed as a decimal
29
+ between 0.0 and 1.0.
30
+
31
+ When all nodes are in the same percolated state this measure is
32
+ equivalent to betweenness centrality.
33
+
34
+ Parameters
35
+ ----------
36
+ G : graph
37
+ A NetworkX graph.
38
+
39
+ attribute : None or string, optional (default='percolation')
40
+ Name of the node attribute to use for percolation state, used
41
+ if `states` is None. If a node does not set the attribute the
42
+ state of that node will be set to the default value of 1.
43
+ If all nodes do not have the attribute all nodes will be set to
44
+ 1 and the centrality measure will be equivalent to betweenness centrality.
45
+
46
+ states : None or dict, optional (default=None)
47
+ Specify percolation states for the nodes, nodes as keys states
48
+ as values.
49
+
50
+ weight : None or string, optional (default=None)
51
+ If None, all edge weights are considered equal.
52
+ Otherwise holds the name of the edge attribute used as weight.
53
+ The weight of an edge is treated as the length or distance between the two sides.
54
+
55
+
56
+ Returns
57
+ -------
58
+ nodes : dictionary
59
+ Dictionary of nodes with percolation centrality as the value.
60
+
61
+ See Also
62
+ --------
63
+ betweenness_centrality
64
+
65
+ Notes
66
+ -----
67
+ The algorithm is from Mahendra Piraveenan, Mikhail Prokopenko, and
68
+ Liaquat Hossain [1]_
69
+ Pair dependencies are calculated and accumulated using [2]_
70
+
71
+ For weighted graphs the edge weights must be greater than zero.
72
+ Zero edge weights can produce an infinite number of equal length
73
+ paths between pairs of nodes.
74
+
75
+ References
76
+ ----------
77
+ .. [1] Mahendra Piraveenan, Mikhail Prokopenko, Liaquat Hossain
78
+ Percolation Centrality: Quantifying Graph-Theoretic Impact of Nodes
79
+ during Percolation in Networks
80
+ http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0053095
81
+ .. [2] Ulrik Brandes:
82
+ A Faster Algorithm for Betweenness Centrality.
83
+ Journal of Mathematical Sociology 25(2):163-177, 2001.
84
+ https://doi.org/10.1080/0022250X.2001.9990249
85
+ """
86
+ percolation = dict.fromkeys(G, 0.0) # b[v]=0 for v in G
87
+
88
+ nodes = G
89
+
90
+ if states is None:
91
+ states = nx.get_node_attributes(nodes, attribute, default=1)
92
+
93
+ # sum of all percolation states
94
+ p_sigma_x_t = 0.0
95
+ for v in states.values():
96
+ p_sigma_x_t += v
97
+
98
+ for s in nodes:
99
+ # single source shortest paths
100
+ if weight is None: # use BFS
101
+ S, P, sigma, _ = shortest_path(G, s)
102
+ else: # use Dijkstra's algorithm
103
+ S, P, sigma, _ = dijkstra(G, s, weight)
104
+ # accumulation
105
+ percolation = _accumulate_percolation(
106
+ percolation, S, P, sigma, s, states, p_sigma_x_t
107
+ )
108
+
109
+ n = len(G)
110
+
111
+ for v in percolation:
112
+ percolation[v] *= 1 / (n - 2)
113
+
114
+ return percolation
115
+
116
+
117
+ def _accumulate_percolation(percolation, S, P, sigma, s, states, p_sigma_x_t):
118
+ delta = dict.fromkeys(S, 0)
119
+ while S:
120
+ w = S.pop()
121
+ coeff = (1 + delta[w]) / sigma[w]
122
+ for v in P[w]:
123
+ delta[v] += sigma[v] * coeff
124
+ if w != s:
125
+ # percolation weight
126
+ pw_s_w = states[s] / (p_sigma_x_t - states[w])
127
+ percolation[w] += delta[w] * pw_s_w
128
+ return percolation
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/reaching.py ADDED
@@ -0,0 +1,206 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Functions for computing reaching centrality of a node or a graph."""
2
+
3
+ import networkx as nx
4
+ from networkx.utils import pairwise
5
+
6
+ __all__ = ["global_reaching_centrality", "local_reaching_centrality"]
7
+
8
+
9
+ def _average_weight(G, path, weight=None):
10
+ """Returns the average weight of an edge in a weighted path.
11
+
12
+ Parameters
13
+ ----------
14
+ G : graph
15
+ A networkx graph.
16
+
17
+ path: list
18
+ A list of vertices that define the path.
19
+
20
+ weight : None or string, optional (default=None)
21
+ If None, edge weights are ignored. Then the average weight of an edge
22
+ is assumed to be the multiplicative inverse of the length of the path.
23
+ Otherwise holds the name of the edge attribute used as weight.
24
+ """
25
+ path_length = len(path) - 1
26
+ if path_length <= 0:
27
+ return 0
28
+ if weight is None:
29
+ return 1 / path_length
30
+ total_weight = sum(G.edges[i, j][weight] for i, j in pairwise(path))
31
+ return total_weight / path_length
32
+
33
+
34
+ @nx._dispatchable(edge_attrs="weight")
35
+ def global_reaching_centrality(G, weight=None, normalized=True):
36
+ """Returns the global reaching centrality of a directed graph.
37
+
38
+ The *global reaching centrality* of a weighted directed graph is the
39
+ average over all nodes of the difference between the local reaching
40
+ centrality of the node and the greatest local reaching centrality of
41
+ any node in the graph [1]_. For more information on the local
42
+ reaching centrality, see :func:`local_reaching_centrality`.
43
+ Informally, the local reaching centrality is the proportion of the
44
+ graph that is reachable from the neighbors of the node.
45
+
46
+ Parameters
47
+ ----------
48
+ G : DiGraph
49
+ A networkx DiGraph.
50
+
51
+ weight : None or string, optional (default=None)
52
+ Attribute to use for edge weights. If ``None``, each edge weight
53
+ is assumed to be one. A higher weight implies a stronger
54
+ connection between nodes and a *shorter* path length.
55
+
56
+ normalized : bool, optional (default=True)
57
+ Whether to normalize the edge weights by the total sum of edge
58
+ weights.
59
+
60
+ Returns
61
+ -------
62
+ h : float
63
+ The global reaching centrality of the graph.
64
+
65
+ Examples
66
+ --------
67
+ >>> G = nx.DiGraph()
68
+ >>> G.add_edge(1, 2)
69
+ >>> G.add_edge(1, 3)
70
+ >>> nx.global_reaching_centrality(G)
71
+ 1.0
72
+ >>> G.add_edge(3, 2)
73
+ >>> nx.global_reaching_centrality(G)
74
+ 0.75
75
+
76
+ See also
77
+ --------
78
+ local_reaching_centrality
79
+
80
+ References
81
+ ----------
82
+ .. [1] Mones, Enys, Lilla Vicsek, and Tamás Vicsek.
83
+ "Hierarchy Measure for Complex Networks."
84
+ *PLoS ONE* 7.3 (2012): e33799.
85
+ https://doi.org/10.1371/journal.pone.0033799
86
+ """
87
+ if nx.is_negatively_weighted(G, weight=weight):
88
+ raise nx.NetworkXError("edge weights must be positive")
89
+ total_weight = G.size(weight=weight)
90
+ if total_weight <= 0:
91
+ raise nx.NetworkXError("Size of G must be positive")
92
+
93
+ # If provided, weights must be interpreted as connection strength
94
+ # (so higher weights are more likely to be chosen). However, the
95
+ # shortest path algorithms in NetworkX assume the provided "weight"
96
+ # is actually a distance (so edges with higher weight are less
97
+ # likely to be chosen). Therefore we need to invert the weights when
98
+ # computing shortest paths.
99
+ #
100
+ # If weight is None, we leave it as-is so that the shortest path
101
+ # algorithm can use a faster, unweighted algorithm.
102
+ if weight is not None:
103
+
104
+ def as_distance(u, v, d):
105
+ return total_weight / d.get(weight, 1)
106
+
107
+ shortest_paths = nx.shortest_path(G, weight=as_distance)
108
+ else:
109
+ shortest_paths = nx.shortest_path(G)
110
+
111
+ centrality = local_reaching_centrality
112
+ # TODO This can be trivially parallelized.
113
+ lrc = [
114
+ centrality(G, node, paths=paths, weight=weight, normalized=normalized)
115
+ for node, paths in shortest_paths.items()
116
+ ]
117
+
118
+ max_lrc = max(lrc)
119
+ return sum(max_lrc - c for c in lrc) / (len(G) - 1)
120
+
121
+
122
+ @nx._dispatchable(edge_attrs="weight")
123
+ def local_reaching_centrality(G, v, paths=None, weight=None, normalized=True):
124
+ """Returns the local reaching centrality of a node in a directed
125
+ graph.
126
+
127
+ The *local reaching centrality* of a node in a directed graph is the
128
+ proportion of other nodes reachable from that node [1]_.
129
+
130
+ Parameters
131
+ ----------
132
+ G : DiGraph
133
+ A NetworkX DiGraph.
134
+
135
+ v : node
136
+ A node in the directed graph `G`.
137
+
138
+ paths : dictionary (default=None)
139
+ If this is not `None` it must be a dictionary representation
140
+ of single-source shortest paths, as computed by, for example,
141
+ :func:`networkx.shortest_path` with source node `v`. Use this
142
+ keyword argument if you intend to invoke this function many
143
+ times but don't want the paths to be recomputed each time.
144
+
145
+ weight : None or string, optional (default=None)
146
+ Attribute to use for edge weights. If `None`, each edge weight
147
+ is assumed to be one. A higher weight implies a stronger
148
+ connection between nodes and a *shorter* path length.
149
+
150
+ normalized : bool, optional (default=True)
151
+ Whether to normalize the edge weights by the total sum of edge
152
+ weights.
153
+
154
+ Returns
155
+ -------
156
+ h : float
157
+ The local reaching centrality of the node ``v`` in the graph
158
+ ``G``.
159
+
160
+ Examples
161
+ --------
162
+ >>> G = nx.DiGraph()
163
+ >>> G.add_edges_from([(1, 2), (1, 3)])
164
+ >>> nx.local_reaching_centrality(G, 3)
165
+ 0.0
166
+ >>> G.add_edge(3, 2)
167
+ >>> nx.local_reaching_centrality(G, 3)
168
+ 0.5
169
+
170
+ See also
171
+ --------
172
+ global_reaching_centrality
173
+
174
+ References
175
+ ----------
176
+ .. [1] Mones, Enys, Lilla Vicsek, and Tamás Vicsek.
177
+ "Hierarchy Measure for Complex Networks."
178
+ *PLoS ONE* 7.3 (2012): e33799.
179
+ https://doi.org/10.1371/journal.pone.0033799
180
+ """
181
+ if paths is None:
182
+ if nx.is_negatively_weighted(G, weight=weight):
183
+ raise nx.NetworkXError("edge weights must be positive")
184
+ total_weight = G.size(weight=weight)
185
+ if total_weight <= 0:
186
+ raise nx.NetworkXError("Size of G must be positive")
187
+ if weight is not None:
188
+ # Interpret weights as lengths.
189
+ def as_distance(u, v, d):
190
+ return total_weight / d.get(weight, 1)
191
+
192
+ paths = nx.shortest_path(G, source=v, weight=as_distance)
193
+ else:
194
+ paths = nx.shortest_path(G, source=v)
195
+ # If the graph is unweighted, simply return the proportion of nodes
196
+ # reachable from the source node ``v``.
197
+ if weight is None and G.is_directed():
198
+ return (len(paths) - 1) / (len(G) - 1)
199
+ if normalized and weight is not None:
200
+ norm = G.size(weight=weight) / G.size()
201
+ else:
202
+ norm = 1
203
+ # TODO This can be trivially parallelized.
204
+ avgw = (_average_weight(G, path, weight=weight) for path in paths.values())
205
+ sum_avg_weight = sum(avgw) / norm
206
+ return sum_avg_weight / (len(G) - 1)
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (204 Bytes). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_betweenness_centrality.cpython-310.pyc ADDED
Binary file (21.1 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_betweenness_centrality_subset.cpython-310.pyc ADDED
Binary file (10.9 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_closeness_centrality.cpython-310.pyc ADDED
Binary file (9.32 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_current_flow_betweenness_centrality.cpython-310.pyc ADDED
Binary file (7.98 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_current_flow_betweenness_centrality_subset.cpython-310.pyc ADDED
Binary file (4.37 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_current_flow_closeness.cpython-310.pyc ADDED
Binary file (2.1 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_degree_centrality.cpython-310.pyc ADDED
Binary file (3.99 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_dispersion.cpython-310.pyc ADDED
Binary file (2.27 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_eigenvector_centrality.cpython-310.pyc ADDED
Binary file (5.81 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_group.cpython-310.pyc ADDED
Binary file (10.1 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_harmonic_centrality.cpython-310.pyc ADDED
Binary file (5.07 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_katz_centrality.cpython-310.pyc ADDED
Binary file (9.51 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_laplacian_centrality.cpython-310.pyc ADDED
Binary file (6.06 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_load_centrality.cpython-310.pyc ADDED
Binary file (9.55 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_percolation_centrality.cpython-310.pyc ADDED
Binary file (2.76 kB). View file
 
venv/lib/python3.10/site-packages/networkx/algorithms/centrality/tests/__pycache__/test_reaching.cpython-310.pyc ADDED
Binary file (5.68 kB). View file