Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation branch #38

Open
wants to merge 8 commits into
base: master
Choose a base branch
from
57 changes: 55 additions & 2 deletions brainconn/centrality/centrality.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,39 @@
from ..distance import reachdist
from ..utils import invert

def Degree_centrality (G):
"""
Degree centrality, is the simplest measure of centrality. It assumes
that no nodes with many connections exert more influence over network.
However, the limitation of degree centrality is that all connections
are treated as same strength.

"""

def Delta_centrality (G):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function names should be lower snake case, so Delta_centrality --> delta_centrality, Degree_centrality --> degree_centrality, Leverage_centrality --> leverage_centrality.

"""
Another way of thinking about centrality is that it does not depend
on either degree, closeness or betweenness but is based on the effect
of the removal of a node has on the structure and function of the rest
of the network.

def betweenness_bin(G):
The intuition behind this measure is that inactivation of nodes will
exert a disproportionate impact on remaining network elements.

Delta centrality is closely related to analysis of network robustness.

Used to creat putative scaffold. Refer to Sporns (2016) paper to see an
example of structural scaffold of human brain.
"""

def betweenness(G):
"""
Node betweenness centrality is the fraction of all shortest paths in
the network that contain a given node. Nodes with high values of
betweenness centrality participate in a large number of shortest paths.



Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We also don't want these extra spaces here.

Parameters
----------
A : NxN :obj:`numpy.ndarray`
Expand All @@ -26,6 +52,10 @@ def betweenness_bin(G):
-----
Betweenness centrality may be normalised to the range [0,1] as
BC/[(N-1)(N-2)], where N is the number of nodes in the network.

Betweeness centrality assumes that the information is routed
along the shortest paths which may not be an appropriate assumption
in case information transmission in brain.
"""
G = np.array(G, dtype=float) # force G to have float type so it can be
# compared to float np.inf
Expand Down Expand Up @@ -185,7 +215,7 @@ def entropy(w_):
return Hpos, Hneg


def edge_betweenness_bin(G):
def edge_betweenness(G):
"""
Edge betweenness centrality is the fraction of all shortest paths in
the network that contain a given edge. Edges with high values of
Expand Down Expand Up @@ -341,6 +371,13 @@ def eigenvector_centrality_und(CIJ):
node i is equivalent to the ith element in the eigenvector
corresponding to the largest eigenvalue of the adjacency matrix.

Eigenvector centrality is based on the notion that the given node is highly
central if its' neighbors also share the same property. Howveer, it does not
account for the disparity in degree of a node with respect to its'
neighbours. This has different implications depending on the networks'
assortavity and the tendency of a node to be connected to nodes with
similar degrees (Joyce, 2010)

Parameters
----------
CIJ : NxN :obj:`numpy.ndarray`
Expand Down Expand Up @@ -655,6 +692,9 @@ def module_degree_zscore(W, ci, flag=0):

def pagerank_centrality(A, d, falff=None):
"""
Interesting fact: It is used to rank websites in the results of Google
search engine.

The PageRank centrality is a variant of eigenvector centrality. This
function computes the PageRank centrality of each vertex in a graph.

Expand Down Expand Up @@ -817,3 +857,16 @@ def subgraph_centrality(CIJ):
# compute eigenvector centr.
Cs = np.real(np.dot(vecs * vecs, np.exp(vals)))
return Cs # imaginary part from precision error

def Leverage_centrality:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You need to define functions with parentheses, even if they're empty. We also need to include the word "pass" when the function isn't running yet.

So instead of

def Leverage_centrality:
    """Documentation...
    """

we should have:

def Leverage_centrality():
    """Documentation...
    """
    pass

"""
This concept considers the centrality of the node depending on whether
its neighbour rely on the node for information

leverage centrality does not assume shortest path or in a serial fashion.
It focuses on disparity in node degrees to quantify information flow
locally.

For further information on utitlity of leverage centrality in the
brain network refer to Joyce (2010)
"""