Two network transformations
In this paper, we study relational networks. They may be as large as socialnetworks or as small as neural networks. We employ the concepts of closure andclosure operators to describe their structures, and introduce the idea of functionaltransformation to model their dynamic qualities.One transformation,ω, reduces a complex network to a much simpler form, yetpreserves important properties such as path connectivity and centrality measures.The other transformation,ε, expands a network by using grammar-like productions.Both are continuous (with respect to closure) and we show thatεis effectivelyω−1in thatω·ε·ω=ω.It is thought thatωmay model human memory consolidation and thatεmaymodel memory reconstruction.
Document typePeer reviewed
Document versionFinal PDF
SourceMathematics for Applications. 2019 vol. 8, č. 1, s. 79-96. ISSN 1805-3629
- 2019/1