An alternative kind of ANN introduced by George Delaportas in 2006, that predates most of the innovations recently found in Tensor Flow and other ANN libraries in 2022.
G.A.N.N is the precursor and basis of:
- Neural Architecture Search (NAS)
- Encapsulation of layers and transformer blocks
- Auto-optimization of ANN layers (topology)
- Dynamic computation graphs
- Heuristic-Based Layer Management
Actually, GANN is not just another ANN but rather a framework that creates and trains this new ANN automatically based on certain criteria and mathematical models that were invented explicitly for this purpose.