Since their inception, artificial neural networks have relied on manually designed architectures and inductive biases to better adapt to data and tasks. With the rise of deep learning and the expansion of parameter spaces, they have begun to exhibit brain-like functional behaviors. Nevertheless, artificial neural networks remain fundamentally different from biological neural systems in structural organization, learning mechanisms, and evolutionary pathways.
From the perspective of neuroscience, we rethink the formation and evolution of intelligence and proposes a new neural network paradigm, Brain-like Neural Network (BNN). We further present the first instantiation of a BNN termed LuminaNet that operates without convolutions or self-attention and is capable of autonomously modifying its architecture. We conduct extensive experiments demonstrating that LuminaNet can achieve self-evolution through dynamic architectural changes. On the CIFAR-10, LuminaNet achieves top-1 accuracy improvements of 11.19%, 5.46% over LeNet-5 and AlexNet, respectively, outperforming MLP-Mixer, ResMLP, and DeiT-Tiny among MLP/ViT architectures. On the TinyStories text generation task, LuminaNet attains a perplexity of 8.4, comparable to a single-layer GPT-2-style Transformer, while reducing computational cost by approximately 25% and peak memory usage by nearly 50%.
We provide several interactive structures of the LuminaNet mentioned in the paper.
-
Image Recognition Task (CIFAR-10)
d_hidden structure 10 link 32 link 64 link 84 link 128 link -
Text Generation Task (TinyStories)
d_hidden structure 128 link 256 link 384 link
conda create -n LuminaNet python=3.10.16
conda activate LuminaNet
pip install -r requirments.txtpython test_<cifar10/tinystories>.pyTraining code will be released soon.
This repository is licensed for research and non-commercial use only.
Commercial use requires a separate license from the author.
Please refer to LISENCE for more information.
Contact:

