Ok, enough jokes. Computational neuroscience is neuroscience. But instead of doing experiments in real life, you do them on a computer. Sometimes, you don't even need a computer, because you might be busy developing some crazy theories like old-school mathematicians.
My current research focuses on neural representation and Recurrent Neural Networks (RNNs). More specificly, I'm interested in neural networks with biological properties (e.g. sparsity, Dale's law)
Before my Ph.D. program at Georgia Tech, I worked on simulations of the fruit fly’s visual system at Columbia University. Before that, I worked on kinetic PDEs of spiking networks and even some biochemical material development at Wuhan University (yes my undergrad major is chemistry).
Here is my CV in case you are interested.
Publications and Posters
- Alex Q. Wang, Soon Ho Kim, Hannah Choi Rank Dependency of Rescaled Pruning in Recurrent Neural Networks Working Paper (2025).
- Aishwarya H. Balwani, Alex Q. Wang, Farzaneh Najafi, Hannah Choi Constructing Biologically Constrained RNNs via Dale’s Backprop and Topologically-Informed Pruning BioRxiv, Under Revision at Science Advances (2024).
- Ai-Nv Zhang, Wei Wu, Chi Zhang, Qiu-yang Wang, Ze-Nan Zhuang, Han Cheng, Xian-Zheng Zhang A Versatile Bacterial Membrane-Binding Chimeric Peptide with Enhanced Photodynamic Antimicrobial Activity, Journal of Materials Chemistry B (2019).
- Alex Q. Wang, Vidit Tripathi, Hannah Choi Unraveling the Effects of Different Pruning Methods on Recurrent Neural Networks, Georgia Tech Frontiers in Science Conference and Symposium (2025).
- Vidit Tripathi, Alex Q. Wang, Hannah Choi Unraveling the Effects of Different Pruning Rules on Network Dynamics, Computational and Systems Neuroscience (2025).