mastodon.world is one of the many independent Mastodon servers you can use to participate in the fediverse.
Generic Mastodon server for anyone to use.

Server stats:

8.3K
active users

#spikingneuralnetworks

0 posts0 participants0 posts today
Replied in thread

New preprint on our "collaborative modelling of the brain" (COMOB) project. Over the last two years, a group of us (led by @marcusghosh) have been working together, openly, online, with anyone free to join, on a computational neuroscience research project

biorxiv.org/content/10.1101/20

This was an experiment in a more bottom up, collaborative way of doing science, rather than the hierarchical PI-led model. So how did we do it?

We started from the tutorial I gave at @CosyneMeeting 2022 on spiking neural networks that included a starter Jupyter notebook that let you train a spiking neural network model on a sound localisation task.

neural-reckoning.github.io/cos

youtube.com/watch?v=GTXTQ_sOxa

Participants were free to use and adapt this to any question they were interested in (we gave some ideas for starting points, but there was no constraint). Participants worked in groups or individually, sharing their work on our repository and joining us for monthly meetings.

The repository was set up to automatically build a website using @mystmarkdown showing the current work in progress of all projects, and (later in the project) the paper as we wrote it. This kept everyone up to date with what was going on.

comob-project.github.io/snn-so

We started from a simple feedforward network of leaky integrate-and-fire neurons, but others adapted it to include learnable delays, alternative neuron models, biophysically detailed models, incorporated Dale's law, etc.

We found some interesting results, including that shorter time constants improved performance (consistent with what we see in the auditory system). Surprisingly, the network seemed to be using an "equalisation cancellation" strategy rather than the expected coincidence detection.

Ultimately, our scientific results were not incredibly strong, but we think this was a valuable experiment for a number of reasons. Firstly, it shows that there are other ways of doing science. Secondly, many people got to engage in a research experience they otherwise wouldn't. Several participants have been motivated to continue their work beyond this project. It also proved useful for generating teaching material, and a number of MSc projects were based on it.

With that said, we learned some lessons about how to do this better, and yes, we will be doing this again (call for participation in September/October hopefully). The main challenge will be to keep the project more focussed without making it top down / hierarchical.

We believe this is possible, and we are inspired by the recent success of the Busy Beaver challenge, a bottom up project of mathematics amateurs that found a proof to a 40 year old conjecture.

quantamagazine.org/amateur-mat

We will be calling for proposals for the next project, engaging in an open discussion with all participants to refine the ideas before starting, and then inviting the proposer of the most popular project to act as a 'project lead' keeping it focussed without being hierarchical.

If you're interested in being involved in that, please join our (currently fairly quiet) new discord server, or follow me or @marcusghosh for announcements.

discord.gg/kUzh5MHjVE

I'm excited for a future where scientists work more collaboratively, and where everyone can participate. Diversity will lead to exciting new ideas and progress. Computational science has huge potential here, something we're also pursuing at @neuromatch.

Let's make it happen!

In 2000, Nicolas Brunel presented a framework for studying sparsely connected #SpikingNeuralNetworks (#SNN) with random connectivity & varied excitation-inhibition balance. The model, characterized by high sparseness & low firing rates, captures diverse neural dynamics such as synchronized regular and asynchronous irregular activity and global oscillations. Here is a brief summary of these concepts & a #PythonTuroial using the #NESTsimulator.

🌍 fabriziomusacchio.com/blog/202
#CompNeuro #Neuroscience

SPIKING NEURAL NETWORKS!

If you love them, join us at SNUFA24. Free, online workshop, Nov 5-6 (2-6pm CET). Usually ~700 participants.

Invited speakers: Chiara Bartolozzi, David Kappel, Anna Levina, Christian Machens

Posters + 8 contributed talks selected by participant vote.

Abstract submission is quick and easy (300 word max), and now open until the deadline Sept 27.

Registration is free, but mandatory.

Hope to see you there!

snufa.net/2024/

Due to its computational efficiency and biological plausibility, the #IzhikevichModel is an exceptional tool for understanding #neuronal interactions within #SpikingNeuralNetworks (#SNN). Here’s a quick #Python implementation of Izhikevich's original #Matlab code along with examples using different synaptic weights and neuron types, each leading to diverse spiking behaviors and network dynamics:

🌍fabriziomusacchio.com/posts/iz

Dear colleagues,

It's a pleasure to share with you this fully-funded #PhD position in #computational neuroscience in interaction with #neuromorphic engineering and #neuroscience:

laurentperrinet.github.io/post

TL;DR: This PhD subject focuses on the association between #attention and #SpikingNeuralNetworks for defining new efficient AI models for embedded systems such as drones, robots and more generally autonomous systems. The thesis will take place between the LEAT research lab in Sophia-Antipolis and the INT institute in Marseille which both develop complementary approaches on bio-inspired AI from neuroscience to embedded systems design.

The application should include :
• Curriculum vitæ,

• Motivation Letter,

• Letter of recommendation of the master supervisor.

and sent to Benoit Miramond benoit.miramond@unice.fr, Laurent Perrinet Laurent.Perrinet@univ-amu.fr, and Laurent Rodriguez laurent.rodriguez@univ-cotedazur.fr

Cheers,
Laurent

PS: related references:

  • Emmanuel Daucé, Pierre Albigès, Laurent U Perrinet (2020). A dual foveal-peripheral visual processing model implements efficient saccade selection. Journal of Vision. doi: doi.org/10.1167/jov.20.8.22

  • Jean-Nicolas Jérémie, Emmanuel Daucé, Laurent U Perrinet (2024). Retinotopic Mapping Enhances the Robustness of Convolutional Neural Networks. arXiv: arxiv.org/abs/2402.15480

Just in time for your weekend, we released Brian 2.6, the new version of your friendly spiking network simulator. 🚀
It comes with many small improvements, bug and compatibility fixes, and offers a major new feature for running standalone simulations repeatedly (or in parallel) without recompiling code. In addition, it comes with general infrastructure improvements all around (official wheels for Python 3.12! Docker images on Docker hub! Apple Silicon builds/tests!).
Enjoy (and let us know if you run into any issues, of course…) 🥳

briansimulator.org/posts/2024/

The Brian spiking neural network simulator · New release: Brian 2.6It’s been a while since the last release, so we are very happy to announce that Brian 2.6 is finally out 🎉 This release comes with a major new feature to run standalone simulations repeatedly (or in p

Special Session on #SpikingNeuralNetworks and #Neuromorphic Computing at the 33rd International Conference on Artificial Neural Networks (ICANN) 2024 - Call for Papers

Sep 17 - 20, Lugano, Switzerland

The special session invites contributions on recent advances in spiking neural networks. Spiking neural networks have gained substantial attention recently as a candidate for low latency and low power AI substrate, with implementations being explored in neuromorphic hardware. This special session aims to bring together practitioners interested in efficient learning algorithms, data representations, and applications.

ORGANIZERS:

  • Sander Bohté (CWI Amsterdam, Netherlands)
  • Sebastian Otte (University of Lübeck, Germany)

Find more details at: e-nns.org/wp-content/uploads/2

#icann#enns#ai

I'm on the latest episode of Brain Inspired talking about #Neuroscience, #SpikingNeuralNetworks, #MachineLearning and #Metascience! Thanks Paul Middlebrooks (not on Mastodon I think) for the invite and the extremely fun conversation. For the explanation of why this picture you'll have to listen to the episode. 😉

braininspired.co/podcast/183/

Also, if you're not yet listening to Brain Inspired you should be - and support Paul on Patreon. He provides this free for the community with no adverts. What a hero!

We are finally on Mastodon, time for a little #introduction 👋 !

Brian is a #FOSS simulator for biological #SpikingNeuralNetworks, for research in #ComputationalNeuroscience and beyond. It makes it easy to go from a high-level model description in Python, based on mathematical equations and physical units, to a simulation running efficiently on the CPU or GPU.

We have a friendly community and extensive documentation, links to everything on our homepage: briansimulator.org

This account will mostly announce news (releases, other notable events), but we're also looking forward to discussing with y'all 💬