mastodon.world is one of the many independent Mastodon servers you can use to participate in the fediverse.
Generic Mastodon server for anyone to use.

Server stats:

12K
active users

On 13–14th of February 2025, I organised a on that explored various methodological avenues for my project’s data acquisition.

Here’s a short thread summarising what we dealt with and achieved during these days.

CCying everyone interested in , , , , , and alike.

Let’s jump right into it!

[Note: See alt text for more info about the pictures attached to this thread]

🧵1/20

The Hackathon brought together 10 scientists of different backgrounds – from linguists to music scholars and computer scientists.

To warm up intellectually,* we began with a general discussion of the phenomenon of Romani chords.

–––
*
Do you want to warm up with us to digest this thread easier? Watch the bit of the project’s video abstract from 7:17–7:58 and then try to answer the question: “Why does the dimenzovaný chord matter?”

=>

youtube.com/watch?v=aEXkI2br3c

🧵2/20

The discussion helped to identify the biggest challenges for the Hackathon:

On the one hand, the project aims to collect the data on an ambitious scale, which requires a pragmatic data-driven approach (a “musical laboratory”). But on the other, the phenomenon of Romani chords is embedded in peculiar layers of sociocultural reality, the uncovering of which requires a sensitive ethnographic approach.

This dilemma became the central theme for our methodological thoughts.

🧵3/20

Then, we discussed and tried out various contemporary music information retrieval methods with a special focus on accompanying (aka ).

The first method to come up was “Automatic chord recognition” (), a method that scientists have worked on for over 25 years.

A nice summary of this ongoing research (with the most significant challenges identified) can be found in this 5-year-old article => archives.ismir.net/ismir2019/p

🧵4/20

Nevertheless, we found the use of ACR for the project to be limited.

ACR is a classification task that analyses audio signals to compare and categorise harmonic content into *predefined* chord labels.

While this method estimates very well the common progressions in functional harmony, it doesn’t quite know how to handle less common chords and voicings (such as the “dimenzovaný” chord” mentioned above) as they are not usually found in the *predefined* labels.

🧵5/20

Another method we considered was the .

This method focuses on identifying and extracting individual notes from musical textures by analysing frequency components, onset times, and timbral features using techniques such as spectral decomposition, pitch tracking, and machine learning-based transcription methods.

🧵6/20

Petr Nuska

However, neither this method had a satisfactory data output that would suit the project’s purposes. Even the state-of-the-art polyphonic detection algorithms (such as the one developed by Celemony Melodyne) produce a lot of noise, and cleaning the data to a standard usable for the project would be enormously laborious.

(Although it is possible that in a few years, progress in will make this method much more effective and efficient).

🧵7/20

Then, we moved to an exploration of the musical instruments with output.

This option seemed to be a great compromise between [relatively] clean sonic data output while keeping the fragile layers of ethnographic context [relatively] intact.

Since the RomChords project focuses primarily on harmonic accompaniments for guitar and accordion, we worked with a FR-1x and a .

🧵8/20

Another challenge of our Hackathon was the attempt to track automatically and synchronise it with the sonic data.

We wanted to test the current state of the art using the Leap Motion Controller 2 for finger tracking.

And since neither accordion nor guitar stays relatively stable while being played, we also planned to track the movement of instruments with (aka ).

🧵9/20

However, we have not been quite successful in these ambitions.

It turns out that the left hand on the guitar fingerboard is no longer recognised by the controller, very likely because the guitar fretboard goes in the way of the camera’s frame.

In other words, we didn’t find a sweet spot where the controller could cover the fretboard while properly recognising the hand and, thus, tracking the movement of all fingers as needed.

🧵10/20

We experienced similar issues with the accordion.

Especially for positions when fingers were close to each other or those where the thumb was hidden under the palm, the hand slipped away from the camera’s attention very quickly.

Also, the estimated fingers’ positions were often inconsistent with the actual positions. The distance between the keyboard keys seemed too small for the camera to determine correctly which key the finger was located.

🧵11/20

We would probably fix this if we:

* Set up several controllers from different angles to overcome its blind spots (although setting this up properly would require a great deal of additional programming work).

* Combine the controller with more accurate tracking tools, such as motion-tracking gloves (although we could quickly make playing very uncomfortable for the musicians).

Any and/or specialist here to give us their thoughts?

🧵12/20

Since the finger movement data were rather “nice to have” than those “must have,” we decided to give up this challenge and focus exclusively on the sonic data part.

In the evening, we welcomed three Romani artists, thanks to whom we were able to record some firsthand sonic data on accompanying harmonies. The session was accompanied by an insightful conversation with the musicians, allowing us to engage with the sociocultural layers of the phenomenon.

🧵13/20

And, of course, what would a workshop on Romani music be *without* Romani music?

We wrapped up the first evening in the “Za školou” café, where our dear guests prepared a set of Romani songs for us. The concert was opened to the public, and the venue was soon filled with a crowd of dancing people.

That brilliant concert transitioned into a fantastic jam session that lasted until the morning hours…

🧵14/20

Our continued the next day.

We reconvened to review yesterday’s pilot data, polish it, and get it ready for analysis.

🧵15/20

For guitar-based data, we needed to tackle mainly the abundant short notes – data noise occurring likely as a result of imperfect onboard processing.

We managed to fix this with Midi transform functions in .

Accordion was already giving us usable data, so we only worked on separating the midi channels for trebles and basis.

(Kudos to Šimon Libřický, our Logic Pro Expert!)

🧵16/20

Overall, in about 27 hours, we tackled various research problems and challenges.

Some we managed to solve, others we just named (which is equally important for future work).

And yes, there may be another for data analysis as soon as the data is collected (perhaps in late 2027).

If you are interested, just keep an eye on the hashtag or sign up for e-mail updates on the project’s website (=> romanichords.eu/)

🧵17/20

I wish to thank warmly all Hackathon participants for their valuable insights and intellectual inputs, namely Martin Gális, Jan Hajič, Šimon Libřický, Tomáš Klapka, Jonatán Müller, Adam Pospíšil, German Angel Puerto, Ondřej Skovajsa and Ieva Weaver.

It was a genuine pleasure to work with you all!

🧵18/20

Thanks to , whose activities greatly inspired this event to happen (I highly recommend checking their exciting and ever-growing portfolio => ufal.mff.cuni.cz/pmcg)

Thanks to the civic association , which helped to organise such a memorable concert and which continues to do essential work for popularising (=>gilora.eu/)

Also, thank for allowing us to use its premises and bar to host the concert!

🧵19/20

ufal.mff.cuni.czPrague Music Computing Group | ÚFAL

Last, but not least, I wish to thank @EUCommission for funding my project, allowing interdisciplinary research projects (such as this one) to thrive and even try out some unconventional methods for progressing in research.

🧵20/20