MIDI Export vs MusicXML (MIDI) Export

• May 5, 2017 - 20:14

Hello. I have a general question regarding MusicXML exports and how the MIDI data in these exports is handled by MuseScore 2 (at least, I think that's what I'm asking!) The specific scenario I have in mind involves exporting from MuseScore, for import to a DAW (in my case Logic Pro X, though I suppose this would apply to most DAWs):

If I create a score in MuseScore that contains note articulations, those articulations are nicely represented in MuseScore's MIDI playback - ie staccato notes are shortened, accented notes have higher velocity, and so on.

Now: if I export this score as a MIDI file and bring it into Logic, the imported MIDI file retains the articulations (nice!) but the score's articulation symbols have mostly been lost (not so nice but hey it's MIDI data, not notation, so we understand).

Conversely, if I export from MuseScore as MusicXML, the imported score retains most of its articulation symbol's (MusicXML doing what it does best) but the MIDI data has now dropped all articulations and plays back in a metronomic fashion - literally as written in the score. This *sort of* makes sense ('makes sense' because MusicXML is focused on notation... but 'sort of' because it does, after all, have built-in support for MIDI.. and: tempo changes, for example, remain intact)...

My question: is this due to (a) a limitation in the way MusicXML handles MIDI data; (b) a limitation in the way MuseScore converts its own MIDI data to MusicXML; or (c) perhaps a limitation in the way the DAW is interpreting the imported XML?

I'm not really expecting an answer to (c) as that would be a question for Apple. But I would like to better understand where in the export process MuseScore's original MIDI data becomes whitewashed.
I've done a little bit of poking around in the MusicXML API, enough to understand that notation elements and 'sound' elements are kept separate by design: note elements, for example, appear to be focused on specifying the score's visual appearance, not on MIDI playback. But the API also defines a sound element that can have attributes like 'dynamics' (which translates to MIDI velocity if I understand correctly). Finally, I've had a peek or two at xmlwriter.cpp but I'm not seeing anything there specific to either MIDI or the writing of 'sound' tags.

Could it be that MusicXML is generating new MIDI data based on a literal interpretation of the notation while ignoring MuseScore's playback MIDI data (and if so, why would articulation symbols be ignored in the generation of that new data)? Or is that MuseScore's export process is doing something along the same lines, generating new MIDI data based on it's score specification, rather than using it's original internal playback data?

Any thoughts would be greatly appreciated... Many thanks in advance!

btw: Wonderful work, folks...


Comments

MusicXML has nothing to do with MIDI. A MusicXML file contains no MIDI data whatsoever. Nor does a MuseScore file contain MIDI data. So I don't understand your basic statement about exporting a MusicXML file and finding the MIDI data has lost its articulations. Again, a MusicXML has no MIDI data in the first place. What is it you are really asking here? Are you saying some third party program reads that MusicXML file and displays the aritculations but doesn't play them? If so, that has nothing do with MIDI data within the MusicXML file, because there is no such data. That's just a limitation of the program you are using to import that MusicXML file - apparently it knows how to display articulations but not play them back.

If that's not what you mean, maybe you could try explaining again, with more specific examples.

In reply to by Marc Sabatella

Right, well it sounds like that answers my question and the issue may indeed be on the DAW's import end, or perhaps simply down to my misunderstanding of how all the parts fit together.

I should mention: I'm new to MuseScore and I've not dealt with MusicXML before (I'm coming to this from the DAW perspective: Logic just recently added MusicXML import to its toolkit and I'm simply exploring potential workflows using it). And: yes, I'm aware MuseScore is a notation application, and that playback is therefore a secondary concern.

As to my basic statement about MIDI data being exported as part of the xml file: I was basing that on two (probably faulty) assumptions: (a) since MuseScore is able to trigger playback through its internal synths, and is also able to export MIDI data, there must be MIDI data stored or generated somewhere behind the scenes; and (b) that the exported MusicXML file contained a description of the MIDI data, along with the score data (I never suspected that it held MIDI data itself) which, when imported to Logic, produced a MIDI region as well as a score view - and so I was wondering if MuseScore, when it performs the export, was somehow specifying what that xml description of the MIDI data should be, based on it's own internal data.

It sounds like that's all wrong, and that the DAW is simply creating the MIDI data on import, based on whatever is being described in the xml score data.

Finally, as to the difference in the resulting MIDI files (so no one thinks I'm just making this up), see the following screen grabs of a simple test score:

test-score-grab.png

Exporting this as a MIDI file produces the following in Logic:

midi-import.png

The score, of course, looks pretty bad - the notation is poor and most of the symbols have been dropped - but we expect this because it's a MIDI file import. The MIDI data itself, on the other hand, honors most of MuseScore's articulations: the piano-roll view shows the shortened staccato notes and changing dynamics (as MIDI velocity, denoted by the various note colors) and the event list on the left shows those changing velocities (2nd column from the right). Playback is quite faithful to MuseScore's internal playback, except now I can use my own selection of virtual instruments. All of this is as one would expect and I should think it's a fair guess to say that MuseScore is producing this MIDI data, on export, based on whatever is in the .mscz file.

Exporting it as MusicXML, however, produces a very different MIDI result:

musicxml-grab.png

Here, the score is not bad (though not nearly as nice as the MuseScore original) and you can see some meta events (here representing a few of the score symbols) in the event list on the left. But the MIDI region contains no articulations (except for the two accented notes in bar 2) and no dynamics.

So my entire question was based on: why is this so, and at which stage in the process (at the MuseScore export end, the MusicXML file format end or the DAW import end) is this difference in MIDI data being produced.

From your reply it sounds like MuseScore simply generates, on export, either MIDI (based on it's internal playback) or xml (based on it's score definition) and there's no way to capture both at the same time. I was also mistaken, I take it, in thinking that additional MIDI note data (velocity, altered durations etc) could be described in the xml file using MusicXML's 'sound' element tags. Forgive my naivety here - as I say, some of this stuff is new territory for me.

What I'm looking for, btw, is the best of both worlds: composing raw materials in MuseScore, then importing to Logic - for further development and sound production - while retaining both the score data and the 'real' MIDI data (and perhaps being able to shuttle things between MuseScore and Logic as well). And yeah, I realize this may not be the exact use case MuseScore is intended for but I do think there could be some useful scenarios if this were possible. Logic has some wonderful MIDI manipulating capabilities stemming from the old Emagic days, and MuseScore strikes me as fabulous venue, not just for beautiful notation, but also for the development of interesting tools for composing.

I hope that helps clarify things. Thanks for the reply, Mark. Any further comments you care to add to the above would be most appreciated!

In reply to by bonedog

If a score is exported as an MID, all the dynamics and articulations are embedded in the notes and saved.
For example: when you put "p" dynamics in a music note, its characteristic is already in the velocity attribute of midi notes. No more "p" sign is required.
----
Maybe in the future (I'm just thinking), these marks can be processed as "meta-tags" using the MIDI-expansion feature. (Text-event only)
This time another challenge comes: If the extractor software can write this, can the importer-software read it? (There is no standard yet)
For crescendo, diminuendo only, CC # 11 messages can be written and read (while CC # 7 remains constant). But I don't know if there is any software that does this.

Note: I'm not in the MuseScore development team. I'm just a regular user. What I'm saying here is my own experiences.

In reply to by bonedog

What MuseScore has internally is a mostly "semantic" representation of the score - not unlike MusicXML. So it keeps track of things like "staccato quarter note on beat 2" combined with "most recent dynamic we saw was mf", not "note on message starting at tick 480 at velocity 96 and lasting for 120 ticks". Well, it's actually somewhere between those extremes, but closer to the first. When it is time to play your score, we generate the necessary MIDI info, but we don't save that or export it to other formats (other than MIDI of course). I don't know that MusicXML would have a way to record it even if we wanted to. So the MusicXML file just has the info more like "staccato quarter note on beat 2".

So yes, I think it is safe to saw Logic simply doesn't have any concept of how to interpret the articulations for playback. It's possible MusicXML has some way for the exporting program to provide playback hints, not sure. If so, then in theory we could include some of that in our MusicXML export. But I have no idea if that's actually a thing or not. Really, since Logic is the program more concerned with playback, it makes much more sense for *them* to simply do a better job of interpreting the symbols already present in the MusicXML.

In reply to by Marc Sabatella

Very helpful, thank you both.

Agree this sounds more and more like a limitation on Logic's end.

As it is, some of Logic's internal score symbols have 'special' meanings which trigger meta events that affect playback (as in my screen shot above). But some - in fact, most - are simply ignored insofar as playback is concerned. I suspect on import they're simply mapping xml symbols to internal symbols, with the result that a few affect playback (like those accented notes, also above) but most don't. And that's why I was thinking that the ability to encode articulations in xml as additional playback data had to be the key.

As for MusicXML even supporting playback articulations (and the possibility of MuseScore recording this data on export) I have to say I find the MusicXML docs rather vague. Regarding the 'sound' element, for example, we're told (emphasis is mine):

"The sound element contains general playback parameters. They can stand alone within a part/measure, or be a component element within a direction. Tempo is expressed in quarter notes per minute. If 0, the sound-generating program should prompt the user at the time of compiling a sound (MIDI) file. Dynamics (or MIDI velocity) are expressed as a percentage of the default forte value (90 for MIDI 1.0)."

...but very little else, and there are no examples showing usage. I've tried manually adding some of these tags to an xml file but so far I've had no luck in getting them to do anything at all.

In any case, thanks again - this at least points me in the right direction.

Mark, one last off-topic question: do you guys have a rough timeline for the release of 3.0, and will the plugin API be enhanced/expanded as part of that release?

In reply to by bonedog

We do have some people on the forums here more knowledgeable about MusicXML, hopefully someone will step in.

Regarding your last question: no timetable on 3.0, and no idea what the plans are for the plugin framework. But I can say based on the current state of development, 3.0 is still quite a ways off.

In reply to by [DELETED] 5

Thanks, Nicolas. Yeah, I figured MuseScore currently does not export that type of playback data to MusicXML, though I suppose part of my original question was wondering if it potentially could.

But I'm now also wondering how completely Logic honors - or doesn't - the MusicXML API on import. Those 'sound' elements, for example, which I mention above (and which seemingly affect playback in some cases - e.g. tempo - but not in others - e.g. dynamics): is that an incomplete MusicXML implementation thing or an incomplete Logic import thing? I begin to suspect the latter.

In reply to by bonedog

What we really would need was a script that would alter the midi notes in logic according to the graphically correct score view that results from the musicxml import. There's some type of "MIDI meaning" settings that probably could transform the midi information to more closely resemble an actual player performing upon the accents and staccato marks etc. But it seems to be a work in progress, since Logic is adding articulation to its stock instruments but they aren't utilised, not even when importing musicxml. Hopefully this gets better in the near future.

Do you still have an unanswered question? Please log in first to post your question.