Technical Info on USB MIDI - 7 Bit and 14 Bit

Technical Info on USB MIDI - 7 Bit and 14 Bit

I studied analog and digital communications in college, and I’ve done a little research on USB and MIDI I/O. I wrote up a blog post (since it is too long for a forum) that explains the differences between MIDI 5-Pin and MIDI USB, as well as 7-bit MIDI from 14-bit MIDI.

This should answer most of your questions about what happens between hitting a button on your USB Midi Controller and Traktor receiving a MIDI message. Obviously the mapper maps the message to the software function.

I wrote this up to hopefully help someone who is converting to the digital world and/or for someone that just wants to know what’s happening with his or her devices. Feel free to comment in the forum or in the comments of the blog. I may have left some things out. See the following URL:

Edit: This blog post does not get into the nitty gritty of tech-details, and it leaves out some crucial information on processing, A/D conversion, software, and hardware. It should still help you understand the basic differences though.

Good read!

Your blog keeps becoming better :slight_smile: good job on this article :wink:

Dude, hardcore props for doing this. You have my appreciation :slight_smile:

Great article - Im sure that is going to help a lot of people.

I think your statement about 14 bit midi & zero latency is a bit of the mark though.

Zero latency is absolutley not possible, and while the actual time period for 14 bits of data to be transferred at 10 Mb/s is 1.4 uS, both computer operating systems and USB are far from real time. When it comes to midi data latency is still in the order of milliseconds is the norm, regardles of if it is 7 or 14 bit.

Interesting article

So who is responsible in doing the AD conversion of a fader or a knob? The midi controller or the midi interface? Who is truncating the data?

I’m going to assume that you mean that all midi is being sent 14 bits, and it gets truncated down to 7? AD conversion happens on the controller. The data isn’t truncated, because most interfaces use AD converters are 7-bit for the standard midi definitions. Usually it’s only the Pitch Bend knobs on keyboards that run in 14-bit, with some exceptions.

I wasn’t saying that the data was 14 bits, but that it was converted to 7 bits lowering the resolution instead. Yeah my english is strange… sorry, I’m spanish.

You think the 7 bits come from the AD conversion stage?

I work with arduinos, which are really cheap circuits with AD/DA conversion for controlling electronics and such and they work with 10 bit resolution. I’d think the vci not being cheap would have the bottle neck somewhere in the firmware instead of being a hardware limitation.

If what you say is true, then the supposed 1.4 firmware will never be able to upgrade the knobs/faders to 14 bits… because obviously it would be a hardware limitation.

It would be nice to know where exactly the 7 bits come from.

It would be really interesting to take a look at firmware code. But I really don’t know where to start to get that.

Don’t know what the ADC’s resolution are, but they are probably 8 or 10 bit resolutions on the VCI, but less that 14 bit so only compatible with the 7 bit midi standard. From what I understand, the bits get truncated in the firmware even at higher resolutions though, cause the midi standard separates the signals into command and data at 7 bits resolution, and unless it’s 2 sets of 7 bit data to make 14-bit, then it’s no good.

Any ADC resolution is compatible with 14 bit midi. I would lay a bet that nearly all midi controllers which have high resolution (14 bit messages) faders etc are sampling at a lower resolution. Most probably 10 which is the most common resolution on mid range micro controllers. This is still 8 times the resolution of 7 bit, giving better than .01 % control on pitch which is the analog control which has the highest requirement for resolution.

Also a 14 bit midi message adds roughly 1 more ms of latency than a 7bit midi message, this is because there is one 7 bit midi message sent every 1 ms. In a 14 bit midi message 2x 7bit messages are sent, so thats where you get your extra 1ms of latency.

I’m glad you added this, midifidler. I was only describing the difference in USB and MIDI cables as far as protocols and wiring is concerned. Overall system speed and processing is ENTIRELY separate, as this depends on the software and the hardware being used. So yes, my numbers are only estimating the time it takes to go from USB output to USB Input… Nothing else is included. It would be too complex to calculate for everyone; sorry for any confusion.

As far as A/D conversion and MIDI bit-lengths are concerned, this is going beyond the scope of my blog post, since I will admit that I don’t know enough about the subject to post. You COULD have an analog input that sends an analog signal to a separate A/D converter… but many modern devices simply sample your analog input on the control itself… much like your camcorder samples real life at 30 frames per second (i.e. 30 pictures a second).

Your analog control input can be converted into a digital output by an encoder… I’ve seen them with up to around 20 bits or so. The encoder takes the data and sends it to the MIDI microchip processor, which must be capable of handling at least 14 bit messages. The messages are queued alongside the other controls midi messages and sent down the USB line. The time it takes for the actual USB interface to send and receive data is very quick. The time for the data to be separated and processed is another story.

BTW thanks for all the reads and comments guys.

Its allways hard when writing an article like this to walk the line between a simplified technical explanation and to much information!

Well done!