right, stupid question. by now i should know the answer. but until now i have had no need to fiddle with this, just used the default settings that my dj i/o installed with.
however, since getting and setting up a dicer in traktor i have experienced the odd 'glitch' in sound (similar to the apc40 issue with send monitor state) when some of the LEDs go on/off - mainly loading the cue points on mode 1 and sometimes when i change a loop on mode 3.
after a tweak through my system to see if i could streamline the paths the signals take, i looked at my audio latency.
on the dj i/o you get a drop down, with the number of samples (128, 256, 512 etc...) and the latency in ms next to it. my first thought was i'll go with the lowest number of samples (40 something) as this would mean less processing and things might work better - NO. got all glitchy at slightest touch of anything.
so went the other way to 512 samples - could go to 1024 but the latency range high end listed was 99ms - my latency now sits around 14ms and things are behaving.
anyway, long story short, am i right in thinking that more samples means better sound and so more latency? and if so, how come the answer to my glitch seems to be 'make the sound better'. i would've though i'd need to compromise some of it?
confused
Bookmarks