Get your geek on- linux discussion

131
rayj wrote:
To me, digital audio is digital audio. Last I looked, Pro Tools and Tracktion were the only suites that rendered audio in 64bit. I'm a little fuzzy on that stat, but I'll be running my system in 64 bit architecture, which is potentially exciting. Of course, I won't really know if it will matter much until I run a few tests.



Curious - I had a bit of a squiz, and pro tools looks about as 64bit as my trousers. I haven't even been able to get it to render a 32 bit float. As far as addressible memory space is concerned, nfi, maybe it is, maybe it isn't - tho I think it very unlikely that it is - my guess is that it either invokes a PAE hack/like thing on the hardware by using something like internal 34bit or 40bit memory addressing to nudge over the 4GB limit.
The only thing google picks up about ProTool and 64 bit are some gay-ass expansion card designed for PCI-64 slots - so we're talking physical topology + bandwidth, not the new sliced bread. If we're talking the mixing engine...maybe, best case scenario. This all hinges on what sort of thing we're really talking when we say 64bit....Bandwidth? Memory Space? Precision level? OS compatibility? I think it's misleading if 64bit is used in a marketing term sense to denote quality by inference, like the way "HD" is used on everything from TV's to T-shirts.

Have you got the inside line on this RayJ? I'm genuinely curious.


In other news, I am getting so fucking sick of the letters H and D used in close conjunction with each other.

Get your geek on- linux discussion

132
skinny honkie wrote:
Have you got the inside line on this RayJ? I'm genuinely curious.


In other news, I am getting so fucking sick of the letters H and D used in close conjunction with each other.


The 64-bit intel I'm running on largely stems from sales propaganda. I'm pretty sure it's solely in the mix engine. I stopped caring about it when I decided to go Linux...not to mention that I don't have an oscilloscope, and the two I have access to aren't calibrated worth a shit (one's tube, for chrissakes)...

I did run across more information on VIA chipsets and JACK, though. I can't recall where, and maybe the issues have been dealt with, but apparently there is/used to be some slop in the way it handles things...

Damn. Wish I remembered where that was. Somewhere in the documentation on preparing 'real-time' kernels and syncing apps...

Get your geek on- linux discussion

134
From the Ardour site, under 'system configuration':

Avoid VIA motherboards and chipsets wherever possible. This company has demonstrated an almost complete disregard for reasonable use of the PCI bus. Their hardware has repeatedly been implicated in a failure to achieve low latency performance. Here is one example of the kinds of problems you can expect.


There's a link in the quote that didn't show. Check it out, if you're into it, and tell me what you think...

Get your geek on- linux discussion

135
skinny honkie wrote:I just did my swot - on the avid site, they state that Protools HD has a 48 bit mixing architecture, so they're adding+summing stereo 24's.


Not to mention that those are theoretical bits. According to Bob Katz, regarding playback (not software mixing or anything), 24 bit usually means 18-20 bit in a practical sense. Most electronics require pretty enormous power rails and unbelievably tight tolerances to actually use the hypothetical range of a stream. The jump from 20 bit to 24 bit seems to require a lot of accurate processing...

But I don't really know what I'm talking about here. Playback and software processing are largely two different things, I know...

Get your geek on- linux discussion

137
I know the VIAs have a non-standard PCI implementation, and I wouldn't choose one if I knew I was doing something extremely PCI sensitive....
but: I think the variance in their PCI is overstated as a cause of problems. Any app/function that sensitive to stuff in the physical layer is a flawed model imho anyway, but that's by the by -
I have had to set up a protools rig on a VIA board, circa early 2k3 - it worked without issues. I've also set up a Canopus DV Raptor on a box with a delta 1010LT, on a VIA motherboard - the DV was by rep a very sensitive device, but it all worked fine - and that box now runs Ubuntu Studio 7.10.
I haven't seen problems come out of the woodwork in either of these two scenarios, but I've seen VIA PCI problems raise themselves in other situations....fortunately, more mundane situations.

VIA's PCI was pretty shitty circa 2000, seemed to improve a lot by 2003, but I haven't had enough problems for me to follow it since then. I'm a little outta the loop now.

If in doubt - avoid.

Get your geek on- linux discussion

138
Yeah, I think I can handle a little slop anyway. There are other places to look for issues...

It's weird. Some configurations show no VIA slop whatsoever. I'm more worried about implementing Firewire, which is probably a bad idea anyway, as it seems susceptible to jitter even under all the commercially popular architectures...

But I would love to be able to swap cheap recording interfaces hither and yon. And I'm even more leery of USB 2.0, although I might just be being superstitious...

Get your geek on- linux discussion

140
rayj wrote:
It's weird. Some configurations show no VIA slop whatsoever. I'm more worried about implementing Firewire, which is probably a bad idea anyway, as it seems susceptible to jitter even under all the commercially popular architectures...

But I would love to be able to swap cheap recording interfaces hither and yon. And I'm even more leery of USB 2.0, although I might just be being superstitious...


If you look closely enough, you will always find jitter. Firewire at least has implicit error-checking and correction for the individual packets transmitted between the host and the device.
And yeah, you're right to be leery of USB (1 or 2) - in practise it's firewire without the error-checking.

Who is online

Users browsing this forum: No registered users and 15 guests