In response to #19 sprockkets, sorry this is such a late response, I just checked for responses to my original post. The reason I want Firewire is for Audio Interface purposes, everything from the new Hercules Firewire audio device to Yamaha's MLAN 01X use fireware. Not everything of course, but Firewire is getting very pervasive in pro audio.
It really depends on if gaming is your primary use of sound. An Audigy is good for gaming, PEROID. Music affecianado's need not apply. Furthermore, Creative has never really fixed their PCI bus bandwidth issues(possibly will become irrelevant with PCI Express), and can be problematic with other devices due to a crappy ACPI implementation.
Your diss on the Envy also pretty much ignores its roots in the high end. It is not software audio. It does not do everything that the Audigy does for *gaming* in hardware, but for other functions its all in hardware. It is the ONLY card on the market that not only meets its specs, it exceeds them. The Audigy falls significantly short in several areas(signal to noise, and remember the original Audigy only had 19bit sound despite their 24bit claims, no idea if they fixed that on the Audigy 2 or not).
For someone serious about sound, an Audigy is not a choice. For a pure gamer, it is an option(although honestly the difference between it and a Envy based solution is negligible). In gaming the Audigy has slightly less CPU utilization and a few more effects, but the sound quality is mediocre at best.
Personally I do not find that the few effects it adds are worth the downsides of Creative cards. Also, I am more likely to listen to music on my PC than play games, although I do game occasionally. Soooooo....Creative is a poor choice in *my* situation. Your mileage may vary.
In the great words of Woody Paige, "How many times do I have to straighten you guys out?"
Soundstorm: Great DSP (which only matters for 3d sound rendering), and has absolutely NO impact on the audio quality, that's the job of the codec chip. Since ALL motherboard manufacturers insist on using the piss-poor Realtek ALC650 chip to do the sound output, the sound quality suffers.
To see what Soundstorm can REALLY do, check out the Asus A7N266-C, which put 5.1 out on an ACR card that featured a Sigmatel codec, not the ALC650. By moving the analog part of the implementation away from the motherboard, and using quality analog parts, the sound quality (i.e. noise / frequency response / dynamic range)was greatly improved.
Dolby Digital encoding: Don't forget that DD is COMPRESSED. You can't fit six channels of even 16bit/44.1khz audio into a single SPDIF stream. By utilizing DD, you're taking this nice audio generated for you and mp3'ing it on the fly.
3DSoundSurge.com reviewed the Soundstorm APU and found that the Dolby Digital generated was just six independent streams compressed and "wrapped into" a DD stream. Things like joint stereo weren't utilized at all to share audio information between channels in order to raise the effective bitrate (i.e. if I use 1/2 the bandwidth to describe what's common between two channels, and 1/4 the bandwidth to describe the differences for each channel, then each channel uses an effective 75% bandwidth, instead of just 50%. Ceteris paribus, bitrate = kwalitee. So, DD encoding is a neat idea, but it's a flawed one. That said, why not just integrate six or eight digital outputs on a soundcard using VersaJacks? That way, we harness just the 3D audio rendering power of Soundstorm but leave the analog part to external DACs and amplifiers that are chosen by the user.
It would eliminate the single-cable convinience, but you'd be getting bit-perfect digital output, and it'd be up to the user to pick the DACs and amps he likes. Unfortunately, there don't seem to be any receivers with multichannel digital inputs, but a man can dream of optimal solutions, can't he? :)
That said, a gamer should still have an Audigy. Since every game out there now uses some form of EAX, you get the best results from using hardware that was designed to support that API, not third-party hardware using someone else's drivers (e.g. Sensaura)
Speaking of 3D audio rendering, the Via Envy SUCKS. You guys need to realize that Via Envy is just a C-Media 8738 with 7.1 and nice DACs. It's SOFTWARE AUDIO, people, it's AC'97 that sounds a little better than most. It's an eight channel, 24/192- and 24/96-supporting Sound Blaster freakin' Pro! Not that there's anything wrong with that, but, again, all things being equal, playing an EAX-supporting game will have an Audigy2-equipped machine in front, followed by the Soundstorm-equipped machine, followed by a Via Envy-equipped machine.
Finally, firewire. Firewire = good. Chipset-level firewire = gooder. Keep in mind that Firewire has bus-mastering capability, whereas with USB and USB2, the CPU has to handhold every bit going across the bus. Do you really want your shiny new Athlon64 playing crossing guard with USB2 streams, or would you rather have the bits maneuver themselves across independently? Thought so :)
Chipset-level firewire is good for a simple reason that you only have 133MB/sec maximum theoretical bandwidth. A 400Mb/sec (or 50MB/sec) can eat up to half of your practical PCI bandwidth. Whereas, if it IS integrated, you're only taxing the intra-chipset bandwidth, which is plentiful on A64 boards, and has been plentiful ever since we've gone away from using the PCI bus as the NB/SB interconnect (i.e. the AMD 760 chipset on the AMD side and the Intel BX, which were the last two chipsets to do that).
Whoops, you are correct, I was getting SPDIF mixed up with Toslink cables. My mistake. Heh, I do make those occasionally it seems.
My point was about the optical Toslink cables, not the digital output itself. However, all that aside, the Soundstorm is still a very low quality integrated sound solution...
I don't know what you're talking about. SPDIF is not an optical output. And you don't use optical cable at all. There is also no converter. You ran a coax cable directly from sound card to your receiver's coax input. And it's all digital. There will be no signal loss even if you convert them. However, if you're talking about the different sample rate that causes sound quality issue due to the re-sampling, that is true for most SPDIF ports on board or on sound cards. But that has much to do with the design of the sound card rather than anything else.
#62: If you are going from an optical output to a coax input, you *are* converting the signal. In a straight optical to optical link, it is being converted first inside the source device and again on the reciever. So yes you are converting the signal.
While it is true that most people do not base their mobo purchase decision on APU capability, however when it comes to use the PC as HTPC or simply want to play games on your big screen HDTV, the DD real-time encoding plays a big role on chose which mobo to be in your HTPC. Instead of have to connect 3 analog sound wires and pay big $$ to have a receiver to support multi-channel analog input, you can use a SPDIF/Coax digital connection to get all your sound (desktop, game and DVDs) from PC to the HT.
SPDIF is compatible with coax and all you need is a mono mini-jack to RCA adapter so that you can connect it directly to your coax input on the receiver. There is no double conversion needed. I believe that how most people connect their PC to the receiver.
This chipset looks promising, I like it. And a great article about it :)
I'm a bit curious about the raid - do you guys think it may be possible to implement a hot-swappable raid array with integrated raid controllers anytime soon?
Maybe you can make an article testing the performance boost from using a 4-drive raid 0 array with this baby?
Another thing that interests me - are there any mobos with IGP for Athlon64? I know it won't be a performer, I'm just curious if it even exists. Also is anything being heared about some new DX9 IGP anytime soon(hopefully with this chipset)? It'd also be cool if having an AGP card doesn't disable the IGP, like the ati-intel chipsets... Well I guess I'm dreaming now, but I'd like to see your comments or any info you have on nVidia's IGP plans. I guess you AT folks could ask nVidia about this :)
#59: Try measuring your bandwidth with a 4 drive RAID 0 array using fast drives on that setup and then put the same array in an Intel or AMD chipset system. nVidia's PCI implementation is not very good at all.
[q] Actually, to date nVidia has had a *very* troublesome PCI implementation, anyone with a PCI RAID controller and a 4 disk RAID 0 array can tell you that. It is so bad, in fact, that prototype NF3-150 boards for Opteron used AMD PCI chips just to avoid using the nForce3 integrated PCI bus. I am not certain if these boards ever reached production status however.[/q]
Uh, no. Not in my experience. On my 8RDA+, I've used:
#55: I did not say DDR2 was needed right now, its not and AMD is making the right decision. I was just pointing out that the latency penalty should not be a real issue since it moves more data. But time will tell.
#54: I have not checked out the Catalina yet, however if it does not have a coax output, it will not find a home in my setup. SPDIF is a consumer level technology, championed by Sony, but it is not as high quality as coax simply due to the fact that the signal must be converted twice(to and from optical) which is never a good thing. Furthermore, the cables are frail and expensive. Professional level equipment never has SPDIF, it uses coax exclusively.
Wesley: Glad they are dropping SoundStorm. Waste of time and effort in my opinion.
1 - nVidia is committed to the one-chip chipset for Athlon 64. They are firmly convinced that the one-chip eliminates the potential bottlenecks of a north-south bridge communications bus. Even with the the memory controller on the chip there is only so much real estate practically available on a single-chip chipset.
2 - Customer surveys by nVidia found that most buyers did not use Sound Storm, and that Sound Storm did not enter heavily into the decision to buy nForce. So the decision was made to choose the on-chip LAN, firewall, and much-expanded RAID capabilities which benefit greatly from being moved off the bus.
3 - There are new sound solutions in the works for nVidia. You may see them in a future chipset or on a sound card. Final decisions have not been made.
#53, I'll believe it when I see the tests. It sounds like RAMBUS- that was supposed to be better at latency but turned out the opposite at over twice the cost at the time. Read the last paragraph of Wesley's post(#50)- he's closer to the industry and there are others expressing similar concerns. All these are things that Intel with its resources should iron out and AMD come in when its sorted, If AMD get to a third of the market and in the black then it can show leadership in these areas. Meanwhile stick to what they are best at cpus.
#48: Turtle Beach Catalina which I suspect is a newer card (it's more expensive :) ) than SC, seem to tout optical SPDIF output as a feature (doesn't mention coax at all) and it's merely pass-through SPDIF at that (no hardware Dolby encoding -- thus I'll end up with the additional three audio cables again). Are you sure you have all your facts straight?
If you're a professional musician -- I agree, the SS isn't for you, but I thought nForce was primarily a chipset targetted at gamers?
#52: Latency ends up about the same due to the fact that twice the operations per clock are happening in the same span as regular DDR. It does not, however, give you any real benefit, just higher scalability. The lack of DDR2 support also really has nothing to do with the chipset, its a CPU feature on Athlon64/FX architecture's, not a chipset one, so people bemoaning the lack of DDR2 need to look at AMD, not nVidia.
Like I said before, the only feature needed from my point of view is PCI Express. I refuse to buy anymore PCI or AGP devices at this point knowing that in a year or two they will be useless. Unlike my CPU, I don't often change out my sound card, motherboard, SCSI card, or other such devices, so when it comes time to upgrade my system, PCI Express will be the order of the day for me.
Good to see your real opinions, wesley #50. I too am worried about this slow latency DDR2 particularly with the a64 where I see system memory latency as being the bottleneck for improved gaming speed. AMD have got themselves a winner with a64/newcastle but still have mainboard issues as well as heavy debt. In these conditions, good poker dictates that you play conservatively. So I'm quite happy to see only DDR1 and PCI on the nF3-250 for the moment.
But, so is the lack of PCI-X. It means that system integrators and postproduction facilities will be hesitant about using NF3-250 motherboards for workstations because a significant portion of the current NLE cards want at least a 64-bit PCI slot, if not a PCI-X 66, 100, or 133.
This lack of PCI-X slots on Athlon64 motherboards (you have to get a dual opteron board to get them) means i may have to go Intel for my next systems, and i was really hoping to get an Athlon64 because Lightwave runs best on them overall.
#49 - I heartily DISAGREE with your conclusions. As you will see soon enough DDR2 is at present the same performance as DDR (at best) at twice the price or more. While I do appreciate the potential of DDR2, the current execution is like Prescott - much ado about very little.
As for your bandwidth, we are talking about an Athlon 64 and NOT an Intel CPU. Intel design and deep pipes keep it constantly starved for bandwidth; A64 on the other hand has been shown to perform just about as well with current single-channel DDR as it does with much greater bandwidth dual-channel DDR. This actual performance certainly refutes your claim for the A64 "needing DDR2". Even dual-channel is more a checklist item most consumers demand than it is a huge performance booster on A64. But dual-cahnnel will indeed be a part of socket 939 - doubling memory bandwidth for an Athlon 64 that already competes quite well with single-channel memory.
I do agree with your point about hard-drive throughput, and there is little to complain about in the nF3-250Gb design in that regard.
Talk to memory manufacturers about DDR2. Most are extremely frustrated at having to add huge buffering to even get the 533 stuff to work. In addition latencies are so high at 4-4-4-8 that any performance gain is pretty much nullified. And the cost is prohibitive (sound like early Rambus?). Things WILL improve with time on DDR2, but your sweeping pronouncements are just misinformed.
PCI Express and Hyperthreading won't make a bit of difference in today's games. The only benefit I can think of with nForce3 is *maybe* better sound, and gigabit LAN. PCI Express has been shown to only produce minimal effects on fps, and who cares about hyperthreading unless you enjoy burning CD's and compressing your latest movie while playing a FPS. What this chipset really needed, and the ref . board doesn't support is DDR2. Memory bandwidth and SATA hard drives are the only thing that's going to unleash the power of our already over-kill video cards and load the expansive levels in an acceptable time. Why this article failed to acknowledge this I don't understand.
#46: For purely gaming purposes the Soundstorm does an adequate job. No complaints there. But many people use their PC for more than gaming, and anyone who cares about the actual quality of the sound coming out, especially for music playback would care about the differences. Yes the S/N ratio is very poor on SoundStorm setups. Anyone who cares about excellent reproduction would not be using SPDIF cables as well, they would demand a coax solution for digital output(Turtle Beach SC for instance offers this).
Like I said, it was a leap over what was included on motherboards when it was first released, but it has stagnated since then and the competition is far ahead now. Even Creative Labs, which is not even remotely close to being a leader in sound quality, is far beyond the SoundStorm nowadays. Now give me a SS solution with 24/96 capabilities and 106 S/N ratio and they would be back in the hunt. But that won't happen, nVidia is not a audio company.
The dually is good if you're running a game and other apps even if they are single threaded. I don't of course but many do, to switch quickly to avoid the boss or for 10 minutes relaxation while working. There is some loss of performance as a result of the cpus watching each other but with the present design and power of the opteron it wouldnt be noticeable. I'd like a dually.
#41: Soundstorm=poor quality in what way? S/N? I'm using the SPDIF connector and get 5.1 surround in most of today's games and DVD movies. What other audio solution features Dolby encoding in hardware? I have not seen (heard) one yet.
SoundStorm is the only audio solution that offers Audigy2 much competition when it comes to CPU usage in games.
When something better appears, I'll switch in a second, but for now I dread my next motherboard upgrade as it'll mean I'll have to go back to standard audiocables again (and no less than three at that, in addition to the SPDIF cable!). :-(
As for USB2: It sucks. Compare external drive solutions, the old firewire400 interface wins every time. If nVidia has really cut firewire support, lets atleast hope they get USB 2.0 support right this time. I had to install an extra USB 2.0 controller to get my Thrustmaster FF wheel working for more than five minutes at a time (I tried with both Epox 8RDA3+ and ABit AN7 motherboards).
personally i wouldnt mind a dual cpu a64 solution. In my experience, it means a hell of a lot more time between upgrades. Hell, i've even still got a dual celeron 500 bp6 setup that is quite usable still, even tho its running BeOS, ie. support is kinda dead :)
Trogdor: Yes multi-threading is more complicated, however its a shift that everyone *is* making. There is really very little excuse to make single threaded applications on today's hardware and operating system environments, its an issue more of an established method of doing things giving way *very* slowly to new ways. For an industry that embraces most new technology, its strange that they did not change their design philosophies long ago, really once Win9x(and Pentium CPU's) became a standard the infrastructure was in place...
#39: In my honest opinion, the lack of Soundstorm is an improvement. The APU they were using was a lot of marketing, but relatively poor quality. Even the 'cheap' off brands had better chips available, and nowadays with Via's Envy line the Soundstorm is very, very out of date. I think its absence represents the reality that nVidia did not see enough of a benefit in trying to become a full fledged audio processing company, and since most motherboards without nForce chipsets have other solutions it wasn't a huge value-add(many NF2 boards did not even utilize the nVidia solution).
Any serious enthusiast would be using a Turtle Beach, M-Audio(or other Via Envy solution), or Audigy anyways, at least if sound quality mattered to them at all. Soundstorm was decent in its time, but they did not try to compete when the next generation arrived(Audigy/Envy) and they weren't top of the line when they were introduced(TB Santa Cruz had that crown).
Its a risk/reward scenerio, and the rewards did not outweigh the risks of the heavy investment it would take to keep up with the big boys.
I don’t understand why they don’t have fire wire. It can’t be that hard to include it, and MB manufacturers would be very happy with that since they wouldn’t need to mess with another chip and leads. It would also help in the whole SFF and laptop areas.
For all the people wining about the sound, I still think they are aiming this at servers and workstations. Plus gamers would want the pci sound anyways, I know people who add pci sound even with the awesome nforce 2 sound, go figure.
Finally, enough bitching about the typos, once is enough. I don’t see you with a reference board in hand!
I'm disappointed there's no PCI-Express support. What's the deal with that? When will nVidia make a chipset like the n3-250 plus PCI-Express? Geez, even SiS has a good chipset w/ PCI-E.
I'm fairly certain that this is just a generic board to test the chipset out with, it's not going to be the final product put out by GigaByte or Abit... After all, most nForce2 boards have 3 DIMM slots, while the GigaByte GA-7Nxxx series all had 4...
Now that nVidia's shown that they can still make motherboard chipsets, I think it's time they showed us they can still make video cards that rock your pants off.
Wow @ 2.4 Ghz. But Only 2 DIMMs for RAM? Please tell me other boards will have more than 2! Im running with 2x 256 + 1 x 512 Dimm. It would kill my bank account to waste another 100 bucks on ram.
#31 - You ever tried to make a gaming engine multi-threaded? How about making it really multi-threaded so that you might get a 50-100% boost in performance by adding a second processor? I won't say it can't be done, but it is a *major* change in design philosophy and coding. My experience with multi-threaded applications is that they are much more complex to get working properly. The only game so far that I've heard of trying to use multi-threading was Quake 3, and it didn't work very well. I think the estimate of 3 or more years before games start taking advantage of multi-threading is pretty optimistic, but we'll see.
Wow, this is the first product in a few months that has been interesting (though, the coming NV40/R420 war will be fun to watch).
The gigE interests me because I'm looking at a home media network that would be seperate from my normal network, and looking at putting out simulatenous DVD/HDTV feeds over the network was kinda iffy on 100Mbit networks (HD can be up to 19Mbit/s, DVDs are probably anywhere from 2Mbit/s to 4 or 5Mbit/s).
My only gripe is that the socket 939 chips arent ready yet. I'm waiting for those to show up before I make a move.
Once again, the only person who said anything about gaming performance and dual CPU rigs in reference to today's environment is you, Prisoner. I fire up a game on my PC maybe once a month, so honestly buying ANY pc component for gaming reasons is more than a little rediculous in my case(thats what I have an Xbox for).
However I have plenty of reasons to run dual CPU's, I mess around with making my own DVD's, occasionally I am known to compile a kernel, etc. These are becoming increasingly popular in the average home as well, especially with DVD recorders getting cheap and people wanting to convert those home movies.
As for games, my only point was that the installed base is being created now. I'd recon that at this point there are more HT compatible P4's sold than there are NV30 or R300 class and higher graphics cards on the market, and they are already developing games targetted for those platforms. All it really would take is Epic and id making their next generation engines more multi-threading friendly and you'd see mass adoption since those engines form the basis for a huge number of games. The potential for major increases in gaming performance is there, it just has not been tapped yet.
However, as I said, gaming is a relatively *minor* reason for dual CPU adoption. Believe it or not, most people don't do any sort of serious gaming on their PC, so it would really never be much of a selling point...
#27 what is your obsession with games? Anyway my argument that dual CPU systems are highly relevant to enthusiasts stands and that has very little to do with games and more with multi-tasking and highly demanding applications such as video editing, image rendering, code compilation, server duties, etc...
Anyway the gap between a dual and single CPU systems with regards to games really is quite small these days and mostly it is down to the board in question being focussed on stability and reliability rather than outright performance. I'm guessing you wouldn't want for games performance from a dual Athlon FX-53 system on an nVidia nForce3-250 chipset.
I'll be waiting for PCI Express versions too. It doesn't appear that the jump to A64 is going to give me enough of a speed increase over an OC'd Barton until I'm ready to replace my 9800 Pro anyway.
#24 and #25, the idea of buying "ahead of the curve" for technology has historically been a stupid, cost-ineffective idea. Buying a duallie system today (at mucho $$$) because you expect to find duallie-ready games in the next three to five years is just dumb use of your money. I say three to five years because that's how long it's going to be before gaming companies produce software that either demands dual CPU's or demands Hyperthreading. In the meantime, you'll have one very expensive processor on a very expensive motherboard just sitting around twiddling its thumbs. And by the time these games DO come out, both of your CPU's (and very likely your motherboard as well) will be obsolete. Such is the way of things.
Now, one of you DID touch on a good reason to get a duallie system, namely if you're doing compute-intensive stuff like 3D rendering. I happen to do that for a living, and I've got 8 dual Athlon systems in a render farm. Much more cost effective than single CPU systems, but none of them will ever win any points in a gaming match.
#22 I didn't say specifically for games, I said enthusiast. A dual CPU system is inherently more flexible, be it compiling code faster, to rendering pictures quicker to multi-tasking using many apps. How many enthsiasts simply run one program at a time? I know I don't and could make use of a powerful dual CPU system.
Dual CPU systems do not need to run with ECC/Registered memory although typically due to the target market this is a feature. Running a dual processor FX system with standard DDR memory could be a very fast and cost effective machine.
#22: I agree with you until you get to the part about 'never will'. HyperThreading is making developers consider making thier apps multi-threaded, and starting sometime next year multi-core CPU's will be introduced most likely. When most machines sold have the ability to process more than one thread at a time, it would be pretty stupid to ignore that factor.
So for now, multiple CPU's is not that helpful for *gaming*, although it is for many other applications. In the future, however, I expect it to be very helpful for everything, including gaming.
#4 - Ass-kissing has never been my forte. I consider myself an equal-opportunity offender. After finding none of the AGP locks worked on Round 1 chipsets, you better believe I would test for myself whatever I am told about the new boards.
Frankly I really like nF3-250GB, but I also hear good things about SiS 755FX for 939 (1200HT) and VIA's update for 939. After some of the crap we've had to endure with Round 1 chipsets, it will be nice to have some good Athlon 64 choices in Round 2.
#20, gamers that buy dual-CPU systems are just being stupid. Practically no game out there makes good use of more than one CPU, and none are planned. Add to that the overhead of having additional CPU's in the system, the cost of a dual system versus a single, and the slower memory (Reg'd ECC), and you've got a tremendous waste of money. I have *never* seen a dual-CPU game box outrun a single-CPU game box, and I doubt we ever will.
#18, I know it's full duplex, but even then you will have a hard time getting full utlization under normal working conditions. Benchmarks are designed to run things at unrealistic rates. The point is, although I don't encourage it, you can certainly put Gigabit on the PCI bus and get very usable performance out of it. In most cases, the limiting factor is going to be CPU utilization anyway.
External HDDs could make good use of a Firewire connection, especially now it is whizzing along at 800MBit/s.
The multi CPU implementation sounds interesting, of course AMD will completely fail to capitalise on it by not making the FX dual processor capable. How many enthusiasts (AMD wise) could resist the chance of dual FX-53s, especially with the possibility of overclocking them? You have the distinction between the 2xx series and the FX due the removal of ECC/Registered memory in the FX 939 series, so they essentially serve two different markets.
Why would you need firewire with USB2? OK, ipod and camcorders.
I have one question. Since you use a browser to configure the firewall, does this mean it is OS independant, i.e., I can use it in Linux without needing drivers to run it? Soundstorm not present on here, oh well, almost all uATX boards had the MCP and not MCP-T so it didn't matter anyhow, and it doesn't work in Linux anyhow. VIA sound is troublesome in Linux too. I rather use my own sound card. Just hope there is a driver for the cool LAN adapter.
#10 - LAN is Duplex. Gigabit on PCI with overhead can do about 820mb/sec in industry standard tests. nVidia's on-chip LAN could output about 1840 mb/sec in the benchmarks we have seen. This is more than twice as fast IF you have a source that can actually output 1GB in both directions.
#11 - PCI Express will be seen on Intel boards very soon. AMD boards will not move as rapidly to the Intel PCI Express standard.
#12 - Firewire is not on-chip. Undoubtedly many mfgs will add firewire with an additional chip on-board nF3-250.
''No one can possibly complain about the feature-set of nForce3-250''
to add my vote to what's already been said, no firewire for my iPod and no SoundStorm/DolbyDigital for that lovely Yamaha amp I just bought mean i think someone needs to calm down a little about all that excitement (and learn a little about the difference between megabits and bytes by the sound of things)
i wonder if they'll release Soundstorm as a PCI eXpress card....
#8: Actually, to date nVidia has had a *very* troublesome PCI implementation, anyone with a PCI RAID controller and a 4 disk RAID 0 array can tell you that. It is so bad, in fact, that prototype NF3-150 boards for Opteron used AMD PCI chips just to avoid using the nForce3 integrated PCI bus. I am not certain if these boards ever reached production status however.
As for this chipset, it looks nice, but honestly I'll wait until there is a PCI Express solution out there, I was just forced due to power problems destroying my equipment to upgrade my motherboard prematurely, and I don't intend to buy another until the next wave of features is available...
Looks like another error on the "Conclusion" page. Last sentence, second paragraph says "We expect that some enterprising companies, which specializes in catering to the computer enthusiast, will slip in some Socket 954 boards based on the Ultra chipset with a Gigahertz HyperTransport."
There's a huge gaffe on the On-Chip Gigabit page. It states that Fast Ethernet runs at "100MB/sec" and Gigabit runs at "1000MB/sec." "MB" is shorthand for mega<i>bytes</i>, not mega<i>bits</i>. Megabits should be abbreviated "Mb."
Normally I wouldn't be this anally-retentive, but the poor usage leads to another problem later on down the page. The article states that Gigabit Ethernet running at "1000MB/sec" is faster than the PCI bus which runs at "133MB/sec." The PCI rate figure is correct, but the Gigabit figure makes it look like Gigabit is about 8 times faster than the PCI bus itself. <i>It's not!</i> The PCI bus runs at (133Mbytes/sec X 8 bits/byte = ) 1064Mbit/sec, which faster than Gigabit. The article is very misleading in this respect.
In truth, the PCI bus can almost never reach its peak 133MB/sec rate (usually it's around 100MB/sec) but then again Gigabit can't reach it's peak either.
Regardless, the article is completely incorrect when it indicates a Gigabit card would overwhelm a PCI bus. This is not true.
First off: GB is GigaByte. Wesley wrote "GB" more than once while actually referring to Gigabit (bit has lowercase b).
Next, 1000Mbps is roughly 125MB/s (theoretical peak I expect). 33MHz 32-bit PCI is roughly 133MB/s. I dislike PCI Gb implementations as the next guy, but I'd still like to know how nVidia managed to come up with the half speed figure? Perhaps nVidia's PCI-bus implementation is sub-par? (which is a real issue! Via has struggled with really bad PCI performance for years :-( )
Finally there's 6-channel audio; What happened with Soundstorm and Dolby encoding implemented in hardware? (I currently use only the SPDIF connectors on my nForce2 and get surround sound both in games and while playing DVDs -- is there no way to get this functionality with Athlon64?)
Hopefully the next article will shed some light on some of these issues. Cheers! :)
yeah, the SiS 755FX plug at the end was sort of a red-herring - didn't fit at all with the article, which was soley about Nvidia, it didn't need SiS's recent efforts tacked on the end at the last second.
A couple things:
1) to all you nay-sayers about the worth of gigabit ethernet - I thumb my nose at you! Let's not play chicken or the egg games here, let's just usher in new *desired* technology as smoothly as possible - having gigabit ethernet will push me to replace my netgear 10/100 switched hub, not the other way around.
2) Anandtech, what's with the nvidia ass kissing? When you say things like 'Nvidia assured us.." and "We did test Nvidia's claim... [and we believe it]" - come on, a little healthy doubt is a good thing. Just because they supplied you with a reference nforce3 250 mobo doesn't mean you have to see how far you can stick your tongue up their butt. Honestly, the article felt like it leaned toward Nvidia abit. Believe it or not, you can report on a product without it sounding like some money changes hands or something.
What's with the SiS 755 crap at the end of the article? Someone didn't proof read, huh? That is also obvious in the spelling errors. Excellent article. Better than recent ones. I do wish that you had been able to include the performance portion, cuz now I'm itching to see them. One thing tho, how many people have several gigabit systems at home? I know I will not upgrade any of mine until they are replaced, so it will be awhile. Therefore I am not too excited at this point, especially if the high speed wireless standards work out to high enough throughput to allow real time multi-media transfers. Love the on chip firewall, but Zonealarm is still the only useful application specific solution I know of. Not that I'm an expert, I am far from it, but the Blackice debacle was seen coming long ago.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
71 Comments
Back to Article
arswihart - Monday, April 5, 2004 - link
In response to #19 sprockkets, sorry this is such a late response, I just checked for responses to my original post. The reason I want Firewire is for Audio Interface purposes, everything from the new Hercules Firewire audio device to Yamaha's MLAN 01X use fireware. Not everything of course, but Firewire is getting very pervasive in pro audio.draven31 - Saturday, March 27, 2004 - link
note that the S/PDIF spec says that a 'fiber'interface is available... that is a optical S/PDIF. TOSLINK is a type od S/PDIF optical connector.Reflex - Saturday, March 27, 2004 - link
It really depends on if gaming is your primary use of sound. An Audigy is good for gaming, PEROID. Music affecianado's need not apply. Furthermore, Creative has never really fixed their PCI bus bandwidth issues(possibly will become irrelevant with PCI Express), and can be problematic with other devices due to a crappy ACPI implementation.Your diss on the Envy also pretty much ignores its roots in the high end. It is not software audio. It does not do everything that the Audigy does for *gaming* in hardware, but for other functions its all in hardware. It is the ONLY card on the market that not only meets its specs, it exceeds them. The Audigy falls significantly short in several areas(signal to noise, and remember the original Audigy only had 19bit sound despite their 24bit claims, no idea if they fixed that on the Audigy 2 or not).
For someone serious about sound, an Audigy is not a choice. For a pure gamer, it is an option(although honestly the difference between it and a Envy based solution is negligible). In gaming the Audigy has slightly less CPU utilization and a few more effects, but the sound quality is mediocre at best.
Personally I do not find that the few effects it adds are worth the downsides of Creative cards. Also, I am more likely to listen to music on my PC than play games, although I do game occasionally. Soooooo....Creative is a poor choice in *my* situation. Your mileage may vary.
Odeen - Saturday, March 27, 2004 - link
In the great words of Woody Paige, "How many times do I have to straighten you guys out?"Soundstorm:
Great DSP (which only matters for 3d sound rendering), and has absolutely NO impact on the audio quality, that's the job of the codec chip. Since ALL motherboard manufacturers insist on using the piss-poor Realtek ALC650 chip to do the sound output, the sound quality suffers.
To see what Soundstorm can REALLY do, check out the Asus A7N266-C, which put 5.1 out on an ACR card that featured a Sigmatel codec, not the ALC650. By moving the analog part of the implementation away from the motherboard, and using quality analog parts, the sound quality (i.e. noise / frequency response / dynamic range)was greatly improved.
Dolby Digital encoding:
Don't forget that DD is COMPRESSED. You can't fit six channels of even 16bit/44.1khz audio into a single SPDIF stream. By utilizing DD, you're taking this nice audio generated for you and mp3'ing it on the fly.
3DSoundSurge.com reviewed the Soundstorm APU and found that the Dolby Digital generated was just six independent streams compressed and "wrapped into" a DD stream. Things like joint stereo weren't utilized at all to share audio information between channels in order to raise the effective bitrate (i.e. if I use 1/2 the bandwidth to describe what's common between two channels, and 1/4 the bandwidth to describe the differences for each channel, then each channel uses an effective 75% bandwidth, instead of just 50%. Ceteris paribus, bitrate = kwalitee. So, DD encoding is a neat idea, but it's a flawed one.
That said, why not just integrate six or eight digital outputs on a soundcard using VersaJacks? That way, we harness just the 3D audio rendering power of Soundstorm but leave the analog part to external DACs and amplifiers that are chosen by the user.
It would eliminate the single-cable convinience, but you'd be getting bit-perfect digital output, and it'd be up to the user to pick the DACs and amps he likes. Unfortunately, there don't seem to be any receivers with multichannel digital inputs, but a man can dream of optimal solutions, can't he? :)
That said, a gamer should still have an Audigy. Since every game out there now uses some form of EAX, you get the best results from using hardware that was designed to support that API, not third-party hardware using someone else's drivers (e.g. Sensaura)
Speaking of 3D audio rendering, the Via Envy SUCKS. You guys need to realize that Via Envy is just a C-Media 8738 with 7.1 and nice DACs. It's SOFTWARE AUDIO, people, it's AC'97 that sounds a little better than most. It's an eight channel, 24/192- and 24/96-supporting Sound Blaster freakin' Pro! Not that there's anything wrong with that, but, again, all things being equal, playing an EAX-supporting game will have an Audigy2-equipped machine in front, followed by the Soundstorm-equipped machine, followed by a Via Envy-equipped machine.
Finally, firewire.
Firewire = good. Chipset-level firewire = gooder. Keep in mind that Firewire has bus-mastering capability, whereas with USB and USB2, the CPU has to handhold every bit going across the bus. Do you really want your shiny new Athlon64 playing crossing guard with USB2 streams, or would you rather have the bits maneuver themselves across independently? Thought so :)
Chipset-level firewire is good for a simple reason that you only have 133MB/sec maximum theoretical bandwidth. A 400Mb/sec (or 50MB/sec) can eat up to half of your practical PCI bandwidth. Whereas, if it IS integrated, you're only taxing the intra-chipset bandwidth, which is plentiful on A64 boards, and has been plentiful ever since we've gone away from using the PCI bus as the NB/SB interconnect (i.e. the AMD 760 chipset on the AMD side and the Intel BX, which were the last two chipsets to do that).
WHEW.
Reflex - Saturday, March 27, 2004 - link
Whoops, you are correct, I was getting SPDIF mixed up with Toslink cables. My mistake. Heh, I do make those occasionally it seems.My point was about the optical Toslink cables, not the digital output itself. However, all that aside, the Soundstorm is still a very low quality integrated sound solution...
Foxbat121 - Friday, March 26, 2004 - link
Please check this link for S/PDIF information:http://www.mtsu.edu/~dsmitche/rim420/materials/Int...
Foxbat121 - Friday, March 26, 2004 - link
#64,I don't know what you're talking about. SPDIF is not an optical output. And you don't use optical cable at all. There is also no converter. You ran a coax cable directly from sound card to your receiver's coax input. And it's all digital. There will be no signal loss even if you convert them. However, if you're talking about the different sample rate that causes sound quality issue due to the re-sampling, that is true for most SPDIF ports on board or on sound cards. But that has much to do with the design of the sound card rather than anything else.
Reflex - Friday, March 26, 2004 - link
#62: If you are going from an optical output to a coax input, you *are* converting the signal. In a straight optical to optical link, it is being converted first inside the source device and again on the reciever. So yes you are converting the signal.Foxbat121 - Friday, March 26, 2004 - link
#56,While it is true that most people do not base their mobo purchase decision on APU capability, however when it comes to use the PC as HTPC or simply want to play games on your big screen HDTV, the DD real-time encoding plays a big role on chose which mobo to be in your HTPC. Instead of have to connect 3 analog sound wires and pay big $$ to have a receiver to support multi-channel analog input, you can use a SPDIF/Coax digital connection to get all your sound (desktop, game and DVDs) from PC to the HT.
Foxbat121 - Friday, March 26, 2004 - link
#58,SPDIF is compatible with coax and all you need is a mono mini-jack to RCA adapter so that you can connect it directly to your coax input on the receiver. There is no double conversion needed. I believe that how most people connect their PC to the receiver.
Visual - Friday, March 26, 2004 - link
This chipset looks promising, I like it. And a great article about it :)I'm a bit curious about the raid - do you guys think it may be possible to implement a hot-swappable raid array with integrated raid controllers anytime soon?
Maybe you can make an article testing the performance boost from using a 4-drive raid 0 array with this baby?
Another thing that interests me - are there any mobos with IGP for Athlon64? I know it won't be a performer, I'm just curious if it even exists. Also is anything being heared about some new DX9 IGP anytime soon(hopefully with this chipset)? It'd also be cool if having an AGP card doesn't disable the IGP, like the ati-intel chipsets... Well I guess I'm dreaming now, but I'd like to see your comments or any info you have on nVidia's IGP plans. I guess you AT folks could ask nVidia about this :)
Thanks,
Visual
Reflex - Thursday, March 25, 2004 - link
#59: Try measuring your bandwidth with a 4 drive RAID 0 array using fast drives on that setup and then put the same array in an Intel or AMD chipset system. nVidia's PCI implementation is not very good at all.MichaelD - Thursday, March 25, 2004 - link
[q] Actually, to date nVidia has had a *very* troublesome PCI implementation, anyone with a PCI RAID controller and a 4 disk RAID 0 array can tell you that. It is so bad, in fact, that prototype NF3-150 boards for Opteron used AMD PCI chips just to avoid using the nForce3 integrated PCI bus. I am not certain if these boards ever reached production status however.[/q]Uh, no. Not in my experience. On my 8RDA+, I've used:
Highpoint ATA133 Contoller Card
3Ware7000-2 Two-channel IDE RAID card
LSI Megaraid 1600 SCSI RAID card
I've had zero problems. Wha'chu talkin' bout, Willis?
Reflex - Thursday, March 25, 2004 - link
#55: I did not say DDR2 was needed right now, its not and AMD is making the right decision. I was just pointing out that the latency penalty should not be a real issue since it moves more data. But time will tell.#54: I have not checked out the Catalina yet, however if it does not have a coax output, it will not find a home in my setup. SPDIF is a consumer level technology, championed by Sony, but it is not as high quality as coax simply due to the fact that the signal must be converted twice(to and from optical) which is never a good thing. Furthermore, the cables are frail and expensive. Professional level equipment never has SPDIF, it uses coax exclusively.
Wesley: Glad they are dropping SoundStorm. Waste of time and effort in my opinion.
BikeDude - Thursday, March 25, 2004 - link
Thanks Wesley; a single chip implementation makes sense. Now show us the benchmarks! :)Wesley Fink - Thursday, March 25, 2004 - link
#54 and others regarding Sound Storm -1 - nVidia is committed to the one-chip chipset for Athlon 64. They are firmly convinced that the one-chip eliminates the potential bottlenecks of a north-south bridge communications bus. Even with the the memory controller on the chip there is only so much real estate practically available on a single-chip chipset.
2 - Customer surveys by nVidia found that most buyers did not use Sound Storm, and that Sound Storm did not enter heavily into the decision to buy nForce. So the decision was made to choose the on-chip LAN, firewall, and much-expanded RAID capabilities which benefit greatly from being moved off the bus.
3 - There are new sound solutions in the works for nVidia. You may see them in a future chipset or on a sound card. Final decisions have not been made.
Pumpkinierre - Thursday, March 25, 2004 - link
#53, I'll believe it when I see the tests. It sounds like RAMBUS- that was supposed to be better at latency but turned out the opposite at over twice the cost at the time. Read the last paragraph of Wesley's post(#50)- he's closer to the industry and there are others expressing similar concerns. All these are things that Intel with its resources should iron out and AMD come in when its sorted, If AMD get to a third of the market and in the black then it can show leadership in these areas. Meanwhile stick to what they are best at cpus.BikeDude - Thursday, March 25, 2004 - link
#48: Turtle Beach Catalina which I suspect is a newer card (it's more expensive :) ) than SC, seem to tout optical SPDIF output as a feature (doesn't mention coax at all) and it's merely pass-through SPDIF at that (no hardware Dolby encoding -- thus I'll end up with the additional three audio cables again). Are you sure you have all your facts straight?If you're a professional musician -- I agree, the SS isn't for you, but I thought nForce was primarily a chipset targetted at gamers?
Reflex - Thursday, March 25, 2004 - link
#52: Latency ends up about the same due to the fact that twice the operations per clock are happening in the same span as regular DDR. It does not, however, give you any real benefit, just higher scalability. The lack of DDR2 support also really has nothing to do with the chipset, its a CPU feature on Athlon64/FX architecture's, not a chipset one, so people bemoaning the lack of DDR2 need to look at AMD, not nVidia.Like I said before, the only feature needed from my point of view is PCI Express. I refuse to buy anymore PCI or AGP devices at this point knowing that in a year or two they will be useless. Unlike my CPU, I don't often change out my sound card, motherboard, SCSI card, or other such devices, so when it comes time to upgrade my system, PCI Express will be the order of the day for me.
Pumpkinierre - Thursday, March 25, 2004 - link
Good to see your real opinions, wesley #50. I too am worried about this slow latency DDR2 particularly with the a64 where I see system memory latency as being the bottleneck for improved gaming speed. AMD have got themselves a winner with a64/newcastle but still have mainboard issues as well as heavy debt. In these conditions, good poker dictates that you play conservatively. So I'm quite happy to see only DDR1 and PCI on the nF3-250 for the moment.draven31 - Thursday, March 25, 2004 - link
Yes, the lack of PCI Express is a disappointmentBut, so is the lack of PCI-X. It means that system integrators and postproduction facilities will be hesitant about using NF3-250 motherboards for workstations because a significant portion of the current NLE cards want at least a 64-bit PCI slot, if not a PCI-X 66, 100, or 133.
This lack of PCI-X slots on Athlon64 motherboards (you have to get a dual opteron board to get them) means i may have to go Intel for my next systems, and i was really hoping to get an Athlon64 because Lightwave runs best on them overall.
Wesley Fink - Wednesday, March 24, 2004 - link
#49 -I heartily DISAGREE with your conclusions. As you will see soon enough DDR2 is at present the same performance as DDR (at best) at twice the price or more. While I do appreciate the potential of DDR2, the current execution is like Prescott - much ado about very little.
As for your bandwidth, we are talking about an Athlon 64 and NOT an Intel CPU. Intel design and deep pipes keep it constantly starved for bandwidth; A64 on the other hand has been shown to perform just about as well with current single-channel DDR as it does with much greater bandwidth dual-channel DDR. This actual performance certainly refutes your claim for the A64 "needing DDR2". Even dual-channel is more a checklist item most consumers demand than it is a huge performance booster on A64. But dual-cahnnel will indeed be a part of socket 939 - doubling memory bandwidth for an Athlon 64 that already competes quite well with single-channel memory.
I do agree with your point about hard-drive throughput, and there is little to complain about in the nF3-250Gb design in that regard.
Talk to memory manufacturers about DDR2. Most are extremely frustrated at having to add huge buffering to even get the 533 stuff to work. In addition latencies are so high at 4-4-4-8 that any performance gain is pretty much nullified. And the cost is prohibitive (sound like early Rambus?). Things WILL improve with time on DDR2, but your sweeping pronouncements are just misinformed.
jcoltrin - Wednesday, March 24, 2004 - link
PCI Express and Hyperthreading won't make a bit of difference in today's games. The only benefit I can think of with nForce3 is *maybe* better sound, and gigabit LAN. PCI Express has been shown to only produce minimal effects on fps, and who cares about hyperthreading unless you enjoy burning CD's and compressing your latest movie while playing a FPS. What this chipset really needed, and the ref . board doesn't support is DDR2. Memory bandwidth and SATA hard drives are the only thing that's going to unleash the power of our already over-kill video cards and load the expansive levels in an acceptable time. Why this article failed to acknowledge this I don't understand.Reflex - Wednesday, March 24, 2004 - link
#46: For purely gaming purposes the Soundstorm does an adequate job. No complaints there. But many people use their PC for more than gaming, and anyone who cares about the actual quality of the sound coming out, especially for music playback would care about the differences. Yes the S/N ratio is very poor on SoundStorm setups. Anyone who cares about excellent reproduction would not be using SPDIF cables as well, they would demand a coax solution for digital output(Turtle Beach SC for instance offers this).Like I said, it was a leap over what was included on motherboards when it was first released, but it has stagnated since then and the competition is far ahead now. Even Creative Labs, which is not even remotely close to being a leader in sound quality, is far beyond the SoundStorm nowadays. Now give me a SS solution with 24/96 capabilities and 106 S/N ratio and they would be back in the hunt. But that won't happen, nVidia is not a audio company.
Pumpkinierre - Wednesday, March 24, 2004 - link
The dually is good if you're running a game and other apps even if they are single threaded. I don't of course but many do, to switch quickly to avoid the boss or for 10 minutes relaxation while working. There is some loss of performance as a result of the cpus watching each other but with the present design and power of the opteron it wouldnt be noticeable. I'd like a dually.BikeDude - Wednesday, March 24, 2004 - link
#41: Soundstorm=poor quality in what way? S/N? I'm using the SPDIF connector and get 5.1 surround in most of today's games and DVD movies. What other audio solution features Dolby encoding in hardware? I have not seen (heard) one yet.SoundStorm is the only audio solution that offers Audigy2 much competition when it comes to CPU usage in games.
When something better appears, I'll switch in a second, but for now I dread my next motherboard upgrade as it'll mean I'll have to go back to standard audiocables again (and no less than three at that, in addition to the SPDIF cable!). :-(
As for USB2: It sucks. Compare external drive solutions, the old firewire400 interface wins every time. If nVidia has really cut firewire support, lets atleast hope they get USB 2.0 support right this time. I had to install an extra USB 2.0 controller to get my Thrustmaster FF wheel working for more than five minutes at a time (I tried with both Epox 8RDA3+ and ABit AN7 motherboards).
GoatHerderEd - Wednesday, March 24, 2004 - link
#44:My bro is a BeOS fan too! How fun is that?
iwantedT - Wednesday, March 24, 2004 - link
personally i wouldnt mind a dual cpu a64 solution. In my experience, it means a hell of a lot more time between upgrades. Hell, i've even still got a dual celeron 500 bp6 setup that is quite usable still, even tho its running BeOS, ie. support is kinda dead :)ripdude - Wednesday, March 24, 2004 - link
Good article I must say, though the lack of PCI-Express is a small disappointment.Also, the conclusion states that socket 939 is a couple of months away, is there a bit more certain release date? Perhaps somewhere in april/may?
Reflex - Wednesday, March 24, 2004 - link
Trogdor: Yes multi-threading is more complicated, however its a shift that everyone *is* making. There is really very little excuse to make single threaded applications on today's hardware and operating system environments, its an issue more of an established method of doing things giving way *very* slowly to new ways. For an industry that embraces most new technology, its strange that they did not change their design philosophies long ago, really once Win9x(and Pentium CPU's) became a standard the infrastructure was in place...Reflex - Wednesday, March 24, 2004 - link
#39: In my honest opinion, the lack of Soundstorm is an improvement. The APU they were using was a lot of marketing, but relatively poor quality. Even the 'cheap' off brands had better chips available, and nowadays with Via's Envy line the Soundstorm is very, very out of date. I think its absence represents the reality that nVidia did not see enough of a benefit in trying to become a full fledged audio processing company, and since most motherboards without nForce chipsets have other solutions it wasn't a huge value-add(many NF2 boards did not even utilize the nVidia solution).Any serious enthusiast would be using a Turtle Beach, M-Audio(or other Via Envy solution), or Audigy anyways, at least if sound quality mattered to them at all. Soundstorm was decent in its time, but they did not try to compete when the next generation arrived(Audigy/Envy) and they weren't top of the line when they were introduced(TB Santa Cruz had that crown).
Its a risk/reward scenerio, and the rewards did not outweigh the risks of the heavy investment it would take to keep up with the big boys.
GoatHerderEd - Wednesday, March 24, 2004 - link
Why did I say it is mostly for servers, and also it would be good for laptops. erg! You get the point.GoatHerderEd - Wednesday, March 24, 2004 - link
I don’t understand why they don’t have fire wire. It can’t be that hard to include it, and MB manufacturers would be very happy with that since they wouldn’t need to mess with another chip and leads. It would also help in the whole SFF and laptop areas.For all the people wining about the sound, I still think they are aiming this at servers and workstations. Plus gamers would want the pci sound anyways, I know people who add pci sound even with the awesome nforce 2 sound, go figure.
Finally, enough bitching about the typos, once is enough. I don’t see you with a reference board in hand!
jlfowler78 - Tuesday, March 23, 2004 - link
I'm disappointed there's no PCI-Express support. What's the deal with that? When will nVidia make a chipset like the n3-250 plus PCI-Express? Geez, even SiS has a good chipset w/ PCI-E.xt8088 - Tuesday, March 23, 2004 - link
Have at another NForce 3 250 review at http://www.hexus.net/content/reviews/review.php?dX...This review mentioned the lack of APU, and it had the benchmark tests.
Shinei - Tuesday, March 23, 2004 - link
I'm fairly certain that this is just a generic board to test the chipset out with, it's not going to be the final product put out by GigaByte or Abit... After all, most nForce2 boards have 3 DIMM slots, while the GigaByte GA-7Nxxx series all had 4...Now that nVidia's shown that they can still make motherboard chipsets, I think it's time they showed us they can still make video cards that rock your pants off.
Regs - Tuesday, March 23, 2004 - link
Wow @ 2.4 Ghz. But Only 2 DIMMs for RAM? Please tell me other boards will have more than 2! Im running with 2x 256 + 1 x 512 Dimm. It would kill my bank account to waste another 100 bucks on ram.TrogdorJW - Tuesday, March 23, 2004 - link
#31 - You ever tried to make a gaming engine multi-threaded? How about making it really multi-threaded so that you might get a 50-100% boost in performance by adding a second processor? I won't say it can't be done, but it is a *major* change in design philosophy and coding. My experience with multi-threaded applications is that they are much more complex to get working properly. The only game so far that I've heard of trying to use multi-threading was Quake 3, and it didn't work very well. I think the estimate of 3 or more years before games start taking advantage of multi-threading is pretty optimistic, but we'll see.Doormat - Tuesday, March 23, 2004 - link
Wow, this is the first product in a few months that has been interesting (though, the coming NV40/R420 war will be fun to watch).The gigE interests me because I'm looking at a home media network that would be seperate from my normal network, and looking at putting out simulatenous DVD/HDTV feeds over the network was kinda iffy on 100Mbit networks (HD can be up to 19Mbit/s, DVDs are probably anywhere from 2Mbit/s to 4 or 5Mbit/s).
My only gripe is that the socket 939 chips arent ready yet. I'm waiting for those to show up before I make a move.
wassup4u2 - Tuesday, March 23, 2004 - link
Then again, the NF3-150 reference board had a "working" AGP/PCI lock...Reflex - Tuesday, March 23, 2004 - link
Once again, the only person who said anything about gaming performance and dual CPU rigs in reference to today's environment is you, Prisoner. I fire up a game on my PC maybe once a month, so honestly buying ANY pc component for gaming reasons is more than a little rediculous in my case(thats what I have an Xbox for).However I have plenty of reasons to run dual CPU's, I mess around with making my own DVD's, occasionally I am known to compile a kernel, etc. These are becoming increasingly popular in the average home as well, especially with DVD recorders getting cheap and people wanting to convert those home movies.
As for games, my only point was that the installed base is being created now. I'd recon that at this point there are more HT compatible P4's sold than there are NV30 or R300 class and higher graphics cards on the market, and they are already developing games targetted for those platforms. All it really would take is Epic and id making their next generation engines more multi-threading friendly and you'd see mass adoption since those engines form the basis for a huge number of games. The potential for major increases in gaming performance is there, it just has not been tapped yet.
However, as I said, gaming is a relatively *minor* reason for dual CPU adoption. Believe it or not, most people don't do any sort of serious gaming on their PC, so it would really never be much of a selling point...
JADS - Tuesday, March 23, 2004 - link
#27 what is your obsession with games? Anyway my argument that dual CPU systems are highly relevant to enthusiasts stands and that has very little to do with games and more with multi-tasking and highly demanding applications such as video editing, image rendering, code compilation, server duties, etc...Anyway the gap between a dual and single CPU systems with regards to games really is quite small these days and mostly it is down to the board in question being focussed on stability and reliability rather than outright performance. I'm guessing you wouldn't want for games performance from a dual Athlon FX-53 system on an nVidia nForce3-250 chipset.
AMDfreak - Tuesday, March 23, 2004 - link
I'll be waiting for PCI Express versions too. It doesn't appear that the jump to A64 is going to give me enough of a speed increase over an OC'd Barton until I'm ready to replace my 9800 Pro anyway.truApostle - Tuesday, March 23, 2004 - link
all your base belong to themprisoner881 - Tuesday, March 23, 2004 - link
#24 and #25, the idea of buying "ahead of the curve" for technology has historically been a stupid, cost-ineffective idea. Buying a duallie system today (at mucho $$$) because you expect to find duallie-ready games in the next three to five years is just dumb use of your money. I say three to five years because that's how long it's going to be before gaming companies produce software that either demands dual CPU's or demands Hyperthreading. In the meantime, you'll have one very expensive processor on a very expensive motherboard just sitting around twiddling its thumbs. And by the time these games DO come out, both of your CPU's (and very likely your motherboard as well) will be obsolete. Such is the way of things.Now, one of you DID touch on a good reason to get a duallie system, namely if you're doing compute-intensive stuff like 3D rendering. I happen to do that for a living, and I've got 8 dual Athlon systems in a render farm. Much more cost effective than single CPU systems, but none of them will ever win any points in a gaming match.
agent2099 - Tuesday, March 23, 2004 - link
AC97 Audio? This is a step backwards from Nforce2. Where is the MCP-T?JADS - Tuesday, March 23, 2004 - link
#22 I didn't say specifically for games, I said enthusiast. A dual CPU system is inherently more flexible, be it compiling code faster, to rendering pictures quicker to multi-tasking using many apps. How many enthsiasts simply run one program at a time? I know I don't and could make use of a powerful dual CPU system.Dual CPU systems do not need to run with ECC/Registered memory although typically due to the target market this is a feature. Running a dual processor FX system with standard DDR memory could be a very fast and cost effective machine.
Reflex - Tuesday, March 23, 2004 - link
#22: I agree with you until you get to the part about 'never will'. HyperThreading is making developers consider making thier apps multi-threaded, and starting sometime next year multi-core CPU's will be introduced most likely. When most machines sold have the ability to process more than one thread at a time, it would be pretty stupid to ignore that factor.So for now, multiple CPU's is not that helpful for *gaming*, although it is for many other applications. In the future, however, I expect it to be very helpful for everything, including gaming.
Wesley Fink - Tuesday, March 23, 2004 - link
#4 -Ass-kissing has never been my forte. I consider myself an equal-opportunity offender. After finding none of the AGP locks worked on Round 1 chipsets, you better believe I would test for myself whatever I am told about the new boards.
Frankly I really like nF3-250GB, but I also hear good things about SiS 755FX for 939 (1200HT) and VIA's update for 939. After some of the crap we've had to endure with Round 1 chipsets, it will be nice to have some good Athlon 64 choices in Round 2.
prisoner881 - Tuesday, March 23, 2004 - link
#20, gamers that buy dual-CPU systems are just being stupid. Practically no game out there makes good use of more than one CPU, and none are planned. Add to that the overhead of having additional CPU's in the system, the cost of a dual system versus a single, and the slower memory (Reg'd ECC), and you've got a tremendous waste of money. I have *never* seen a dual-CPU game box outrun a single-CPU game box, and I doubt we ever will.prisoner881 - Tuesday, March 23, 2004 - link
#18, I know it's full duplex, but even then you will have a hard time getting full utlization under normal working conditions. Benchmarks are designed to run things at unrealistic rates. The point is, although I don't encourage it, you can certainly put Gigabit on the PCI bus and get very usable performance out of it. In most cases, the limiting factor is going to be CPU utilization anyway.JADS - Tuesday, March 23, 2004 - link
External HDDs could make good use of a Firewire connection, especially now it is whizzing along at 800MBit/s.The multi CPU implementation sounds interesting, of course AMD will completely fail to capitalise on it by not making the FX dual processor capable. How many enthusiasts (AMD wise) could resist the chance of dual FX-53s, especially with the possibility of overclocking them? You have the distinction between the 2xx series and the FX due the removal of ECC/Registered memory in the FX 939 series, so they essentially serve two different markets.
sprockkets - Tuesday, March 23, 2004 - link
Why would you need firewire with USB2? OK, ipod and camcorders.I have one question. Since you use a browser to configure the firewall, does this mean it is OS independant, i.e., I can use it in Linux without needing drivers to run it?
Soundstorm not present on here, oh well, almost all uATX boards had the MCP and not MCP-T so it didn't matter anyhow, and it doesn't work in Linux anyhow. VIA sound is troublesome in Linux too. I rather use my own sound card. Just hope there is a driver for the cool LAN adapter.
Wesley Fink - Tuesday, March 23, 2004 - link
#10 -LAN is Duplex. Gigabit on PCI with overhead can do about 820mb/sec in industry standard tests. nVidia's on-chip LAN could output about 1840 mb/sec in the benchmarks we have seen. This is more than twice as fast IF you have a source that can actually output 1GB in both directions.
#11 -
PCI Express will be seen on Intel boards very soon. AMD boards will not move as rapidly to the Intel PCI Express standard.
#12 -
Firewire is not on-chip. Undoubtedly many mfgs will add firewire with an additional chip on-board nF3-250.
fla56 - Tuesday, March 23, 2004 - link
''No one can possibly complain about the feature-set of nForce3-250''to add my vote to what's already been said, no firewire for my iPod and no SoundStorm/DolbyDigital for that lovely Yamaha amp I just bought mean i think someone needs to calm down a little about all that excitement (and learn a little about the difference between megabits and bytes by the sound of things)
i wonder if they'll release Soundstorm as a PCI eXpress card....
Reflex - Tuesday, March 23, 2004 - link
#8: Actually, to date nVidia has had a *very* troublesome PCI implementation, anyone with a PCI RAID controller and a 4 disk RAID 0 array can tell you that. It is so bad, in fact, that prototype NF3-150 boards for Opteron used AMD PCI chips just to avoid using the nForce3 integrated PCI bus. I am not certain if these boards ever reached production status however.As for this chipset, it looks nice, but honestly I'll wait until there is a PCI Express solution out there, I was just forced due to power problems destroying my equipment to upgrade my motherboard prematurely, and I don't intend to buy another until the next wave of features is available...
DAPUNISHER - Tuesday, March 23, 2004 - link
Keep your eyes open for my AN50R listing for sale at rock bottom pricing in the FS/FT forum when the 250 is on shelves :Dfla56 - Tuesday, March 23, 2004 - link
prisoner881 - Tuesday, March 23, 2004 - link
Looks like another error on the "Conclusion" page. Last sentence, second paragraph says "We expect that some enterprising companies, which specializes in catering to the computer enthusiast, will slip in some Socket 954 boards based on the Ultra chipset with a Gigahertz HyperTransport."Socket 954? Methinks that ought to be Socket 754.
arswihart - Tuesday, March 23, 2004 - link
What about firewire connectors, do you guys think they'll be added to production boards?Curt Oien - Tuesday, March 23, 2004 - link
PCI EXPRESS ?prisoner881 - Tuesday, March 23, 2004 - link
There's a huge gaffe on the On-Chip Gigabit page. It states that Fast Ethernet runs at "100MB/sec" and Gigabit runs at "1000MB/sec." "MB" is shorthand for mega<i>bytes</i>, not mega<i>bits</i>. Megabits should be abbreviated "Mb."Normally I wouldn't be this anally-retentive, but the poor usage leads to another problem later on down the page. The article states that Gigabit Ethernet running at "1000MB/sec" is faster than the PCI bus which runs at "133MB/sec." The PCI rate figure is correct, but the Gigabit figure makes it look like Gigabit is about 8 times faster than the PCI bus itself. <i>It's not!</i> The PCI bus runs at (133Mbytes/sec X 8 bits/byte = ) 1064Mbit/sec, which faster than Gigabit. The article is very misleading in this respect.
In truth, the PCI bus can almost never reach its peak 133MB/sec rate (usually it's around 100MB/sec) but then again Gigabit can't reach it's peak either.
Regardless, the article is completely incorrect when it indicates a Gigabit card would overwhelm a PCI bus. This is not true.
BikeDude - Tuesday, March 23, 2004 - link
Argh... I keep forgetting that it's 1000Mbps _full duplex_... nVidia are indeed correct, the PCI bus is only half that speed. :-/--
Rune
BikeDude - Tuesday, March 23, 2004 - link
First off: GB is GigaByte. Wesley wrote "GB" more than once while actually referring to Gigabit (bit has lowercase b).Next, 1000Mbps is roughly 125MB/s (theoretical peak I expect). 33MHz 32-bit PCI is roughly 133MB/s. I dislike PCI Gb implementations as the next guy, but I'd still like to know how nVidia managed to come up with the half speed figure? Perhaps nVidia's PCI-bus implementation is sub-par? (which is a real issue! Via has struggled with really bad PCI performance for years :-( )
Finally there's 6-channel audio; What happened with Soundstorm and Dolby encoding implemented in hardware? (I currently use only the SPDIF connectors on my nForce2 and get surround sound both in games and while playing DVDs -- is there no way to get this functionality with Athlon64?)
Hopefully the next article will shed some light on some of these issues. Cheers! :)
--
Rune
KristopherKubicki - Tuesday, March 23, 2004 - link
gigE is awesome and worth it. i dunno about the firewall but eh. 45MB/s network transfers are fun.Kristopher
Verdant - Tuesday, March 23, 2004 - link
schweet... when is my 16x nforce 250 mobo comming the the mail?klah - Tuesday, March 23, 2004 - link
hmmm.. seems that last page was slipped in from the November SiS article. weird.Phiro - Tuesday, March 23, 2004 - link
yeah, the SiS 755FX plug at the end was sort of a red-herring - didn't fit at all with the article, which was soley about Nvidia, it didn't need SiS's recent efforts tacked on the end at the last second.A couple things:
1) to all you nay-sayers about the worth of gigabit ethernet - I thumb my nose at you! Let's not play chicken or the egg games here, let's just usher in new *desired* technology as smoothly as possible - having gigabit ethernet will push me to replace my netgear 10/100 switched hub, not the other way around.
2) Anandtech, what's with the nvidia ass kissing? When you say things like 'Nvidia assured us.." and "We did test Nvidia's claim... [and we believe it]" - come on, a little healthy doubt is a good thing. Just because they supplied you with a reference nforce3 250 mobo doesn't mean you have to see how far you can stick your tongue up their butt. Honestly, the article felt like it leaned toward Nvidia abit. Believe it or not, you can report on a product without it sounding like some money changes hands or something.
mechBgon - Tuesday, March 23, 2004 - link
*drool*bldkc - Tuesday, March 23, 2004 - link
What's with the SiS 755 crap at the end of the article? Someone didn't proof read, huh? That is also obvious in the spelling errors. Excellent article. Better than recent ones. I do wish that you had been able to include the performance portion, cuz now I'm itching to see them.One thing tho, how many people have several gigabit systems at home? I know I will not upgrade any of mine until they are replaced, so it will be awhile. Therefore I am not too excited at this point, especially if the high speed wireless standards work out to high enough throughput to allow real time multi-media transfers. Love the on chip firewall, but Zonealarm is still the only useful application specific solution I know of. Not that I'm an expert, I am far from it, but the Blackice debacle was seen coming long ago.
wicktron - Tuesday, March 23, 2004 - link
weeeeeee!! pci/agp lock!