3dfx Archive
http://www.falconfly.de/cgi-bin/yabb2/YaBB.pl
This & That >> This & That >> Opinions on Duron processor
http://www.falconfly.de/cgi-bin/yabb2/YaBB.pl?num=1080494071

Message started by whodovoodoo2man on 28.03.04 at 19:14:31

Title: Opinions on Duron processor
Post by whodovoodoo2man on 28.03.04 at 19:14:31
Opinions on the Duron processor please.

Considering building a glide box with my Voodoo 5500.

Duron 1.6 is about £25 at the moment- dirt cheap.

Already have a Athlon 2600XP based system with twin Voodoo 2's and a Geforce 4600 card, so don't need this system as a main gaming rig.

Cheers

Andy

Title: Re: Opinions on Duron processor
Post by FalconFly on 28.03.04 at 19:18:49
The Duron 1.6GHz is a nice CPU, but I'd look out for used AthlonXP's as well.

Depending on the deal, you might get quite a better 'bang for the buck' with those.

Title: Re: Opinions on Duron processor
Post by DenisF on 28.03.04 at 20:08:45
I'ma get beaten to death with a yard stick for this, but Duron cpu's suck ass.

They're like.. eek.. Celerons..
might aswell get a cheap celeron 1.4ghz (tualatin) or a celeron 2.4ghz, they cost about the same..

Title: Re: Opinions on Duron processor
Post by amp_man on 28.03.04 at 21:09:45
Hmm, I've built only one Duron system, I wasn't extremely impressed, but it did the job. Of course, it was for an internet/email/mord processing machine, so gaming wasn't of overly great importance, and it was only a 1.2GHz CPU. OTOH, I think you can probably find an Athlon XP CPU for a decent price, and get much better bang for your buck, as someone else said.

Also, celeron and duron really shouldn't be compared too much...a celeron is a much better CPU, unless you're one of those nuts who want to try modding your Duron (don't ask me about this, I know very little about it). But you can almost undoubtedly get an AXP for the price of a celeron, which would waste a celeron any day. I wouldn't even think, though, about getting a socket370 CPU, they're so old and expensive for the performance you get from 'em

Title: Re: Opinions on Duron processor
Post by FalconFly on 28.03.04 at 21:39:01
*lol*

The intel Celeron is known to have by far the worst IPC ever observed in any x86 CPU design to date :P

The AMD Duron, however, still packs its 128k of L1 Cache, plus the exclusive 64k L2 Cache, bundeled with the far superior FPU of the Athlon family.
Add SSE to it, with a fairly low specific power consumption (Celerons have almost the same TDP than Pentium4's!), and you have a really decent CPU in the very low price region.

At 1600MHz, it is measured almost persistently faster than a Celeron 2600MHz (!)

And I don't know where you guys live, but here, the Celeron 2600Mhz is at least 2x (rather 2.5-3x) the price of a Duron 1600Mhz :P
-----------------
All in all, that's why tech-experienced Users normally won't even touch a Celeron.

Against the AMD Duron, there is practically only one competitor :
...the AMD AthlonXP

To make a long story short :
If you buy a Celeron, you either want to spend 2-3x the money on eqal or less performance, or really have an odd reason to do so (e.g. keep an existing intel platform at all costs).

Title: Re: Opinions on Duron processor
Post by Blazkowicz on 28.03.04 at 21:39:36
er,  the celeron P4 is the worst Intel CPU since the celeron 266 and 300 without L2 cache  ;)
you can get an XP2400+ at the same price as a celeron 2.4ghz

that's what I chose for my main PC, because it maxes out my K7S5A which is one of the last mobos to support my V5 5500 AGP

Title: Re: Opinions on Duron processor
Post by DenisF on 29.03.04 at 07:58:59
http://www.vanshardware.com/reviews/2002/01/020123_Duron_13/020123_Duron_13.htm

^the only review i could find of a 1.6ghz duron


BTW nothing beats the good ol' Tualatin P3.
Those babies can give the P4 2.4ghz a competition.

And the only difference between a P3 Tualatin and a Tualatin Celeron is the cache, p3 has 512kb, celery has 256kb.

Same core same features same speed etc', ONLY difference is the cache.

Title: Re: Opinions on Duron processor
Post by Andrew Boiu on 30.03.04 at 09:17:02
In fact, if someone remembers, there was a set of comparison tests between the last P3 (1.4 or 1.5 Ghz) and a P4 at 2 Ghz. Given all the huge bandwidth it had, and 400 Mhz bus, the P4 was seriously slowing in some apps, even in some games. The only thing that is good on P4 is that it was somehow cheaper to build by Intel, in fact seriously cheaper (less transistors, less cache) to compete with Athlon.

Even now, P4 are looking kind of freezed in that Mhz battle, they alway seek higher and higher frequencies, but the overall performance is not scalling so good. And when your CPU hogs up 100 W, whereas a Athlon uses 60-70, it MAKES a difference on cost effective performance.

Unfortunatly I've heard nothing about VIA C4 processors. They were a run for their money, even if they are slower than a Duron, but it would have make it the best ever mobile sollution, with the lowest power consumption. But giving the fact that Intel is still in 80% of the mobile computers, there is almost no place for AMD, not even talking about VIA.

Title: Re: Opinions on Duron processor
Post by Micha on 30.03.04 at 11:54:35
Pentium4:
remember HyperThreading, SSE3 (Prescott only) etc. technologies not used extensively yet, but the days will come...believe  ;) should I remember you all why p4 was slower in its very beginning than a P3? the fsb (P3 133, P4 100), but i think we have overcome these days yet & with intel having licenses for AMD's 64bit technology, let's see what'll come  :) (at all, i'm not considering bying crappity smacking intel)

Celeron:
sucking cpu, the only good thing is the o/c capacity. i heard of one guy giving a 1,5GHz celeron 100%/3GHz only adjusting a water cooler on it.

Duron:
faster than a celeron, yeah, but no real o/c at all. at least from what i know (a 1GHz morgan wasn't able to)

AthlonXP:
might be worth the try, if you get a used & cheap one, as somebody replied already.

let's not forget: a voodoo5 needs high scaled cpus nowadays if you consider playing modern games, i.e. mesafx & 3d analyzer take a high profit of it when making the cpu doing the h t&l stuff. so you'd run better with the fastest&cheapest you can get  ;D if you only want to give your old glide games a profitable hardware environment, you're of course satisfied with a duron (at least i was w/ a TB1GHz)

Title: Re: Opinions on Duron processor
Post by Micha on 30.03.04 at 12:00:21
er, andrei, i don't think anybody would buy a via cpu for a gaming rig. i wouldn't even buy one at all! of course, cheap stuff, but wasted money anyway. no further coment on that.

Title: Re: Opinions on Duron processor
Post by Andrew Boiu on 30.03.04 at 12:20:49
I didn't even thought of a gaming rig when talking about VIA C4. As I've said it is a good CPU for desktop, office apps, and a very low power consumption CPU for laptops. For these things it is very good. Not for games, 3D graphics and video editing.

Comment on Micha post: I agree almost totally. The only thing is that in the past and at the moment, there are cases in which an older CPU might perform slightly better than you'd expect. It's CPU arhitecture and design, rather than raw, brute Mhz muscle power that counts more sometimes. Besides that, CPU performance is sometimes not as scalable as you might expect it to be...

Title: Re: Opinions on Duron processor
Post by amp_man on 30.03.04 at 22:33:53

Quote:
The Celeron ? yes, i knew : the 300A !

lol, the only celeron I've ever bought! $5, and now it runs at 450 whenever I use it


Quote:
The Celeron better than the Duron ??
And in my town, for the price of a Celeron 2400, i can obtain an Athlon XP 2400+ ... ???


Alright, I'm sorry, but yet, cost-wise and stock speed, the Duron kicks a**. I meant if you get the choice between the two for like the same cost, then yeah, the celeron would probably be a better choice. I have a friend with a 700MHz Celeron running at 784MHz on stock Dell cooling, using SoftFSB I believe. And I'll take Micha's word that Duron's don't OC, although I don't OC other ppl's PCs. But you can, as I stated before, get an AthlonXP for the same price as that celeron, so the whole subject is really pointless.

Title: Re: Opinions on Duron processor
Post by Blazkowicz on 31.03.04 at 21:15:25
the modern celeron is a bastardized P4 even worse than the original Willamette P4, and now the duron is a 0.13µ part. Some people have a duron 1.4@2.2ghz with missing L2 cache enabled (!) by connecting some bridges on the cpu

Title: Re: Opinions on Duron processor
Post by Micha on 01.04.04 at 15:13:14

wrote on 30.03.04 at 12:20:49:
I didn't even thought of a gaming rig when talking about VIA C4. As I've said it is a good CPU for desktop, office apps, and a very low power consumption CPU for laptops. For these things it is very good. Not for games, 3D graphics and video editing.

i suppose the topic was about a cpu for a gaming rig...at least, whodovoodoo2man wrote about a glide box..


wrote on 30.03.04 at 12:20:49:
It's CPU arhitecture and design, rather than raw, brute Mhz muscle power that counts more sometimes. Besides that, CPU performance is sometimes not as scalable as you might expect it to be...

guess what? why do you think i mentioned the p3/p4 fsb again? or hyper threading & sse3? of course, most implementations count more than clock --> that's why we rather would buy an athlonxp than a p4 for hardcore performance, isn't it?

concerning this duron/celeron struggle:
as i know, both are just "small" models of their mainstream brothers (i.e. p4 & athlonxp) with a redesigned, but still actual core. so the duron applebred is a barton with only 64kb l2 cache enabled (somebody there might unlocked da sh*t already), it's the same thing w/ celeron. just @ lower clocks. :)

>>adding<<
well, the o/clocked duron @ 2.3GHz is interesting, patience! but we know every cpu to have its own specific behaviour when o/clocked..so it might be hard to find one whose able to o/c in this dimensions (without some cooling else than air or water  ;D )

Title: Re: Opinions on Duron processor
Post by Andrew Boiu on 06.04.04 at 09:55:41
Actually, the SSE, HT didn't and perhaps they would never revolutionise the world, no matter what. The only good use for them is marketing...

What would be more usefull than 5Ghz, would be a CPU  and in fact a whole PC, designed from ground-up, taking into account the future. Until now, nothing moves this way, just a bit more, a bit more, and it works (or not so).

At the moment I suppose there are black holes in the mind of CPU developers, as noone actually know 100% what is inside the CPU, because it got by far too complicated. And really, I see no efficiency when you need a 5Ghz CPU. Make some calculations on paper, and even if you take everything into account, you find out that the expected performance and times to make some operations are far from real situation. If the above is wrong, why we are talking about scalable performance? If all the CPU's would be scalable, wouldn't this word would be with no sense?

Title: Re: Opinions on Duron processor
Post by Andrew Boiu on 08.04.04 at 12:16:45
Still, 5 million Hz is 5 million Hz. I see nothing spectacular in that, and frankly I will see less, just when thinking that in order to run Word 2020 using pictures along text, you will need 5Ghz at least (Os'es are great at eating CPU's), without running anything else. Sounds logic, economic, fair?

The power of these CPU's would never be used at their full power. First they have bugs (1, or 2, but they DO have at least one, but they are cosmetized as "different, new gen out"), Bioses these days DO have some bugs (5-10 at least) otherwise there wouldn't be hardware compatibility problems, Windows has TONS TONS TONS of bugs, programs that are done by gaming companies have TONS (x4)... And so on...

Next thing is why bother doing something to use at max the CPU, when anyway next month a new one, faster would appear? All the software and hardware producers know that, and know that they shouldn't spend money and time to improve something, as long as it works. And finally, the only time when the CPU's would be used nearly max, would be when we can't produce faster CPU's, the limit has been reached. But then again, soon another "PC" would be available, so this mad-crazy-SF "science and efficiency is back" script would never be real.

Title: Re: Opinions on Duron processor
Post by FalconFly on 08.04.04 at 19:38:59
*ugh*

As soon as I launch any decent Game, my CPU is utilized to 100% (and with lots ot room for improvement actually).

My Network (35700MHz) is under 100% load all the time, and as well, having lots of room for performance improvements to be desired.

CPU makers do know quite exactly what is going on in their Designs, but indeed, they do find errors (with intel being by far in lead for some odd reason, judging from their errata sheets).
Yet, those aren't "fixed" by intoducing new Models (as you assumed), but are fixed in more or less silence with the Introduction of new Steppings.
Only major re-works or improvements of a design are actually published to the normal end-user (AMD Athlon64 CG-Stepping is a rare but good example)

SIMD Instructions are extremely useful and powerful Extensions.
Nothing in the CPU world ever allowed to increase computing power by 100% or more before.
Unfortunately, not all Problems can be translated into an ideal, large Matrix, for SIMD to handle.

Today, most are silently used to boost performance without making much fuzz about it (practically all SFX, and most AI Engines excessively rely on them); only when a Program can boost performance extremely (e.g. Video post-processing), it is used as a real marketing feature.

Title: Re: Opinions on Duron processor
Post by Andrew Boiu on 14.04.04 at 09:41:29
The first thing to notice is that, for example, when using a text processing program, you never use 32 bit (fully). Actually you are far from this need. When working on a graphics design app, you are using 32 bit (really) only when you are working with graphics (scaling, effects). When doing a sound processor you use 32 bit only when you load/save a file, and perhaps at some effects.

Finally, in the movie making or inside a 3D app, you never use 32 bit for something like AI. Of course, you use bigger precision, but I don't remember a game to act accordingly to 0,00000000... distance precision. More than that tons of decisions are taken in things like if I=0... and if I=1 (Boolean) you don't even use 8 bit (and these things fit nicely on a 32 bit bus reserved)... Only when you calculate geometry or transfer data from CPU to GPU you really use fully the 32bit or wider bus, but the CPU most of the time is not making transfers, or calculating geometry...

Even when 64 bit appears onto the market, the first thing that you get is bigger programs, but very rarely you need to cope with so big numbers.

Conclusion is if nothing changes, the 32/64/128 bit would never bring real increases of performance of more than 20% (commonly 10%), unless we will design better compilers, interpreters, assemblers, that are able to fit and execute more simultaneously operations on a 32/64/128 bus, and fill as much as possible the bus accordingly to the length of numbers and complexity of operations involved, not just let a single operation take the whole 32 bit bus for a split second...

Title: Re: Opinions on Duron processor
Post by FalconFly on 15.04.04 at 09:09:50

wrote on 14.04.04 at 09:41:29:
The first thing to notice is that, for example, when using a text processing program, you never use 32 bit (fully). Actually you are far from this need. When working on a graphics design app, you are using 32 bit (really) only when you are working with graphics (scaling, effects). When doing a sound processor you use 32 bit only when you load/save a file, and perhaps at some effects.


False. Period.
I could explain, but I feel you're falling back into old schemes of "I don't know anything about what I'm writing, but I do so nonetheless".

Do you even know what 16bit Memory Addressing is limited to ?
It's a whopping 1 MegaByte !

I bet your OS wouldn't even boot with that amount of RAM.

It took alot of tricks and performance to virtually extend this Address range, which caused alot of Coders nervous breakdowns and probably increased Aspirin/Headache medication sales by a fair margin.
But whom am I telling this, you wouldn't understand anyway.


Quote:
Finally, in the movie making or inside a 3D app, you never use 32 bit for something like AI. Of course, you use bigger precision, but I don't remember a game to act accordingly to 0,00000000... distance precision. More than that tons of decisions are taken in things like if I=0... and if I=1 (Boolean) you don't even use 8 bit (and these things fit nicely on a 32 bit bus reserved)... Only when you calculate geometry or transfer data from CPU to GPU you really use fully the 32bit or wider bus, but the CPU most of the time is not making transfers, or calculating geometry...


Hm, so what you're writing is, that you have no clue about movie making (Filesize limits, Color precision, post-processing anyone?), nor anything about 3D Engines. Congratulations on your "coming out", but this might be the *hint* wrong place to publish it.

In fact, you can't even use any of your PCI Cards, since they're... evil 32bit *omg* !
You go and try to build your 8bit/16bit 3D Engine that handles 500k Polygons inside a Trucolor (darn, can't! since you won't use 32bit, right?) Engine, and make it run then.
See you next life then, you'll be busy coding still in your grave.

I could literally tear apart the nonsense you wrote, word-by-word, but I'm already wasting my time on your limitless ignorance.


Quote:
Even when 64 bit appears onto the market, the first thing that you get is bigger programs, but very rarely you need to cope with so big numbers.


Please leave that to those that actually need 64bit. And in case you haven't noticed (which I would deem a miracle), the only relevant, current 64bit design is sold mainly because of its excellent 32bit performance, compatibility, and 64bit-scalability as a future Option ...


Quote:
Conclusion is if nothing changes, the 32/64/128 bit would never bring real increases of performance of more than 20% (commonly 10%), unless we will design better compilers, interpreters, assemblers, that are able to fit and execute more simultaneously operations on a 32/64/128 bus, and fill as much as possible the bus accordingly to the length of numbers and complexity of operations involved, not just let a single operation take the whole 32 bit bus for a split second...


I shall give you a last hint there :
This is what actually happened during the last 10 years (!)
Unless my eyes are dorked, the gcc and intel Compilers actually have superseeded V1.0 and 80286 code levels.

If you are still working off an 8bit System, using 64k of Memory, then that's you living hopelessly in the past.

And being a mainly GFX oriented Forum, I shall hint you that present designs use 256bit Memory busses. (which is excessively used)

PLEASE spare us with your whining FFS.
If you have no clue about 5 years old, let alone current Hardware technology and the developments of the last decade, please log onto your favorite 300 Baud BBS, and share your experiences there.
--------------------------
This is nothing personal, but I'm very close to increase the average Forum quality by removing and banning a single User from the Database.

IMHO, you already deserve a new, unique Rank by now :
Troll

--- edit ----
Alright, here's your new Rank.
After all, you worked very hard for it.

Title: Re: Opinions on Duron processor
Post by Andrew Boiu on 16.04.04 at 09:26:45
Anyway, I know very well what I am saying. Then, can you prove why 80% of the performance increases where achieved by Mhz raising? Try using a P2 (333 Mhz) at 133 Mhz. There you will see clearly that MMX is not so important, that Quake3 cripples, that your movie compresion goes down, down, and that your wonderfull 3D Anim is taking a whole our to render 1 frame. Not convinced, right? Even if that CPU would be 64 bit it wouldn't change too much.

I didn't say anything about memory addressing. Everyone knows what EMS meaned back in the 80's DOS era: "swithcing pages memory" and it is clearly that it was very slow. Also, I see a very bright future for the Athlon 64, as the integrated memory controller would bring very big performance increases. 512 bit accessing is very normal for the future. What I am saying is that we really have few programs that use 100% 32 bit code, and that everything inside them is using 32 bit precision (to be effective). Let alone 64 bit, at least for the moment.

The CPU is the machine inside the machine that does all. This is the main problem. When you are talking about a GPU, almost every inovation can bring benefits, and, for example, 64 bit color would be usefull in the future, along with higher resolutions. CPU's evolution is much more problematic, and even includes the forgotten part of backward compatibility, which, at a time, gets to create big constraints on the design.

And about the rank, let alone the joke. Is it so hard to prove that so many things are dependent on Mhz these days for the CPU's. I don't think that even a ton of tests (sec to do that) would prove to you and others what a big difference makes the CPU operating frequency, and, to a lesser degree FSB and latency's. Of course a lower CPU could beat a newer one in performance, because of it's design, but in general this is not likely to happend as one CPU's enhancement could overcome one problem.

Title: Re: Opinions on Duron processor
Post by amp_man on 17.04.04 at 05:22:32

wrote on 16.04.04 at 09:26:45:
Anyway, I know very well what I am saying. Then, can you prove why 80% of the performance increases where achieved by Mhz raising? Try using a P2 (333 Mhz) at 133 Mhz. There you will see clearly that MMX is not so important, that Quake3 cripples, that your movie compresion goes down, down, and that your wonderfull 3D Anim is taking a whole our to render 1 frame. Not convinced, right? Even if that CPU would be 64 bit it wouldn't change too much.


Omfg...of course when you do something like that you're going to decrease performance, I think what they're trying to say is that clock speed is not the only thing that matters. Which it doesn't. Compare, if you will, a 486 DX 4 100MHz with a Pentium at even 75Mhz. The pentium will waste the 486 any day. Then do the same for a P2 233 and a P233 MMx. The P2 will win. If you can't tell why, then you need to take a break and learn a few things.

Title: Re: Opinions on Duron processor
Post by FalconFly on 17.04.04 at 13:06:20

wrote on 16.04.04 at 09:26:45:
Anyway, I know very well what I am saying. Then, can you prove why 80% of the performance increases where achieved by Mhz raising? Try using a P2 (333 Mhz) at 133 Mhz. There you will see clearly that MMX is not so important, that Quake3 cripples, that your movie compresion goes down, down, and that your wonderfull 3D Anim is taking a whole our to render 1 frame. Not convinced, right? Even if that CPU would be 64 bit it wouldn't change too much.


Well, if you knew at least a bit about x86 CPU Cores, you'd see what errors you're making again.

So let's see :
- Branch Prediction
- Branch Target Cache
- Register Extensions and renaming
- Speculative Execution
- SuperScalar Execution
- Out-of-Order Execution
- L1 and L2 Code/Data Caches
- Cache Strategies
- n-way Cache association
- Singe/Double-Precision Integer and Math SIMD
- integrated Integer/Floating Point ALU
- n-layer CPU Designs
- HyperThreading (intel)
- HyperTransport Channels (AMD)
- Integrated Memory Controller
- Pipelining

Above are only a few reasons why a massive factor of today's CPU Design capabilities is NOT derived from Clock alone ::)

AGAIN :
Dig up some Information before making bogus statements!


Quote:
I didn't say anything about memory addressing.


*sigh*
Memory Addressing has always been a Key Factor for moving from old 8 bit Designs to 16/32/64 bit.
It goes hand in hand with the new Designs, and cannot be simply 'left out' when bashing 64, or even 32bit like you did.


Quote:
What I am saying is that we really have few programs that use 100% 32 bit code, and that everything inside them is using 32 bit precision (to be effective). Let alone 64 bit, at least for the moment.


The correct statement would read :
At this time, we have almost no native 16bit Code remaining in currently designed, or upcoming Software pieces. Apart from the fact that no sane Coder or Compiler will generate 16bit Code in the first place, it is of no use anymore, and implies terrible restrictions (if anyone was masochistic enough to still use them for whatever reason).

Whether each of the 32 bits are all needed all the time, is completely irrelevant.
You can't for instance, deactivate 2 or even 4 Cylinders of a running V8 Engine on the fly, just because you don't need all the horsepower every now and then. They are there, and they will run along, even if they aren't required for isolated strokes.

Feel free to propose your own CPU design, that magically and variably switches from being an 8bit Core upto a 64bit Core a Billion Times a Second.

The Binary System has some fixed rules, in case you haven't noticed, which can't be overcome (except for abusing a free Bit for Status Flagging or alike)


Quote:
The CPU is the machine inside the machine that does all. This is the main problem. When you are talking about a GPU, almost every inovation can bring benefits, and, for example, 64 bit color would be usefull in the future, along with higher resolutions. CPU's evolution is much more problematic, and even includes the forgotten part of backward compatibility, which, at a time, gets to create big constraints on the design.


*ack*
No, the CPU doesn't do it all. Actually, today's CPU's are more free from "generic processes" than they were ever before, assisted by a myriad of external Logic Chips.
64bit is already availble, albeit only useful for excessive Shader Operations.

I loved your sentence about the "forgotten backward compatibility".
Name any CPU Family apart from the x86, that is still able to execute native Code from its absolutely first design ???
Answer : there is none !

The x86 CPU Design's success is built upon its abolutely unique backwards compatibility!


Quote:
And about the rank, let alone the joke. Is it so hard to prove that so many things are dependent on Mhz these days for the CPU's.


Ehm, you're contradicting yourself there. How can you bash "GHz frenzy" and 32/64bit, while now you claim that MHz is a major factor for today's CPU powers ?!

Anyway, be very careful here....
This wasn't a Joke, but an ample, last warning in your direction.

One more step in the direction you're walking (actually, since the first day you got here), all countermeasures I listed in above Posting will go in effect, without any further Warning. I've given you more than half a dozen warnings, but you continue to fall back into your old scheme.

Title: Re: Opinions on Duron processor
Post by Andrew Boiu on 20.04.04 at 11:01:50
Since the discussion is getting a bit out of topic, it shall be carried on in another thread: http://www.falconfly-central.de/cgi-bin/yabb/YaBB.pl?board=offtopic;action=display;num=1082450366;start=0#0.

I would like to point that parts of the things above are theory, not real figures, because there is not enough information colected until now to be sure of them.

3dfx Archive » Powered by YaBB 2.4!
YaBB © 2000-2009. All Rights Reserved.