Welcome, Guest. Please Login 3dfx Archive
 
  HomeHelpSearchLogin  
 
Pages: 1 2 
Send Topic Print
Opinions on Duron processor (Read 622 times)
Andrew Boiu
Senior Member
****
Offline


LDE-BDreams

Posts: 267
LDE-BDreams
Gender: male
Re: Opinions on Duron processor
Reply #15 - 08.04.04 at 12:16:45
 
Still, 5 million Hz is 5 million Hz. I see nothing spectacular in that, and frankly I will see less, just when thinking that in order to run Word 2020 using pictures along text, you will need 5Ghz at least (Os'es are great at eating CPU's), without running anything else. Sounds logic, economic, fair?

The power of these CPU's would never be used at their full power. First they have bugs (1, or 2, but they DO have at least one, but they are cosmetized as "different, new gen out"), Bioses these days DO have some bugs (5-10 at least) otherwise there wouldn't be hardware compatibility problems, Windows has TONS TONS TONS of bugs, programs that are done by gaming companies have TONS (x4)... And so on...

Next thing is why bother doing something to use at max the CPU, when anyway next month a new one, faster would appear? All the software and hardware producers know that, and know that they shouldn't spend money and time to improve something, as long as it works. And finally, the only time when the CPU's would be used nearly max, would be when we can't produce faster CPU's, the limit has been reached. But then again, soon another "PC" would be available, so this mad-crazy-SF "science and efficiency is back" script would never be real.
Back to top
 
WWW  
IP Logged
 
FalconFly
YaBB Administrator
*****
Offline


3dfx Archivist

Posts: 2445
5335N 00745E
Gender: male
Re: Opinions on Duron processor
Reply #16 - 08.04.04 at 19:38:59
 
*ugh*

As soon as I launch any decent Game, my CPU is utilized to 100% (and with lots ot room for improvement actually).

My Network (35700MHz) is under 100% load all the time, and as well, having lots of room for performance improvements to be desired.

CPU makers do know quite exactly what is going on in their Designs, but indeed, they do find errors (with intel being by far in lead for some odd reason, judging from their errata sheets).
Yet, those aren't "fixed" by intoducing new Models (as you assumed), but are fixed in more or less silence with the Introduction of new Steppings.
Only major re-works or improvements of a design are actually published to the normal end-user (AMD Athlon64 CG-Stepping is a rare but good example)

SIMD Instructions are extremely useful and powerful Extensions.
Nothing in the CPU world ever allowed to increase computing power by 100% or more before.
Unfortunately, not all Problems can be translated into an ideal, large Matrix, for SIMD to handle.

Today, most are silently used to boost performance without making much fuzz about it (practically all SFX, and most AI Engines excessively rely on them); only when a Program can boost performance extremely (e.g. Video post-processing), it is used as a real marketing feature.
Back to top
 
WWW  
IP Logged
 
Andrew Boiu
Senior Member
****
Offline


LDE-BDreams

Posts: 267
LDE-BDreams
Gender: male
Re: Opinions on Duron processor
Reply #17 - 14.04.04 at 09:41:29
 
The first thing to notice is that, for example, when using a text processing program, you never use 32 bit (fully). Actually you are far from this need. When working on a graphics design app, you are using 32 bit (really) only when you are working with graphics (scaling, effects). When doing a sound processor you use 32 bit only when you load/save a file, and perhaps at some effects.

Finally, in the movie making or inside a 3D app, you never use 32 bit for something like AI. Of course, you use bigger precision, but I don't remember a game to act accordingly to 0,00000000... distance precision. More than that tons of decisions are taken in things like if I=0... and if I=1 (Boolean) you don't even use 8 bit (and these things fit nicely on a 32 bit bus reserved)... Only when you calculate geometry or transfer data from CPU to GPU you really use fully the 32bit or wider bus, but the CPU most of the time is not making transfers, or calculating geometry...

Even when 64 bit appears onto the market, the first thing that you get is bigger programs, but very rarely you need to cope with so big numbers.

Conclusion is if nothing changes, the 32/64/128 bit would never bring real increases of performance of more than 20% (commonly 10%), unless we will design better compilers, interpreters, assemblers, that are able to fit and execute more simultaneously operations on a 32/64/128 bus, and fill as much as possible the bus accordingly to the length of numbers and complexity of operations involved, not just let a single operation take the whole 32 bit bus for a split second...
Back to top
« Last Edit: 14.04.04 at 09:43:22 by Andrew Boiu »  
WWW  
IP Logged
 
FalconFly
YaBB Administrator
*****
Offline


3dfx Archivist

Posts: 2445
5335N 00745E
Gender: male
Re: Opinions on Duron processor
Reply #18 - 15.04.04 at 09:09:50
 
Quote:
The first thing to notice is that, for example, when using a text processing program, you never use 32 bit (fully). Actually you are far from this need. When working on a graphics design app, you are using 32 bit (really) only when you are working with graphics (scaling, effects). When doing a sound processor you use 32 bit only when you load/save a file, and perhaps at some effects.


False. Period.
I could explain, but I feel you're falling back into old schemes of "I don't know anything about what I'm writing, but I do so nonetheless".

Do you even know what 16bit Memory Addressing is limited to ?
It's a whopping 1 MegaByte !

I bet your OS wouldn't even boot with that amount of RAM.

It took alot of tricks and performance to virtually extend this Address range, which caused alot of Coders nervous breakdowns and probably increased Aspirin/Headache medication sales by a fair margin.
But whom am I telling this, you wouldn't understand anyway.

Quote:
Finally, in the movie making or inside a 3D app, you never use 32 bit for something like AI. Of course, you use bigger precision, but I don't remember a game to act accordingly to 0,00000000... distance precision. More than that tons of decisions are taken in things like if I=0... and if I=1 (Boolean) you don't even use 8 bit (and these things fit nicely on a 32 bit bus reserved)... Only when you calculate geometry or transfer data from CPU to GPU you really use fully the 32bit or wider bus, but the CPU most of the time is not making transfers, or calculating geometry...


Hm, so what you're writing is, that you have no clue about movie making (Filesize limits, Color precision, post-processing anyone?), nor anything about 3D Engines. Congratulations on your "coming out", but this might be the *hint* wrong place to publish it.

In fact, you can't even use any of your PCI Cards, since they're... evil 32bit *omg* !
You go and try to build your 8bit/16bit 3D Engine that handles 500k Polygons inside a Trucolor (darn, can't! since you won't use 32bit, right?) Engine, and make it run then.
See you next life then, you'll be busy coding still in your grave.

I could literally tear apart the nonsense you wrote, word-by-word, but I'm already wasting my time on your limitless ignorance.

Quote:
Even when 64 bit appears onto the market, the first thing that you get is bigger programs, but very rarely you need to cope with so big numbers.


Please leave that to those that actually need 64bit. And in case you haven't noticed (which I would deem a miracle), the only relevant, current 64bit design is sold mainly because of its excellent 32bit performance, compatibility, and 64bit-scalability as a future Option ...

Quote:
Conclusion is if nothing changes, the 32/64/128 bit would never bring real increases of performance of more than 20% (commonly 10%), unless we will design better compilers, interpreters, assemblers, that are able to fit and execute more simultaneously operations on a 32/64/128 bus, and fill as much as possible the bus accordingly to the length of numbers and complexity of operations involved, not just let a single operation take the whole 32 bit bus for a split second...


I shall give you a last hint there :
This is what actually happened during the last 10 years (!)
Unless my eyes are dorked, the gcc and intel Compilers actually have superseeded V1.0 and 80286 code levels.

If you are still working off an 8bit System, using 64k of Memory, then that's you living hopelessly in the past.

And being a mainly GFX oriented Forum, I shall hint you that present designs use 256bit Memory busses. (which is excessively used)

PLEASE spare us with your whining FFS.
If you have no clue about 5 years old, let alone current Hardware technology and the developments of the last decade, please log onto your favorite 300 Baud BBS, and share your experiences there.
--------------------------
This is nothing personal, but I'm very close to increase the average Forum quality by removing and banning a single User from the Database.

IMHO, you already deserve a new, unique Rank by now :
Troll

--- edit ----
Alright, here's your new Rank.
After all, you worked very hard for it.
Back to top
« Last Edit: 15.04.04 at 15:38:34 by FalconFly »  
WWW  
IP Logged
 
Andrew Boiu
Senior Member
****
Offline


LDE-BDreams

Posts: 267
LDE-BDreams
Gender: male
Re: Opinions on Duron processor
Reply #19 - 16.04.04 at 09:26:45
 
Anyway, I know very well what I am saying. Then, can you prove why 80% of the performance increases where achieved by Mhz raising? Try using a P2 (333 Mhz) at 133 Mhz. There you will see clearly that MMX is not so important, that Quake3 cripples, that your movie compresion goes down, down, and that your wonderfull 3D Anim is taking a whole our to render 1 frame. Not convinced, right? Even if that CPU would be 64 bit it wouldn't change too much.

I didn't say anything about memory addressing. Everyone knows what EMS meaned back in the 80's DOS era: "swithcing pages memory" and it is clearly that it was very slow. Also, I see a very bright future for the Athlon 64, as the integrated memory controller would bring very big performance increases. 512 bit accessing is very normal for the future. What I am saying is that we really have few programs that use 100% 32 bit code, and that everything inside them is using 32 bit precision (to be effective). Let alone 64 bit, at least for the moment.

The CPU is the machine inside the machine that does all. This is the main problem. When you are talking about a GPU, almost every inovation can bring benefits, and, for example, 64 bit color would be usefull in the future, along with higher resolutions. CPU's evolution is much more problematic, and even includes the forgotten part of backward compatibility, which, at a time, gets to create big constraints on the design.

And about the rank, let alone the joke. Is it so hard to prove that so many things are dependent on Mhz these days for the CPU's. I don't think that even a ton of tests (sec to do that) would prove to you and others what a big difference makes the CPU operating frequency, and, to a lesser degree FSB and latency's. Of course a lower CPU could beat a newer one in performance, because of it's design, but in general this is not likely to happend as one CPU's enhancement could overcome one problem.
Back to top
 
WWW  
IP Logged
 
amp_man
Ex Member


Re: Opinions on Duron processor
Reply #20 - 17.04.04 at 05:22:32
 
Quote:
Anyway, I know very well what I am saying. Then, can you prove why 80% of the performance increases where achieved by Mhz raising? Try using a P2 (333 Mhz) at 133 Mhz. There you will see clearly that MMX is not so important, that Quake3 cripples, that your movie compresion goes down, down, and that your wonderfull 3D Anim is taking a whole our to render 1 frame. Not convinced, right? Even if that CPU would be 64 bit it wouldn't change too much.


Omfg...of course when you do something like that you're going to decrease performance, I think what they're trying to say is that clock speed is not the only thing that matters. Which it doesn't. Compare, if you will, a 486 DX 4 100MHz with a Pentium at even 75Mhz. The pentium will waste the 486 any day. Then do the same for a P2 233 and a P233 MMx. The P2 will win. If you can't tell why, then you need to take a break and learn a few things.
Back to top
 
 
IP Logged
 
FalconFly
YaBB Administrator
*****
Offline


3dfx Archivist

Posts: 2445
5335N 00745E
Gender: male
Re: Opinions on Duron processor
Reply #21 - 17.04.04 at 13:06:20
 
Quote:
Anyway, I know very well what I am saying. Then, can you prove why 80% of the performance increases where achieved by Mhz raising? Try using a P2 (333 Mhz) at 133 Mhz. There you will see clearly that MMX is not so important, that Quake3 cripples, that your movie compresion goes down, down, and that your wonderfull 3D Anim is taking a whole our to render 1 frame. Not convinced, right? Even if that CPU would be 64 bit it wouldn't change too much.


Well, if you knew at least a bit about x86 CPU Cores, you'd see what errors you're making again.

So let's see :
- Branch Prediction
- Branch Target Cache
- Register Extensions and renaming
- Speculative Execution
- SuperScalar Execution
- Out-of-Order Execution
- L1 and L2 Code/Data Caches
- Cache Strategies
- n-way Cache association
- Singe/Double-Precision Integer and Math SIMD
- integrated Integer/Floating Point ALU
- n-layer CPU Designs
- HyperThreading (intel)
- HyperTransport Channels (AMD)
- Integrated Memory Controller
- Pipelining

Above are only a few reasons why a massive factor of today's CPU Design capabilities is NOT derived from Clock alone Roll Eyes

AGAIN :
Dig up some Information before making bogus statements!

Quote:
I didn't say anything about memory addressing.


*sigh*
Memory Addressing has always been a Key Factor for moving from old 8 bit Designs to 16/32/64 bit.
It goes hand in hand with the new Designs, and cannot be simply 'left out' when bashing 64, or even 32bit like you did.

Quote:
What I am saying is that we really have few programs that use 100% 32 bit code, and that everything inside them is using 32 bit precision (to be effective). Let alone 64 bit, at least for the moment.


The correct statement would read :
At this time, we have almost no native 16bit Code remaining in currently designed, or upcoming Software pieces. Apart from the fact that no sane Coder or Compiler will generate 16bit Code in the first place, it is of no use anymore, and implies terrible restrictions (if anyone was masochistic enough to still use them for whatever reason).

Whether each of the 32 bits are all needed all the time, is completely irrelevant.
You can't for instance, deactivate 2 or even 4 Cylinders of a running V8 Engine on the fly, just because you don't need all the horsepower every now and then. They are there, and they will run along, even if they aren't required for isolated strokes.

Feel free to propose your own CPU design, that magically and variably switches from being an 8bit Core upto a 64bit Core a Billion Times a Second.

The Binary System has some fixed rules, in case you haven't noticed, which can't be overcome (except for abusing a free Bit for Status Flagging or alike)

Quote:
The CPU is the machine inside the machine that does all. This is the main problem. When you are talking about a GPU, almost every inovation can bring benefits, and, for example, 64 bit color would be usefull in the future, along with higher resolutions. CPU's evolution is much more problematic, and even includes the forgotten part of backward compatibility, which, at a time, gets to create big constraints on the design.


*ack*
No, the CPU doesn't do it all. Actually, today's CPU's are more free from "generic processes" than they were ever before, assisted by a myriad of external Logic Chips.
64bit is already availble, albeit only useful for excessive Shader Operations.

I loved your sentence about the "forgotten backward compatibility".
Name any CPU Family apart from the x86, that is still able to execute native Code from its absolutely first design ???
Answer : there is none !

The x86 CPU Design's success is built upon its abolutely unique backwards compatibility!

Quote:
And about the rank, let alone the joke. Is it so hard to prove that so many things are dependent on Mhz these days for the CPU's.


Ehm, you're contradicting yourself there. How can you bash "GHz frenzy" and 32/64bit, while now you claim that MHz is a major factor for today's CPU powers ?!

Anyway, be very careful here....
This wasn't a Joke, but an ample, last warning in your direction.

One more step in the direction you're walking (actually, since the first day you got here), all countermeasures I listed in above Posting will go in effect, without any further Warning. I've given you more than half a dozen warnings, but you continue to fall back into your old scheme.
Back to top
 
WWW  
IP Logged
 
Andrew Boiu
Senior Member
****
Offline


LDE-BDreams

Posts: 267
LDE-BDreams
Gender: male
Re: Opinions on Duron processor
Reply #22 - 20.04.04 at 11:01:50
 
Since the discussion is getting a bit out of topic, it shall be carried on in another thread: http://www.falconfly-central.de/cgi-bin/yabb/YaBB.pl?board=offtopic;action=displ....

I would like to point that parts of the things above are theory, not real figures, because there is not enough information colected until now to be sure of them.
Back to top
 
WWW  
IP Logged
 
Pages: 1 2 
Send Topic Print