If heat or power matters to you the new ATi chips use a LOT less power than nVidia ones.
Also, I never really trusted ATI's driver stories.
Besides, a 5850 at full load still consumes considerably less power than a GTX 460 (and outstrips it hands down performance-wise). The only reason the GTX 460 is deemed a success in terms of power consumption is because it doesn't suck as much as the other Nvidia cards, and not because it's better than the ATI ones.
http://www.tomshardware.com/reviews/geforce-gtx-460-gf104-fermi,2684-14.html
One or two watts is considerable? Kishy, do some research for yourself as brand loyalty causes the most fanboyism with video cards as I've ever seen. I've got the Nvidia GTX 260, GTX 275, and the GTX 470. Because Nvidia's CEO is such a **** (and he is), there is much hatred towards Nvidia. The cards are fine and mine don't run hot although I know how to set them up correctly. The drivers are great and make a difference.
Its absolutely true that the current ATI/AMD cards are great and are more effecient, however their driver team is tiny compared to Nvidias. I read almost all forums with video cards and there is quite a battle between both VC makers.
ATI has some severe driver problems with certain games and especially with Crossfire & Eyefinity. The forums show this alot. I have owned many of both and would have liked to have bought an ATI this last year but (for ME) the driver problems effected the type of games I was interested in. This may not at all be an issue for you. I could easily recommend either card depending on how big you wanted to go. I'm not fanboyish at all. I just tell people that the Nvidias are not the junk as seems to be advertised. I LIKE ATI too.
I use the 4870x2. i wanna nother one so my rig is maxed out for gpu's. but the next gen is the ati-hd 5970 (http://www.pricewatch.com/gallery/video_cards/radeon_hd_5970)
Don't SLI two 9800GTs. SLIing two bad cards isn't a good idea. The GTX 460 appears to be promising but I don't think anyone should buy it until the 1GB version is available for $200 or less. If you're willing to buy two 9800GTs, you can buy a 5850 for cheaper than two would cost you and get basically the best card on the market, all factors included (power consumption, yada yada).
I'm wary of that motherboard. I'd rather buy a better one but they stopped making all the good X58 boards (Bloodrage anyone).
I would buy the 5850 at a $250 price point but apparently that's not the retail price right now:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814102884&cm_re=5850-_-14-102-884-_-Product
If you can get one for $250 in any way possible, that's the best buy for your money.
In fairness, in the year Vista came out, 30% of all BSODs reported to MS were caused by the nVidia driver failing.
My pro-nVidia stance is largely based on nVidia's superior driver support for non-Windows operating systems. But I have heard that ATI's drivers for Windows are pretty good by this stage.
not some huge expensive gaming rig, since there aren't really any good games coming out for pc's anyway.
Bah, who needs them 3D games when you have SVGA:
I played most of those and got bored of them already, like 15-20 years ago. lol Stonekeep brings backs some memories. lol Hexen was fun too back then. I thought it was so cool. lol
I'm surprised you played them to be honest! In that case, you get the MS-DOS gaming nerd award, for playing some of the best DOS games released:Show Image(http://geekhack.org/attachment.php?attachmentid=11770&stc=1&d=1279424515)
EVERYONE GIVE CHIMERA A BIG APPLAUSE!
I never got to play hexen or heretic 1 (the demo of heretic 1 only). So, I still got lots of gaming ahead of me.
You know, they could easily make games in that SAME FASHION with updated graphics, it would be so amazing. But no... stupid game industry.
Consult me before buying a power supply. I am like Kishy, Ripster, and Ricercar rolled into one, only for power supplies.
A.) Ebay = no
B.) Open box = meh
C.) Not that great a power supply, though not as bad as some of BB's other in-house PSUs (Dynex and early Rocketfish). The X7 is one of Huntkey's better products, but I still don't feel I can recommend it, especially open box on ebay.
By the way, it seems he sells a lot of stuff from Best Buy, and for significantly below even the employee discount price (cost+ 5/10%). I'm guessing he steals from BB and lists the stuff on ebay. Problem with that is if you need it replaced and bring it in, it'll be registered as stolen (via. UPC). So double bad there.
You only need a decent 400W power supply for your system (even with the GTX460), and I highly recommend a quality 400W over a cheap 900-1000W.
No risk is worth it when it comes to power supplies.
Clearly Chimera's opinions on PSUs are as sophisticated as his opinions on trackballs.
That, and he has never had one fail on him, taking out two graphics cards in the process.
I'm pretty sure I've seen a bad PSU take out some RAM too... Can't remember the specifics though.
Either way, life is too short to play a lottery with low end PSUs, especially when the price of a good PSU is less than the parts that you could end up replacing.
If you read the thread, I explained that I had seen those same brownout symptoms, so I'm aware of them. That type of failure is more common, but as you stated, putting in a new psu fixed the problem, and the parts weren't destroyed.
That type of failing is a heck of a lot more common than the psu failing and permanently destroying parts.
Google may not be a credible source for reports, but it certainly can be looked at for the frequency of reported events such as this.
There should have been more than one complaint of persons when searching for that term if it was at all common.
Even still, I don't want to have to figure out whether there is some software problem, RAM problem, heating problem, PSU problem etc etc. They tend to cause weird problems that can manifest themselves as failures of other parts. In the example that I used about the DVD drive, if that happened to me, I would have gone and bought a new drive unnecessarily. Just a gigantic waste of time and money even if they don't fry parts (which they do).
You seem to imply that this is a regular occurence? Would it not make more economic sense to buy a good PSU that will last as long as the other components last rather than having to replace **** PSUs every once in a while? Those cheap things add up.
So there's no reason to pay out the nose for something basic like a power supply if you can meet the necessary requirements of your build for less.
That's where you're wrong. There is reason, but you just don't see it.
That's not really a valid comparison, because buying replacement parts from vendors is extortion.
And I don't buy the most expensive PSU on the market. I read around and I find one that is going to be reliable. If it happens to be expensive, so be it.
But $30 DVD drives cannot harm your other parts. I should just stop wasting my time because keep in mind that we're having this argument for your benefit.
To me it's a lot like you're arguing that I should buy a $200 dvdrw, when there are ones out now for $30, that are perfectly fine and do what I want. The return on the cost from $30 to $200 just isn't that much.
My point is that you can't tell from reviews what is going to be reliable or not. Companies can pay people to review products, reviews can be wrong, and it just isn't a major factor on an individual basis, as you may get the one car out of 1000 that doesn't have breaks even so.
You're paying out the nose for what people think won't fail 2-3 years down the road, is no different than buying a psu from apple that cost $400, and is a 1000% markup on the parts.
Besides there's a lot of over-engeering that goes into most of those overpriced psu's. Why pay $200 for a psu that is built to last 10 years, when the average lifespan of a computer is 3. My mother's system's 8 years old cause she doesn't need much power for what she does, email and whatnot, she's had 1 psu failure in the last few months on an 8 year old system that's been running continuously..and I didn't pay anywhere near a major amount of the original one that failed, and it died cause it was full of dust, not from manufacturer fault. Her system was never anywhere near a gaming rig of course.
If I were going to build 50000 systems, I would want to review every part that went into a psu to make sure it was quality, but on an individual basis stuff like that just isn't that reliable or important.
To me it's a lot like you're arguing that I should buy a $200 dvdrw, when there are ones out now for $30, that are perfectly fine and do what I want. The return on the cost from $30 to $200 just isn't that much.
Oh, if you're only gaming, there's no reason to go i7.
I read the last few pages, skipped this one, blah blah. Just one thing stuck out, about not being able to find good reviews? Here you are.
http://www.overclock.net/power-supplies/738097-psu-review-database.html
Shinji2k went to all the trouble of listing every "good" review from every reliable (ie., has testing equipment, knows what they're doing, not paid off) reviewer in the industry. Not every power supply has been reviewed, of course, but there are a couple of hundred units listed there with competent testing, dissections, and informed opinions.
I would recommend a Phenom II 955/890FX platform. You'll save money, overclock the **** out of it, lose no performance in gaming when compared to i7.
I was talking about graphics cards I think, but nice site. They don't have the rf900, but they have the 700...that one isnt modular though.
No, not into amd stuff, except maybe in laptops.
You have? I think that's pretty rare. A google search of "power supply failure destroyed graphics cards" returned one result on the subject from 2008. And that was a pentium 2. lol It's clear you're a snob about psu's as you are about trackballs. Although I don't know why being left handed would effect your choice of them. lol
I've had lots of power supply failures over the year. None on c2d+ generations. None have destroyed components on my boards. In addition I worked as a repair tech for a short time and my main job was repairing systems that would not boot. I repaired hundreds of systems. It was a rare machine that had a power supply problem. The few that did didn't have any fried components. It was normally the mobo that got fried from lightning strike over lan or modem lines. Power supplies are pretty rugged these days, you really have to stress them out, even the cheap ones, to blow them, and even then they rarely take any components with them in my experience.
The fact that there a number of bad ones out there is without doubt. The fact that there are ones that are banned in certain countries because they don't have standard protections on them, and that they would explode doesn't surprise me, but I think those are pretty rare as well. There are also batteries that explode and cause fires in laptops, and Toyota's that don't have breaks, but what are the chances of you actually seeing and getting stuff like that? It's like 1 in 1000 odds at best.
Unless you're designing a new model line for a company or something, I don't see the point of knowing about every component of a power supply, and testing it for quality. If it does the job, and has reasonable protection, it should be good enough. It's just a power supply, people have been making them for 80 years. Before computers they were made for radios and junk.
Hell, the power supply on my apple ][ died on me, in 1979 and didn't take any components with it. Back then, that was immature technology, and power supply failures were very common. These days it just doesn't happen that often relative to the number of people using computers and the amount of psu's out there.
Wrong again. Intel is clearly superior to AMD when it comes to mobile products, so if you really want to be a fanboy, laptops are the best place to do it. For gaming desktops, the difference becomes trivial.
Besides, what does "not into AMD stuff" even mean? Are you into wasting money? I wouldn't think so based on your reluctance to buy a decent power supply. So why do you want to spend $100 more on CPU power you don't need, and refuse to spend $50 more on a power supply. Overclocking doesn't cost too much money. A $30 Scythe Mugen 2 is enough to push any Phenom II (maybe even the 6-cores but I haven't played with one yet) to 4GHz.
AMDs are unreliable, man.
*coughs*
There is a limit to how "cheap" manufacturers can make PSUs, as, it has to fit within a certain safety guideline.
It means I hate the intel graphics cards in laptops, that intel always seems to want to package with the majority of their processors. There's exactly 1 intel based tablet pc(which I own) that has ever been produced with a dedicated graphics card, and that was before the amd/ati partnerships. (although the new tm2's do have ulv's(yuck) and ati graphics cards. )
I'm not into AMD desktops cause I've owned early ones, couldn't stand them, seen and had tons of problems with them, and not willing to sacrifice capability for cost. A cheap power supply doesn't sacrifice any of the capabilities of the system, which an amd motherboard/processor would, except perhaps extreme overclocking, which I'm not going to do anyway.
No they don't. Only cert they have to make to be sold in the US is FCC, which says nothing on safety or quality, just makes sure it won't interfere with other devices. Leadman LP8860 units cost less than $5 to produce, and it shows.
Also, there's a difference between getting a deal on a nice unit on ebay, and buying a generic piece of ****.
We just had a huge discussion on all the ways a cheap PSU screws you over. And now you say there's no difference? Failed reading comprehension, perhaps?
It means I hate the intel graphics cards in laptops, that intel always seems to want to package with the majority of their processors. There's exactly 1 intel based tablet pc(which I own) that has ever been produced with a dedicated graphics card, and that was before the amd/ati partnerships. (although the new tm2's do have ulv's(yuck) and ati graphics cards. )
I'm not into AMD desktops cause I've owned early ones, couldn't stand them, seen and had tons of problems with them, and not willing to sacrifice capability for cost. A cheap power supply doesn't sacrifice any of the capabilities of the system, which an amd motherboard/processor would, except perhaps extreme overclocking, which I'm not going to do anyway.
On the topic of PSUs taking out other hardware when they go, would it be possible to create...um, filters to put in between each connector and the target device (inlet and outlet plugs on said filters)?
I don't imagine such a thing would need to be too complicated. I'm not an EE guy but I think capacitors can have a smoothing effect and eat up overvoltages, right?
So chimera, when's the PSU coming? I want to see how well it all goes.
that surge is hitting your components dead on because of the lack of a MOV.
No gtx 460 that I could find, but interesting that the gtx lines are so much better. It seems like the 3dmarks correlate roughly to the direct draw crystalmark scores.
The most typical fanboy justification: "I had so much problems with Company X 47 years ago so it must mean their products are horrible." You know what, my Pentium 4 rig sucked **** and that doesn't mean i7 sucks ****.
Also, like I said, for gaming purposes, a Phenom II system is no worse than i7. There was a review, if I can find it, that showed a Phenom II actually edging out an i7 in games because of some funky architecture. Buying a 9800GT is a foolproof way of sacrificing gaming capability though. And if the whole shebang fries, even better. I'm sure a working Phenom II system is more capable in games than a dead i7 system.
Obviously what you said is true, but at this point I know intel systems, I like them, and I'm sticking to them. The i7 is the top right now, so until you can show me an amd system in crystalmarks or something that isn't risking blowing its hardware to keep up, I'm sticking with plans to build an i7.
He speaks truth. Only reason I'm on a Core 2 Quad is because I started out upgrading from an HP Pavilion with an E2200 and I got the Q9550 for $170, vs. $260 usual price (thank god for Microcenter).
AMD chips don't cook themselves... period. Except when overclocked with insufficient cooling, and Intel will suffer the same fate there anyway. And any modern CPU will overheat without cooling these days, anything but some of the very low-end Atoms need active cooling.
They also don't go changing the socket every few months, or require new chipsets every time a new chip comes out on an old socket... Like all those LGA775 chips.This. My Socket AM2 (not even AM2+) MSI board which I purchased at the end of 2007 supports all the way up to the AM3 Phenom II X4 and Athlon II X4 CPUs, as long as you use a processor with maximum 95W TDP.
No, intel chips start sending 0's if they overheat.
No, the 9800GT was succeeded by the GTS250. Which was a rebranded 9800GTX+
The GTX 285 was quite expensive for what you got. The cards to get back then were either the GTX 275 or HD 4890.
Also, people wore onions on their belts because it was the style back then.
In many cases, the AMD chips represent much better value for money compared with Intel ones. They also don't go changing the socket every few months, or require new chipsets every time a new chip comes out on an old socket... Like all those LGA775 chips.
No, intel chips start sending 0's if they overheat.
Binary processors often send 0s during normal operating conditions too.
Yeah, this is a real old argument:
http://webcache.googleusercontent.com/search?q=cache:mm1TUOYg_B8J:www.tomshardware.com/forum/65831-28-7ghz-there+intel+vs+amd+overheat+protection+throttle&cd=1&hl=en&ct=clnk&gl=us&client=firefox-a (http://webcache.googleusercontent.com/search?q=cache:mm1TUOYg_B8J:www.tomshardware.com/forum/65831-28-7ghz-there+intel+vs+amd+overheat+protection+throttle&cd=1&hl=en&ct=clnk&gl=us&client=firefox-a)
I really don't care about amd. Like I said, show me a system that isn't blowing its hardware trying to keep up with a modestly clocked i7 and is cheaper for parts and I'll consider it.
Yeah, this is a real old argument:
http://webcache.googleusercontent.com/search?q=cache:mm1TUOYg_B8J:www.tomshardware.com/forum/65831-28-7ghz-there+intel+vs+amd+overheat+protection+throttle&cd=1&hl=en&ct=clnk&gl=us&client=firefox-a (http://webcache.googleusercontent.com/search?q=cache:mm1TUOYg_B8J:www.tomshardware.com/forum/65831-28-7ghz-there+intel+vs+amd+overheat+protection+throttle&cd=1&hl=en&ct=clnk&gl=us&client=firefox-a)
I really don't care about amd. Like I said, show me a system that isn't blowing its hardware trying to keep up with a modestly clocked i7 and is cheaper for parts and I'll consider it. You can't use nvidia sli on amd boards either though right? I'd have to switch to ati crossfire? That's really the issue I have right now probably more than anything. I'd rather not switch over to ati graphics cards, as I'm relatively used to nvidia as well. I'd have no clue even where to start with ati desktop cards, haven't used them since the late 90s.
The demanding games I play do not scale to multi-GPU setups properly.
Unless you have a 30" screen, I do not see why you'd need it anyway.
You can't use nvidia sli on amd boards either though right? I'd have to switch to ati crossfire? That's really the issue I have right now probably more than anything.You can use an nVidia SLI on a motherboard with an nVidia nForce chipset and an AMD processor.
By the time a single graphics card is obsolete, the second hand price for a second one has usually reached a stage that it's cheaper to replace the single card with one that is about twice as fast as it. Sell your original for bonus lulz.
It also doesn't put too much load on your Chingchongic PSUs.
Damn. Sounds just like what I need for that machine I have in the corner that runs benchmarking software all day.
3.4? On a Yorkfield? Weak.
I hear Intel chips divide by zero during floating point operations.
No, intel chips start sending 0's if they overheat. The amd can't tell heat from anything, so it just keeps on going. I don't believe those were overclocked at all. They were just normal settings playing a game.
The most typical fanboy justification: "I had so much problems with Company X 47 years ago so it must mean their products are horrible." You know what, my Pentium 4 rig sucked **** and that doesn't mean i7 sucks ****.
Also, like I said, for gaming purposes, a Phenom II system is no worse than i7. There was a review, if I can find it, that showed a Phenom II actually edging out an i7 in games because of some funky architecture. Buying a 9800GT is a foolproof way of sacrificing gaming capability though. And if the whole shebang fries, even better. I'm sure a working Phenom II system is more capable in games than a dead i7 system.
VLC Media Player.
Buggy as all hell - don't listen to anyone who says otherwise - but the format support is awesome.
God, I don't have crappy psu's on my c2d systems. Almost all of them are 500+ and have had good reviews. I'm just not obsessive about knowing every stat of them. Just like the one I picked here, pretty much at random, I've been pretty good at picking out ones that are reasonable quality for decent prices.
3 pages about psu's that in my mind aren't worth giving that much thought to at this point.
I have 3 desktops that I use on a daily basis. I like one to have a lot of video cards to drive the cacoon of lcds that I sit in. lol It's my media pc that I watch shows on, and comment to you guys on. That unit doesn't have to have a lot of power, but I like to switch sometimes to a single screen to play a game on it sometimes which is when the sli comes in. xfx overclocked 7600gt's and 8600's dropped below $30 a long time ago, and they work fine for the majority of apps and games in sli.
Sides they're black:Show Image(http://img.tomshardware.com/us/2006/07/17/summer_2006_geforce_7_graphics_gear/xfx-7600gt-angle.jpg)
Then I have a gaming system that I have a 1080p larger lcd on, it's no 30" but it's still a decent size, and then I have a productivity unit that I do most of my schoolwork/apps on, then I have tablet pcs also driving external lcds. They're all in various stages of obsolesce and upgrade, so as one thing becomes obsolete I might switch it to another unit, and so forth. So sli just gives me another option within that framework.
Wattage has nothing to do with quality. Case in point: you just bought a 900W PSU for $50. Heck, I can't find an attractive 400W unit for that cheap (for me).
3.7GHz in winter. In summer it gets too warm. I can give you my validation for 4GHz... At 82C.
So... the real problem you have is psychological.
So quite possibly the craziest person (not in a good way) I've ever seen in the world has deemed that I have a psychological problem.
This is not good.
In a world that's gone crazy, the only sane people are the ones who are crazy. You might be ok. ;)
So quite possibly the craziest person (not in a good way) I've ever seen in the world has deemed that I have a psychological problem.
This is not good.
The more watts you're dealing with, the more components you need to control the higher current.
Crazy people use these:Show Image(http://images.apple.com/pr/photos/iMac/imac_flowershot.jpg)
I bought one of those at a thrift store like 5 years ago for like $20. It had a non functioning drive. They used some crap brand that was prone to failure. I replaced the hd and rebuilt the system from scratch. Put osx Jaguar on it. It had all kinds of problems, way worse than any pc I've ever had. It's the most advanced mac I have. lol It was useful to learn osx and mac had some games and stuff from the early days that I played that were pretty interesting and addicting.
I have a purple one of those <3Show Image(http://img.photobucket.com/albums/v104/enthauptet/bin/imac_setup.jpg)
I have a purple one of those <3Show Image(http://img.photobucket.com/albums/v104/enthauptet/bin/imac_setup.jpg)
Fireballs, at least of the age you'd find in a gen-1 iMac, aren't bad drives.
Later Quantums and pretty much any Bigfoot are supposed to be the bad ones.
Fireballs, at least of the age you'd find in a gen-1 iMac, aren't bad drives.
Later Quantums and pretty much any Bigfoot are supposed to be the bad ones.
There weren't a any colored versions in the 1st gen iMacs. They were all Bondi Blue. Tangerine iMacs apparently were the only models that were widely available for a while because nobody wanted them except in cities where the local sports franchise had orange as one of their main colors. There aren't a lot of teams that fit that bill. Denver comes to mind... not sure what else.
1000 watt? That's like running a dryer!
If you burn your house down, can I have your keyboards?
I It seems like with this i7 though, the processor is the main cost,
It's not crappy at all. It's like a $160 retail psu that I got for $50.
Well, since you don't overclock and don't see the merit of getting a decent power supply, honestly that pre-built Gateway doesn't seem like a bad idea. Normally I'd never let anyone buy pre-built rigs but it's not like you're listening to my custom part recommendations anyway so it'll probably be easier for you just to get that.
Can you return the parts you already bought for the i7 build?
so build with an amd x6... I know, you have already shot that idea down. Bang for buck dude.
Like I said, show me an amd system that comes anywhere near an i7 and isn't oc'd with water coolers and junk and I might do that, but until I see one I'm not even going to consider it.
Here's an amd phenom and it only has 150k crystalmarks that I'm sure I could get with one of my c2ds if I overclocked them, and got a newer graphics card.
http://crystalrank.info/CrystalMark/09/ranking.php?ID=132716 (http://crystalrank.info/CrystalMark/09/ranking.php?ID=132716)
They're just nowhere near i7 levels.
Since i7's are one of the first systems meant to be overclocked I probably will overclock the i7 at least to some degree, but probably not past a reasonable safe zone.
And those are all synthetic benchmarks which mean ****-all and which i7s are supposed to dominate. I'll dig up the gaming ones in a second but why do you think the people recommending AMD are lying to you? Do you think instantkamera or I get commission from every AMD CPU sold?
http://www.neoseeker.com/Articles/Hardware/Reviews/pii_965/
And those are all synthetic benchmarks which mean ****-all and which i7s are supposed to dominate. I'll dig up the gaming ones in a second but why do you think the people recommending AMD are lying to you? Do you think instantkamera or I get commission from every AMD CPU sold?
No 920's or 940's though.
the ecs motherboard I want is $145, which is a solid board,
and the 1156 boards are robust and very inexpensive. That one 500k crystalmark was on a $100 asus board.
Motherboards have been really cheap since c2d's.
I think I paid like $50 for my 570 slit-a's and asus mobos that I've been using for like 2 or 3 years now.
http://cgi.ebay.com/Asus-P7P55D-PRO-Motherboard-Supports-Intel-Turbo-boost-/120579279704?cmd=ViewItem&pt=Motherboards&hash=item1c1315cb58http://cgi.ebay.com/Asus-P7P55D-PRO-Motherboard-Supports-Intel-Turbo-boost-/120579279704?cmd=ViewItem&pt=Motherboards&hash=item1c1315cb58 (http://cgi.ebay.com/Asus-P7P55D-PRO-Motherboard-Supports-Intel-Turbo-boost-/120579279704?cmd=ViewItem&pt=Motherboards&hash=item1c1315cb58http://cgi.ebay.com/Asus-P7P55D-PRO-Motherboard-Supports-Intel-Turbo-boost-/120579279704?cmd=ViewItem&pt=Motherboards&hash=item1c1315cb58)
I would think that the crysis tests have more to do with the graphics card than the cpu... crystalmark is more intensive cpu test which is why I would use it for selecting a cpu, and 3dmark for selecting a graphics card.
What's the name of the one you would suggest would compete with an i7? Do they cost less than $200?
This is probably a bit more like it:
http://hardware-infos.com/tests.php?test=64&seite=10 (http://hardware-infos.com/tests.php?test=64&seite=10)
Usually, when a CPU beats an i7 960, logic dictates it can beat 920s and 940s.
How do you know it's solid? $145 isn't cheap. More expensive than a good 890FX board.
Yes because the CPU wasn't overclocked. You don't need an expensive motherboard to run stock. If you're gaming, why do you care about Crystalmark? What the **** is Crystalmark anyway? I've only ever seen you use it.
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=X58&x=0&y=0Show Image(http://www.omgwtfimages.com/uploads/thumb/thumb_185.jpg)
Another option is i5. You can get an i5 750 with a P55 board instead of i7.
So you linked the part where i7s beat Phenom IIs in a synthetic benchmark, and I linked the part where Phenom IIs beat i7 in gaming which is what you will be using the computer for. Lolwut?
You can't bench a system based on a game, when we're talking about cpu, when the majority of the games benchmark is coming from the gpu.
If you want to talk graphics cards, we can talk crysis ratings all you want, it doesn't have that much bearing on the quality of the cpu. You're trying to compare apples with oranges.
Crystalmark is a Japanese benchmark software.
Since i7's are one of the first systems meant to be overclocked I probably will overclock the i7 at least to some degree, but probably not past a reasonable safe zone.
Seriously, can anyone explain what's going on here?
Wow. So those Athlon 64s with unlocked multipliers that were touted as a feature of the chip, and the extreme ease of overclocking computers made before the late 90s was just something I was imagining?
I think I need to see a doctor.
3. Has your child asked for new hardware?
Computer hackers are often limited by conventional computer hardware. They may request "faster" video cards, and larger hard drives, or even more memory. If your son starts requesting these devices, it is possible that he has a legitimate need. You can best ensure that you are buying legal, trustworthy hardware by only buying replacement parts from your computer's manufacturer.
If your son has requested a new "processor" from a company called "AMD", this is genuine cause for alarm. AMD is a third-world based company who make inferior, "knock-off" copies of American processor chips. They use child labor extensively in their third world sweatshops, and they deliberately disable the security features that American processor makers, such as Intel, use to prevent hacking. AMD chips are never sold in stores, and you will most likely be told that you have to order them from internet sites. Do not buy this chip! This is one request that you must refuse your son, if you are to have any hope of raising him well.
Wow. So those Athlon 64s with unlocked multipliers that were touted as a feature of the chip, and the extreme ease of overclocking computers made before the late 90s was just something I was imagining?
I think I need to see a doctor.
You can't bench a system based on a game, when we're talking about cpu, when the majority of the games benchmark is coming from the gpu.
So, in your own twisted mind you agree. If you want to game, and CPU doesnt mean **** in games, (your) logic still dictates you'd save your damn money and buy an AMD. Just sayin'.
Phenom II X4, GTX460, an SSD. There's your gaming rig.
Except that all the amd's on ebay cost exactly the same as the i7, or more, and as I've said all along, I don't know amd's, and don't want to find out that it's missing some feature that I like with intels.
Phenom II X4, GTX460, an SSD. There's your gaming rig.
Actually, with one exception, the only unlocked multiplier chips Intel makes are expensive "Extreme Edition" ones. This practice has dates back a few years, I think the first such chip was either a Pentium 4 or Pentium D.
The one exception is the Core i7 875K (I think it is called) which was designed to compete with the fact that AMD's top end chips are about $200-300, which allow overclockers that extra freedom without having to re-mortgage their house or whatever.
Some could say that this shows that AMD can't compete with Intel's top end. I would say that AMD has no interest in doing so because only nutjobs buy $1,000 consumer desktop CPUs anyway.
And qualify "meant to be overclocked". I'm pretty sure they tell you that you void your warranty if you fry the chip (although, in reality, they have no way of knowing if a chip has been fried by overclocking). So, I dont see how "You can overclock at your own risk" is any different to "Dont overclock, but here's all the prerequiste tools". Q6600 much?
What feature, exactly, can a CPU be missing? HT? Used to compensate for overly long instruction pipeline. Speedstep? Cool'n'Quiet. Functioning and overclocking?
I guess I just have to say that you have no business building computers if you don't even understand what differentiates components.
Yeah speed step for one, but not just the features of the cpu, but the mobos as well. I don't understand i7's, or the amd competitors these days as I've said over and over again, so I appreciate this talking about them, it's making me do a lot of research.
Minus the SSD mind you because if there's ever been a bigger waste for a pure gaming rig, it's an SSD. Or how about those special gamer Network cards?
Please tell me, what features do you think AMD motherboards are missing? I have an AMD motherboard and can dig around my BIOS for you.
Minus the SSD mind you because if there's ever been a bigger waste for a pure gaming rig, it's an SSD. Or how about those special gamer Network cards?
Please tell me, what features do you think AMD motherboards are missing? I have an AMD motherboard and can dig around my BIOS for you.
What feature, exactly, can a CPU be missing? HT? Used to compensate for overly long instruction pipeline.
I'm sold on using ssds. I think they're more stable and less likely to just drop dead without warning, and start clicking and junk on you which I'm sick of with normal drives. Drive click gives me nightmares.
YIf you say I can get an amd system that has the same performance as an intel for like 50% the cost, then I might agree
, but it's not looking that way based on ebay prices I'm seeing. They look to be about the same cost, and less power in most situations, so it's not going to convince me.
I'm sold on using ssds. I think they're more stable and less likely to just drop dead without warning, and start clicking and junk on you which I'm sick of with normal drives. Drive click gives me nightmares. If for that alone, let alone they are faster, I'm going to get a 60gb ssd for the system drive of the next desktop I build.
Ehrmm... sorta. What you're saying was true of the HT used in the Pentium 4, but when done right, it can be a good idea. You do know what it actually does, right?
Last I checked, current Intel and AMD chips have very similar pipeline lengths, and there aren't really any particular fundamental differences between one and the other.
The arguments made by AMD against hyperthreading include complexity, power consumption, and issues with cache misses. So it isn't an objectively brilliant idea, it so happens that Intel thinks it's good and AMD does not. Some non x86 vendors have similar things in their chips. The IBM POWER7 for example can support four threads per core.
"AMD is from the same people who make GeForce and Asus mother boards. So they make A lot of money to begin with so they don't need more. Don't worry."
The sad reality is that AMD has been losing too much money to even be able to compete with a pimple on Intel's ass. Intel, by contrast, is expanding. So much so that when I'm done with my EE degree, I'm definitely going to try and get a job at Intel.Show Image(http://media.bestofmicro.com/V/8/233108/original/feature_image07.jpg)
"AMD is from the same people who make GeForce and Asus mother boards. So they make A lot of money to begin with so they don't need more. Don't worry."
The sad reality is that AMD has been losing too much money to even be able to compete with a pimple on Intel's ass. Intel, by contrast, is expanding. So much so that when I'm done with my EE degree, I'm definitely going to try and get a job at Intel.Show Image(http://media.bestofmicro.com/V/8/233108/original/feature_image07.jpg)
this coming from a guy who doesn't bat an eye at buying a PSU off sketchBay.
I'm disappointed with all the hexacore and octacore nonsense. Most software runs perfectly fine on 2-4 cores. I'd like to see more SSE optimizations, logic core optimizations to bring down the transistor count, smaller die size, smaller process, more of this stuff where idle cores downclock and busy ones overclock... Improvements to the manufacturing process so that there are more higher-binned chips downclocked so we can get a chip that can overclock to the same speed as a more expensive one... :p
It won't happen. Please explain why AMD would sell CPUs for half the price of Intels if they were just as good? Like, this is not an AMD fail on your part - this is a logic fail. You're saying that you will only buy Product X if it is as good as Product Y and costs half as much. How does that make sense for the makers of Product X? At all?
Your justification for not buying AMD is because it isn't as good as Intel while costing half as much at the same time. Again, that makes no sense.
However, the alternative is that you buy Product X which is as good as Product Y in games which is what you will be using the computer for and costs half as much.
Try this on for size. If you're adventurous enough to trust open-box Ebay power supplies, then you should give unlocking cores a shot:
http://www.newegg.com/Product/Product.aspx?Item=N82E16819103846
http://www.newegg.com/Product/Product.aspx?Item=N82E16813138194
Chances are that you will be able to unlock the 555 into a full quad core Phenom II.
I'm happy with intel. They've been working relatively flawlessly for me for the last 15 years, while the motherboards I've had for AMD chips have been crap.
I'm happy with intel. They've been working relatively flawlessly for me for the last 15 years, while the amd's I've tried have been crap. It would take that price difference in order for me to switch. I see no reason to switch if the difference in price is negligible.
where are you taking that EE degree? Mtl, so I assume either conc or McGill. I'll assume not UQAM as your written English has me thinking you are a native English speaker - forgive me if I'm wrong (I suppose you could be a hybrid from the west island though ;)).
Fixed.
Were the AMD motherboards from the same manufacturer and price range as the Intel ones you had?
So for you to use AMD, they have to beat Intel and charge much less. Right. You do know how dumb that is, right? And to think this whole time I've been trying to have a somewhat intelligent discussion with you. Waste of time but at least my post count gets upped.
Yeah, McGill, and yeah 100% Anglophone. I live in St. Laurent (which is very French if you live anywhere except in the Jewish part, which is very English and which is where I live).
AMD is pointless as a company as far as I'm concerned if they can't do that. They have no point in the marketplace.
Intel processors are better and cost more.
AMD processors are not as good and cost less. That's a pretty balanced marketplace.
AMD is pointless as a company as far as I'm concerned if they can't do that. They have no point in the marketplace. Intel is the standard, not AMD, it's always been that way. They have to offer more for less or they won't be able to compete at all. Even then. I mean look at Tucker, he offered a better car for less, and still wasn't able to compete with the big 3 auto makers. It's the same with processors. Brand recognition and customer loyalty has a lot to do with it.
Yeah, except from what I'm seeing, on the used/second hand market at least, they cost the same and do less.
Show me an amd chip that I don't need a huge water cooling or hs/fan that will compete with a i7 on ebay or somewhere else and will cost significantly less and I'll consider it.
But what does any of that have to do with the fact that you can save money with AMD and get the same gaming performance as Intel?
Show me an amd chip that I don't need a huge water cooling or hs/fan that will compete with a i7
Did you ignore those Crysis links? You clearly didn't because you cried about them and also agreed that AMD is on par with Intel when it comes to games.
I didn't ignore them. I disagreed with them, because I wasn't sure if they were testing the cpu, or the gpu, and all the little disclaimers and junk were in German, so I have no idea what they were actually comparing.
Yeah, McGill, and yeah 100% Anglophone. I live in St. Laurent (which is very French if you live anywhere except in the Jewish part, which is very English and which is where I live).
They were testing the overall system, which is what you need to play a game. The disclaimers did not say "Warning: These benchmarks are a joke and Intel is in reality better than AMD".
There are too many factors to go amd vs intel based on a whole system. It's unlikely you can get any really fair results doing stuff this way, especially testing with a game like crysis, which is really an atypical game.
Does anyone else smell a troll here?
How is it unfair? Because it showed AMD beating Intel?
I just have never considered building an amd desktop for like 10-15 years.
No, but given I saw no parts list there of the units they were testing, I'm not sure it was a fair comparison. Is there a list of what they were testing exactly? Even the motherboard cpu combination can have big effects on performance, within brands, let alone amd vs intel. Even from chip to chip with intel, you'll get different results, and different level of overclock capability. One way or another it'll be unfair.
I smelled one days ago. I mainly hang out in this thread to talk around him.
I chime in when interesting questions are raised. But yeah, most of this looks like bull****.
No, but given I saw no parts list there of the units they were testing, I'm not sure it was a fair comparison. Is there a list of what they were testing exactly? Even the motherboard cpu combination can have big effects on performance, within brands, let alone amd vs intel. Even from chip to chip with intel, you'll get different results, and different level of overclock capability. One way or another it'll be unfair. It even matters little things like what system, or even what programs are installed on the system, you'll get different results for the benchmarks. So it's very difficult to get fair comparisons, you need a large sample size.
It is bs, cause a bunch of amd fanboys decided to troll my post on graphics cards.
It is bs, cause a bunch of amd fanboys decided to troll my post on graphics cards. Next we'll hear from the mac enthusiasts telling me I should just buy a mac.
Seriously, you are basing your judgement of AMD on 15 year old data? Sheeeeit, you might as well be talking about cyrix here...
I made up my mind a long time ago that instead of learning ANYTHING, I'm simply going to pretend I know everything. That way, instead of having to actually acquire information and process it, I can allow my brain to atrophy until I eventually can't remember to breath. Basically, what I believe is totally true, because I say it is. BTW, the world is most definitely flat.
You should. They're shiny and so easy to use.
Have good PSUs too.
It is bs, cause a bunch of amd fanboys decided to troll my post on graphics cards. Next we'll hear from the mac enthusiasts telling me I should just buy a mac.
awesome.
Come over to the Big Rig thread, it's a better party. Chimera isn't invited due to low intelligence.
Actually, to be fair, a bunch of pro-good-PSU(or anti-bad-PSU) fanboys decided to troll your post on graphics cards.
Actually, to be fair, a bunch of pro-good-PSU(or anti-bad-PSU) fanboys decided to troll your post on graphics cards.
I think it is relevant to point out that no one HAS recommended a Mac, despite the fact that you have placed yourself squarely within their targeted user-base. This would lend credence to our suggestions to anyone willing to listen ...
Anti-bad PSU fanboys? That's like - "Those car safety people are seatbelt fanboys".
AMD users are closer to mac users than I am.
Anti-bad PSU fanboys? That's like - "Those car safety people are seatbelt fanboys".
So because I don't want to know every component of a psu, that I can buy for $50, I'm a Mac user? lol Or can't afford some massive graphics card, and don't want to buy an Amd? AMD users are closer to mac users than I am.
AMD users are closer to mac users than I am.
Those PSU fanboys are more like, people who want to pay 20k for an engine in their car that can make it go 200mph, but never drive more than 60mph.
Personally, I'll pay $1000 for an engine that will go 200mph, as I did, and laugh as I pass them on the road.
Having a quality psu is more like, people who want to pay 20k for an engine in their car that can make it go 200mph, but never drive more than 60mph.
Personally, I'll pay $1000 for an engine that will go 200mph, as I did, and laugh as I pass them on the road.
AMD user posts in this thread have been tests and benchmarks. Your posts in this thread have been "AMD sucked 15 years ago I ain't gwain' buy sheeyit, Intel dun treat me right well". Tell me which one is more fanboyish?
I've at least used, serviced, and built both. I didn't like them. I gave them a try, and they sucked. I'm supposed to try them again, why? Cause some anonymous people on the net said they've changed, and showed me a benchmark that has no details?
Anonymous people who, by your own admission, know more about computers than you. Yes you are.
I think most of them know a lot more than me in theory, but most of them lack real world experience.
I've used Phenom II and you haven't. How's that for lack of real world experience? Or is real world experience in your books defined as "good experience with Intel, bad with AMD, everyone else be trollin' and dun' goofed". Or is experience with processors 15 years ago a requirement to build gaming computers in 2010?
Two. One for a guy who does 3D work and encodes media, and another for a guy who does a lot of burning/ripping for piracy purposes. If you pull the "I don't believe you" card I can get proof in a few days. The media encoding guy has a GTX 260 and his gaming performance cannot compare to my computer's because I have a much more powerful GPU. The other has a 512mb 4870, another relative featherweight compared to my 990MHz core 4890 and my computer consistently gets higher frames (4870 512mb and GTX 260 though are too weak compared to 4890s, but that should give you an idea of how important GPUs are for gaming rigs).
Obviously I know how important gpus are for gaming purposes. The problem I've had is with people using gaming benchmarks when comparing cpus. If they had equal gpus, which would win?
I remember the days when Intel or AMD releasing a new generation of chip actually counted for ****. Nowadays, there's only very certain things that benefit from faster CPUs, and unless you're gaming, the wait times on other CPU related things can be solved by the user going off to get a coffee.
Or at least that's how I see it...
What gets me these days is the widespread misuse of the word "multitasking". Watching Youtube while typing up an essay in Word and listening to music is not multitasking. Well, it is, but it's not the type of multitasking that requires people to buy 6-core processors or i7s with hyperthreading. An Atom can multitask those things.
What you really need i7s and 6-cores for are megatasking, i.e. doing multiple pieces of important, intensive **** at once. I can't name a thing because I don't really do important **** on my computers. Maybe like encoding two videos, ripping a CD and compiling a program at the same time while playing GTAIV on your second monitor. IDK.
I've had tons of hard drives fail on me over the years and take sometimes years of work with them. I've never had a psu fail and take any components with it, in 30 years of using, building, and maintaining other people's computers as well.
the wait times on other CPU related things can be solved by the user going off to get a coffee.
The German review used equal GPUs. The point of the reviews I linked were to show this:
- With the same GPUs, there is no difference in CPU platform
- With the money you save with a cheaper CPU platform, you can get a better GPU
- Your cheap-CPU+expensive-GPU system now performs the same as the expensive-CPU+expensive-GPU platform, and performs better than the expensive-CPU+cheap-GPU platform in games
The GTX 260 guy is the perfect example. The poor bugger can't play BC2 smoothly. Shoulda' bought a Phenom II and a 5850. So what if your project takes an extra 20 minutes to complete.
Does this mean my next computer will come with a complimentary coffee card or year's supply of my favorite roast?
If you have a Pentium 4 machine, your computer is the coffee machine.
I was going to say, I still have my old athlon tbird 1.4Ghz somewhere. Im sure I could mate a lian-li with a rancillio silvia and use the CPU as the boiler.
Not at 2048x1152 with full AA and AF. GTX 260s aren't good cards by any current standards and those settings are just too high. I don't fancy my 4890s chances with it either but a 4890 does actually get scarily close to GTX 285 performance when everything is turned up at 2560x1600 in Crysis so who knows.
That would probably be an eerily effective way of dissipating it's heat at the same time.
4890 competes with the GTX 275. GTX 260 is comparable with a low end 4870 IIRC.
The 4890 beats the 5830 by a small margin.
I'd say the GTX260 and Radeon 4870 are still high end cards, just not if you want to play on 1920x1200 or up. The only game I can't max with my 4870 at 1680x1050 is Crysis.
I think most of them know a lot more than me in theory, but most of them lack real world experience. The lack of theory knowledge on my part is temporary until I have time to come back up to speed.
Take the 3DMark06 tests at face value (as you should any synthetic benchmark), because in our next section we begin real-world testing on a cadre of popular video games known for taxing the graphics processor, and the performance curve is expected change.Quote from the review you linked, you should really stop depending too much on synthetic benchmarks.
All in all this is a more than respectable performance delivered by the Radeon - in matters of pure fps power it can get extremely close to the Geforce, which is noticeably more expensive, and in some cases the Atomic can even beat the Nvidia card.
Okay mate... let's give you a bit of real world experience.
Started on an 8088
Recently specced and built a 250 node Renderfarm with 48 cores per node.
I rock an X4 620 in my main rig, and am building myself one of the afortermentioned 48 core servers (Just the 1 node).
Oh, and build/repair/spec/upgrade a whole variety of systems, every day, because it's how I make my living.
My "real world experience" sorted and out of the way.
For pure performance, go i7.
For performance/cost ratio, go quad or hex from AMD.
That, my friend, sums it up, in it's entirety. If you won't go for AMD, given your stated goal of a cost-effective Gaming Rig, then it's nothing but pure fanboyism. It will play games, as good, if not slightly better, given the rest of they system is the same.
Personally, I don't care what name is on the box, as long as it fills the criteria I set out when building the thing. I ran Intel's throughout the C2D/C2Q era, and AMD stomped intel into the curb with the Athlon XP series. Both have highs and lows. Thus is life... deal with it.
That's why I say the 5830 is ATI's worst card in years. It performs worse than the card it's replacing and costs more (even after the price cut when the GTX460 came out), has the power consumption and heat of the card two tiers above it, and is utterly stomped by Nvidia's competition to it, the GTX460. ATI really screwed up there, $200 is a crucial price point and if you aren't strong there you're going to have a lot of trouble. They started out strong with the 5770, but now that's at $150, and the 5830 just can't take up that mantle.
Professional experience isn't exactly the same as what I do, and isn't really germane to the conversation. The facts are that new you may have a point, or building a server that has to be there for customers. I said earlier if I was doing this I would have a completely different thought process, and would be concerned more about details. Putting together a unit from used/refurbed/open box parts off ebay it's a completely different world. The fact is that I'm not seeing the price advantage, and so why should I worry about AMD?
http://cgi.ebay.com/AMD-X4-955-3-2GHZ-USED-/120598867230?cmd=ViewItem&pt=CPUs&hash=item1c1440ad1e#ht_500wt_1154
Sup Mr. Ebay. The very same CPU that was seen beating the i7s in games not only in the German benchmark I linked but the other two that you seem to have ignored. $110 buy it now. Surely you can't get a 920 for that cheap?
Professional experience isn't exactly the same as what I do, and isn't really germane to the conversation. The facts are that new you may have a point, but putting together a unit from used/refurbed/open box parts off ebay it's a completely different world. The fact is that I'm not seeing the price advantage, and so why should I worry about AMD?
http://cgi.ebay.com/AMD-X4-955-3-2GHZ-USED-/120598867230?cmd=ViewItem&pt=CPUs&hash=item1c1440ad1e#ht_500wt_1154
Sup Mr. Ebay. The very same CPU that was seen beating the i7s in games not only in the German benchmark I linked but the other two that you seem to have ignored. $110 buy it now. Surely you can't get a 920 for that cheap?
http://cgi.ebay.com/AMD-Phenom-II-X4-955-3-2-GHz-CPU-AM3-Black-Edition-[URL=http://cgi.ebay.com/AMD-Phenom-II-X4-955-3-2-GHz-CPU-AM3-Black-Edition-DDR3-/300434452166?cmd=ViewItem&pt=CPUs&hash=item45f349eec6]DDR3-/300434452166?cmd=ViewItem&pt=CPUs&hash=item45f349eec6 (http://cgi.ebay.com/AMD-Phenom-II-X4-955-3-2-GHz-CPU-AM3-Black-Edition-DDR3-/300434452166?cmd=ViewItem&pt=CPUs&hash=item45f349eec6)[/URL]
Looks like this is the lowest at all reputable seller.. Still not bad at $150 if it indeed will compete with an i7.
I should have put it a bit clearer.
In the last 20 years I have built literally thousands of systems, ranging from charity builds from my spares pile, through to the monster renderfarm. the bulk of my work is "normal" systems (Your falls squarely into that category).
My current machine is largely 2nd hand, bought from Forum's off of early adopters. What I am saying is very germane to what you are trying to do, because I do it 320+ days a year. The server/farm builds happen on average once or twice a year.
Last week I built a system, with an identical purpose to yours. Quality gaming, without spending silly money. PhenomII, 5770, 8GB, Intel X25M SSD+2TB, 600W Seasonic PSU. Job Done. All the parts were used bar the PSU. It came in about 20% cheaper than I could source an i7 rig that would perform in the same manner. the facts are, I simply could not build an i7 rig for the same money and keep performance on par.
Well now that I have a more solid idea of what the amd competitor is, it may be that I could save about $100...
Part of the problem I've had with AMD's is that they have like 80 processors from different generations and lines all named exactly alike.
As far as I know, you can't find any previous gen AMDs in retail channels any more. Same can't be said for Intel...
Show Image(http://media.bestofmicro.com/8/R/206379/original/image031.png)
$110 on Ebay.Show Image(http://media.bestofmicro.com/8/V/206383/original/image035.png)
$110 on Ebay.Show Image(http://media.bestofmicro.com/9/0/206388/original/image040.png)
$110 on Ebay (safer for your power supply).
http://www.hardwarecanucks.com/charts/cpu.php?pid=69,70,71,76,77&tid=2
$110 on Ebay with a trivial overclock that can be done on the stock cooler unlike i7s (that I can walk you through via PMs).
http://www.hardwarecanucks.com/charts/cpu.php?pid=69,70,71,76,77&tid=4
$110 on Ebay with that same overclock.
http://www.modreactor.com/english/Reviews/Test-ATI-HD-4890-1GB-CrossFire-AMD-Phenom-II-955-BE-vs-Intel-Core-i7-920/Page-4-Performance-Crysis-Warhead.html
$110 on Ebay, look at that mmmmm.
http://www.modreactor.com/english/Reviews/Test-ATI-HD-4890-1GB-CrossFire-AMD-Phenom-II-955-BE-vs-Intel-Core-i7-920/Page-5-Performance-S.T.A.L.K.E.R.-Clear-Sky.html
$110 on Ebay.
http://www.modreactor.com/english/Reviews/Test-ATI-HD-4890-1GB-CrossFire-AMD-Phenom-II-955-BE-vs-Intel-Core-i7-920/Page-7-Performance-Devil-May-Cry-4.html
$110 on Ebay.
Alright, having trouble finding a mobo...
Is this one?
I think I'd almost rather buy a c2q for my current systems than buy a whole new amd system.
Did you have the ATI card in the same AMD machine that sucked 15 years ago? 2010 and my 2-year-old ATI card is still well supported and beats many of the new generation video cards. Too bad, you could have had the same. Also, it's not slightly better. Did you not read the reviews I linked, or the real-world experience I talked from? The 4890 is a viable high-end card today that rivals a GTX 285.
Whatever, the GTX 260 isn't a bad card. Do you know if the one you bought is the one with 192 cores or the updated one with 216?
http://cgi.ebay.com/GIGABYTE-GA-MA790XT-UD4P-Motherboard-AMD-790X-AM3-DDR3-/170491257032?cmd=ViewItem&pt=Motherboards&hash=item27b21220c8#ht_640wt_911
That's my motherboard. People hate on Gigabyte but in my opinion and in the opinions of everyone who owns one, it's a flagship AM3 motherboard. FSB beasts (although with black editions, FSBs don't matter).
What's wrong with Gigabyte? I don't bother buying anything else.
1.) Ask question like you're actually curious
2.) See what everyone thinks is good
3.) Say that what everyone else thinks is good is actually garbage because of stupid reason X
4.) ?????????
5.) trollprofit
So you seriously passed up a reference 4890 for $5 more than you bought a 192 GTX 260 for?
Sorry man but there's fanboyism and then there's outright idiocy. Really. I'll put it in simpler terms since you seem to blank out when the word "ATI" is mentioned: you passed up a GTX 285 to save $5 and go with a 192-core GTX 260.
RMA problems as of late. They're on the OCN boycott list next to EVGA and XFX now (and MSI too for some reason) (I haven't used EVGA and I support the XFX boycott, in fact I led the damn thing). I'd still buy a Gigabyte board over anything. Asus? Don't make me laugh. Overrated. People who use Asus boards are the same people that think Nvidia is better than ATI and AMD is a third-world company that makes Intel knockoffs. MSI? Maybe. DFI? The only board maker I actually liked before it closed down.
I wouldn't compare the 4890 to a GTX285. It gets close in some comparisons, but it's closest to the GTX275.
But yeah, imagine passing up to a GTX275 1GB for $5 more than a GTX260 192 896MB.
"AMD is from the same people who make GeForce and Asus mother boards. So they make A lot of money to begin with so they don't need more. Don't worry."
The sad reality is that AMD has been losing too much money to even be able to compete with a pimple on Intel's ass. Intel, by contrast, is expanding. So much so that when I'm done with my EE degree, I'm definitely going to try and get a job at Intel.Show Image(http://media.bestofmicro.com/V/8/233108/original/feature_image07.jpg)
So at the end of the day:
You still bought a shoddy power supply
You still bought a GTX 260
You're still going to buy i7
I'm really wondering why I bother. I suppose I can blame a bit of it on you too. Why bother making a help thread if you know everything?
RMA problems as of late. They're on the OCN boycott list next to EVGA and XFX now (and MSI too for some reason) (I haven't used EVGA and I support the XFX boycott, in fact I led the damn thing). I'd still buy a Gigabyte board over anything. Asus? Don't make me laugh. Overrated. People who use Asus boards are the same people that think Nvidia is better than ATI and AMD is a third-world company that makes Intel knockoffs. MSI? Maybe. DFI? The only board maker I actually liked before it closed down.
X7 900W isn't excellent, just solid.
I'm really wondering why I bother. I suppose I can blame a bit of it on you too. Why bother making a help thread if you know everything?
So at the end of the day:
You still bought a shoddy power supply
You still bought a GTX 260
You're still going to buy i7
I'm really wondering why I bother. I suppose I can blame a bit of it on you too. Why bother making a help thread if you know everything?
Intel sacks employees over 50. Really horrible company to work with -- it's easy to tell from their business structure.
Okay first of all, you need to learn manners and stop criticizing products when someone says it works well for them. Just because YOU think it's bad or not "the best", doesn't mean you have to enforce it.
BECAUSE the product works well for whomever is using it, they can say it's excellent as it gets the job done well.
Definitions of excellent on the Web:
very good;of the highest quality; "made an excellent speech"; "the school has excellent teachers"; "a first-class mind"
wordnetweb.princeton.edu/perl/webwn
Is this about your G70 thread? When you got called out for calling it an excellent CRT when in fact, according to the specifications and common opinion, it's a piece of ****?
If something works it doesn't make it excellent. Not at all. You need to learn that "This thing works" is different from "This thing works exceptionally". I've had many products that work. My Pentium 4 worked and it was a piece of ****. Onboard sound works and it sounds like ****. Semprons work and they're pieces of ****; far from excellent. You know what, I can go to Walmart right now and buy an electric guitar for like $30 bucks. It will work. Excellent guitar? Not by a long shot.
If you want to call a power supply excellent, you should have the numbers on your side, which in this case, they are not. On top of that, it was purchased on Ebay, open box; if that isn't enough to earn it the shoddy moniker, this should do it: IT'S OWN INCLUDED MODULAR CABLES DID NOT FIT INTO THE POWER SUPPLY and had to be modified by chimera15.
Is this about your G70 thread? When you got called out for calling it an excellent CRT when in fact, according to the specifications and common opinion, it's a piece of ****?
If something works it doesn't make it excellent. Not at all. You need to learn that "This thing works" is different from "This thing works exceptionally". I've had many products that work. My Pentium 4 worked and it was a piece of ****. Onboard sound works and it sounds like ****. Semprons work and they're pieces of ****; far from excellent. You know what, I can go to Walmart right now and buy an electric guitar for like $30 bucks. It will work. Excellent guitar? Not by a long shot.
If you want to call a power supply excellent, you should have the numbers on your side, which in this case, they are not. On top of that, it was purchased on Ebay, open box; if that isn't enough to earn it the shoddy moniker, this should do it: IT'S OWN INCLUDED MODULAR CABLES DID NOT FIT INTO THE POWER SUPPLY and had to be modified by chimera15.
One of my issues with amd, is that every amd I've ever used has had a different feel of the way programs open, and the desktop in general, even my turion x2 ultra laptops, and I don't like it, like more jarring or something, I don't know, maybe it's just me impressing my opinions of amd on the machine or something. lol
You're angry kid.
No, chimera is satisfied with the product. You're vehemently trying to say it's of poor quality. Your opinion is void if the product IS indeed working to satisfaction.
It doesn't matter if he has to do a few things here and they're, it's working. I had to cut some plastic tabs off of fan connectors. Some manufacturers just make things differently.
One of my issues with amd, is that every amd I've ever used has had a different feel of the way programs open, and the desktop in general, even my turion x2 ultra laptops, and I don't like it, like more jarring or something, I don't know, maybe it's just me impressing my opinions of amd on the machine or something. lol
Give it time, he hasn't used it 24/7 yet. Bad PSUs aren't supposed to die on the first day. That would just make them broken PSUs and not bad PSUs. How am I being vehement? The word "shoddy" does not really have strong emotional connotations.
Also, it was the power supply's own included cables that didn't fit into the power supply. I mean, sure, some manufacturers make things differently but even two parts of the same product?
AMD just isn't as mainstream.
The slower low-end semprons ran like mud...
You're angry kid.
The numbers and reviews do support it being an excellent power supply for its class. And it wasn't a big modifications. You probably wouldn't even notice it if you didn't know I'd done it. It was probably a result of the seller putting cables for a different supply in with it or something, not the manufacturer.
Yeah. I'm all for liking things which aren't necessarily good (hell, I like my 286, and it isn't even a 'good' 286). The thing is, if you like inferior things, you need to admit they are inferior, otherwise you look like a jackass.
"I really like ______ despite it being technically inferior to such things as ______, _______ or _____. For me, ______ attribute is more important than _______ overall presentation" is generally a good form to follow when bragging about liking not-the-best things.
PSUs, no exception. Buy a reputable brand at an appropriate price, NEW, or face the risk of destroying your stuff. I'm running a pretty questionable unit in my desktop (it's a Thermaltake, but not a particularly good one) and I understand the risks involved. I simply don't have the cash to go buying a PSU that costs more than $40. Sooner or later I'll end up with a good one, but until such a time, it's a gamble every time I flip the back switch on and press that power button.
The fact in this case is that power supplies all tend to appear to work alike when they're new, even trashy ones. The difference is in how they die and how long it takes to die. These things aren't known until the death occurs.
Most probably. I haven't noticed regular operation differences between Intel and AMD rigs. Especially no jarring.
Give it time, he hasn't used it 24/7 yet. Bad PSUs aren't supposed to die on the first day. That would just make them broken PSUs and not bad PSUs. How am I being vehement? The word "shoddy" does not really have strong emotional connotations.
Also, it was the power supply's own included cables that didn't fit into the power supply. I mean, sure, some manufacturers make things differently but even two parts of the same product?
One of my issues with amd, is that every amd I've ever used has had a different feel of the way programs open, and the desktop in general, even my turion x2 ultra laptops, and I don't like it, like more jarring or something, I don't know, maybe it's just me impressing my opinions of amd on the machine or something. lolIt might not be the CPU causing this. I'm not entirely sure I know what you're talking about but I think I might - slow/jerky screen redraw although everything else is running smoothly?
One of my issues with amd, is that every amd I've ever used has had a different feel of the way programs open, and the desktop in general, even my turion x2 ultra laptops, and I don't like it, like more jarring or something, I don't know, maybe it's just me impressing my opinions of amd on the machine or something. lol
Wait, I'm lost. Are we talking about CPUs or electric garage doors here?
It wasn't like it was a quality issue that I had to repair. The plugs on the cables were clearly differently shaped. Like triangular shapes where square should be and such, and tabs that were larger than they should be. I'm pretty sure the cables were for a different although perhaps similar supply, as they were very close. I just had to trim down some of the shapes here and there.
I thought for a while I was doing something wrong, like had the wrong plug that went into the hole or something, but triple checked it, with a multimeter as well. None of the connectors on a single line would fit the holes, so it had to be wrong connectors.
ok seriously, how did I over-look this? That sounds ****ing SKETCH-EYE!
Forget the power supply quality question for a second, how do you nonchalantly continue to be like:
"Yeah, ebay rocks for buying hundreds of dollars worth of computer equipment. Sure, I might have to whittle something new out of the garbage they send me, but I'll do anything to save a buck or two."
If something works it doesn't make it excellent. Not at all. You need to learn that "This thing works" is different from "This thing works exceptionally".
I think getting something for 1/3 the cost is worth a little whittling time. Sides I was a professional whittler, so it's no big deal to me.
Show Image(http://img.photobucket.com/albums/v104/enthauptet/bin/cummarks.jpg)
Nvidia: still in business
ATI: owned by AMD
3DFX: out of business, employees went to Nvidia
S3: very small-time 2D/basic 3D, not a true going concern
Matrox: out of business
actually (having lived in Mtl) I know for a fact Matrox is NOT out of business. I knew about the rest, save for the surprising status of S3.
When I read that, I thought automatically of the random Maxtor products I see sometimes in Futureshops but that's probably not what you meant is it? Do they have an office here?
random maxtor futureshop product (http://www.futureshop.ca/en-CA/product/maxtor-onetouch-4-3-5-640gb-external-hard-drive-stm306404ota3e1-rk-black/10121059.aspx?path=a7fc1a9fd6e0d917dbb550b249b65302en02)
these cards are almost as long as thatShow Image(http://img.photobucket.com/albums/v104/enthauptet/bin/FXs.jpg)
Nvidia SLI = 3DFX SLI with different name. Remember, all the 3DFX folks went to Nvidia when they shut down. Nvidia just changed the words behind the acronym to avoid copyright infringement while keeping the meaning clear. The new wording doesn't even make sense, "scaled link interface"? That doesn't mean anything, whereas "scan line interleave" is a perfectly sensible technical term.
Those ARE huge, what are they? I would guess voodoos in SLI(scan line interleave, not the newfangled SLI), but I can see DVI interfaces, which I dont think they had, plus I dont see the bridge connector thingy that connects the two. Do tell.
They're nothing compared to MY voodoo card (It's one of the first ones; a real footlong!).
Yeah, but how vague is that? You have a link interface. Ok. And it's scalable. Ok. SO what's it a link interface for? Why should I care?
arent the latest (voodoo5) cards the longest of the 3dfx card? Sorry, everything you have IS awesome.
Almost a foot? My Sapphire 4870 is 11.5", and that's considered just on the long side of average. Look at the 5970s.
I remember those!
I got one of those Quadro's lying around. I use my GeForce 5200 that I picked out of a busted computer a few years ago for its VGA port. It's a decent card.
Those ARE huge, what are they? I would guess voodoos in SLI(scan line interleave, not the newfangled SLI), but I can see DVI interfaces, which I dont think they had, plus I dont see the bridge connector thingy that connects the two. Do tell.
The problem is all of my monitors are VGA only. I doubt DVI would really enhance the picture anyhow, but Nvidia could have at least had the courtesy to add a VGA port. My DVI to VGA cable (that I use) only supports 60 Hz, opposed to my adapter + VGA cable which supports 75 Hz.
Why do I even bother. I'm going to get a good ATI card that has VGA one of these days.
I don't know much about LCD refresh rates. I know I run mine at the normal 60Hz because it's a normal LCD but some of my games have options like "1920x1080@59Hz". What advantage is there to picking the 59Hz over the 60Hz option?
But, like, why would a game even have that option? To me it's the weirdest thing. It's like buying a 1920x1080 monitor and having the option to run it at 1919x1079.
Refresh rates on LCD's, like CRT's, are simple. It's how often the screen refreshes. If it refreshes at 60Hz, that's 60 times each second. 100Hz, that's 100 times a second. The higher it is, the smoother things are on the screen.
Many gamers prefer CRT's for that reason. Since the CRT's can handle better refresh rates, they can achieve very high FPS.
nobody cares about high-end CRTs
Much of the discussion of refresh rate does not apply to the liquid crystal portion of an LCD monitor. This is because while a CRT monitor uses the same mechanism for both illumination and imaging, LCDs employ a separate backlight to illuminate the image being portrayed by the LCD's liquid crystal shutters. The shutters themselves do not have a "refresh rate" as such due to the fact that they always stay at whatever opacity they were last instructed to continuously, and do not become more or less transparent until instructed to produce a different opacity. Most of the TFT LCDs used in portable devices and computer monitors need a continuous refresh. The driving voltage determines the transmittance of the liquid crystal.
nobody cares about high-end CRTs and stop saying you know something you clearly don't
Jesus people. The refresh rate on an LCD is the exact same as a CRT. The reason they dont FEEL (or look) the same is due to the way CRT draws AND illuminates each scan line as it scans VS. the fact that the LCD backlight is ALWAYS on (and in reality (CCFL) has a "refresh rate" of about 200hz, well above NORMAL human perception).
There's plenty of people who like high-end CRT's. Including me.
the response time kind of makes the refresh rate pointless though doesn't it?
Since the CRT's can handle better refresh rates, they can achieve very high FPS.
It's not just refresh rates -
To paraphrase one of my great internet heroes - "he spewed enough BS to cover a football field full of babies 3 feet deep in bull****, which sounds cool because he could have potentially murdered a football field full of babies, but he passed on this opportunity by talking about CRTs instead."
LCDs still utilize Hz (the slower the Hz, the worse the response & redrawing). They don't "refresh", but do draw the picture... which is why you can still get that "line" going down vertically with LCDs when viewing fast moving images.
Jesus people. The refresh rate on an LCD is the exact same as a CRT. The reason they dont FEEL (or look) the same is due to the way CRT draws AND illuminates each scan line as it scans VS. the fact that the LCD backlight is ALWAYS on (and in reality (CCFL) has a "refresh rate" of about 200hz, well above NORMAL human perception).
I don't know much about LCD refresh rates. I know I run mine at the normal 60Hz because it's a normal LCD but some of my games have options like "1920x1080@59Hz". What advantage is there to picking the 59Hz over the 60Hz option?
LCDs still utilize Hz (the slower the Hz, the worse the response & redrawing). They don't "refresh", but do draw the picture... which is why you can still get that "line" going down vertically with LCDs when viewing fast moving images.
Do you understand that Hz is a unit of measurement that means 1/s? That's like saying, "hey look, my measuring tape is really cool, it utilizes meters and centimeters!"
One reason why I do not prefer **** LCDs is that my eyes can detect little diagonal lines going through certain colors. It drives me nuts. CRT's don't do that.
I'll continue to troll this forum by any means possible. Keeping with that theme, I will create a pointless thread to cover ground that has already been discussed to death. Don't actually bother participating.
One reason why I do not prefer LCD's is that my eyes can detect little diagonal lines going through certain colors. It drives me nuts. CRT's don't do that.