http://gamingtrend.com
November 23, 2014, 05:04:59 AM *
Welcome, Guest. Please login or register.

Login with username, password and session length
News:
 
   Home   Help Search Calendar Login Register  
Pages: [1] 2 3   Go Down
  Print  
Author Topic: GeForce GTX TITAN announced -GTX 780/770  (Read 4902 times)
0 Members and 1 Guest are viewing this topic.
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« on: February 19, 2013, 10:48:37 PM »

holy sweet jeebus..... 6 gigs of RAM.  fuck, at a grand I don't need it but that is a sexy beast they got there:



here's a bunch of of previews:

Anandtech
Gur3D
Legit Reviews
Overclockers Club
« Last Edit: May 30, 2013, 11:53:47 PM by CeeKay » Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
Teggy
Gaming Trend Senior Member

Online Online

Posts: 8779


Eat lightsaber, jerks!


View Profile
« Reply #1 on: February 20, 2013, 01:28:51 AM »

NVIDIA put up a graph that showed it still couldn't play Metro 2033 at 60fps at max settings.
Logged

"Is there any chance your jolly Garchomp is female?" - Wonderpug
TheAtomicKid
Gaming Trend Senior Member

Offline Offline

Posts: 1444



View Profile
« Reply #2 on: February 22, 2013, 03:16:22 AM »

It's not a thousand bucks, it's TWO-thousand bucks... because there's no point in buying just one of the things.... (well, almost no point. two card sli works pretty good these days, though there ARE cases where it doesn't)

Kudoes to nvidia. They've put a lot of work and effort, and quality, into these cards, regardless of the 'ridiculous' pricing.
* TheAtomicKid twitches
* TheAtomicKid calculates budget... 500$/year upgrade budget, means I would need the cards to last at least 4 years to break even on the normal cycle... that's extremely achievable in this case.... anyone else notice the upgrade cycle is slowing down?

Atomic
Logged
Scraper
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 3978



View Profile
« Reply #3 on: February 22, 2013, 12:48:30 PM »

I'm still waiting on the proper 700 series to be announced. I'm still running two 460s in SLi and I have no problems running games with high to max settings. Heck I may even wait for the next ATI cards since they will no doubt be close in architecture to what the next gen consoles will have.
Logged

" And they are a strong and frightening force, impervious to, and immunized against, the feeble lance of mere reason." Isaac Asimov
Turtle
Gaming Trend Senior Member

Offline Offline

Posts: 9454



View Profile WWW
« Reply #4 on: February 23, 2013, 10:17:41 PM »

This just reminds me of Voodoo3D's ridiculous last gasp before crumbling, they made a monster card that was overpriced, and the company folded shortly after. And this was after they had already invested way too much in proprietary APIs that went nowhere and were just obsoleted, as is happening now.

Of course, NVidia is too big and diversified to crumple the way Voodoo3D did, but it seems the same thing is happening. I think what keeps them afloat is the mobile market where their tegra chips will likely do well.

PhysX and CUDA, and other proprietary stuff is being ignored by developers, except the ones that want NVidia to pay them to use it. Instead, as seen with the PS4, open GPU computing APIs.
Logged
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #5 on: February 23, 2013, 10:28:48 PM »

Quote from: TheAtomicKid on February 22, 2013, 03:16:22 AM

It's not a thousand bucks, it's TWO-thousand bucks... because there's no point in buying just one of the things.... (well, almost no point. two card sli works pretty good these days, though there ARE cases where it doesn't)

Kudoes to nvidia. They've put a lot of work and effort, and quality, into these cards, regardless of the 'ridiculous' pricing.
* TheAtomicKid twitches
* TheAtomicKid calculates budget... 500$/year upgrade budget, means I would need the cards to last at least 4 years to break even on the normal cycle... that's extremely achievable in this case.... anyone else notice the upgrade cycle is slowing down?

Atomic

from what I understand this is a very limited run, so you may have to factor luck into it too biggrin
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
Purge
Gaming Trend Staff
Gaming Trend Senior Member

Online Online

Posts: 18614



View Profile WWW
« Reply #6 on: February 24, 2013, 03:17:32 AM »

After playing Borderlands 2 with a single 460 with everything turned up including full PhysX - next gen consoles better have this in the bag. Nothing is as sexy as a phase shift where all the debris gets pulled into a swirling mass of pain (as I shoot and/or stab said victim to death).

Do I need this card? No.

Will I get one? No.

I'd rather own an Xbox NEXT and a PS4. My PC is doing just fine ATM.

The next upgrade will be a new system entirely, save perhaps the brand new Samsung 840 250GB SSD I got.
Logged

"If it weren't for Philo T. Farnsworth, inventor of television, we'd still be eating frozen radio dinners." - Johnny Carson
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #7 on: March 20, 2013, 05:28:05 PM »

damn, this was done with a Titan.
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
mashakos
Welcome to Gaming Trend

Offline Offline

Posts: 2


View Profile
« Reply #8 on: March 20, 2013, 09:33:37 PM »

Quote from: CeeKay on March 20, 2013, 05:28:05 PM

damn, this was done with a Titan.
it is impressive, but notice how the only geometry shown on screen was the head? That's as far as the Titan can handle. Probably another 15-20 years before a PC can render game characters that detailed.
« Last Edit: March 20, 2013, 09:35:46 PM by mashakos » Logged
Misguided
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 4850


Semi-acquatic egg-laying mammal of action


View Profile
« Reply #9 on: March 20, 2013, 10:19:55 PM »

Hey mashakos,

Wouldn't it be the case that the rest of the body would need drastically less in terms of the geometry and processing, though?
Logged

Ruining language with my terrible words.
mashakos
Welcome to Gaming Trend

Offline Offline

Posts: 2


View Profile
« Reply #10 on: March 20, 2013, 10:34:12 PM »

Quote from: Misguided on March 20, 2013, 10:19:55 PM

Hey mashakos,

Wouldn't it be the case that the rest of the body would need drastically less in terms of the geometry and processing, though?
you'll get the dreaded uncanny valley effect - especially if the hands look like they're taken off a mannequin's. It's the biggest problem with L.A. Noire.
Logged
shon
Gaming Trend Senior Member

Offline Offline

Posts: 698


View Profile WWW
« Reply #11 on: April 18, 2013, 12:34:51 AM »

I just setup a 3 monitor system and I decided to go with the Titan, it was a lot of money, but for once I wanted the best.  The card is amazing, powerful but very quiet.  I did do a little overclocking just for fun and it was pretty easy.  Now all my games run smooth even at the high resolution I have to run at.  Playing on a 3 monitor setup is just awesome, from racing to FPS the experience is unbelievable and I'm thankful that I could afford it.

 
Logged

CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #12 on: May 19, 2013, 08:20:20 PM »

Quote from: Scraper on February 22, 2013, 12:48:30 PM

I'm still waiting on the proper 700 series to be announced.

they're coming.
« Last Edit: May 19, 2013, 08:24:47 PM by CeeKay » Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
Knightshade Dragon
Administrator
Gaming Trend Senior Member

Offline Offline

Posts: 21081



View Profile WWW
« Reply #13 on: May 20, 2013, 10:20:35 PM »

Quote from: CeeKay on May 19, 2013, 08:20:20 PM

Quote from: Scraper on February 22, 2013, 12:48:30 PM

I'm still waiting on the proper 700 series to be announced.

they're coming.

I'm running a 570 on my desktop and doing ok.  I'm sure come Witcher 3 time I'll need a 7xx series card though. 
Logged

Ron Burke
EiC, Director of Gaming Trend
Gamertag:
Gaming Trend
PS3 Tag: GamingTrend
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #14 on: May 23, 2013, 06:25:44 PM »

Quote from: Knightshade Dragon on May 20, 2013, 10:20:35 PM

Quote from: CeeKay on May 19, 2013, 08:20:20 PM

Quote from: Scraper on February 22, 2013, 12:48:30 PM

I'm still waiting on the proper 700 series to be announced.

they're coming.

I'm running a 570 on my desktop and doing ok.  I'm sure come Witcher 3 time I'll need a 7xx series card though. 

they'll probably be at 8xx by then slywink

nVidia has offcially launched the 780.
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #15 on: May 30, 2013, 11:55:39 PM »

the 770 series is coming out next.
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
TiLT
Gaming Trend Senior Member

Offline Offline

Posts: 6702


Preaching to the choir


View Profile WWW
« Reply #16 on: June 09, 2013, 12:19:55 PM »

Just in time for my new PC, which now has a 780. smile
Logged
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #17 on: June 15, 2013, 07:37:01 PM »

I know the RAM doesn't matter unless you're multi-monitoring, but I find it odd that the 780 maxes out at 3 gigs of RAM while you get get a 770 with 4 gigs (or at least 4 gig 780's aren't being advertised at the places I'm looking at, which I concede is possible).  you'd think they'd market some 4 gig 780's towards the enthusiast crowd; it'd probably get all the buyers who don't want to spend a grand on the Titan but are willing to spend a bit more for the extra RAM.
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
TheAtomicKid
Gaming Trend Senior Member

Offline Offline

Posts: 1444



View Profile
« Reply #18 on: June 17, 2013, 12:35:28 AM »

GTX 770/680   :256 bit memory interface... (4x64bit channels) 512MB/channel on each channel gets you 2GB of memory total.

GTX 780/Titan :384 bit memory interface... (6x64bit channels) 512MB/channel on each channel gets you 3GB of memory total.

GTX 770/680 with 4GB of memory uses 2 sets of chips per channel (backside of the card ftw!)

so a 780 the next step up is... 6GB... which is what you get on a titan. And eventually, someone WILL sell a 6GB 780 at a premium, no doubt.

All said? 3GB is more than enough atm. 2GB is sometimes not enough... 1 or 2 games currently are capable of maxing out a 2GB card with textures, etc. And deep levels of AA and anisotropy will also consume additional memory.

I was looking at the 4GB 680/670 units before the current cycle, but 3GB is going to be the sweet spot for some time to come. You'll note that AMD cards aren't offering any more than that either. It's just not needed, save in very special cases.

Atomic
« Last Edit: June 17, 2013, 12:42:53 AM by TheAtomicKid » Logged
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #19 on: July 29, 2013, 12:41:23 AM »

a bit late, but thanks for the breakdown Atomic.

Quote from: TiLT on June 09, 2013, 12:19:55 PM

Just in time for my new PC, which now has a 780. smile

any impressions?  I'm about to pull the trigger on one myself, and use my GTX 580 as a dedicated PhysX card.  not sure how much of a boost it will give, but I've seen benchmarks of a 680 coupled with a 580 where it gave a 15fps boost, and every bit will help with if I go 3D with my new 120mhz monitor from BenQ.
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #20 on: July 29, 2013, 04:34:45 PM »

NM, said what the heck and pulled the trigger.  they're also giving away free copies of Splinter Cell Blacklist Deluxe Edition with them, so in effect it's only costing me $590 for the card since I already had a pre-order on Amazon for the game.
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
Scraper
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 3978



View Profile
« Reply #21 on: July 29, 2013, 05:37:02 PM »

I just upgraded from a pair of 460s in SLi to one 760. It should get here today. I figured that card was a good sweet spot for performance vs price. Plus I could always get a second one, put them in SLi and they will be more powerful than any one card on the market, all for $500.00. I'll more than likely wait about a year to grab the second one though, which should make it even cheaper.
Logged

" And they are a strong and frightening force, impervious to, and immunized against, the feeble lance of mere reason." Isaac Asimov
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #22 on: July 29, 2013, 05:53:47 PM »

thought about doing SLI or even a Titan, but my previous experience with Crossfired ATI's kinda turned me off of dual GPUs.  it seemed to be a crapshoot if a new game would work with them or if I'd have to wait for a patch or driver update to get them to work right.  I remember being all excited to fire up Crysis 2 with two GPUs, only to be treated with flickering graphics that resulted in me having to disable one of the cards whenever I ran it until they fixed the issue.  happened with a few other games as well, so I figure it'll be awhile before I go back to two GPUs.  of course that was ATI, maybe nVidia has better luck.
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
Scraper
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 3978



View Profile
« Reply #23 on: July 29, 2013, 06:05:46 PM »

Quote from: CeeKay on July 29, 2013, 05:53:47 PM

thought about doing SLI or even a Titan, but my previous experience with Crossfired ATI's kinda turned me off of dual GPUs.  it seemed to be a crapshoot if a new game would work with them or if I'd have to wait for a patch or driver update to get them to work right.  I remember being all excited to fire up Crysis 2 with two GPUs, only to be treated with flickering graphics that resulted in me having to disable one of the cards whenever I ran it until they fixed the issue.  happened with a few other games as well, so I figure it'll be awhile before I go back to two GPUs.  of course that was ATI, maybe nVidia has better luck.

Nvidia is a lot better about it. I have run in SLi with my last two card sets. I ran 8800gts for years, then upgraded to the 460s, and honestly I could run everything smooth and maxed with the 460s to this day. But I had the upgrade itch and the money to spare so I figured what the hell.
Logged

" And they are a strong and frightening force, impervious to, and immunized against, the feeble lance of mere reason." Isaac Asimov
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #24 on: July 29, 2013, 06:09:44 PM »

yeah, that's the bad thing with money to spare, it's too easy to go 'what the hell' and dive in.  I almost dropped $2000+ on a new PC build over the weekend before I slapped myself, did some research and realized I wouldn't be getting much out of it other than a shiny new rig icon_lol
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
TheAtomicKid
Gaming Trend Senior Member

Offline Offline

Posts: 1444



View Profile
« Reply #25 on: July 29, 2013, 06:13:31 PM »

Oh I forgot to mention!

I... uhh... caved in and pulled the trigger...

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127746

Reasons were due to a combination of reviews, expectations from the way the cooler is built, price, and clockspeeds....

The stock coolers are really good, but some of the aftermarkets cool better, even if they're not as snazzy. I was torn trying to decide between this model and the gigabyte windforce 3 model... went with this one because two large fans will have a tendency to run more quietly than three smaller ones, assuming similar amounts of airflow. Also, the black die in the fan blades will have a tendency to make the blades quieter... apparently clear plastic is more brittle? And to be honest the blades on the gigabyte unit don't look clear... more smoke, which indicates there is SOME kind of die in there... just not enough to make the plastic opaque. (which may or may not matter, and the GB unit may or may not be noisier in the end... I just went with the MSI)

Final results? Happy, though admittedly expensive. Thankfully, I've been saving up spare cash on the side of everything else to pay for computer upgrades... and it covered about 2/3rds of the cost of this card, so it's only partial misbehavior. Card runs very quiet as a matter of course. MSI has done an interesting thing with this card. It's a normal card in all aspects... but they've got an app released, which allows you to dumbset silent, gaming, and overclock modes. TBH, I haven't used it.

Card is advertised at 954/1006, and I've no doubts it can do that. It defaults, however, to 902mhz... which would be enough to annoy people, but not me... because Boost 2.0 has me covered. Card is boosting without help to 1006 when put to the test, so the app is completely unnecessary in my case. According to GPU-Z, my ASIC is judged as 'average', which is a little dissappointing, but it bodes well for anyone else picking up one of these, if average results in 'dont really need anything else to make it perform'

I'd have been happier if the card had been a little cheaper? But lets put this in perspective. At one time I had a gtx 680 lightning in this machine build. (intel 3820, x79 mobo, etc).... I happen to have run a copy of the Heaven 1.0 benchmark at the time and saved it... and Heaven 1.0 was still installed to the machine. This card is benching heaven at 25% better speeds. And it's a LOT faster than the GTX 260 I had in place. (the 680 failed, had to return it)

I thought about waiting for the '780 Lightning' from MSI, but it's another month away or so, and at a price premium to boot. I decided it wouldn't be worth it, especially with Boost 2.0 on the job.... it will tend to make up the difference for people automagically.

The only caveat I have for people... gpu's have a tendency not to max out their fans, and especially for the 780/titan this is a bad thing. At 80c, the nvidia firmware starts slowing the cards down in order to keep temps under control... but sadly, most of the firmwares don't use all of the fan available to help keep it cool... resulting in a card that runs slower and/or hotter than it otherwise might. If this bothers you, you can use a copy of MSI's Afterburner software, to manually alter your fan profile... you have to keep afterburner running for it to work, but it has a setting to tell it to start with windows, so it just sits in the tray with 5 million other things biggrin

(I set mine to slowly ram up, peaking at 100% fan when the card hits 77c... you can hear it when the card starts getting hot... but the only time it does this, is when you're stressing the card.... aka, heavy gaming... which means you're not really noticing the card... and THIS card is not particularly loud when you reach that point... the gtx260 before, by comparison, was a howler)

All in all? Satisfied. And the card is not exhibiting any bad behavior at all, so it's a keeper.

9/10 (based solely on 'personal feeling' rather than any technical measurements).

Atomic
Logged
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #26 on: July 29, 2013, 06:28:55 PM »

I'll have to keep an eye on the temps with the new card.  I ordered the EVGA GTX 780 Superclocked with their new ACX coolers; I've seen a review or two that say they work pretty good and keep it below 80c under load.
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
TheAtomicKid
Gaming Trend Senior Member

Offline Offline

Posts: 1444



View Profile
« Reply #27 on: July 30, 2013, 12:50:05 AM »

Basically yeah, just keep an eye on the temps.

Btw... '80c or under during load' doesn't really mean anything. The silicon starts throttling at 80c, so in fact it WONT go over 80c unless something breaks. Ideally you want everything to max out just before 80c.. thus maximizing on the capabilities of Boost 2.0

Atomic

edit: reading comprehension ftw. 'Below 80c' does not equal '80c or below'
« Last Edit: July 30, 2013, 04:02:29 AM by TheAtomicKid » Logged
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #28 on: August 01, 2013, 09:34:48 PM »

well, got my 780, installed it and ran a few benchmarks.  I have to say I'm impressed.  I did some benchmarks on certain games last night with my GTX 580 and I'm in the process of making a comparison.

the only issue I'm having is my PC isn't recognizing the 580 I put in a separate slot to work as a PhysX card.  my mobo has 2 extra PCI-E slots, and I put it in the 8x one, which is  furthest from the x16 so the two cards wouldn't be on top of each other.  I may try the other slot, but after I run some benchmarks of games that use PhysX to see if I should even bother.  it definitely is powering up; and I have a 1200 watt PSU so that shouldn't be an issue.

BTW, I was expecting this card to be huge, but was surprised to see it was smaller than my MSI GTX 580 Lightning Extreme.
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #29 on: August 01, 2013, 11:50:20 PM »

Benchmarks (all ran at 1920x1080 w/ max settings on an i5-2500K @ 3.3ghz w/ 8 gig of RAM):

Bioshock Infinite
GTX 580   ---- 59.64 Avg/ 97.62 Max
GTX 780 ---- 110.04 Avg/ 290.9 Max

Arkham City
GTX 580   ---- 4 Min/ 94 Max/ 55 Avg
GTX 780 ---- 19 Min/ 110 Max/ 63 Avg

Unigine Valley Benchmark
GTX 580   ---- 43.2 Avg/ 1205 Score
GTX 780 ---- 48.5 Avg/ 2029 Score

Sleeping Dogs
GTX 580   ---- 30.5 Avg/ 41.5 Max
GTX 780 ---- 63.5 Avg/ 84.8 Max

Tomb Raider
GTX 580   ---- 58.5
GTX 780 ---- 105.8

Tomb Raider w/ Tress FX
GTX 580   ---- 36.1
GTX 780 ---- 74.7

Metro Last Light
GTX 580   ---- 21.08 Avg/ 37.05 Max
GTX 780 ---- 26.68 Avg/ 68.47 Max

Hitman Absolution
GTX 580   ---- 35 Max/ 27 Avg
GTX 780 ---- 64 Max / 47 Avg

============

some observations:

I think the Bioshock Infinite benchmark is slightly screwy, as the first time with the 780 it gave me a max FPS of over 500.

was surprised there weren't greater improvements on Metro Last Light or Arkham City.  Metro is weird because the average FPS only went up a few but the max FPS almost doubled.

TressFX in Tomb Raider looks kind of silly.

Sleeping Dogs would push the 580 to 81 degrees celsius, but with the ACX cooling it only hit 70.

I didn't have a benchmark for Crysis 3, but now I seem to be hitting 30FPS on average.

Unigine Valley is supposedly the latest benchmark everyone is using, and it is even included on the disc sent with the card.  not much of a gain there either on average FPS but the score shot up.

a possible caveat:  I had to upgrade my drivers to 326.41 as the ones I was using were about a year old and didn't support the 780.

an unrelated note:  I noticed that in my BIOS the RAM is set at 1333 instead of 1600; would it cause any issues if I changed it to what it is supposed to be?
« Last Edit: August 01, 2013, 11:52:05 PM by CeeKay » Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
TheAtomicKid
Gaming Trend Senior Member

Offline Offline

Posts: 1444



View Profile
« Reply #30 on: August 02, 2013, 04:07:33 AM »

Decided to futz with my card a bit this evening. Used a combination of gpu-z and afterburner to test it a bit using their windowed render test to keep the gpu at its top power strap while I futzed with afterburner, bumping the clock speed.

It's perfectly stable in windows, even up to +216 mhz (at 103% power budget for Boost 2.0 btw, don't forget that... free performance there)... however, it fails when trying to run the valley benchmark after locking in the upgraded speed.

It is, however, stable with no visible artifacts at +108 mhz, all the way through the run.

Albeit with NO interim testing to where exactly between 108 and 216 it becomes unstable, we can take away that the card can take a 10% overclock... which means the card is running roughly 90% of its actual capability in terms of gpu clockspeed.

Which, in fact, puts it smack in my personal comfort zone at the default settings. (If I overclock things, I tend to run them at about 90% of what they max out at, in order to boost longevity of the parts)

I could probably run it at the elevated settings, but there doesnt seem to be any real need for it anyways. I just bump the power budget to 103%, grant it my manual fan profile rather than letting the card bios do it, and everything else .

When you ran your valley run at 'max settings' did that include 8x AA, etc?

And did you remember to go into the nvidia control center and tweak the settings there? In particular the power management mode, tweaking from 'auto' to 'prefer maximum performance' will give you some free fps. There's also an optimization for single display vs multiple displays, though not sure how much difference that one makes.

Atomic
Logged
TheAtomicKid
Gaming Trend Senior Member

Offline Offline

Posts: 1444



View Profile
« Reply #31 on: August 02, 2013, 04:10:45 AM »

btw. nvidia control panel has an entry for telling the driver how to decide what physx hardware to use. It may be defaulting to the cpu, though I don't see why it would. Try forcing it to the 580, assuming it can see it.

Atomic
Logged
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #32 on: August 02, 2013, 04:59:39 AM »

yep, Valley was at 8x with quality set to ultra.  I forgot about the power management mode, thanks for reminding me, but in that benchmark it didn't make a difference.  I did notice one bit about the benchmark:



I'm guessing that's the GPU Boost 2.0's doing, haven't played around with any overclocking yet.

as for the phsyx, it is defaulting to the 780 (it is set to auto but underneath the box it says PhysX > GTX 780), and the 580 is not showing up.  I don't even see it in the device manager.  I'm going to try the 4X slot tomorrow to see if it works differently.  I'm not sure if it normal, but the mobo goes 8x, then 4x in the middle, and finally 16x right next to the cpu.  I've never had to use either slot so it could be an issue with the 8x.  I also checked the manual to make sure there were no switches or jumpers I needed to play with, and I didn't see anything in the BIOS that sounded like it would affect it.

I've had some beers this evening and I've found working on hardware while buzzed isn't a good idea smile

on the plus side pretty much everything is running smooth, definitely smoother than with the 580, so it is all good  thumbsup
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
TheAtomicKid
Gaming Trend Senior Member

Offline Offline

Posts: 1444



View Profile
« Reply #33 on: August 02, 2013, 05:03:38 AM »

actually it appears the program is reading the gpu speed incorrectly, I'm sad to say.

I've monitored my speeds with several programs... none of the go over my rated 1006 mhz.... valley reads my card in the 1200's.

Atomic
« Last Edit: August 02, 2013, 05:11:15 AM by TheAtomicKid » Logged
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #34 on: August 02, 2013, 05:24:15 AM »

I may have to try their Heaven one to see what it says, but that will be for tomorrow since I'm supposed to be up at 7am.  blasted shiny new toy icon_lol
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #35 on: August 02, 2013, 05:07:16 PM »

I looked up the 580's system requirements:

Quote
PCI Express or PCI Express 2.0 - compliant motherboard with one dual-width x16 graphics slot

it is a beast of a card, maybe it won't work right in anything under x16.
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #36 on: August 02, 2013, 07:21:16 PM »

tried the other slot, no go.  Took the card out altogether for now.  while it would be nice to be able to use it as a phsyx card it's not a necessity.
Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
TheAtomicKid
Gaming Trend Senior Member

Offline Offline

Posts: 1444



View Profile
« Reply #37 on: August 03, 2013, 10:45:56 PM »

I've never heard of them not working in an x8 slot....after all, that's all that's required for SLI. Are you _certain_ the slot you're plugging it into is x8? What you described... the furthest one from the main gpu... makes it sound like it's the extra x4 slot they like to tag on... which is fine as far as AMD is concerned, but nvidia has a tendency to turn their nose up at.

Anyone else want to chime in here? I have limited experience with multiple gpu's in a box. Anyone know if x8 is the minimum required even for just physx processing? (which would be insane, as it actually takes very little.... apparently an 8800gt is more than enough to fill any physx requirements... at least the last time I looked into it.

Atomic
Logged
CeeKay
Gaming Trend Staff
Gaming Trend Senior Member

Offline Offline

Posts: 71766


La-bibbida-bibba-dum! La-bibbida-bibba-do!


View Profile
« Reply #38 on: August 03, 2013, 11:19:44 PM »

here's the board I have, and the markings on the inside match up.   I double checked the manual and did see that all 3 are considered x16, they just run at x16, x8 and x4 respectively so that killed my theory the slot wouldn't be good enough.

one thing I did notice when I gave it one last shot is that when I put in the screws to hold it in place it seems to be pulling end of the card up slightly (I thought it may be due to both but it happens with just one screw in), something that didn't happen when it was in the original slot.  I'm now wondering if the connections there may be pulled up just enough to cause issues.

[edit] I forgot to add that the card was still locking into the slot when this happened, it wasn't loose at all and I still needed to press the slot release to get it out.
« Last Edit: August 03, 2013, 11:37:43 PM by CeeKay » Logged

Because I can,
also because I don't care what you want.
XBL: OriginalCeeKay
Wii U: CeeKay
Biyobi
Gaming Trend Senior Member

Offline Offline

Posts: 718


View Profile
« Reply #39 on: August 04, 2013, 06:30:26 AM »

I've never run a Physx setup but do you maybe still need to use the SLI connector?

From reading around, using the 580 in anything less than 16x may actually bottleneck your main card.  Running the 580 in the 8x slot will drop the main card's slot down to 8x as well.
Logged
Pages: [1] 2 3   Go Up
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines
Valid XHTML 1.0! Valid CSS!
Page created in 0.154 seconds with 103 queries. (Pretty URLs adds 0.038s, 2q)