Bulldozer vs Ivy Bridge vs GPUs for Gameplay Videos
Out of Bulldozer, Ivy Bridge, and a latest-gen GPU, which do you recommend for rendering videos to be published on youtube?
lets make a couple of assumptions:
-the software being used is Dxtory and Sony Vegas Pro 11
-the game is BF3 being played at 1200p
-the GPU is a GTX 660 Ti (feel free to point out a more appropriate GPU though)
-the system has 32GB of 1333mhz ram
-keep in mind the video is only being put on youtube (1080p res), i have no idea what constraints and optimizations this use-case offers
-video quality is king (keep in mind there will be text overlays), but it only counts if it makes a difference on youtube ;)
-mobo, cpu, and gpu cost is max out at around $700, but preferably $500.
-overclocking IS allowed, but I'm honestly not that great (did 4.4Ghz 2600k on stock cooler) and I've never tried it on AMD before.
to be captain obvious, I need a GPU anyways to play BF3*. by 'I,' I mean my little brother (the one who actually plays games and has used media software before - I've never touched it personally).
*I'm not sure if CPU performance affects rendering on the GPU pipeline
feel free to talk about overall performance, cost, and balance. I'm very unknowledgable about this user scenario and I would like to learn as many tidbits as possible.
The CPU is definitely a factor in that, the GPU less so. Usually the best thing to do is to assign a specific core (or in case of BD two) in Dxtory for recording. As the recording itself is at 30 fps, the gpu won't bottleneck you there, but you'd definitely want more horsepower to enjoy from a smooth gameplay for yourself with on max settings.
Assuming you live in Canada:
CPU - 3570K - Intel Core i5 3570K Unlocked Quad Core Processor LGA1155 3.4GHZ Ivy Bridge 6MB Retail - Intel - BX80637I53570K - 145$ after rebate.
Mobo - Asus P8Z77-V - ASUS P8Z77-V LE ATX LGA1155 Z77 DDR3 2PCI-E16 2PCI-E1 3PCI SATA3 DVI HDMI DP USB3.0 Motherboard - ASUS - P8Z77-V LE - 219$
GPU - Galaxy GeForce GTX 670 GC 1006MHZ 2GB 3004MHZ GDDR5 SLI 2xDVI HDMI DisplayPort PCI-E DX11 Video Card - Galaxy Technology - 67NPH6DV5ZVX - 345$ after rebate.
Total ~ 710. If bugdet is tight you can find a cheaper but still solid Asrock board. The 670 (unlike the 660Ti) will give you the ability of running ultra at 1080p @60 fps.
i agree with a 3570k/z77 based motherboard of your choice/gtx 660Ti or 670/ 16gb ram should be more than enough (I would go for 1600mhz for cost/performance).
Yes, because you don't want to load your GPU even more while gaming.. Assigning a single core from the CPU is the best solution IMO, as the gameplay would not be affected by it much. At least that was my experience from frapsing BF3. (I was using a 580 back then).
I think there's some confusion going on. Keep in mind, I am NOT the one who will be using this machine (I barely even game); that will be my brother. I have no experience with any of the software being used or really any sort of media-creation software in general. I'm just here to learn some more about the process and how to optimize the hardware.
I was talking about GPU acceleration for Sony Vegas Pro. He says he uses it to add in text/graphics, cut the video, and encode it in the right format.
I'm under the impression that Dxtory captures the game window and outputs it into some sort of loseless video format. From there, he takes that file into SVP, does his work, and then uploads to youtube.
EDIT: Anybody have any experience with using Bulldozer for encoding? I'm under the impression this was the only performance/dollar win for BD. If so, it seems like it would be a decent choice since BF3 really doesn't seem to give a shit about CPU performance
what about ''intel quick sync''? has anyone any experience with that? I'm just suggesting it, but I've never used it.
What's the Best Platform for Transcoding Media? Find Out NOW!! Linus Tech Tips - YouTube
Intel Quick Sync Video Transcoding versus CUDA & CPU Alone NCIX Tech Tips - YouTube
GPGPU encoding is awful, I have tried many times. I encode videos quite a bit, and CPU based encoding is constant. GPGPU tends to throw random artifacts in, I have tried many different pieces of software thinking it was bad code. It needs to come a long way before I will try it again.
Its probably just the code maturity right now, since the cpu encoding has been around forever.
I have a bulldozer for encoding and it chews through video. I can see it being weak for some things, but its not bad at all for this type of stuff if your encoder uses all "8" cores.
Bulldozers are OC friendly, a cheap $30 cooler will get you running 4.5Ghz. That being said any intel will OC and trounce the AMD in lightly threaded apps while using less power. Ether Intel or AMD will serve your purpose, go with what ever is cheapest, just stay away from GPGPU encoding for right now.
you seem to have experience using bulldozer and intel. are there any interesting price points/products? is FX-8120 a good buy compared to the cost of competing intel chips at its price?
The 8120 is $149 at ncix, a 3570k is ~$200, 3770k is ~$320. An intel solution would be ~$400 minimum for board and cpu. This 8120 & msi gd65 bundle for $275, looks pretty good to me for an encoding box.
If your encoding software can take advantage of 8 threads, then an overclocked 8120 will provide the best value with your budget. Keep in mind that power consumption goes up a bit when overclocking the BD chips.
8 Bulldozer threads at 4.5 ghz will be roughly comparable to an older, non-overclocked, 6 core intel chip (i7-970, E5645 xeon) when using all threads. Maybe even slightly beat it in some tests.
|All times are GMT -7. The time now is 01:19 AM.|