www.dead-donkey.com https://forum.dead-donkey.com/ |
|
Highest quality questions https://forum.dead-donkey.com/viewtopic.php?f=1&t=14052 |
Page 1 of 1 |
Author: | Slayer [ Thu Mar 08, 2007 6:19 pm ] |
Post subject: | Highest quality questions |
Just out of curiosity: What kind of stuff would I need in my home to make viewing highest quality movies useful? I ask this because I take it that it is not useful to watch them on my PC or regular TV. Other questions concerning highest quality (also out of curiosity ![]() - what does 1080i mean? - what is a watermark and what is its use? - how many people actually actually watch highest quality stuff because they have the "machines" to do it? Haha, when I read over this last line I get the feeling I'm gettin really old(-fashioned), while I'm only from the '70s ![]() |
Author: | PC_Arcade [ Thu Mar 08, 2007 6:55 pm ] | ||||||||||||
Post subject: | Re: Highest quality questions | ||||||||||||
A PC with TV out and / or a HDTV
They'll be fine on your PC, but they'll be better on an HDTV set (unless you have a HUGE monitor)
It's the vertical resolution, in this case 1080 lines interlaced, you'll see either 720/1080 which are the lines, followed by p or i which is progressive or interlaced.
Channel identifiers usually, they're used to identify which channel you're watching
Dunno, I don't, although I will when I get a decent HD television
|
Author: | spudthedestroyer [ Thu Mar 08, 2007 9:52 pm ] | ||||||
Post subject: | |||||||
Yup a media center, ideally a dead quiet one over running long cables.
what pca said + 1080p is full 1080 lines height, whilst 1080i is the 1080 shared between two frames. 1080p requires ultra high bitrate, ideally you'll need HDCP or HDMI to carry a stream like this.
You are wacthing fox! ![]() I was looking at an awesome tv yesterday... £5500! ![]() ![]() At the moment, the inconvienance of switching wires means i don't bother much with HD, i tend to go for rips. My monitor supports 1080p though as far as i'm aware. |
Author: | Slayer [ Thu Mar 08, 2007 10:56 pm ] |
Post subject: | |
Far out, I'm gonna buy a big ass TV for my new house AND have TV out on my graphics card! Maybe I'll try one out to see what it's like. I like quality, but it's not my main concern (that being the amount of gore ![]() |
Author: | spudthedestroyer [ Fri Mar 09, 2007 1:39 am ] |
Post subject: | |
tv-out will be far worse quality than HDCP or HDMI, any decent current tv will have HDMI, personally i'd get one with HDCP too and then you just use your secondary DVI output and treat the tv as a second monitor (I'd have to have three DVI outputs, dual monitor setup == no going back for me ![]() Whether your graphics card will correctly support encrypted video is another matter, I think Geforce 8800GTs should, but as long as you don't mess up when you choose a tv, imo you should look at hdcp/dvi inputs. |
Author: | Slayer [ Fri Mar 09, 2007 6:01 pm ] | ||||||
Post subject: | |||||||
OK, that sounds not so promising ![]() I don't have a lot of money to buy a brand new HD tv, I was just thinking about takuing over a second hand one, because most of my money will be spent on the floorboards. Right now I have a very tiny tv with no extra options for PC input and I wanted to replace that one with a bigger screen that has those options (like a Trinitron, everyone is buying plasma screens now and has to get rid of their Trinitrons ![]()
I have a Radeon 9800 Pro, not sure whether that is good enough, but it seems to be for normal rips at least.
How do I not mess up? Is it OK if the TV has those hdcp/dvi inputs? How do I recognise them? What about lower quality rips played on HD tv? WIll it look worse than on a normal tv? I have this feeling that a TV which properly supports highest quality will be too expensive for me ![]() |
Author: | spudthedestroyer [ Fri Mar 09, 2007 9:56 pm ] | ||||
Post subject: | |||||
HDMI is a next-gen video connector. Its a digital connector and the step up from Component (which are analogue). If you follow consoles, a big problem for microsoft's console pushing is there is no HDMI in the xbox2 and they are outputting video over the analogue component cables. PS3 on the other hand has HDMI. I think 2.1 of xbox will have hdmi to 1up sony. Essentially its near the top of the pecking order./
is encrypted transmission so that you can view protected HDDVDs and BluRay discs. In the future, bluray and hddvd will only play over hdmi or dvi with hdcp. PC monitors come with hdcp these days, and graphics card output is capable of doing this. You must use DVI though, not VGA or tv-out. You want to be using dvi output anyway, since its substantially higher quality and supports a greater bandwidth. Radeon 9800 Pro does not have HDCP, but if you've got dual DVI outputs, just use your second one to link to your future tv rather than tvout. You'll be able to do a resolution of upto 1600x1200 over tv-outs 800x600 (well below hd spec). TV-out is a lesser solution, use dvi. You'll need a better graphics card for hdcp which will be needed for decoding encrypted content, for example from a hd disc. If you never intend to buy into that crap, your rips won't have encryption so you can just use ordinary dvi to play your video ![]() |
Author: | Slayer [ Fri Mar 09, 2007 11:21 pm ] | ||
Post subject: | |||
I checked what the dvi output looks like. You speak of dual dvi output, but I see only one dvi connector, like in this picture. Is that one connector enough? And what about this:
Or maybe better? Or does it not make any difference? |
Author: | ViSCeRaL [ Sat Mar 10, 2007 1:17 am ] | ||
Post subject: | |||
HDTV will show up the flaws in a rip much more readily than a standard one. I watch rips on a 37" HD LCD, and pixellation is much more apparent than on my previous 32" standard def CRT TV. Fortunately, I have a 20ft long living room, so I just sit further away from the screen. ![]() |
Author: | spudthedestroyer [ Sat Mar 10, 2007 8:57 pm ] | ||
Post subject: | |||
ah, yours is the old kind, modern ones are usually dual DVIs, and the DVIs support HDCP. Yours is old and has VGA and DVI... what do you link your monitor to? if its VGA, yes you can use the DVI to hookup to a good tv. Ideally, you'd have a nice monitor linked via dvi, and then your second dvi with hdcp would be linked to your great tv. Otherwise you'd have a piddly low-resolution and not something for watching HDTV content through. Rips would be okay, but tv out these days isn't generally that good unless you get a dedicated card. If your monitor is using DVI, maybe you should get a new graphics card with dual. If you don't intend to play HD-DVD, or bluray from your pc, or any encrypted WMV files, then HDCP isn't overly important, but generally something that is HDCP certified means if can take high bandwidth so its suitable for playing HDTV over. HDTV will change your perspective on the quality of certain rips, as does using a projector. Large HDTVs are great for HDTV, but low res stuff doesn't look so good. Basically your having to stretch content to fit a higher resolution, the higher the res, the worse a rip looks unless its above or close to that native resolution ![]() Short: if your wanting a tv to link your pc to, you want to be using digital output = dvi, and not using low resolution analogue kit. |
Author: | Slayer [ Sat Mar 10, 2007 10:40 pm ] |
Post subject: | |
OK, I think I'm not gonna buy a HDTV. So it will be a low res TV anyway (for example the Trinitron ![]() ![]() Is it still useful to use DVI instead of TV out then? |
Author: | GrindCallus [ Sun Mar 11, 2007 7:48 am ] |
Post subject: | |
Well if your card came with a dvi-to-vga adapter and your trinitron has a vga input, then yes it will. Most Trintrons have an S-Video input which is better than using Coaxial (CATV) or Composite (RCA red/white/yellow). Some places have DVI-to-Svideo cables but I wouldn't trust it. Cablewise, from best to worst: a/v equals audio/video obviously HDMI (A and V) DVI (V) VGA (V) Component (V) S-Video (V) Compostie (A and V) Coaxial (A and V) |
Author: | EthrielTD [ Mon Apr 02, 2007 5:32 pm ] | ||
Post subject: | |||
Hmnnn, then i guess I`ve been able to do something odd as i have a GF5200FX with a standard tv-out (S-Video) on my second machine connected to my Samsung LE26R74BDX that`ll allow me set the second screen size to 1280x720 which works fine (not sure if my screen downscales or not) after previously having a GF4400MX that did only support 800x600. ![]() I can also set my screen size to HD resolutions in the "Advanced Timings" section of the Nvidia Driver Panel but havent played with this yet. ...so am I blessed with a wierd video card or am i just missing something? |
Author: | Slayer [ Mon Apr 02, 2007 9:15 pm ] |
Post subject: | |
Oh, on this topic: I'm getting a TV for free, so I'm dropping all my demands and just see what I get. I think it will be rather OK, not top notch (if it's utter shit I'll replace it asap). |
Author: | BadBugs [ Mon Apr 02, 2007 9:42 pm ] |
Post subject: | |
I have a pair of gforce 256mb 6800s in my dual core gaming rig, had them just over a year now & it took nvidia 9 months to release any drivers that would power the TV-out sockets. I still cant get them to work on a normal telly, yet the radeon card that I had prior to my SLi upgrade was so simple to set up, so in my experience I'd recommend radeon for CRT/Projection TV out. |
Author: | spudthedestroyer [ Mon Apr 02, 2007 11:39 pm ] | ||||
Post subject: | |||||
GF5200FX is not a Radeon 9800 pro ![]() If you don't have a hdtv set though, then it will be downscaling to 480i/p on your tv. I've noticed tv-out on a lot of graphics cards are horrendously blurry and messy.
A projector without dvi or d-sub isn't worth shit, use that since you will have a far, far better chipset handling the actual video over that then a crummy tv-out chip. ![]() Nvidia's aren't too hard to get working, update the drivers, and keep-a-clickin until it works. I had to change the pal region for one i setup a while back or something stupid, but i just wouldn't use tv-out anymore. All my video sets have DVI input now ![]() ![]() |
Author: | EthrielTD [ Tue Apr 03, 2007 12:54 am ] |
Post subject: | |
I have my main pc (with a GFX 5700LE) connected to the Samsung via the HDMI by way of a DVI/HDMI adapter, connecting the other pc was just me playing around so i could use the winamp visulisations on my bedroom set and view the second units desktop without any plugging/unplugging (the second pc is in a cupboard bolted to the wall to hold all my tunes/movies.....i had a break in a few years back and they stole my pc....thank god for back ups) so i stream stuff to my main pc and watch it off there via the DVI/HDMI and that works well. The samsung is a HDTV, and you can tell the difference when swapping channels from the HDMI to the S-Video but not quite so much as i thought. It just surprised me when i tried it that the GF5200 did go over 800x600 after the previous GF4400 didn`t as i got the impression that it was a limit of the tv-out interface, rather than it being more card specific. (Was using a CRT at the time) Now that i think about it my mate had a Radeon (dunno what model) and that only went to 800x600 too. So i`d be right in thinking that the 9800 pro is limited to 800x600 over tv-out rather than the interface itself? ....and now i`ve confused myself, if not everyone else. |
Page 1 of 1 | All times are UTC [ DST ] |
What's blood for, if not for shedding? |