Return to the CurtPalme.com main site CurtPalme.com Home Theater Forum
A forum with a sense of fun and community for Home Theater enthusiasts!
Products for Sale ] [ FAQ: Hooking it all up ] [ CRT Primer/FAQ ] [ Best/Worst CRT Projectors List ] [ Setup Tips & Manuals ] [ Advanced Procedures ] [ Newsletters ]

 
Forum FAQForum FAQ   SearchSearch   MemberlistMemberlist  Photo AlbumsPhoto Albums  RegisterRegister 
 MembershipClub Membership   ProfileProfile   Private MessagesPrivate Messages   Log inLog in 
Blu-ray disc release list and must-have titles. Buy the latest and best Blu-ray titles to show off in your home theater!

NEC XG-75 + 8800GT
Goto page Previous  1, 2
 
Post new topic   Reply to topic   Printer-friendly view    CurtPalme.com Forum Index -> Home Theater PCs
View previous topic :: View next topic  
Author Message
yonexsp




Joined: 25 Apr 2006
Posts: 311



PostLink    Posted: Wed Oct 08, 2008 3:24 am    Post subject: Reply with quote


        Register to remove this ad. It's free!
Satanier wrote:
yonexsp wrote:
Satanier, just saw your email.

OK, I think Mark & myself over the last 4 years have probably tried and had the most experience with XG's using HTPC's.

Here is what I have learnt, Mark chime in with any corrections etc:

1) XG's will accept a standard 1080p@60 signal but won't resolve it like this, but will leave a lot of unused raster which is bad! Hence the need for custom timings via either a HTPC or external scaler like a VP50 to be able to use all the raster.
2) Powerstrip with interlaced signals only works with ATI cards and Non 8xxx series NVidia cards. Which if you have an 8800GT or an 8500GT in my case, sucks! BUT
3) ATI seem to have dropped Gamma correction for Vista (that maybe changed with the latest catalyst drivers, have not checked), really freakin BAD!, as to be honest the main benefit of a HTPC for me at this point is gamma correction, I hate Vista!
3) 1080i@96 is the best and closest to a Progressive looking image you will get with interlaced, but refer to #2 above
4) 108i@72 is the next best. Both resolutions are only really better because they get rid of judder.
5) 1080p@48 flickers to much
6) XG's will do 1080p AND Resolve the lines properly at 60hz BUT, and big BUT! Takes a perfectly setup XG, perfect astig, custom timings etc etc not worth the hassle to be honest, and a NON-LC version at that for the best 1080p resolving as well. NON-XG's are sharper, BUT LC's version are awesome, coz practically all the halo's are gone.
7) XG's are infinitely tunable, colours are great and just look stunning when done right
8) 108i@60hz from the component output of an NVidia card like the 8800GT is really really good. BUT you need AnyDVD to get over the DRM crap which you don't need over VGA, but you can't use an 8800GT over VGA in interlaced mode aggghhhhh!!!! oh and you need a component input card or transcoder. BUT, there is no Raster size issue, plenty of bandwidth to play with.
9) Things were so much easier before the need for fancy video cards for HD DVD and BD Sad
10) PowerDVD is not as good as it used to be. The Toshiba external HD DVD player HD-A3, is the best for DVD but you need gamma correction BUT for 1080i support you need a DVI/HDMI input card.
11) Vista sucks! I said that already right, well it sucks!
12) Powerstrip does not work basically with 8xxx series NVidia cards, what a nightmare!
13) NVidia's custom resolution feature sucks a ss! The lack of real time adjustments aka Powerstrip is a huge loss, and effectively makes the software almost useless

What do I use now:

8500GT with component output 1080i@60hz running AnyDVD. It is a drop dead image, but is so sharp right now I can see scanlines from 8ft on a 100", but they aren't really distracting in any meaningful way.

What is need to solve all of the above?

Cheap ATI card that supports Gamma correction in Windows Vista. The ATI cards support interlaced mode, work well with Powerstrip, just the gamma needs to be fixed. But, as I just spent the money on the PC & Card, I will wait. 1080i@60 over the component looks stunning already, and having seen all the other resolutions, it matches up really really well. Yes there is some slow pan judder, but I never see it now anyway.

So, try a copy of Anydvd, use the component and away you go is my advice.

Ken



I don't have component inputs on my XG-75, I think my rasters are being fully used as well, and I guess I'm running 1080p but it looks very sharp, it looks about as good as my dell 2407wfp-hc 24" lcd, which is native resolution 1920x1200. I don't really want to buy more stuff if I can avoid it, I'm saving up for a gamma box and an HDMI option in the far future. Vista certainly does suck, that's why I still use XP =). Nvidia custom resolution sucks too, I agree. So I guess my question is, my raster seems totally used, the picture looks as sharp as my LCD monitor, (aside from the left side // corners because my green tube wont focus right there ---- issue is electronic), is there any real problem with just leaving it as is now? It's also nice to just use windows resolution config since I can easily manage my multiple monitors, one of which being the CRT. Is my XG special somehow? That I was able to get an incredible looking setup in 1080p as a newbie? I'm a bit confused I guess.


Sounds good, if you enjoy it then that is all that matters. But no XG will use full raster with 1080p. 100% Size in the menu is not the same as full raster. Take a look in the lnes and see how much sapce is left beside the raster on the sides. You'll see how much is wasted.

K./
Back to top
Satanier




Joined: 25 Aug 2008
Posts: 185



PostLink    Posted: Thu Oct 09, 2008 4:02 am    Post subject: Reply with quote

yonexsp wrote:
Satanier wrote:
yonexsp wrote:
Satanier, just saw your email.

OK, I think Mark & myself over the last 4 years have probably tried and had the most experience with XG's using HTPC's.

Here is what I have learnt, Mark chime in with any corrections etc:

1) XG's will accept a standard 1080p@60 signal but won't resolve it like this, but will leave a lot of unused raster which is bad! Hence the need for custom timings via either a HTPC or external scaler like a VP50 to be able to use all the raster.
2) Powerstrip with interlaced signals only works with ATI cards and Non 8xxx series NVidia cards. Which if you have an 8800GT or an 8500GT in my case, sucks! BUT
3) ATI seem to have dropped Gamma correction for Vista (that maybe changed with the latest catalyst drivers, have not checked), really freakin BAD!, as to be honest the main benefit of a HTPC for me at this point is gamma correction, I hate Vista!
3) 1080i@96 is the best and closest to a Progressive looking image you will get with interlaced, but refer to #2 above
4) 108i@72 is the next best. Both resolutions are only really better because they get rid of judder.
5) 1080p@48 flickers to much
6) XG's will do 1080p AND Resolve the lines properly at 60hz BUT, and big BUT! Takes a perfectly setup XG, perfect astig, custom timings etc etc not worth the hassle to be honest, and a NON-LC version at that for the best 1080p resolving as well. NON-XG's are sharper, BUT LC's version are awesome, coz practically all the halo's are gone.
7) XG's are infinitely tunable, colours are great and just look stunning when done right
8) 108i@60hz from the component output of an NVidia card like the 8800GT is really really good. BUT you need AnyDVD to get over the DRM crap which you don't need over VGA, but you can't use an 8800GT over VGA in interlaced mode aggghhhhh!!!! oh and you need a component input card or transcoder. BUT, there is no Raster size issue, plenty of bandwidth to play with.
9) Things were so much easier before the need for fancy video cards for HD DVD and BD Sad
10) PowerDVD is not as good as it used to be. The Toshiba external HD DVD player HD-A3, is the best for DVD but you need gamma correction BUT for 1080i support you need a DVI/HDMI input card.
11) Vista sucks! I said that already right, well it sucks!
12) Powerstrip does not work basically with 8xxx series NVidia cards, what a nightmare!
13) NVidia's custom resolution feature sucks a ss! The lack of real time adjustments aka Powerstrip is a huge loss, and effectively makes the software almost useless

What do I use now:

8500GT with component output 1080i@60hz running AnyDVD. It is a drop dead image, but is so sharp right now I can see scanlines from 8ft on a 100", but they aren't really distracting in any meaningful way.

What is need to solve all of the above?

Cheap ATI card that supports Gamma correction in Windows Vista. The ATI cards support interlaced mode, work well with Powerstrip, just the gamma needs to be fixed. But, as I just spent the money on the PC & Card, I will wait. 1080i@60 over the component looks stunning already, and having seen all the other resolutions, it matches up really really well. Yes there is some slow pan judder, but I never see it now anyway.

So, try a copy of Anydvd, use the component and away you go is my advice.

Ken



I don't have component inputs on my XG-75, I think my rasters are being fully used as well, and I guess I'm running 1080p but it looks very sharp, it looks about as good as my dell 2407wfp-hc 24" lcd, which is native resolution 1920x1200. I don't really want to buy more stuff if I can avoid it, I'm saving up for a gamma box and an HDMI option in the far future. Vista certainly does suck, that's why I still use XP =). Nvidia custom resolution sucks too, I agree. So I guess my question is, my raster seems totally used, the picture looks as sharp as my LCD monitor, (aside from the left side // corners because my green tube wont focus right there ---- issue is electronic), is there any real problem with just leaving it as is now? It's also nice to just use windows resolution config since I can easily manage my multiple monitors, one of which being the CRT. Is my XG special somehow? That I was able to get an incredible looking setup in 1080p as a newbie? I'm a bit confused I guess.


Sounds good, if you enjoy it then that is all that matters. But no XG will use full raster with 1080p. 100% Size in the menu is not the same as full raster. Take a look in the lnes and see how much sapce is left beside the raster on the sides. You'll see how much is wasted.

K./



You're referring to raster size on tube face? I was under the impression that a slightly smaller raster was recommended by some for the XG. I was told there are pros and cons to using the entire tube face, and pros and cons to using a smaller raster. My rasters go to about 1/2" maybe a bit less on the left and right side, although this is not uniform all the way up the side due to keystone, so it's slightly more at the bottom.
Back to top
View user's photo album (1 photos)
MikeEby




Joined: 24 Jun 2007
Posts: 5238
Location: Osceola, Indiana


PostLink    Posted: Thu Oct 09, 2008 4:26 am    Post subject: Reply with quote

There are two schools of thought on this. Better focus can most likely be achieved by using a smaller raster area, others want to maximize resolution by using most of the tube face, both ideas have merit.

Mike

_________________
Doing HD since the last century!
Back to top
Mark_A_W




Joined: 15 Mar 2006
Posts: 3068
Location: Sunny Melbourne Australia


PostLink    Posted: Fri Oct 17, 2008 10:02 pm    Post subject: Reply with quote

Satanier wrote:
yonexsp wrote:
Satanier wrote:
yonexsp wrote:
Satanier, just saw your email.

OK, I think Mark & myself over the last 4 years have probably tried and had the most experience with XG's using HTPC's.

Here is what I have learnt, Mark chime in with any corrections etc:

1) XG's will accept a standard 1080p@60 signal but won't resolve it like this, but will leave a lot of unused raster which is bad! Hence the need for custom timings via either a HTPC or external scaler like a VP50 to be able to use all the raster.
2) Powerstrip with interlaced signals only works with ATI cards and Non 8xxx series NVidia cards. Which if you have an 8800GT or an 8500GT in my case, sucks! BUT
3) ATI seem to have dropped Gamma correction for Vista (that maybe changed with the latest catalyst drivers, have not checked), really freakin BAD!, as to be honest the main benefit of a HTPC for me at this point is gamma correction, I hate Vista!
3) 1080i@96 is the best and closest to a Progressive looking image you will get with interlaced, but refer to #2 above
4) 108i@72 is the next best. Both resolutions are only really better because they get rid of judder.
5) 1080p@48 flickers to much
6) XG's will do 1080p AND Resolve the lines properly at 60hz BUT, and big BUT! Takes a perfectly setup XG, perfect astig, custom timings etc etc not worth the hassle to be honest, and a NON-LC version at that for the best 1080p resolving as well. NON-XG's are sharper, BUT LC's version are awesome, coz practically all the halo's are gone.
7) XG's are infinitely tunable, colours are great and just look stunning when done right
8) 108i@60hz from the component output of an NVidia card like the 8800GT is really really good. BUT you need AnyDVD to get over the DRM crap which you don't need over VGA, but you can't use an 8800GT over VGA in interlaced mode aggghhhhh!!!! oh and you need a component input card or transcoder. BUT, there is no Raster size issue, plenty of bandwidth to play with.
9) Things were so much easier before the need for fancy video cards for HD DVD and BD Sad
10) PowerDVD is not as good as it used to be. The Toshiba external HD DVD player HD-A3, is the best for DVD but you need gamma correction BUT for 1080i support you need a DVI/HDMI input card.
11) Vista sucks! I said that already right, well it sucks!
12) Powerstrip does not work basically with 8xxx series NVidia cards, what a nightmare!
13) NVidia's custom resolution feature sucks a ss! The lack of real time adjustments aka Powerstrip is a huge loss, and effectively makes the software almost useless

What do I use now:

8500GT with component output 1080i@60hz running AnyDVD. It is a drop dead image, but is so sharp right now I can see scanlines from 8ft on a 100", but they aren't really distracting in any meaningful way.

What is need to solve all of the above?

Cheap ATI card that supports Gamma correction in Windows Vista. The ATI cards support interlaced mode, work well with Powerstrip, just the gamma needs to be fixed. But, as I just spent the money on the PC & Card, I will wait. 1080i@60 over the component looks stunning already, and having seen all the other resolutions, it matches up really really well. Yes there is some slow pan judder, but I never see it now anyway.

So, try a copy of Anydvd, use the component and away you go is my advice.

Ken



I don't have component inputs on my XG-75, I think my rasters are being fully used as well, and I guess I'm running 1080p but it looks very sharp, it looks about as good as my dell 2407wfp-hc 24" lcd, which is native resolution 1920x1200. I don't really want to buy more stuff if I can avoid it, I'm saving up for a gamma box and an HDMI option in the far future. Vista certainly does suck, that's why I still use XP =). Nvidia custom resolution sucks too, I agree. So I guess my question is, my raster seems totally used, the picture looks as sharp as my LCD monitor, (aside from the left side // corners because my green tube wont focus right there ---- issue is electronic), is there any real problem with just leaving it as is now? It's also nice to just use windows resolution config since I can easily manage my multiple monitors, one of which being the CRT. Is my XG special somehow? That I was able to get an incredible looking setup in 1080p as a newbie? I'm a bit confused I guess.


Sounds good, if you enjoy it then that is all that matters. But no XG will use full raster with 1080p. 100% Size in the menu is not the same as full raster. Take a look in the lnes and see how much sapce is left beside the raster on the sides. You'll see how much is wasted.

K./



You're referring to raster size on tube face? I was under the impression that a slightly smaller raster was recommended by some for the XG. I was told there are pros and cons to using the entire tube face, and pros and cons to using a smaller raster. My rasters go to about 1/2" maybe a bit less on the left and right side, although this is not uniform all the way up the side due to keystone, so it's slightly more at the bottom.



No, he's refering to the unused pixels on the side of the image = the gap between the image and the raster edge. They are called the porch pixels.

For what Ken said, I wouldn't use component out. Just set 1080i out over VGA. It works for ATi and Nvidia cards.

For ATi you need to get 1080i 30hz working in Catalyst Control Centre, then tweak the timings to 1080i 96hz in Powerstrip.

Timings:
"C:\Program Files\PowerStrip\PStrip.exe" /t:1920,112,128,115,1080,46,5,30,126655,542
Back to top
yonexsp




Joined: 25 Apr 2006
Posts: 311



PostLink    Posted: Mon Oct 20, 2008 2:31 pm    Post subject: Reply with quote

How do you get 1080i over VGA on a 8xxx NVidia card? I can't get that to work at all.
Back to top
Satanier




Joined: 25 Aug 2008
Posts: 185



PostLink    Posted: Mon Oct 20, 2008 8:03 pm    Post subject: Reply with quote

yonexsp wrote:
How do you get 1080i over VGA on a 8xxx NVidia card? I can't get that to work at all.




I'm starting to think it may be impossible. I gave up, and I'm using 1080p(I think) now.
Back to top
View user's photo album (1 photos)
Display posts from previous:   
Post new topic   Reply to topic   Printer-friendly view    CurtPalme.com Forum Index -> Home Theater PCs All times are GMT
Goto page Previous  1, 2
Page 2 of 2
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You cannot attach files in this forum
You can download files in this forum