Return to the CurtPalme.com main site CurtPalme.com Home Theater Forum
A forum with a sense of fun and community for Home Theater enthusiasts!
Products for Sale ] [ FAQ: Hooking it all up ] [ CRT Primer/FAQ ] [ Best/Worst CRT Projectors List ] [ Setup Tips & Manuals ] [ Advanced Procedures ] [ Newsletters ]

 
Forum FAQForum FAQ   SearchSearch   MemberlistMemberlist  Photo AlbumsPhoto Albums  RegisterRegister 
 MembershipClub Membership   ProfileProfile   Private MessagesPrivate Messages   Log inLog in 
Blu-ray disc release list and must-have titles. Buy the latest and best Blu-ray titles to show off in your home theater!

Cannot stream from Lumagen to Sony G90/Moome
Goto page 1, 2, 3  Next
 
Post new topic   Reply to topic   Printer-friendly view    CurtPalme.com Forum Index -> CRT Projectors
View previous topic :: View next topic  
Author Message
Madmanaenewman




Joined: 11 Dec 2006
Posts: 21
Location: Canada


PostLink    Posted: Sat Sep 09, 2017 11:50 pm    Post subject: Cannot stream from Lumagen to Sony G90/Moome Reply with quote


        Register to remove this ad. It's free!
Sorry for the length of this post, but I have tried to anticipate questions by giving as much information as may be relevant.

I am trying to get my computer to display content on my G90 using a Moome card and a Radiance video processor, without success.

Equipment: Desktop computer running Windows 10 and based on Asus TUF Z270 MARK 2 MOBO and NVIDIA GeForce GTX 1070 Graphics card connected to Lumagen Radiance 2143 which is in turn connected through Output 2 via a 10M HDMI RedMere (active, unidirectional) cable to a Moome IFB-FULLHD V3 HDMI V1.4 installed in “B” slot of Sony G90. I am using a smart TV as my computer monitor which is currently attached to output 1 of the Lumagen as the graphics card only has 1 HDMI slot which is needed to feed the Lumagen and the TV does not accept a display port cable. The G90 and the Moome are used but fully tested and verified by Curt Palme. The Moome card did not come with a remote so I have ordered a new one which I should see in about a week. I have a display port to HDMI cable on order which will allow me to feed the tv/monitor direct from the graphics card thus freeing up Lumagen’s Output 1 to feed the receiver (on order). Remaining components are all brand new (the lumagen was an open box item). I also have 4 pairs of 3D NOW active shutter glasses.
Other than the computer/graphics card everything was purchased through http://www.curtpalme.com/ with the clear understanding that these components are all compatible with each other. It is my understanding that I should be able to watch 2D and 3D content stored on the computer without issue, assuming I set everything up correctly.
Problem 1: I cannot find any combination of settings that send a usable signal to the G90.
Problem 2: I am not good at this stuff, but I have been reading the manuals and trying my best to absorb and comprehend. I am however stuck.
Isolation Strategies performed:.
1. When I attached my laptop directly to the MOOME using the same RedMere HDMI cable the G90 displayed properly through HDMI1 but not HDMI2. This told me the cable, the MOOME and the G90 all work and that the Moome card must be set to HDMI1.
2. The TV/monitor works perfectly whether attached directly to the graphics card or to the Lumagen. This tells me that those components also work. It apparently is not a hardware issue, plus everything so far is “plug and play” easy.
3. When I attach the desktop computer directly to the MOOME I got mixed results. Most of the time I get a flickering image or no image. I did however finally manage to get a stable picture (settings to follow). I don’t know how good the picture is as my screen is on back order and I did not agonize over a perfect registration on my dark red wall.
4. When I attach the laptop or desktop or my nephew’s PlayStation to the Lumagen and the Lumagen to the Moome I am always back to getting flickering or no image. This suggests that the settings on the Lumagen cannot be interpreted by the MOOME.
5. I tried using input 8 and got the same results.
6. I reset the Lumagen and got the same results.
My understanding is that the Lumagen tries to obtain EDID information from the display and that the MOOME does indeed (per Curt) send EDID information when asked. Moreover, I have tried various custom settings on the Lumagen (basically ensuring it is sending 1080P, 60 Hz) to no avail. If my understandings concerning EDID are accurate then I should be able to use the default “Auto 2,1” output setting on the Lumagen with output 2 going to the G90 and everything should work, which is all now all I am trying.
These are my conclusions:
7. Most of the video settings on my desktop must be incompatible. At the time I started wring this, the stock Windows “Display Settings” offered the following options: 1920 x 1080. 1768x92, 1680x1050, 1600x1024, 1600x900, 1366x768 (recommended), 1360x768, 1280x1024, and 1280x960. I used the recommended 1366 x 768 setting and finally got a working signal (for a while), but when I look at the options now they have changed both in terms of choices and recommendation. The current top choice and recommendation is “1920 x 1080 (Recommended, 3D)”. These choices must vary depending on the NVIDIA settings.
8. These are the current NVIDIA options:
a. Under the title “HD 3D” I can choose from “1080p, 1920 x 1080” (23 or 24 Hz) or “720p, 1280 x 720” (50, 59 or 60Hz).
b. Under the title “Ultra HD, HD, SD” my choices are “1080p, 1920 x 1080” (25, 30, 50, 59 or 60Hz); “1080i, 1920 x 1080 (Native)” (29 or 30 Hz), and a bunch more
c. There are also some “PC” settings
d. I have the ability to create a custom setting (with a ton of warnings that it might damage something).
9. I have managed to get working settings but sometimes it has required rebooting the computer and right now I cannot get a working setting. Regardless, any time I have got a working setting when I try to feed that setting through the Lumagen it no longer works.
Does anyone have any idea what my settings should be:
• On the computer in order to send a working signal directly to the G90.
• On the computer in order to send to the Lumagin
• On the Lumagin in order to send a working signal to the G90.
Any help would be much appreciated.
Jules
Back to top
kal
Forum Administrator



Joined: 06 Mar 2006
Posts: 17850
Location: Ottawa, Canada

TV/Projector: JVC DLA-NZ7


PostLink    Posted: Sun Sep 10, 2017 12:46 pm    Post subject: Reply with quote

Hi Jules,

On suggestion I have is to make sure you're not feeding the CRT projector 23 or 24 Hz. You need to do 59/60Hz.
Most film based sources will default to 24Hz (such as when playing a Blu-ray disc) but that doesn't work on CRT projectors as CRT projectors cannot sync that low. A Playstation will also default to 24Hz usually - you need to turn off the "auto" refresh rate and force it to 60Hz. See the section in the Moome manual called "Installation example on SONY VPH-G90 and play with SONY PS3". It'll walk you through getting the Playstation working right. The manual is here: http://www.curtpalme.com/docs/ifb_fhdv3_v11_20140228.pdf

The same sorts of settings would apply to other sources too.

My guess is your laptop that works when connected directly to the Moome card / CRT projector is older and is doing 60Hz, but your desktop is newer and is doing a lower refresh rate as has a much newer video card with newer drivers / more features.

Before getting into 3D or even introducing the Lumagen, if you want to get the Desktop (or any source) working right I'd suggest simplifying and just doing directly from the source to the Moome card / CRT projector and leaving out the Lumagen and not trying to do 3D (yet). Get all sources working reliably then introduce other items.

Given that your issues are not really Lumagen related, I'm moving this to the CRT Projectors forum where likely more people will see it.

Good luck!

Kal

_________________

Support our site by using our affiliate links. We thank you!
My basement/HT/bar/brewery build 2.0
Back to top
View user's photo album (18 photos)
cmjohnson




Joined: 03 Apr 2006
Posts: 5180
Location: Buried under G90s


PostLink    Posted: Sun Sep 10, 2017 1:16 pm    Post subject: Reply with quote

FYI, a G9i0 WILL sync to a 24 Hz refresh rate. I will not call the picture particularly watchable but all my G90s will do it.
Back to top
kal
Forum Administrator



Joined: 06 Mar 2006
Posts: 17850
Location: Ottawa, Canada

TV/Projector: JVC DLA-NZ7


PostLink    Posted: Sun Sep 10, 2017 1:56 pm    Post subject: Reply with quote

I'm surprised it will actually sync that low, but like you said, it'll be unwatchable.

To quote Moome's Sony HDMI card manual about 24HZ, 25HZ 29.9HZ, 30HZ:

" IFB-FULLHD V3 support these modes, and will output these signal with same
refresh rate as same as input source, but most CRT projector are only support
30Hz-150Hz, using such low refresh rate may let SONY projector working with
incorrect behavior. "


Kal

_________________

Support our site by using our affiliate links. We thank you!
My basement/HT/bar/brewery build 2.0
Back to top
View user's photo album (18 photos)
kal
Forum Administrator



Joined: 06 Mar 2006
Posts: 17850
Location: Ottawa, Canada

TV/Projector: JVC DLA-NZ7


PostLink    Posted: Sun Sep 10, 2017 2:10 pm    Post subject: Re: Cannot stream from Lumagen to Sony G90/Moome Reply with quote

Madmanaenewman wrote:
... to a Moome IFB-FULLHD V3 HDMI V1.4 installed in “B” slot of Sony G90...

Are you sure you have this card and not an older one? Reason I ask is that you just ordered this card 3 days ago, so it can't have arrived yet.

Older cards do not have the same functionality.

Kal

_________________

Support our site by using our affiliate links. We thank you!
My basement/HT/bar/brewery build 2.0
Back to top
View user's photo album (18 photos)
wanderer




Joined: 11 Jan 2015
Posts: 63



PostLink    Posted: Sun Sep 10, 2017 4:10 pm    Post subject: Re: Cannot stream from Lumagen to Sony G90/Moome Reply with quote

Madmanaenewman wrote:
Does anyone have any idea what my settings should be:
• On the computer in order to send a working signal directly to the G90.

With a GTX 1070 card this option probably won't be possible. Nvidia reportedly removed all VGA output capabilities from the current line of cards to go fully digital. You could certainly try another PC, earlier graphics card or a laptop though as those should work on a direct connection.
Back to top
kal
Forum Administrator



Joined: 06 Mar 2006
Posts: 17850
Location: Ottawa, Canada

TV/Projector: JVC DLA-NZ7


PostLink    Posted: Sun Sep 10, 2017 4:36 pm    Post subject: Re: Cannot stream from Lumagen to Sony G90/Moome Reply with quote

wanderer wrote:
Madmanaenewman wrote:
Does anyone have any idea what my settings should be:
• On the computer in order to send a working signal directly to the G90.

With a GTX 1070 card this option probably won't be possible. Nvidia reportedly removed all VGA output capabilities from the current line of cards to go fully digital. You could certainly try another PC, earlier graphics card or a laptop though as those should work on a direct connection.

I imagine he means directly to the G90 through the Moome HDMI card?

Kal

_________________

Support our site by using our affiliate links. We thank you!
My basement/HT/bar/brewery build 2.0
Back to top
View user's photo album (18 photos)
wanderer




Joined: 11 Jan 2015
Posts: 63



PostLink    Posted: Sun Sep 10, 2017 6:38 pm    Post subject: Re: Cannot stream from Lumagen to Sony G90/Moome Reply with quote

kal wrote:
I imagine he means directly to the G90 through the Moome HDMI card?
Kal

I think you are right - I just interpreted the last part there as being direct from PC>G90 over VGA, which would work unless you have a latest GTX type of card.

It sounds like the digital signal Nvidia GTX 1070 > HDMI > Moome > G90 chain is working as outlined in step #3 above for some resolutions. That would confirm the GTX works (digital), the Moome and G90 work, so it's just the Radiance in the mix that's left.
Back to top
cmjohnson




Joined: 03 Apr 2006
Posts: 5180
Location: Buried under G90s


PostLink    Posted: Sun Sep 10, 2017 8:10 pm    Post subject: Reply with quote

It may be that the Moome card is very restrictive of the specific format of the resolutions it accepts, and the video card may be only capable of outputting "PC spec" formats when the card requires "video spec" formats.

Just a thought.
Back to top
Madmanaenewman




Joined: 11 Dec 2006
Posts: 21
Location: Canada


PostLink    Posted: Sun Sep 10, 2017 10:23 pm    Post subject: Reply with quote

Hello and many thanks to all those who have offered input. Time now to respond to some of the tips and questions put to me:
1) I hope I will never post a question the answer to which is covered multiple times in the manuals, internet and forums. Please rest assured that I have not been sending 24HZ signals to the G90. While I have dabbled with a few options just to see, for the most part I am only sending 60HZ signals.
2) Eliminating the Lumagen from the equation is very wise and while that is what I was trying it is good to hear confirmation. As mentioned I did manage to get a signal of sorts once or twice I forget the settings and have not been able to duplicate the results. So this is what I just tried. I booted my computer and went to the windows "Display Settings" which had defaulted to "1366 x 768 (recommended)". I then went to the NVIDIA Control Panel and put the setting there to the same PC type "1366 x 768 (native)" setting ensuring it is also on 60Hz. I verified from the Moome manual that this is in fact a supported setting. I plugged in the HDMI card and voila...a picture briefly appeared, then disappeared, appeared again with many horizontal lines dancing throughout, then disappeared and so on. Am I missing something here? I send a signal that goes to my tv just fine and is supposedly supported by the moome card but it doesn't actually work, and of course who wants to send such a poor resolution when everything is supposed to work with 1080p? So lets try that...I set the NIVIDIA to to "1080p 1920 x 1080, 60HZ", look at the windows display which set automatically to 1920 x 1080, and tried feeding that to the G90. Same result.
3) You (Kal) are correct...I did just order a new card. Right now I am using a second hand card which Curt said was the current model, but perhaps he was mistaken or perhaps the card is defective. I struggle with manuals before asking for help and have been struggling with this for well over a week and I tentatively concluded that the moome was the problem, hence my purchase of a new one. I will be returning the used one today and will not be able to test further until the new one arrives.

As to the suggestion that the card is looking for Video specs as opposed to PC Specs, my understanding is that the moome card accepts both (as per the list in the manual), that the 1366 x 768 I just tried is a PC spec and that the 1080p I just tried is a video spec, so while that seems like an interesting possibility I don't imagine that could be the issue.

Hopefully the new card will solve the problems as I am gathering from the responses and everything I have tried that my settings should work. If not I fear I will have to delve deeper into the problem than the manuals seem to cover.

I will post an update once the new card arrives.
Jules
Back to top
cmjohnson




Joined: 03 Apr 2006
Posts: 5180
Location: Buried under G90s


PostLink    Posted: Sun Sep 10, 2017 10:36 pm    Post subject: Reply with quote

Do you have a dvd or blu-ray player that outputs 1080p over HDMI?

If you do, does it work when connected directly to the G90 via the Moome card?
Back to top
Madmanaenewman




Joined: 11 Dec 2006
Posts: 21
Location: Canada


PostLink    Posted: Sun Sep 10, 2017 10:40 pm    Post subject: Reply with quote

Ha...just tried one more thing before taking the Moome card out. I rebooted the computer to see what would happen. This time the G90 said "frequency out of range". It also shows FH = 4.88 and FV = x, where x changed every time I unplugged and re plugged in the cable. Very weird. My nephew's WII (game) works though.
Back to top
kal
Forum Administrator



Joined: 06 Mar 2006
Posts: 17850
Location: Ottawa, Canada

TV/Projector: JVC DLA-NZ7


PostLink    Posted: Sun Sep 10, 2017 11:18 pm    Post subject: Reply with quote

Sounds like you may have a intermittent connection somewhere. Possibly a bad cable.

How long are you cables? What brand are they? Can you map out exactly what you're using, something like this:

PS3 source -> Brand X HDMI cable X feet long -> moome card (model?) -> Sony G90

Heed the Moome manual too with the Sony settings that are recommended - there are some specific settings in the G90 you should be using.
See my previous link to the manual.

Kal

_________________

Support our site by using our affiliate links. We thank you!
My basement/HT/bar/brewery build 2.0
Back to top
View user's photo album (18 photos)
AnalogRocks
Forum Moderator



Joined: 08 Mar 2006
Posts: 26690
Location: Toronto, Ontario, Canada

TV/Projector: Sony 1252Q, AMPRO 4000G


PostLink    Posted: Sun Sep 10, 2017 11:39 pm    Post subject: Reply with quote

Quote:
I plugged in the HDMI card and voila...a picture briefly appeared, then disappeared, appeared again with many horizontal lines dancing throughout, then disappeared and so on. Am I missing something here?


I've seen this one on a G90 when running a Farodja quadrupler/scaler via RGBHV, the video needed to be shifted to the right to get a stable image. If threre are custom settings in the Nvidia driver try adding porch pixles to shift the image to the right. I remember the newer crop of HDMI digital outputs were stingy when it comes to front and back porch pixels. Essentially if there arent enough pixels preceding the image you get the dark liney screen. CRT's need some time for the scanning beam to settle down vs fixed pixle displays that don't use scanning.
Also check your Video sync in the G90 menu and try switching from one to the other (tip and ???). This might do something.

I remember back in the olden' days (c.2007) I had trouble getting the V.2 -DVI/YPbPr Moome to work with a Geforce card. Seems the Geforce didn't put out a very strong signal. Not sure if his applies to your situation. Tell me, will this work: PC---Lumagen---PCmonitor?

Also try a second cable to the G90, maybe a 6-12 footer. Set your PC to 1920x1080@60HZ then -----Lumagen at 1080P60----->G90

Other thing you might try is PCat1080P---HDM-Ibooster/equalizer------G90 to see if you get a stable signal.

Booster: http://amzn.to/2wUoimR

Back in 2007 I ran from PC to SONY switcher with Moome card and out to SONY G520P monitor. I added an Extron 202xi analog box to horizontally shift the signal to the right. Basicaly what adding more porch pixels will also do from the video card.

_________________
Tech support for nothing

CRT.

HD done right!
Back to top
View user's photo album (27 photos)
Madmanaenewman




Joined: 11 Dec 2006
Posts: 21
Location: Canada


PostLink    Posted: Mon Sep 11, 2017 8:10 pm    Post subject: Reply with quote

Prior to returning the second-hand Moome card I compared it to the picture in the manual and noted several differences. It may not mean much or it could account for the issues I am experiencing. I cannot know until the new card arrives and I test to see whether or not the problems are solved.
My cable is this one: http://amzn.to/2jhAxVA . While 40 feet is long, and I think 30 feet would have just made it, it is supposed to support "the full 10.2 Gbps high speed HDMI data rate even with 22 AWG conductors" , to provide for "a total of over 16 million color variations" and be "capable of handling the high bandwidth requirements of 3d signals", this all compliments of using Red Mere technology. Moreover, quoting from CNET's discussion of HDMI cables, "when something goes wrong, it goes really wrong. It's often said that with an HDMI signal, you either get everything and it's perfect, or it isn't perfect and you get nothing. In fact, I've said this. If you're getting an image that looks correct, and there are no dropouts in the audio or video, then you're getting everything that's being sent. If the cable is faulty, or it's a really long run with an under-built cable, most of the time you'll just get nothing. No picture at all." Given that I can use this cable from my computer to my TV without issue I feel about 99% positive that the cable isn't the problem. However, if the new Moome card shares the existing issues I will remove the computer from my rack so that I can try a shorter cable. After eliminating the cable once and for all I will take a look at shifting the image. I have no idea how to do that but will figure it out if needed.
Cheers for now.
Back to top
kal
Forum Administrator



Joined: 06 Mar 2006
Posts: 17850
Location: Ottawa, Canada

TV/Projector: JVC DLA-NZ7


PostLink    Posted: Mon Sep 11, 2017 8:49 pm    Post subject: Reply with quote

Interesting timing, as just a couple of days ago I got an email from the Jim, president of Lumagen (Radiance manufacturer) that included some updated recommendations on HDMI cables. I've now added these to our order page here: http://www.curtpalme.com/Radiance.shtm

See the section called What HDMI cables are recommended?.

Kal

_________________

Support our site by using our affiliate links. We thank you!
My basement/HT/bar/brewery build 2.0
Back to top
View user's photo album (18 photos)
Madmanaenewman




Joined: 11 Dec 2006
Posts: 21
Location: Canada


PostLink    Posted: Mon Sep 11, 2017 9:44 pm    Post subject: Reply with quote

I had to read your updated recommendations a few times before it sunk in, and it still leaves me with questions. I do recall reading somewhere, most likely the radiance manual (I am at work now so cannot check), that cables under 2 meters or 6 feet (I forget which) were not recommended at all (I don't think the reason was given). Your recommendations state that passive cables under 1.8 meters are not recommended possibly leaving one to wonder if active cables are OK. I am also wondering what this applies to...input, output or both? My computer is rack mounted directly under the Radiance and a 6 foot cable (which I am currently using just because I had one) is not ideal from a wire management perspective, so if a short cable is OK for the input that would be preferred. A short summary might be helpful along the lines of "Summary (applies to both input and output cables): Less than 2 meters never recommended, 2 to 3 meters should be passive, over 3 meters can be passive up to 15 meters if boosted with Ethereal HDM-GA1 Gigabit Accelerator otherwise should be active" It would also be informative to know why short cables are not recommended.

I am also puzzled by the links as none of them direct to a specific cable boasting 18 Gh. Many however do talk about supporting 18 or 18.2 GBPS at 60HZ or something to that effect.

I see that my cable boasts only 10.2 GBPS. Do I conclude from this that my cable does not meet the recommended specifications?[/u]
Back to top
kal
Forum Administrator



Joined: 06 Mar 2006
Posts: 17850
Location: Ottawa, Canada

TV/Projector: JVC DLA-NZ7


PostLink    Posted: Mon Sep 11, 2017 9:57 pm    Post subject: Reply with quote

Madmanaenewman wrote:
I had to read your updated recommendations a few times before it sunk in, and it still leaves me with questions. I do recall reading somewhere, most likely the radiance manual (I am at work now so cannot check), that cables under 2 meters or 6 feet (I forget which) were not recommended at all (I don't think the reason was given).

Signal timing/bounce - they're just too short.

Note this this is not just a Radiance thing. I have a half a dozen HDMI boxes (splitters, switchers, converters, etc) behind my screen and couldn't get things to work reliably until I replaced lots of the really short cables with 10-12' ones. My devices were right next to each other so years ago when I designed the layout I bought a bunch of short 1-2' cables but it just didn't work. (Most of the boxes were from Monoprice).

Quote:
Your recommendations state that passive cables under 1.8 meters are not recommended possibly leaving one to wonder if active cables are OK.

You won't find any active cables that short most likely. Active are meant for longer runs.

Quote:
I am also wondering what this applies to...input, output or both?

Both.

Quote:
I am also puzzled by the links as none of them direct to a specific cable boasting 18 Gh.

Products change often. I can't link to specific product pages without having to constantly update links. So I link to searches instead.

Quote:
I see that my cable boasts only 10.2 GBPS. Do I conclude from this that my cable does not meet the recommended specifications?

Correct, but remember that 18Ghz is a Radiance Pro requirement with 18Ghz cards. You don't have a Pro/18Ghz. Nothing wrong with using better "certified to 18Ghz" cables on your 'regular' Radiance (1080p) setup.

Keep in mind too that the text I put on the order page is a direct cut & paste from the manufacturer (it's not my words) - if you have further questions about their recommendations or why they make them you can email them too (support@lumagen.com).

Kal

_________________

Support our site by using our affiliate links. We thank you!
My basement/HT/bar/brewery build 2.0
Back to top
View user's photo album (18 photos)
Madmanaenewman




Joined: 11 Dec 2006
Posts: 21
Location: Canada


PostLink    Posted: Sat Sep 16, 2017 4:24 am    Post subject: Reply with quote

Progress report
I now have the new Moome card. The good news is that I can finally get a steady image that appears to be of decent quality but (it took me a while to figure this out) only if watching a video of some sort. When viewing a stationary image the success is mixed at best. The work "appears" is simply because I don't yet have a screen and am projecting onto a red wall and the settings are not fine tuned.
Regardless, this proves that the used moome card was a major part of the initial problem, that the new card works fine (when viewing movies etc.) and that the cable is not an issue.
Not being able to properly project a still image, while frustrating, is something I can live with. After all I would be extremely foolish to use the G90 to project a still image for more than a few seconds anyway. I will use my small TV/monitor to select content and then switch to the G90 to view my selection.
The issue of the Lumagen however remains in full force. As soon as I put it into the mix the best I can get is a picture where random sections of what is supposed to be displayed flash in and out.
I guess I need to know what the settings on the lumagen should be. I am feeding it 1080p at 60hz which is supposed to work just fine.


Last edited by Madmanaenewman on Sat Sep 16, 2017 5:05 pm; edited 1 time in total
Back to top
AnalogRocks
Forum Moderator



Joined: 08 Mar 2006
Posts: 26690
Location: Toronto, Ontario, Canada

TV/Projector: Sony 1252Q, AMPRO 4000G


PostLink    Posted: Sat Sep 16, 2017 5:14 am    Post subject: Reply with quote

Your unstable still image may be causing the no-sync with the Lumagen.

On the other hand maybe you can hook the lumagen up to another TV and get your 1080p settings then post them here. Maybe someone can tell you the correct porch settings etc...

_________________
Tech support for nothing

CRT.

HD done right!
Back to top
View user's photo album (27 photos)
Display posts from previous:   
Post new topic   Reply to topic   Printer-friendly view    CurtPalme.com Forum Index -> CRT Projectors All times are GMT
Goto page 1, 2, 3  Next
Page 1 of 3
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You cannot attach files in this forum
You can download files in this forum