Jump to content
Light-O-Rama Forums

Jerky fades only when running a show


k6ccc

Recommended Posts

My first controllers arrived a few days ago and I am doing some testing on the bench. I am finding that all my lighting circuits in use for my testing exhibit a jerkiness that I explain in more detail below - but only when running a show. When testing the sequence from the sequence editor, the lights fade correctly.

Hardware setup: I am testing a single CMB16D-QC DC controller with currently 3 lighting circuits hooked up. Channel 2 is string of 3 christmas light LEDs from a WalMart string that I have butchered to extract just 3 LEDs and then I added a 330 ohm resistor to make the current about 25mA. Channel 6 is a Malibu light (incan), and channel 7 is a Home Depot special 3W 12V LED spotlight. All three lighting circuits have less than 10 feet of wire between the controller and the lights. The controller is being powered from an Astron 12V DC power supply that is rated for 9 A continuous and 12 A intermittent.
The computer is high end Dell desktop that is running nothing but the LOR software and currently a web browser to type this message. I have a single LOR USB485 interface using the supplied cable and a 6 foot Cat-5 network cable between the USB485 and the controller. The USB cable is plugged into a USB port on the computer (not via a USB hub).

Software: I am using S3 version 3.02 with an advanced license.

The sequence I'm running has a total of 7 channels defined and consists of a 1 minutes series of 1 second fade ups, 1 second fade downs, some 4 fade ups and 4 second fade downs, a few seconds of shimmer and twinkle, and some 1 second on / off / on / off sequences. Really basic.

When I run the sequence from the sequence editor, the lights do exactly what I expect them to do. However I created a show that has just the one sequence, and then set a schedule for it to run for a while. While the show is running, a fade up starts from 0% and starts up to maybe 50% and then goes back to about 20% and then completes the fade to 100%. Fade down does the opposite. The percentages are not necessarily the same every time, but the concept is the same. All three lighting circuits are doing the same thing.

I have read quite a few forum topics about jerky fades using LEDs, but I really don't think that is my problem. First of all, this is a DC controller, not an AC controller feeding LEDs through rectifiers. Secondly the incandecent Malibu light is doing the same thing. Third and most important is that the fades work perfectly when being controlled by the sequence editor.

Anyone got any ideas?

Link to comment
Share on other sites

Solved! I figured out my own problem - amazing what a few hours of sleep will accomplish. Last night when I discovered this "problem", I had never even looked at the show editor before, and when I set up my test show, I managed to leave the test sequence in two places in the show - once in backgrounds and one in animations. That left the same sequence running twice at almost exactly the same time. Had I been awake, I might have noticed things a little wrong with "on" or "off" commands, but I didn't. The fades were the only thing wrong I noticed. Came up with the solution after a decent night's sleep and just tried it, and it's working perfectly.
:]

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...