05-13-2013 08:00 AM
I have a problem with Clock source on the lx9 microboard.
I measure 66.666 Mhz clock pin with osiloskop the clock period change between 66.000- 67.000 Mhz.
Clock Source is not stable.
I measure signal from U1 component Y2 (9.pin)
Do you have any idea about this?
Solved! Go to Solution.
05-13-2013 10:18 PM
To be clear, it is not a "problem" that needs fixing. It was designed that way on purpose. Avnet has no intention of changing this on our default design. If this design doesn't meet your needs, you are welcome to change it. See the following thread:
04-06-2014 06:33 AM
It is the problem...big problem...
By default you should have stabile oscilator not emulation of low cost RC oscilator...
I tried to solve problem by going on yours advise and following the links but those links are not working.
So as a result i spending all day trying to find a solution for having stabile quarc oscilator in order to evaluate and test board.
Can we get please working solution for that problem and consider that it is problem for us who just started.
I need to check delays, timing... so many to learn and on top of that to deal with oscilator which not allows to measure any timing properly
Spread frequency solution should not be by default ...it causes many problems for starting applications.
04-07-2014 11:10 PM
I think there may be some confusion in the terminology being used here and I would like to expound upon this so that other community members can better understand the design principles behind the LX9 MicroBoard. Where you refer to this clock behavior as a big problem, perhaps you mean that the clock behavior is an incovenience to the way you have approached your design? Coming from the software world, I have struggled myself to understand the 'why' of hardware implementation around FPGA design when there are a lot of things which seem counter-intuitive at first look. However, one of the biggest things that has helped me to understand the methodology behind good design practice is to develop an appreciation for the complexitiy of designs from a system level perspective. I have worked on several 'real-world' systems where a spread specturm clock is used as the system clock for an FPGA design. In talking to the senior hardware design engineers about the 'why' of this implementation, I have found the reason for this comes from necessity for real-world products to pass radiated emissions testing which you would find under CE requirements or (here in the USA) FCC rules. Take a look here for an example of some of these testing requirements:
The spread specturm clocking devices help 'soften' the primary and harmonic frequency peaks of electromagnetic energy (as seen on a spectrum analyzer) which are radiated from a PCB thus increasing their degree of electromagnetic compatibility (or EMC). Softening these frequency peaks can mean the difference between passing one of these certification bodies (and shipping a product) or months of design work to supress the emissions through other methods.
I myself have made the mistake of starting new FPGA designs by clocking my logic directly from the clock source supplied on a board. I have also found myself frustrated with buggy programmable logic in those designs where state machines end up in invalid states frequently, logic sections are never reached, and other very difficult problems to track down. If you are doing something simple, perhaps the design can get by using the clock source in this manner but any moderately complex design can easily be affected. One thing that I have learned from more senior design engineers is that it is good design practice to take advantage of the clock management primitives within the FPGA to avoid those difficult to track down clocking problems. In the Spartan6 architecture, you can easily implement a Digital Clock Management (or DCM) block to recover the clock input and clean up missing pulses, phase shift, and other jitter problems.
Here is a piece of VHDL code from one of my designs which implements a DCM_SP primitive and a clock buffer BUFG primitive to help recover the system clock input:
DCM_SP_inst : DCM_SP
generic map (
CLKIN_PERIOD => 15.0 -- Input clock period specified in nS
port map (
CLK0 => clk_system_bufg, -- 1-bit output: 0 degree clock output
LOCKED => locked, -- 1-bit output: DCM_SP Lock Output
CLKFB => clk_system, -- 1-bit input: Clock feedback input
CLKIN => CLK_66_7MHZ, -- 1-bit input: Clock input
RST => USER_RESET -- 1-bit input: Active high reset input
BUFG_inst : BUFG
port map (
O => clk_system, -- 1-bit output: Clock buffer output
I => clk_system_bufg -- 1-bit input: Clock buffer input
reset <= not(locked); -- Generate a system reset that holds off while clocks are not locked
Here is the verilog equivalent of the same code:
// DCM Instantiation for system clock.
DCM_SP DCM_SP_inst (
.CLK0 (clk_system_bufg), // 1-bit output: 0 degree clock output
.LOCKED (locked), // 1-bit output: DCM_SP Lock Output
.CLKFB (clk_system), // 1-bit input: Clock feedback input
.CLKIN (CLK_66_7MHZ), // 1-bit input: Clock input
.RST (USER_RESET) // 1-bit input: Active high reset input
// BUFG Instantiation for system clock.
BUFG BUFG_inst (
.O (clk_system), // 1-bit output: Clock buffer output
.I (clk_system_bufg) // 1-bit input: Clock buffer input
assign reset = ~locked; // Generate a system reset that holds off while clocks are not locked
And here is the related constraints from my UCF:
# Clock and reset
NET CLK_66_7MHZ LOC = K15 | IOSTANDARD = LVCMOS33; # "CLOCK_Y2"
NET CLK_66_7MHZ TNM_NET = TNM667;
TIMESPEC TS_TNM667 = PERIOD TNM667 15ns;
NET USER_RESET LOC = V4 | IOSTANDARD = LVCMOS33 | PULLDOWN; # "USER_RESET"
I recommend trying this in your design.
If you insist upon using a more stable clock source for your LX9 MicroBoard design, please take a look at this other thread:
There are some bitstreams supplied further down in that thread which will allow the clocking device to be programmed to operate with spread spectrum mode disabled. Also, as JBethurem mentions in the other thread, there is an unpopulated oscillator footprint on the LX9 MicroBoard. So you could put down a fixed oscillator if that would better meet your design needs.
Another thing that I have found in my career is that there seems to be a big difference between what is taught in academic circles versus what is practiced in the real world by senior FPGA engineers who do this sort of thing for a living. There are FPGA design practices which I was taught in school which will synthesize and work well for simulation but are very difficult to implement properly onto an actual device. There is a whole section of the Xilinx community forums which contains a wealth of information regarding these practical good design practices:
The LX9 MicroBoard does pass CE testing and is an excellent reference design to use as a starting point and accellerate your own Spartan6 PCB design. For Avnet to change the default programming for the clock source on the LX9 MicroBoard going forward would simply confuse this matter further because there would then be a set of MicroBoards with one default behavior and another set of MicroBoards with another default behavior.
11-14-2014 03:01 AM
This is a problem. The reason it is a problem is that you don't disclose that you're using Spread Spectrum clocking anywhere in your documentation. You just define your clocks as 40Mhz, 66.7MHz and 100MHz - these figures are factually incorrect.
You're supposed to be delivering an open platform presumably that is interoperable with other platforms, but you don't provide the MOST important aspect of FPGA design - consistent clocking!
The board should be delivered with SS off or at least proper documenation of the clocking system and a how-to covering how to turn SS off.
Please excuse me for digging up an old thread.
03-30-2015 08:33 AM
I agree with the complaints of others and have struggled with the same problem for many precious hours, until I started searching for the answer here, after noticing the spread spectrum feature mentioned for the TI CDCE913 PLL on that small leaflet that comes with the EVM.
I realize very well what the regulatory advantages of spread-spectrum clocking are, but couldn't imagine that a clock that is not specified as such would be SS by default. This EVM is used as a prototyping platform and typically not for products that are offered for sale, so a much better starting point would be without SS, where you can see signals clearly on an oscilloscope. Furthermore, in such prototypes one typically doesn't need to comply with FCC rules (and I would argue that FCC rules would often be met even without SS). In my case, the clock system is related with the generation of a carrier for a communication system (and symbol timing/clocking), where this SS could not be tolerated.
As suggested by others before me - the SS should be turned off by default on this platform, or there should at least be clear warnings about it that one cannot miss, alongside with clear instructions of how this can be disabled in the TI PLL part (and not filtered/suppressed by the PLL in the Xilinx).
04-24-2015 04:25 PM
A full reference design to Read and Program the CDCE913 is now available.