Is Verilog's Non-determinism Really a Problem?

A series of blog posts by Jan Decaluwe criticize Verilog for being "non-deterministic" and therefore fundamentally broken.

This one has a code example that illustrates the fact. During a lively Twitter conversation about it I fleshed out the example and put it on EDA Playground so we could all view it and, even better, run it with all the simulators that EDA Playground provides and see what actually happens. Click here to see and run the example code yourself.

If you run the example you'll see that it behaves the same for all the simulators available on EDA Playground, except one. Jan explains that they are all compliant with the Verilog specification because the specification allows for either behavior. Jan expertly explains in this post how that can be.

Despite these clear examples and explanations I'm left with the feeling of, why should I care? Apparently a lot of other users of the Verilog language have the same feeling as me. I tried to see Jan's point and ask some honest questions in comments and on Twitter and I learned some more. My first question is, if we were to try and synthesize his example Verilog code, what kind of hardware would we get? Wouldn't it be non-deterministic in exactly the same way as the Verilog code? And therefore wouldn't the non-deterministic Verilog be an accurate model and not a sign of Verilog's brokenness? The tweets I got in reply agreed that nobody would design hardware like this and so that's a non-issue. See:






In other words, if you are writing your Verilog in normal RTL style then the non-determinism is not a problem. When writing Verilog that will not be synthesized (simulation-only code) people rightfully abandon the restrictions of RTL. As Chris pointed out, people could then fall into the trap of writing code like this example. I believe that's true, so let's look more closely at this example and see what's going on.

Each initial block is a process executing concurrently. The first process assigns a value to result and signals the second process that result can be read by assigning a value of one to ready. The second process blocks until the value of ready changes from zero to one. It then immediately reads the value of result. Now, I have some extensive experience writing embedded C-code with multiple threads and processes. In that world you would be insane to synchronize two threads using simple shared variables like this. That's because, similar to Verilog, you can't predict when your two threads will be scheduled and run by the OS, when interrupts will occur, and so forth. Instead you would use a synchronization construct provided by the operating system such as a semaphore or mailbox. So again I ask, why do we care about this Verilog non-determinism? Isn't it just the same as in other software environments?

I think the answer to my own question might be, no it's not the same in plain Verilog. Sure, SystemVerilog added semaphores and mailboxes (for just this reason, I assume) but plain Verilog does not have those. Using shared variables is the only way to synchronize and share information between processes (really?). If I'm not wrong then that is indeed a problem for those who want to write Verilog code at a higher level of abstraction than RTL. In fact, I'm starting to wonder about the body of verification code that my team has written at work. Do we have any cases of code like this that could suffer from Verilog's non-deterministic behavior? We are using SystemVerilog and the UVM with its TLM interfaces that give you safe ways to communicate between processes so probably not, but I can imagine where someone could be tempted to work outside the nice safe structure of the UVM.

I'm hopeful that others will read this and chime in with any needed clarifications, corrections, and help. I have some ideas for modifying the example code to make it safer that I will explore in a separate blog entry. Stay tuned.

UPDATE: I have written the follow-on post that shows the fix for this particular code example.

Comments

Jan Decaluwe said…
This comment has been removed by the author.
Unknown said…
Great analysis. Thank you for posting.
Bryan said…
Jan, there is no tool or programming environment without pitfalls and traps. Not even our shared favorite (if I assume correctly), Python. Good engineers learn the strengths and weaknesses of the various tools and make decisions about which to use based on those, and then carefully keep the weaknesses in mind as they work with their chosen tool.

I get it that you have chosen not to use Verilog or SystemVerilog. I applaud you for attempting to warn others of the shortcomings of Verilog but I feel the need to offer you some advice on how you warn others. Vague and ominous statements such as, "SystemVerilog introduced additional sources of nondeterminism," and "with Verilog you can never be sure," are not very helpful. They are what is called, F.U.D.: Fear, Uncertainty, and Doubt. Concrete explanations, like your <a href="http://www.sigasi.com/content/vhdls-crown-jewel>VHDL's crown jewel</a> post are much better.

You hope that people won't try and "fix" the example code (scare quotes yours), but I hope they do. Like I explained, non-determinism amongst concurrent processes is nothing new and there are ways to deal with it. For better or for worse, most of us in our industry are using Verilog or SystemVerilog for one reason or another and simply switching to VHDL or MyHDL is not a viable option. We need to understand the pitfalls of our current tool and how to deal with them.
Jan Decaluwe said…
My example of an innocent event-driven model that behaves unexpectedly different on a relevant simulator proves that nondeterminism is a real issue. So does my reference to the Chronologic experience, when early adopters had to rewrite their RTL.

Therefore, "with Verilog you can never be sure", is a fact. Not all language flaws are equal, and non-trivial nondeterminism is a fundamental flaw in an HDL.

As you acknowledged, I try hard to be complete and accurate in blog posts. Expecting the same standard in a comment section and immediately calling it FUD is unreasonable. It would be reasonable to look at my track record, give me some credit and assume that I don't make statements in vain.

The only speculative statement is here is your assumption that "I have chosen not to use Verilog or SystemVerilog." You have no basis for that and furthermore it is completely wrong. For the record, I am Verilog's biggest fan when compared to the latest arrivals of braindead concurrency-only HDLs.

So you feel the need to criticize me on how I say something, regardless of how relevant it is. Within the bounds of respect, and among engineers, I find that ridiculous. But I respect your right to set the house style here, so I will find a better place for my material.
Bryan said…
Jan, I'm sorry. The F.U.D. comment was rude. The thing is, I know you aren't some know-nothing crank. If I thought that I wouldn't be engaging you in conversation at all :-) It's just confusing and frustrating when in one breath you say RTL Verilog is safe and in another you say "you can never be sure with Verilog" (emphasis mine). To me, never means never, not even RTL. In my frustration I got rude with you. Forgive me. You are generous to spend as much time as you do explaining these things to those of us who still have a thing or two to learn :-)

I also hope you can excuse my assumption that you wouldn't use a language that you assert has a fundamental flaw, especially given your work with VHDL and MyHDL. It only seemed logical to me.



Jan Decaluwe said…
Ok.

For an accurate account of my thoughts on the subject, I refer to my blog posts.

I will try to remember that trying to improve on that in comment sections only tends to weaken the original content.

I also hope that people will run their regression suites on various (System)Verilog simulators, and report findings. Much more interesting and to the point than lengthy discussions.

Popular posts from this blog

SystemVerilog Fork Disable "Gotchas"

'git revert' Is Not Equivalent To 'svn revert'

SystemVerilog Streaming Operator: Knowing Right from Left