Thursday, December 10, 2015

Linux Environment Management

Linux Environment Considerations

The Linux Environments that ASIC and SoC (chip) design teams use are often messy and confusing. When team members work on multiple ASIC projects that each require different sets of tools the problem is even worse. When engineers spend time fighting the environment that slows down the development of our chips little by little each day. This doesn't need to be the case. This post explains:
  • What a Linux Environment is
  • Why it's important, especially for ASIC projects
  • Techniques to configure and manage the environment
What is an Environment?
In Linux, every process runs with a set of environment variables available to it. This set of environment variables is often referred to simply as, the environment. Here are some examples of how programs use the environment:

  • The command-line shell uses the PATH environment variable to find the programs you ask it to run
  • Programs use LD_LIBRARY_PATH to find compiled libraries that they rely on
  • ASIC design and verification tools use the LM_LICENSE_FILE environment variable to determine how to contact their required license servers.

For most Linux users the environment isn't much of a concern. When they log in it gets configured by shell initialization files for the common programs and libraries that they use and they are good to go. We ASIC engineers are much more demanding of our environment. We generally use a wide array of software tools that are not included in our Linux distribution. We also keep multiple versions of each of those tools installed so we can try new versions out and revert back to using old versions when needed. Our environment needs to be configured for each of these tools and reconfigured when we want to switch which version of the tool we are using. Making matters worse most of these ASIC tools require more than just PATH and LM_LICENSE_FILE environment variables, they have a wide assortment of other variables they expect to be set in your environment for proper operation.
Managing The Environment
There are several ways to manage your Linux shell environment. Let's take a look at them.
Default Shell Initialization Files
This was mentioned in the introduction. With this technique you simply put all configuration in the shell initialization files (~/.profile, ~/.bashrc, ~/.cshrc, etc.).

  • This is the standard way of managing your environment in linux
  • Simple, easy to understand for everyone
  • You can use standard shell commands to inspect your environment: env, echo $VARIABLE_NAME, etc.
  • Only supports one version of any given tool. To use a different version of a tool you have to edit your shell initialization files, then start a new shell for them to take affect.
  • Everyone has their own initialization files, which makes it hard to ensure everyone is using the same environment
Explicit Environment Files
Another approach is to have explicit environment configuration files. When you log in or start a new terminal session a minimal environment will be configured by the normal shell initialization files, and then you issue a command to configure that shell instance with the desired environment. The environment configuration files can be simple shell-syntax files (and the command as simple as . environment-init or source environment-init). Alternatively, there is an open source tool named Environment Modules that teams often use for this.

  • Environment configuration files can be centralized so there is one file that everyone uses
  • You can maintain multiple configuration files as needed: one per tool version, one per project, one per engineering role, etc.
  • Project leads and/or tool administrators can easily create and maintain the configuration files so individual engineers don't have to
  • You can use standard shell commands to inspect your environment: env, echo $VARIABLE_NAME, etc.
  • Once an environment configuration is loaded it's difficult to unload (you need to start a fresh shell to be sure)1
  • If you use Environment Modules for this, environment configuration files have to be written in Tcl2
Per-command Environment Files
Another approach is to prefix every command that needs a special environment with a command that spawns a subshell, sets up the necessary environment, and then runs the intended command. It looks sort of like this:
envA simulation-command
This way your interactive shell is never poluted with project- or tool-specific settings (just the subshell is) and you can easily switch to a different environment on a per-command basis:
envA simulation-command
cd <another-project-area>
envB synthesis-command
  • All the same benefits of Explicit Environment Files mentioned above
  • Easy to use different environment configurations, even on a per-command basis (you don't have to start a fresh shell for each new environment)
  • Sometimes hard to remember to prefix every command
  • If you don't often use different environments the prefix feels like unnecessary awkwardness
  • Inspecting the per-command shell environment is not as simple as typing echo $VARIABLE_NAME or env, you have to do something like envA sh -c 'echo $PATH'= or =envA env
Smart Environment Manager Tool
This is basically the same as using Explicit Environment Files above, but instead of a simple source command or Environment Modules, you can use a tool that has the ability to load an environment and to safely and completely undo (unload) an environment configuration when you want to switch from one environment to another. An open source tool that does this is named albion (full disclosure: I wrote albion). Using it looks like this:
albion env projectA
albion env projectB
  • All the same benefits of Explicit Environment Files mentioned above
  • Easy to use different environment configurations and switch between them
  • Environment configuration files use sh syntax, not Tcl
  • You can't switch environments in a single command like Per-command Environment Files allows you to, but a future version of albion could support this
  • albion is still somewhat new and might need a little work or customization to fit your specific needs
A messy Linux environment can be confusing to engineers and slow down a project. With some thought and use of a good tool the Linux environment can be tamed. A tame environment will make your engineers happier and your project will go smoothly and more quickly.


1. Environment Modules claims it can cleanly undo (unload, in their terminology) an environment by simply inverting every command in the modulefile (e.g., setting a variable becomes unsetting the variable). If someone has removed or changed a command in the modulefile or deleted it altogether in the time after you loaded it this technique obviously does not work.

2. If this seems OK to you, consider that most ASIC tools provide you with an environment configuration file in csh or sh syntax that you will then have to translate into Tcl, for every version of each tool you install.

Tuesday, April 14, 2015

Mercurial Offers More Choice Than Git

I know, I know, this is the emacs vs. vi debate of our age, never to be settled and pointless to continue arguing.  I'm going to write just a little more about mercurial as compared to git (or git as compared to mercurial, whichever is less inflammatory to say) because today I realized something and I need to write about it.

I've been using mercurial daily at work for a good 5 years now and not much git.  I recently had cause to use git a little more  and I realized something.  The common belief is that git is Freedom and mercurial is tightly constrained Bondage.  I think that is mostly based on a very outdated understanding of how mercurial works and what you can do with it.  Today it has all the same commit and history editing functionality of git with record, amend, graft, rebase, histedit, and so forth commands built-in.  Mercurial is not missing flexibility and freedom enhancing functionality that git has, as far as I know.  The thing I realized is that today's mercurial actually has more freedom and flexibility than git.  Using git I felt like I was being constrained to work in the way Linus does.  For example, I really don't feel the need for the index, especially when you are able to amend commits or do all sorts of other editing of commits with rebase, yet, there's really no way around it, with git you have to use the index.  With mercurial you can choose to have index-like functionality, or not.  With git you have one way to keep track of branches, using the reference-like things that git calls branches.  These come with all the complexity and confusion of remote tracking branches and local branches and fast-forward merges (that are not really merges) and so forth.  With mercurial you have three choices for keeping track of branches, a couple of which are much more simple and easy to use than git branches. I like having these options.

I guess I can't think of anything else right now, both tools are very similar in functionality and both offer far, far more freedom and power than any other version control tool that I know of.  I'm sure git users will comment and tell us where git is more flexible than mercurial.

Monday, February 23, 2015

Why Open Source Has Not Taken Over EDA

We all know that in the world of general software Open Source has all but won. Linux is everywhere. Nearly all programming languages are Open Source (meaning their compilers/interpreters). IDE's, build tools, revision control, syntax highlighters, refactoring tools, you name it, if it's a development tool it's Open Source. Major infrastructure is Open Source too. I mentioned the Linux operating system, but all the major applications are Open Source too: web servers, databases, queueing systems, messaging systems, load balancing, caching systems, GUI frameworks, encryption, authentication, email servers, instant messaging servers, blog engines, you name it.

In the world of Electronic Design Automation (EDA), Open Source has not won. We do now develop on Linux and we use a lot of ancillary tools from the software world that are Open Source such as scripting languages, text editors, databases, web frameworks, build systems, and so forth, but our core tools are still very much closed source and commercial, namely simulation and synthesis tools.

Why haven't those tools followed the same trend that the general software world has seen? Why don't we have a solid Open Source simulator that every one in the industry rallies around, similar to the way gcc is the defacto standard C compiler? Every time I have posed this question there are a few answers that always come up. One is that our industry is small. There are far more software developers, the argument goes, and so the likelihood of a Linus Torvalds or Richard Stallman like leader emerging is small. Another answer is that we aren't software developers. Our primary skill is designing circuits, not software, therefore we just don't have the skills or drive to write our own tools like software developers have done.

While the logic in those answers seems sound, I don't believe they are the primary reason we still lack quality Open Source tools. There have been hobbyists like Linus and RMS that have started Open Source EDA tool projects that are probably at least as functional and easy to use as early versions of Linux or gcc were, but people just haven't flocked to them. Yes, the number of EDA people that are capable of writing a synthesis tool is small, but how many software developers really have the chops to write operating systems or compilers? Also not very many. There are far more consumers than producers of general Open Source software. I don't believe that's the reason there isn't any successful Open Source EDA tools.

I believe the real reason we don't have Open Source tools is because we haven't had an oppressive monopoly in EDA like Microsoft was in its day. Microsoft was so good at locking people in to their proprietary tools and crushing competition that the only answer was to not play their game. You had two choices, go the Microsoft route or go Open Source. We in the EDA world, on the other hand, actually have a (somewhat) functioning capitalist software economy. There are three (three! not just two like Coke and Pepsi) big EDA companies that sell competing software (these companies are lovingly referred to as, "the big three"). Unlike the days of Microsoft, these companies actually have to listen to their customers[1] and compete on the merits and price of their products[2]. Since we have been able to play the big three against each other and generally get at least the bare minimum of what we need, there hasn't been a strong need for a linux or gcc like project to set us free.

Hooray for capitalism! Yet somehow I'm not satisfied. I am very pro capitalism in general, don't get me wrong. I think my dissatisfaction comes from the fact that I am even more pro freedom, and EDA tools are not Free. They are not Open Source. I'm not free to dig into the code if I desire to. I'm not even free to use the tools as I chose. They are offered under onerous license agreements that don't even allow us to publish performance numbers or talk about their costs publicly[3]. When they don't perform or I encounter bugs, my only recourse is beg for mercy from the vendor or embark on the onerous task of switching vendors (just because it's possible doesn't mean it's easy). If it were open source there would be mailing lists and forums where not just the vendors customer support people participated, but other tool users and the tool developers themselves would be available to collaborate with on solutions.

This would sound like an outrageous and silly utopian dream if it wasn't already working in the software world. It's not just young single dudes living in their parents basement or slumming it in graduate cubicles at MIT anymore either. Red Hat Software is a publicly traded company making 11% profit margin on about $1.8 billion in revenue, which puts it right in among the Big Three. There are plenty of other businesses that make real profits doing various combinations of producing, supporting, and providing services based on Open Source software. It can be done. In fact I'm pretty sure if one of the Big Three even released just their simulator as Open Source they would quickly grab all of the market share for simulators and a whole lot of new customers for their other tools. They would also make the world a much better place for all of EDA.


1. Well, at least their biggest customers.

2. Well, not publicly, because their license agreements don't allow us to publish their tools' merits or prices.

3. See footnote above.

Friday, February 20, 2015

Bitcoin vs. Credit Cards

Stripe announced bitcoin support.  There has been much discussion on hacker news about it.  Lots of people are not seeing the benefits of bitcoin over credit cards for individual shoppers. They talk about the fraud protection that credit cards provide them as customers and the 1% or whatever cash back they get on each purchase from the credit card companies that they would be giving up.  I don't think they fully understand the trade-offs here.

First of all, the cash back.  Credit card companies are charging merchants a percentage of each transaction larger than the 1% whatever they are giving you back (how else do they stay in business?).  What people may be forgetting is that retailers are not just eating that percentage cost that the credit cards are charging them.  They are most definitely passing that cost on to all of us consumers in the form of higher prices.  If enough transactions happened with bitcoin that would lower prices for all of us.  Merchants could also offer a cash/bitcoin discount.

Next is the fraud protection.  There are two kinds of fraud protection that the credit card companies provide.  The first kind is if if someone steals my credit card and starts making charges that I didn't authorize.  The credit card companies protect me from the fraudulent charges the thief makes.  Hopefully we all realize that if it weren't for the credit card in the first place this kind of fraud wouldn't be possible.  If Target, for example, hadn't had a bunch of their customers' credit card information stored on their servers when they got hacked their customers wouldn't have needed this kind of protection.

The thing to understand here is that paying with a credit card is a pull operation and paying with bitcoin is a push operation.  To pay a merchant I have to give them my credit card information and they pull money from my card.  Anyone that has that information can pull money from my card, making this kind of fraud easy.  With bitcoin a merchant gives me their bitcoin address and I send (push) bitcoin to them.  I don't have to fork over any sensitive information. The merchant is safe too, by the way, because that bitcoin address they gave me cannot be used to withdraw funds from them, only to give them funds.

Another cost of this kind of fraud protection that I discovered recently is that once it is discovered that someone has stolen your credit card information you lose the use of that credit card until a new one can be issued.  That can take up to two weeks.  Over this last Christmas both of my credit cards were cancelled due to fraud.  I was leaving on a trip soon and didn't have a credit card to take with me.  I really didn't like the idea of carrying a bunch of cash around on a trip instead of a credit card (fortunately I still had a debit card).

The other kind of fraud protection that credit cards provide is chargebacks.  If someone doesn't give me the thing I thought I had paid for with my credit card I can essentially take my money back by asking the credit card company for a chargeback.  You can't do this with bitcoin because it works like cash.  Once the bitcoin leaves your hand (so to speak) you don't have any control over it anymore.  The think to keep in mind again is, this fraud protection does not come for free.  Those fees I already talked about are partly there to pay for this[1].  Now think about it, how often are you required to resort to using a chargeback?  I personally have never needed to.  I have had disputes with merchants where I thought I might need to, but I have always been able to resolve the problem by talking things out with them.  Merchants are very motivated to protect their reputation and their relationship with their customers.  Since this is the case, why are we all paying for the ability to request a chargeback with every transaction?  Could I say to Visa when buying from someone I trust such as Amazon, "hey, I don't want to pay for chargeback protection for this purchase?"  The answer, unfortunately, is no.

I do understand that there will be times when buyers will want or even need chargeback protection.  You could simply use your credit card for times like that, but could we ever get chargeback fraud protection on a purchase made with bitcoin?  I don't personally know of an easy way to do that right now, but escrow services have existed since before credit cards were invented precisely for this reason.  Escrow services could easily work with bitcoin.

I titled this "Bitcoin vs. Credit Cards" but I'm not trying to say that There Can Only Be One.  More options is better.  I see a lot of people dismissing bitcoin out of hand due to misunderstandings about the costs and benefits of bitcoin when compared to other forms of moving money around.  Hopefully this helps clear things up for a few people.


1. Another costly thing about chargebacks is people can use them to defraud merchants. They can ask for a chargeback even when the merchant did deliver the thing they promised.  This is another cost the merchant bears when accepting credit cards.  A cost that, surprise, gets passed back to you as a customer.

Sunday, December 21, 2014

Adventure in Building a Home Gym

Last year I started assembling a little garage gym so I could do some weight lifting in the comfort of home. I scoured the online classifieds and found good deals on used power racks, a bench, and some weights. I was doing pretty well finding these deals, I thought. The one oddity was with the weights. The seller wouldn't sell them separate from this monster:

I had no use for this because I had bought into the idea that compound lifts were the best and this enormous and heavy thing served only to isolate your shoulders. I took it anyway because the guy was nice and he was giving me a good deal on the weights whether this was included or not. I was also thinking in the back of my head that I have a friend that likes metalworking that could probably help me turn it into something more useful. Nearly a full year later my hope was finally realized.

I stored it as is in my garage for about 6 months. My basement was being remodeled so there were lots of things in the garage with it (on it, under it, around it) so that wasn't a big deal. Once the basement was finished and we moved everything back in, it was painfully in the way. I moved it to the side of the house where it sat for a couple months, and then finally my next-door neighbor showed me how his acetylene torch worked and this thing was now in a much more compact state:

The flat bench I originally bought was a little rickety and cheap and my idea was to turn this into a nice sturdy one. A couple weeks ago my schedule and my metalworking friends schedules finally synced up and he graciously helped me use his chop saw, MIG welder, and drill press to reform the original beast into this:

It turns out welding is pretty dang fun. Once this metal frame was done I cut out and sanded some 1/2 inch plywood for the actual bench and attached it using bolts and t-nuts. My wife graciously painted the metal with some Rustoleum, found some padding and vinyl and helped me cover the wood. Here are the last few progress pictures: