SymPy 0.7.1 Released

July 30, 2011

Cross posted on the official SymPy Blog

SymPy 0.7.1 has been released on July 29, 2011. It is available at

The source distribution can be downloaded from:

You can get the Windows installer here:

And the html documentation here:

About SymPy

SymPy is a Python library for symbolic mathematics. It aims to become a full-featured computer algebra system (CAS) while keeping the code as simple as possible in order to be comprehensible and easily extensible. SymPy is written entirely in Python.

Release notes

Major changes

– Python 2.4 is no longer supported. SymPy will not work at all in
Python 2.4. If you still need to use SymPy under Python 2.4 for some
reason, you will need to use SymPy 0.7.0 or earlier.

– The Pyglet plotting library is now an (optional) external dependency.
Previously, we shipped a version of Pyglet with SymPy, but this was
old and buggy. The plan is to eventually make the plotting in SymPy
much more modular, so that it supports many backends, but this has not
been done yet. For now, still only Pyglet is directly supported.
Note that Pyglet is only an optional dependency and is only needed for
plotting. The rest of SymPy can still be used without any dependencies
(except for Python).

– isympy now works with the new IPython 0.11.

– mpmath has been updated to 0.17. See the corresponding mpmath release
notes at

– Added a Subs object for representing unevaluated substitutions. This
finally lets us represent derivatives evaluated at a point, i.e.,
`diff(f(x), x).subs(x, 0)` returns `Subs(Derivative(f(_x), _x), (_x,), (0,))`.
This also means that SymPy can now correctly compute the chain rule
when this functionality is required, such as with `f(g(x)).diff(x)`.

Hypergeometric functions/Meijer G-Functions

– Added classes hyper() and meijerg() to represent Hypergeometric and Meijer G-functions, respectively. They support numerical evaluation (using mpmath) and symbolic differentiation (not with respect to the parameters).

– Added an algorithm for rewriting hypergeometric and meijer g-functions in terms of more familiar, named special functions. It is accessible via the function hyperexpand(), or also via expand_func(). This algorithm recognises many elementary functions, and also complete and incomplete gamma functions, bessel functions, and error functions. It can easily be extended to handle more classes of special functions.


– Added FiniteSet class to mimic python set behavior while also interacting with existing Intervals and Unions

– FiniteSets and Intervals interact so that, for example `Interval(0, 10) – FiniteSet(0, 5)` produces `(0, 5) U (5, 10]`

– FiniteSets also handle non-numerical objects so the following is possible `{1, 2, ‘one’, ‘two’, {a, b}}`

– Added ProductSet to handle Cartesian products of sets

– Create using the `*` operator, i.e. `twodice = FiniteSet(1, 2, 3, 4, 5, 6) * FiniteSet(1, 2, 3, 4, 5, 6) or square = Interval(0, 1) * Interval(0, 1)`

– pow operator also works as expected: `R3 = Interval(-oo, oo)**3 ; (3, -5, 0) in R3 == True`

– Subtraction, union, measurement all work taking complex intersections into account.

– Added as_relational method to sets, producing boolean statements using And, Or, Eq, Lt, Gt, etc…

– Changed reduce_poly_inequalities to return unions of sets rather than lists of sets


– Added generating routines for integer partitions and binary partitions. The routine for integer partitions takes 3 arguments, the number itself, the maximum possible element allowed in the partitions generated and the maximum possible number of elements that will be in the partition. Binary partitions are characterized by containing only powers of two.

– Added generating routine for multi-set partitions. Given a multiset, the algorithm implemented will generate all possible partitions of that multi-set.

– Added generating routines for bell permutations, derangements, and involutions. A bell permutation is one in which the cycles that compose it consist of integers in a decreasing order. A derangement is a permutation such that the ith element is not at the ith position. An involution is a permutation that when multiplied by itself gives the identity permutation.

– Added generating routine for unrestricted necklaces. An unrestricted necklace is an a-ary string of n characters, each of a possible types. These have been characterized by the parameters n and k in the routine.

– Added generating routine for oriented forests. This is an implementation of algorithm S in TAOCP Vol 4A.

xyz Spin bases

– The represent, rewrite and InnerProduct logic has been improved to work between any two spin bases. This was done by utilizing the Wigner-D matrix, implemented in the WignerD class, in defining the changes between the various bases. Representing a state, i.e. `represent(JzKet(1,0), basis=Jx)`, can be used to give the vector representation of any get in any of the x/y/z bases for numerical values of j and m in the spin eigenstate. Similarly, rewriting states into different bases, i.e. `JzKet(1,0).rewrite(‘Jx’)`, will write the states as a linear combination of elements of the given basis. Because this relies on the represent function, this only works for numerical j and m values. The inner product of two eigenstates in different bases can be evaluated, i.e. `InnerProduct(JzKet(1,0),JxKet(1,1))`. When two different bases are used, one state is rewritten into the other basis, so this requires numerical values of j and m, but innerproducts of states in the same basis can still be done symbolically.

– The `Rotation.D` and `Rotation.d` methods, representing the Wigner-D function and the Wigner small-d function, return an instance of the WignerD class, which can be evaluated with the `doit()` method to give the corresponding matrix element of the Wigner-D matrix.

Other changes

– We now use MathJax in our docs. MathJax renders LaTeX math entierly in
the browser using Javascript. This means that the math is much more
readable than the previous png math, which uses images. MathJax is
only supported on modern browsers, so LaTeX math in the docs may not
work on older browsers.

– nroots() now lets you set the precision of computations

– Added support for gmpy and mpmath’s types to sympify()

– Fix some bugs with lambdify()

– Fix a bug with as_independent and non-commutative symbols.

– Fix a bug with collect (issue 2516)

– Many fixes relating to porting SymPy to Python 3. Thanks to our GSoC
student Vladimir Perić, this task is almost completed.

– Some people were retroactively added to the AUTHORS file.

– Added a solver for a special case of the Riccati equation in the ODE

– Iterated derivatives are pretty printed in a concise way.

– Fix a bug with integrating functions with multiple DiracDeltas.

– Add support for Matrix.norm() that works for Matrices (not just vectors).

– Improvements to the Groebner bases algorithm.

– Plot.saveimage now supports a StringIO outfile

– Expr.as_ordered_terms now supports non lex orderings.

– diff now canonicalizes the order of differentiation symbols. This is
so it can simplify expressions like `f(x, y).diff(x, y) – f(x,
y).diff(y, x)`. If you want to create a Derivative object without
sorting the args, you should create it explicitly with `Derivative`,
so that you will get `Derivative(f(x, y), x, y) != Derivative(f(x, y), y, x)`.
Note that internally, derivatives that can be computed are always
computed in the order that they are given in.

– Added functions `is_sequence()` and `iterable()` for determining if
something is an ordered iterable or normal iterable, respectively.

– Enabled an option in Sphinx that adds a `source` link next to each function, which links to a copy of the source code for that function.

In addition to the more noticeable changes listed above, there have been numerous other smaller additions, improvements and bug fixes in the ~300 commits in this release. See the git log for a full list of all changes. The command `git log sympy-0.7.0..sympy-0.7.1` will show all commits made between this release and the last. You can also see the issues closed since the last release [here](


The following people contributed at least one patch to this release (names are given in alphabetical order by last name). A total of 26 people contributed to this release. People with a * by their names contributed a patch for the first time for this release. Five people contributed for the first time for this release.

Thanks to everyone who contributed to this release!

* Tom Bachmann
* Ondřej Čertík
* Renato Coutinho
* Bill Flynn
* Bradley Froehle*
* Gilbert Gede
* Brian Granger
* Emma Hogan*
* Yuri Karadzhov
* Stefan Krastanov*
* Ronan Lamy
* Tomo Lazovich
* Sam Magura*
* Saptarshi Mandal
* Aaron Meurer
* Sherjil Ozair
* Mateusz Paprocki
* Vladimir Perić
* Mario Pernici
* Nicolas Pourcelot
* Min Ragan-Kelley*
* Matthew Rocklin
* Chris Smith
* Vinzent Steinberg
* Sean Vig
* Thomas Wiecki

Merging integration3 with sympy-0.7.0 nightmare

July 25, 2011

For a long time, there have been several problems in my integration3 branch that were fixed in master. I decided that as an incentive to finish the release, I would hold off on merging master into my branch until the 0.7.0 release was finished. Well, here’s a little timeline:

  • June 28, 2011: SymPy 0.7.0 final is released.
  • June 29, 2011: I type git merge sympy-0.7.0 in my integration3 branch.
  • July 24, 2011 (today; technically July 25 because it’s 2 AM): I finish merging sympy-0.7.0 into integration3.

That’s right, it took me over three weeks—almost a month—to merge sympy-0.7.0 into integration3 (granted, I worked on other things at the same time, such as the SciPy 2011 conference, but to me, any merge that takes longer than a day to complete is a problem). This is because git decided that I needed to fix as a merge conflict just about every single change in the release branch since the base of integration3. The total was over 100 files. You can see the final merge commit here.

So I started git mergetool, without which this whole ordeal would have been 10 times worse. The mergetool, which on my computer is opendiff, i.e., File Merge, gave the correct change by default in most cases, so I actually did not have to manually fix the majority of the conflicts. But I did have to go through and do a lot of them. I had to manually check each difference in the polys, as I had made several changes there in the course of working on integration3. In several occisaions, I had to research a change using git log -S and fancy methods. And I noticed at least two regressions in the polys, which I fixed.

mergetool was useless against and, because in my branch I had renamed these to and Fortunately, these were not really modified much by me, so I could basically just replace them with the sympy-0.7.0 versions.

Once I finished merging I had to deal with test failures. This was partly expected, as my branch has always had test failures due to my hack disabling algebraic substitution in exp, which is required for risch_integrate() to work, but there were also several unrelated ones.

Some of these were caused by wrong merge conflict resolutions by me. So I went through git diff sympy-0.7.0 change by change and made sure that nothing was different that I didn’t want to be. I would recommend doing this for any big merge.

Then, I had to fix a few bugs that caused test failures. Several semantics were changed in the release. I think the ones that I had to change were the renaming of has_any_symbols to just has, the renaming of Poly.as_basic() to Poly.as_expr(), and the swapping of the meanings of quo and exquo in the polys. There were also some doctest failures due to the change to lexicographic ordering in the printer.

After all that, there were two regressions that caused test failures. The first was the following:


In [1]: Integral((exp(x*log(x))*log(x)), x).subs(exp(x*log(x)), x**x)
⎮  x          
⎮ x ⋅log(x) dx


In [1]: Integral((exp(x*log(x))*log(x)), x).subs(exp(x*log(x)), x**x)
⎮  x⋅log(x)          
⎮ ℯ        ⋅log(x) dx

This substitution is necessary because the Risch algorithm requires expressions like x^x to be rewritten as e^{x\log(x)} before it can integrate them, but I try to convert them back after integrating so that the user gets the same thing in the result that he entered. I created issue 2571 for this.

The second was that I had several places in my docstrings with things like

Given a derivation D on k[t] and f, g in k(t) with f weakly normalized with respect to t, either raise NonElementaryIntegralException, in which case the equation Dy + f*y == g has no solution in k(t), or the quadruplet (a, b, c, h) such that a, h in k[t], b, c in k, and for any solution y in k(t) of Dy + f*y == g, q = y*h in k satisfies a*Dq + b*q == c.

The problem here is the “raise NonElementaryIntegralException,” part. The code quality checker things that this is an old style exception (like raise Exception, message), due to a poorly formed regular expression. I fixed this in a pull request.

The good news is that now a lot of stuff works that didn’t before because of fixes that were required that only existed in master. For example, the following did not work before, but now does due to improvements to RootSum:

In [1]: risch_integrate(1/(exp(5*x) + exp(x) + 1), x)
           ⎛    2                                                           
x + RootSum⎝21⋅z  + 6⋅z + 1, Lambda(_i, _i*log(-3381*_i**4/4 - 3381*_i**3/4 

                                       ⎞          ⎛     3        2          
- 625*_i**2/2 - 125*_i/2 + exp(x) - 5))⎠ + RootSum⎝161⋅z  + 115⋅z  + 19⋅z + 

1, Lambda(_i, _i*log(-3381*_i**4/4 - 3381*_i**3/4 - 625*_i**2/2 - 125*_i/2 +

 exp(x) - 5))⎠

In [2]: cancel(risch_integrate(1/(exp(5*x) + exp(x) + 1), x).diff(x))
 5⋅x    x    
ℯ    + ℯ  + 1

The general definition of the logarithmic part of an integral is a sum over the roots of a polynomial, which must be expressed as a RootSum in the general case. Previously, RootSum.diff did not work, but thanks to Mateusz, an algorithm for computing exactly the RootSum where the Lambda expression is a rational function was implemented (see this bit from our SciPy tutorial for an idea on how this works), so now the Risch Algorithm can work with RootSum objects just as well with as an ordinary sum of logarithms.

Also, there was a bug in the square free algorithm in my branch that was fixed in master that was causing wrong results (I don’t remember the expression that produced them right now), and also there was a fix by me in master to make is_rational_function() faster, as it was significantly slowing down the calculation of some integrals (for example, risch_integrate(Add(*(exp(i*x) for i in range(1000)))), which is still slow to calculate, but now it’s because of other things).

About big branches

So this merge, along with the poly12 fiasco (which by the way, I think part of the reason git made me do all these merge conflict resolutions was because polys12 was rebased from the polys11 I had merged into integration3), has shown me very clearly that it is very bad to go off with your own branch and do a lot of work and wait a long time before merging it back into the main repo.

This is what was done with polys12. Mateusz had a lot of new polynomials code that he developed in one big branch, and when it finally came to merging it back in, it was a mess. This was for several reasons, which I do not want to discuss too much here, but it became clear to everyone I think that doing this was bad, and that it would have been better to have submitted many changes as pull requests as they were made than keeping them all together in one big branch for a long time.

This model also affected my work, as I had to work off of latest the polys branch, not master, as my work relied heavily on the latest and greatest in the polys.

Well, with this merge of the main repo into my branch, I see that my branch is starting to become the same way. I originally thought that I should finish the Risch algorithm before submitting it to be merged into master. I know know that this is the wrong approach. Development in master is too fast to keep code away from it for too long. The divergence makes it more and more difficult to merge back with every time. Furthermore, there are regressions that were never noticed to be regressions because the code that would have shown them existed only in my branch. Now I have to fix these, whereas if the code were in master, the regression would have never happened in the first place, because the author would have seen it immediately from the test failures.

I also thought that I should wait to merge because there were so many bugs in my code. But I see now that this is also wrong. Merging with master will help me find these bugs, as people will actually use my code. Sure, I’ve asked people to try out risch_integrate(), and some people have (and I thank you), but having it in the default integrate() in master will result in finding more bugs in the code than I ever would alone, which is basically the way it is right now with the code living only in my own branch.

I would prepare my code for merging with master today, if it weren’t for this exp.subs hack, which causes test failures and is technically a regression, but is required for the preparsing code to the Risch algorithm to work. This is why I wrote to the list two weeks ago asking for advice on how to structure the substitution code so that we can nicely have various kinds of substitutions (e.g., exact like I need and algebraic like currently exists in exp) living together without cluttering up the code.

Therefore, I am going to focus my energies on fixing this subs problem so I can get my code merged with master. Then, when this is done, I will continue my work on implementing the remaining cases of the Risch algorithm.

So let this tale be a warning to people working on a lot of code in a big branch. This especially applies to our GSoC students, as it’s extremely easy to let your code accumulate when you’re a GSoC student (technically this branch of mine is a GSoC branch). I see that some of our students are doing a better job of this than others. To those who have your code all in one big branch that hasn’t been merged, I recommend you ready your branch for merge now. And in the future, try to break your code up into small but still meaningful chunks and submit those as pull requests. With git, it’s easy to base the code you are currently working on on code that hasn’t been merged yet, while still keeping things in small chunks for the pull requests.

On the other hand, git will only take you so far if you keep everything in a big branch, because there are going to be changes in master that will affect your work, no matter how isolated you think it is, and these are the sorts of things that it is impossible for git to fix for you. But if your code is in master, it will be supported by everyone, and any major change that affects it will have to fix it. For example, if someone changes a printer and the doctests change, then he will have to change your doctest too if it’s in master, but if it’s in your branch, then you will have to fix it when you next merge/rebase with/against master.

The SciPy 2011 Conference

July 17, 2011

So this past week, I attended the SciPy 2011 conference in Austin, TX, which was my first conference ever. Here are some highlights of the conference for me:

  • I met a ton of cool people. This included meeting several people who I had previously known from mailing lists in person for the first time. I met the SymPy developers Mateusz Paprocki and Andy Terrel, and I also had already known or heard about people like Fernando Perez, Gael Varoquaux, and Robert Kern. There are a lot of people out there who are excited to be using Python for their research, which is a real refresher from my university, where everyone is using Matlab and Maple.
  • Mateusz and I gave a tutorial on SymPy. This was one of the four introductory track tutorials. This was a great experience to teach SymPy to people. You can see the Sphinx document that we used, and there should eventually be a video posted at the SciPy 2011 website.
  • In addition to our tutorial, I attended some of the other tutorials. I particularly enjoyed the NumPy tutorial. Having never used NumPy before, I now feel comfortable with the basics. I also attended Gael Varoquaux’s tutorial on scikits.learn and Corran Webster’s tutorial on Matplotlib, Traits, and Chaco. My only regret is that the advanced track and introductory track tutorials were held at the same time, so I could not attend half of them. I plan to watch the ones I missed online.
  • The general conference was excellent. Some of the talks that I particularly enjoyed were:
    • The keynotes. I found Eric Jone’s keynote particularly relevant as the leader of SymPy, as he talked about some of the good things to do and bad things to not do when leading a scientific project. I also enjoyed Perry Greenfield’s talk about how the astronomy community moved from some old proprietary system to Python.
    • Mateusz gave a talk on his FEMhub online lab, which a was very impressive system for using Python entirely in the web browser.
    • By far the best talk of the entire conference was Fernando Perez’s talk on the new IPython 0.11, which will be coming out in about a week or so. His demo of the new features such as the QT console and html notebook were very impressive. If you want to watch just one video from the conference, I would recommend that one.
    • Mark Dewing gave a talk about a system he wrote using SymPy to do automated derivation of equations. The system is impressive, and contains some features that would be nice to backport to SymPy. He told me that he wants to do this, so follow the mailing list. You can see what he has so far on his derivation_modeling branch at GitHub.
    • The lightning talks. These are very short talks at the end of the conference that are only five minutes long. In addition to many interesting talks, both Mateusz and I gave a lightning talk. Mateusz gave a talk on SymPy Live, which he recently improved to do things like give LaTeX output, and I gave a talk on my work with the Risch algorithm. I would also highly recommend watch this talk once they post the videos.
    • Again, regrettably, I could not attend half of the talks because they were held at the same time. Fortunately, they filmed all of them, so I hope to watch them all online when they are posted (and I recommend that you do too).
  • The sprints were a great time for getting together and hacking together. I worked with Min Ragan-Kelley to make isympy work with the new IPython. Having fixed this, I now want to release 0.7.1 very soon, so I used some of the time during the sprints getting ready for that. We already have preliminary release notes, and my hope is to create a release candidate on Monday (tomorrow). I also finished up my MathJax branch and finished reviewing and pushed in Tom’s first GSoC pull request, which has a lot of really cool stuff relating to converting hypergeometric functions and Meijer G-functions into standard elementary functions. This will all be in the release.

    Also at the sprints, Mateusz worked on an extension for our Sphinx docs that puts a SymPy Live console right in the docs. You can then click on “evaluate” next to any of the code examples, and it will run it in SymPy live. And of course, you can then edit it and play around with it. He already had a working version of this by the end of the sprints (with a few bugs still), but I don’t think he has pushed it to GitHub yet. I think this is going to be a landmark change for our documentation. SymPy Live runs on the App Engine, so this approach can be applied to any library that can run in pure Python 2.5, and I think a lot of such projects are going to be jealous this and want to start using it, because it’s very impressive and useful.

    We also had a couple of people from the conference come to our table and work on SymPy. These were people who were new to SymPy, and I think attended our tutorial. One of them, Emma Hogan, worked a little bit on improving our documentation, and has submitted a pull request.

  • Austin, TX is a nice city with lots of fun places to go, but it is also very humid, which is something I could barely stand (I am used to the same heat, but in Albuquerque it is dry heat). One interesting thing that some of us went and saw was the bats. The bridge over this lake in Austin has over a million bats living under it, and at night they all fly out to feed.

There’s all kinds of fun and interesting stuff that happened that I did not mention here. If you are interested in science and Python, I would highly recommend attending a future SciPy conference.

%d bloggers like this: