How do I sell out with abstract algebra?

First, let me say that I'm impressed by your maturity and wisdom. It's not easy to recognize your own limitations, accept them, and adapt. Most people have to learn the hard way, by living through a few decades of struggle and frustration. Some people actually enjoy struggle and frustration, though. Your choice.

I have been an "industrial" mathematician for 40 years (I "sold out" long ago). I work in the software industry. I don't have a Ph.D, and I don't write research papers (not very often, anyway). I don't spend a great deal of my time doing mathematics, and almost no time at all doing original mathematics, but I do write "mathematician" on my tax return every year. My work is interesting (to me), and I've made quite a lot of money.

From the suggestions below, it's clear that some people judge the fabric of a profession by reading its research literature. This is a hopelessly misleading approach. What happens in day-to-day work in any industry is very far removed from what you read in research papers. If you want to know what it's like working in industry, you should ask people who work in industry. And this is not a very good place to do that. Most of the people who hang around here are university faculty, grad students, and (recently) kids trying to get someone to do their homework for them. If you want to know what software developers do, for example, ask at StackOverflow.

I'll repeat some of the advice from others. Learn some computer science. Learn about basic algorithms, and get good at programming in some mainstream language like C/C++ or Java (not Haskell or OCaml). It's not that difficult, and it's great fun when your code works.

Accept that no-one is likely to pay you to do original research (except on a very small scale as part of a larger project). Especially not mathematical research. People in industry are expected to create working saleable products/processes/systems with a high degree of predictability. Research is too risky. If it were less risky, and the results were more certain, then it wouldn't be research.

Think about what it means to "sell out". One definition says that selling out is doing what society (and your employer) want you to do, rather than what you want to do. But society (or your employer) will only be willing to pay you if your work is valuable to them. So, in some sense, selling out is inevitable unless you're going to be a hermit poet or you're independently wealthy. The best you can hope for is that your work is interesting and fulfilling (in addition to being valuable), and that you don't have to do anything that you find morally distasteful. If you think that making money is distasteful, stick to academia.

To answer your question, I'm not personally aware of any places in industry where significant numbers of people spend time pondering the workings of abstract algebraic constructs. I don't say that they don't exist -- just that I'm not personally aware of them. My mathematical work mostly involves differential geometry (in 2D and 3D space), approximation of functions, numerical methods (root finding, minimization, etc.), very simple linear algebra, and occasionally a bit of algebraic geometry. I very rarely do any original mathematics. I typically use software packages written by other people, and I only need to know enough mathematics to understand the limitations of these packages and their applicability to my problems. If you want to work on the development of the mathematical software tools used by people like me, check out companies like Wolfram, MathWorks, MapleSoft, Rogue Wave, NAG. But be aware that these are (mostly) fairly small companies and they don't employ very many people. And they won't hire you unless you have good programming skills.

I mostly work with manufacturing companies -- people who design cars, airplanes, consumer electronics gadgets and so on. Think about what those companies are trying to do -- they want to create more attractive products, more quickly, with lower costs. How can you (and your expertise) help them do that? Contemplate this until you identify some place where you can imagine that you might fit in and be happy. Or, pick some other industry and go through the same sort of reasoning. The key is to find some place where your skills can add value.

Stop thinking of your work as your life. You'll still be the same person, regardless of whether you're winning Fields Medals or hacking code. Your children will love you just as much either way.


The tech company Twitter is using a software library called algebird. From their GitHub page:

Abstract algebra for Scala. This code is targeted at building aggregation systems (via Scalding or Storm). It was originally developed as part of Scalding's Matrix API, where Matrices had values which are elements of Monoids, Groups, or Rings. Subsequently, it was clear that the code had broader application within Scalding and on other projects within Twitter.

Their discussion goes on to explain why they needed to write such a software library:

Implementations of Monoids for interesting approximation algorithms, such as Bloom filter, HyperLogLog and CountMinSketch. These allow you to think of these sophisticated operations like you might numbers, and add them up in hadoop or online to produce powerful statistics and analytics.

I asked around for why Twitter needs an abstract algebra library. One of the authors Oscar Bokyin, said it had to do with databases.

CS.StackExchange What are uses of Groups, Monoids and Rings in Databases ?

Cardinality can be thought of as a functor from the category Set to the groupoid of isomorphism classes in that category, which we identify was Natural #'s.

In their case, they need to estimate cardinalities of subsets on a scale where it's impossible to check the membership criterion on every single element of the set. So probabilistic counting methods come to the rescue, taking advantage of how these values are stored in a computer.

These probabilistic counting algorithms can be added, multiplied by scalars, etc. behaving like natural numbers.


I will jump on the bandwagon of answers suggesting computer science. Algebraic thinking is deeply embedded in the design of programming languages - especially categorical structures like functors and monads. As a teaser, the Java language was invented by James Gosling, whose thesis was titled "Algebraic Constraints". I know that Microsoft Research does a lot of programming language theory, and I suspect that that would be a good place to apply your algebraic skills to real software and make some good money in the process. You might try learning the Haskell programming language to get a feel for how some of these ideas fit together; Haskell makes some of these algebraic concepts show up right on the surface.

You would also probably do well at a company that uses functional programming languages, instead of designing them. For example, I know that the wall street firm Jane Street writes their software exclusively in OCaml, and I know they do some research on effective functional software design, and that they customize the language to suit their needs. These tasks can be algebraic in flavor, and while they would involve more structural design and less proof, a similar set of skills apply. I bet they pay good money for people who love algebra.

There are many other areas of computer science that rely on algebra. Others have mentioned graphics and robotics, but I would point to the common ancestor of those two fields, which is computational geometry. If you take a look at the Computational Geometry Algorithms Library (CGAL), which is the most widely used geometry library, you will note that it is based on an algebraic core (with concepts like "group", "ring", and "field"). As a shameless plug, doing computational geometry for fun led me to develop this very algebraic library. Computational geometry has to answer very discrete questions like "is this point on this line", and so a common approach is to represent numbers exactly instead of approximately. This means that you get to ignore all of those annoying analysis problems that come up when using approximation. CGAL has a pretty extensive list of projects that use it --- this may be a good place to find employers.

These two fields rely on algebra in different ways. Programming languages will use concepts like "algebraic structure", "functor", and "formal proof", whereas geometry uses concepts like "field", "ring", "matrix". So if you like designing algebra, the former might be a better fit, whereas if you like using algebra, the latter may be. Of course using something and understanding how it fits together always go hand in hand, so in either area you will have opportunities to both use and design algebra. Both of these fields also have a range of people working on them, from pure academic research to very applied software development, so you should be able to find a way to fit yourself in.

One more thought is that advanced physics relies heavily on algebra (although you also have to do integrals!) My senior-level course in Quantum Mechanics certainly relied on linear transformations, eigenspaces, and a number of related concepts. I don't know how you can "sell out" with that, but I'm sure it's possible.