Is there a better (more modern) tool than lex/flex for generating a tokenizer for C++?

I recent added source file parsing to an existing tool that generated output files from complex command line arguments.

The command line arguments got to be so complex that we started allowing them to be supplied as a file that was parsed as if it was a very large command line, but the syntax was still awkward. So I added the ability to parse a source file using a more reasonable syntax.

I used flex 2.5.4 for windows to generate the tokenizer for this custom source file format, and it worked. But I hated the code. global variables, wierd naming convention, and the c++ code it generated was awful. The existing code generation backend was glued to the output of flex - I don't use yacc or bison.

I'm about to dive back into that code, and I'd like to use a better/more modern tool. Does anyone know of something that.

  • Runs in Windows command prompt (Visual studio integration is ok, but I use make files to build)
  • Generates a proper encapsulated C++ tokenizer. (No global variables)
  • Uses regular expressions for describing the tokenizing rules (compatible with lex syntax a plus)
  • Does not force me to use the c-runtime (or fake it) for file reading. (parse from memory)
  • Warns me when my rules force the tokenizer to backtrack (or fixes it automatically)
  • Gives me full control over variable and method names (so I can conform to my existing naming convention)
  • Allows me to link multiple parsers into a single .exe without name collisions
  • Can generate a UNICODE (16bit UCS-2) parser if I want it to
  • Is NOT an integrated tokenizer + parser-generator (I want a lex replacement, not a lex+yacc replacement)

I could probably live with a tool that just generated the tokenizing tables if that was the only thing available.


Ragel: It fits most of your requirements.

  • It runs on Windows
  • It doesn't declare the variables, so you can put them inside a class or inside a function as you like.
  • It has nice tools for analyzing regular expressions to see when they would backtrack. (I don't know about this very much, since I never use syntax in Ragel that would create a backtracking parser.)
  • Variable names can't be changed.
  • Table names are prefixed with the machine name, and they're declared "const static", so you can put more than one in the same file and have more than one with the same name in a single program (as long as they're in different files).
  • You can declare the variables as any integer type, including UChar (or whatever UTF-16 type you prefer). It doesn't automatically handle surrogate pairs, though. It doesn't have special character classes for Unicode either (I think).
  • It only does regular expressions... has no bison/yacc features.

The code it generates interferes very little with a program. The code is also incredibly fast, and the Ragel syntax is more flexible and readable than anything I've ever seen. It's a rock solid piece of software. It can generate a table-driven parser or a goto-driven parser.

Boost.Spirit.Qi (parser-tokenizer) or Boost.Spirit.Lex (tokenizer only). I absolutely love Qi, and Lex is not bad either, but I just tend to take Qi for my parsing needs...

The only real drawback with Qi tends to be an increase in compile time, and it is also runs slightly slower than hand-written parsing code. It is generally much faster than parsing with regex, though.

Flex also has a C++ output option. The result is a set of classes that do that parsing.

Just add the following to the head of you lex file:

%option C++
%option yyclass="Lexer"

Then in you source it is:

std::fstream  file("config");
Lexer         lexer(&file)
while(int token = lexer.yylex())

There's two tools that comes to mind, although you would need to find out for yourself which would be suitable, Antlr and GoldParser. There are language bindings available in both tools in which it can be plugged into the C++ runtime environment.

boost.spirit and Yard parser come to my mind. Note that the approach of having lexer generators is somewhat substituted by C++ inner DSL (domain-specific language) to specify tokens. Simply because it is part of your code without using an external utility, just by following a series of rules to specify your grammar.

Need Your Help

UITableView does not respond to scrolling and touches

ipad ios uitableview iphone-sdk-3.0 ios4

I've created a custom UIView, and inside that UIView, I add a UITableView like thus:

How do I easily convert from one collection type to another during a filter, map, flatMap in Scala?

scala collections map filter implicit-conversion

Suppose I have a List[Int], and I want to call toString on each element, and get back the result as a Vector[String].