Go to the first, previous, next, last section, table of contents.
The following are some limitations with the current release of
gperf
:
-
The
gperf
utility is tuned to execute quickly, and works quickly
for small to medium size data sets (around 1000 keywords). It is
extremely useful for maintaining perfect hash functions for compiler
keyword sets. Several recent enhancements now enable gperf
to
work efficiently on much larger keyword sets (over 15,000 keywords).
When processing large keyword sets it helps greatly to have over 8 megs
of RAM.
-
The size of the generate static keyword array can get extremely
large if the input keyword file is large or if the keywords are quite
similar. This tends to slow down the compilation of the generated C
code, and greatly inflates the object code size. If this
situation occurs, consider using the ‘-S’ option to reduce data
size, potentially increasing keyword recognition time a negligible
amount. Since many C compilers cannot correctly generate code for
large switch statements it is important to qualify the -S option
with an appropriate numerical argument that controls the number of
switch statements generated.
-
The maximum number of selected byte positions has an
arbitrary limit of 255. This restriction should be removed, and if
anyone considers this a problem write me and let me know so I can remove
the constraint.
Go to the first, previous, next, last section, table of contents.