One of the disadvantage of gtl package is that it does not consider the char code of { and }
(and also it cannot convert back to a normal tl, although it's actually easy to do that, just take the middle part -- read the documentation to understand its internal structure)
Expl3 actually have function to do that already, the analysis family of functions:
\def \f #1 {
\def \a {0}
\def \b {}
\tl_analysis_map_inline:nn {#1} {
% only run this once for the first token.
\tl_if_empty:NT \b {
\int_compare:nNnTF {"##3} = {1} {
\def \b {first~is~cat1}
} {
\def \b {first~is~not~cat1}
}
}
\tl_set:Nx \a {\int_eval:n {\a+1}}
}
}
Example usage:
\f{{a}b}
number~of~tokens:~\a \par
\b \par
\f{x{a}b}
number~of~tokens:~\a \par
\b \par
needless to say, in real code don't redefine important LaTeX macros such as \f, \a etc.
expl3get written mainly as the team or others suggest use cases. For token lists, we have mappings which can extract out 'items' (to 'do stuff' with), and some expandable code to do 'manipulations' (see\tl_upper_case:n, etc.). However, parsing in a token-by-token way is unlikely to be generic as each use case will likely have different 'rules', and so it's not been the case that we've had requests that fit within our scope (to date). – Joseph Wright Sep 27 '17 at 08:04