AlO UdZddlZddlZddlmZddlmZddlm Z ddl m Z ddl mZdd lmZejr ddlZdd lmZed Zejejd fed <ejdZejdZejdejZejdejej zZ!ejdejej zZ"e dZ#e dZ$e dZ%e dZ&e dZ'e dZ(e dZ)e dZ*e dZ+e dZ,e dZ-e dZ.e dZ/e d Z0e d!Z1e d"Z2e d#Z3e d$Z4e d%Z5e d&Z6e d'Z7e d(Z8e d)Z9e d*Z:e d+Z;e d,Z<e d-Z=e d.Z>e d/Z?e d0Z@e d1ZAe d2ZBe d3ZCe d4ZDe d5ZEe d6ZFe d7ZGe d8ZHe d9ZIe d:ZJe d;ZKe d<ZLe d=ZMe d>ZNe d?ZOe d@ZPe dAZQe dBZRe dCZSidDe#dEe;dFe'dGe*dHe3dIe2dJe6dKe<dLe.dMe8dNe/dOe9dPe-dQe7dRe)dSe4dTe+e,e0e1e$e(e%e5e&e:dU ZTdVeTUDZVeWeTeWeVks JdWejdNdXXdYeYeTdZ[DdOZZe[eIeKeJe=eNeOePgZ\e[e=eQeKePgZ]d\e^d]e^fd^Z_d_d`d]e^fdaZ`dbe^d]e^fdcZadde^d]ebfdeZcdfdgd]ejdeje^e^ffdhZeGdidjZfGdkd`ejgZhGdldmZiGdndoZjdvdpZkGdqdrelZmGdsdtejgZnGdud ZodS)wzImplements a Jinja / Python combination lexer. The ``Lexer`` class is used to do some preprocessing. It filters out invalid operators like the bitshift operators we don't allow in templates. It separates template code and python code in expressions. N) literal_eval)deque)intern)pattern)TemplateSyntaxError)LRUCache) Environment2Lexer _lexer_cachez\s+z (\r\n|\r|\n)z7('([^'\\]*(?:\\.[^'\\]*)*)'|"([^"\\]*(?:\\.[^"\\]*)*)")z ( 0b(_?[0-1])+ # binary | 0o(_?[0-7])+ # octal | 0x(_?[\da-f])+ # hex | [1-9](_?\d)* # decimal | 0(_?0)* # decimal zero ) z (?) z>=rYs888daQ888zoperators droppedrPc#>K|]}tj|VdSN)reescaperUxs rX ras*PP!1PPPPPPrZc"t| Sr\)lenr`s rXresAwrZ)key token_typereturnc|tvr t|Stdtdtdtdt dt dtdtdtdtd td td i ||S) Nzbegin of commentzend of commentr6zbegin of statement blockzend of statement blockzbegin of print statementzend of print statementzbegin of line statementzend of line statementztemplate data / textzend of template)reverse_operatorsTOKEN_COMMENT_BEGINTOKEN_COMMENT_END TOKEN_COMMENTTOKEN_LINECOMMENTTOKEN_BLOCK_BEGINTOKEN_BLOCK_ENDTOKEN_VARIABLE_BEGINTOKEN_VARIABLE_ENDTOKEN_LINESTATEMENT_BEGINTOKEN_LINESTATEMENT_END TOKEN_DATA TOKEN_EOFget)rgs rX_describe_token_typerxs|&&& ,, /+y95184!#<!8*$  c*j!! "rZtokenTokencX|jtkr|jSt|jS)z#Returns a description of the token.)type TOKEN_NAMEvaluerx)rys rXdescribe_tokenrs' zZ{  + ++rZexprczd|vr'|dd\}}|tkr|Sn|}t|S)z0Like `describe_token` but for token expressions.rOr)splitr}rx)rr|r~s rXdescribe_token_exprrsL d{{jja(( e :  L   % %%rZr~cPtt|S)zsCount the number of newline characters in the string. This is useful for extensions that filter a stream. )rc newline_refindall)r~s rXcount_newlinesrs  z!!%(( ) ))rZ environmentr cftj}t|jt||jft|jt ||jft|jt||jfg}|j @| t|j td||j zf|j @| t|j td||j zfdt|dDS)zACompiles all the rules from the environment into a list of rules.Nz ^[ \t\v]*z(?:^|(?<=\S))[^\S\r\n]*c"g|] }|dd S)rNrTr_s rX z!compile_rules..s 7 7 7aAabbE 7 7 7rZT)reverse)r]r^rccomment_start_stringrkblock_start_stringrovariable_start_stringrqline_statement_prefixappendrsline_comment_prefixTOKEN_LINECOMMENT_BEGINsorted)reruless rX compile_rulesrsC A  0 1 1  Ak. / /  . / /  Ak, - -  1 2 2 Ak/ 0 0  E$(4 K566)qq!BCCC    &2 K344'*QQ{/N-O-OO     8 76%666 7 7 77rZcTeZdZdZefdedejeddfdZde dedd fd Z dS) FailurezjClass that raises a `TemplateSyntaxError` if called. Used by the `Lexer` to specify known errors. messageclsrhNc"||_||_dSr\)r error_class)selfrrs rX__init__zFailure.__init__s rZlinenofilenamez te.NoReturnc:||j||r\)rr)rrrs rX__call__zFailure.__call__st|VX>>>rZ) __name__ __module__ __qualname____doc__rstrtTyperintrrTrZrXrrs @S!"(;!<  ?s?c?m??????rZrcZeZdZUeed<eed<eed<defdZdedefdZdedefd Z d S) rzrr|r~rhc t|Sr\)rrs rX__str__z Token.__str__sd###rZrct|j|krdSd|vr&|dd|j|jgkSdS)zTest a token against a token expression. This can either be a token type or ``'token_type:token_value'``. This can only test against string values and types. TrOrF)r|rr~rrs rXtestz Token.testsF 9  4 $;;::c1%%$)TZ)@@ @urZiterablec:tfd|DS)z(Test against multiple token expressions.c3BK|]}|VdSr\)r)rUrrs rXraz!Token.test_any..$s-88t499T??888888rZ)any)rrs` rXtest_anyzToken.test_any"s&8888x888888rZN) rrrr__annotations__rrboolrrrTrZrXrzrz s KKK III JJJ$$$$$      9#9$999999rZc.eZdZdZd dZd dZdefdZdS) TokenStreamIteratorz`The iterator for tokenstreams. Iterate over the stream until the eof token is reached. stream TokenStreamrhNc||_dSr\)r)rrs rXrzTokenStreamIterator.__init__,s  rZc|Sr\rTrs rX__iter__zTokenStreamIterator.__iter__/s rZc|jj}|jtur |jt t |j|Sr\)rcurrentr|rvclose StopIterationnextrrys rX__next__zTokenStreamIterator.__next__2sH # : " " K       T[ rZ)rrrhN)rhr)rrrrrrrzrrTrZrXrr'sa%rZrc$eZdZdZdejedejedejefdZ de fdZ de fdZ ede fd Zd edd fd Zdefd Zddedd fdZdedejefdZdede fdZdefdZddZdedefdZd S)rzA token stream is an iterable that yields :class:`Token`\s. The parser however does not iterate over it but calls :meth:`next` to go one token ahead. The current active token is stored as :attr:`current`. generatorr+rct||_t|_||_||_d|_tdtd|_ t|dS)NFr) iter_iterr_pushedr+rclosedrz TOKEN_INITIALrr)rrr+rs rXrzTokenStream.__init__CsS )__ */''     Q r22  T rZrhc t|Sr\)rrs rXrzTokenStream.__iter__Qs"4(((rZcPt|jp|jjtuSr\)rrrr|rvrs rX__bool__zTokenStream.__bool__Ts!DL!!GT\%6i%GGrZc| S)z Are we at the end of the stream?rTrs rXeoszTokenStream.eosWs xrZryNc:|j|dS)z Push a token back to the stream.N)rrrs rXpushzTokenStream.push\s E"""""rZcjt|}|j}||||_|S)zLook at the next token.)rrr)r old_tokenresults rXlookzTokenStream.look`s2JJ  &   rZrncHt|D]}t|dS)zGot n tokens ahead.N)ranger)rr_s rXskipzTokenStream.skiphs.q  A JJJJ  rZrcX|j|rt|SdS)zqPerform the token test and return the token if it matched. Otherwise the return value is `None`. N)rrrrs rXnext_ifzTokenStream.next_ifms- <  T " " :: trZc0||duS)z8Like :meth:`next_if` but only returns `True` or `False`.N)rrs rXskip_ifzTokenStream.skip_ifvs||D!!--rZc|j}|jr|j|_nR|jjtur? t |j|_n$#t$r|YnwxYw|S)z|Go one token ahead and return the old one. Use the built-in :func:`next` instead of calling this directly. ) rrpopleftr|rvrrrr)rrvs rXrzTokenStream.__next__zs \ < <//11DLL \ i / / #DJ//        sAA=<A=ct|jjtd|_t d|_d|_dS)zClose the stream.rrTTN)rzrrrvrrrrs rXrzTokenStream.closes2T\0)R@@ "XX  rZch|j|st|}|jjtur*t d|d|jj|j|jt d|dt|j|jj|j|jt|S)z}Expect a given token type and return it. This accepts the same argument as :meth:`jinja2.lexer.Token.test`. z%unexpected end of template, expected rNzexpected token z, got ) rrrr|rvrrr+rrrrs rXexpectzTokenStream.expects|  && &t,,D| I--)EDEEEL'IM &P$PPt|0L0LPP #   DzzrZ)r)rhN)rrrrrIterablerzOptionalrrrrrrpropertyrrrrrrrrrrrTrZrXrr=s :e$ jo *S/    )-))))H$HHHHTX#%#D####ec$ CAJu$5.C.D....%" 35rZrc |j|j|j|j|j|j|j|j|j|j |j |j f }t |}|t|xt|<}|S)z(Return a lexer which is probably cached.)rblock_end_stringrvariable_end_stringrcomment_end_stringrr trim_blocks lstrip_blocksnewline_sequencekeep_trailing_newliner rwr )rrflexers rX get_lexerrs &$)'(&)'!$) C   S ! !E }$)+$6$66 SE LrZc&eZdZdZdZfdZxZS)OptionalLStripzWA special tuple for marking a point in the state that can have lstrip applied. rTcHt||Sr\)super__new__)rmemberskwargs __class__s rXrzOptionalLStrip.__new__swwsG,,,rZ)rrrr __slots__r __classcell__)rs@rXrrsII---------rZrceZdZUejeed<ejeejedfeje fed<ej eed<dS)_Ruler.tokenscommandN) rrrrPatternrrUnionTuplerrrTrZrXrrsb Ys^ GCc*AGG,<< ==== Z_rZrceZdZdZddZdedefdZ dd ed ejed ejed ejede f d Z ddej ej e eefd ejed ejedejefdZ dd ed ejed ejed ejedejej e eeff dZdS)r a Class that implements a lexer for a given environment. Automatically created by the environment class, usually you don't have to do that. Note that the lexer is not automatically bound to an environment. Multiple environments can share the same lexer. rr rhNctj}dtdtjtfd}t t tdt ttdt ttdt ttdt ttdt t t"dg}t%|}||j}||j}||j}||j} |jrdnd} |jr |dnd|_|j|_|j|_d|d|d |d } d | gd |Dz} d t |d| dt;t<ddt |dt<dgt>t |d|d|d || d t@tBfdt |dtEdfdgtFt |d|d|d || dtHdg|ztJt |d| d | tLdg|ztNt |d|d|d|d || d t;t<tPdt |dtEdfdgtRt |dtTdg|ztVt |dtXtZfdgi|_.dS)Nr`rhcZtj|tjtjzSr\)r]compileMSrds rXczLexer.__init__..cs:a-- -rZz\n?rz[^ \t]z(?Pz(\-|\+|)\s*raw\s*(?:\-z\s*|z))rPc&g|]\}}d|d|dS)z(?P. s0QQQ$!Q5Q55555QQQrZrootz(.*?)(?:rH#bygroupz.+z (.*?)((?:\+z|\-#popz(.)zMissing end of comment tagz(?:\+z\-z (.*?)((?:z(\-|\+|))\s*endraw\s*(?:\+zMissing end of raw directivez \s*(\n|$)z(.*?)()(?=\n|$))/r]r^rrrr whitespace_reTOKEN_WHITESPACEfloat_re TOKEN_FLOAT integer_re TOKEN_INTEGERname_rer} string_re TOKEN_STRING operator_reTOKEN_OPERATORrrrrrrrlstrip_unless_rerrjoinrrurkrmrlrrorprqrrTOKEN_RAW_BEGIN TOKEN_RAW_ENDrsrtrrnTOKEN_LINECOMMENT_ENDr) rrrr tag_rulesroot_tag_rulesblock_start_re block_end_recomment_end_revariable_end_reblock_suffix_re root_raw_re root_parts_res rXrzLexer.__init__sJ I . .3 . . . . -!14 8 8 (K . . *mT 2 2 ':t , , )\4 0 0 +~t 4 4 $ '{33;9::q566 ;9::!K;<<%0$;C&&1<0I S) t + <%0%F" 8n 8 8! 8 8'3 8 8 8  MQQ.QQQ Q  A2-22233":z:: aaggz400  AA~AA.AA+A-<AAA#$56 aaii'*F"G"G!I4PP " A>>>,>>)>+:>>>$    !ABOBBBBCC&## A?^??!-??2>??)?+:??? #:}==aaii'*H"I"I!KTRR  &aa oo'>GG(( $A())&(=>&F2  rZr~cBt|j|S)z`Replace all newlines with the configured sequence in strings and template data. )rr&r)rr~s rX_normalize_newlineszLexer._normalize_newlinesVs~~d3U;;;rZsourcer+rstatec~|||||}t||||||S)z:Calls tokeniter + tokenize and wraps it in a token stream.) tokeniterrwrap)rr0r+rr1rs rXtokenizezLexer.tokenize\s=h>>499VT8<AD E A EEc# Kt|ddd}|js|ddkr|d=d|}d}d}dg}|,|dkr&|d vs Jd ||d z|j|d} t |} g} |j} d} d } | D] \}}}|||}| | r|tttfvr9t|tr|}t|tr|d}t!d |dddD}|dkrL|}|t |dd} |g|dd}n|dkry| w|t*sK|ddz}|dks|r+| ||s|d|g|dd}t1|D]\}}|jt4ur ||||dkrb|D](\}}|!|||fV||dz }n)t9|d||}|s |t:vr|||fV||d| zz }d} n|}|t>kr|dkr| dn|dkr| dnk|dkr| dnO|dvrK| stAd|d|||| !}||krtAd|d|d||||s |t:vr|||fV||dz }|dddk}|"}||dkr|!ns|dkrX|D]\}}|||nt9|dn|||j|d} n||krt9|d|}n&|| krdStAd||d ||||L)!aThis method tokenizes the text and returns the tokens in a generator. Use this method if you just want to tokenize a template. .. versionchanged:: 3.0 Only ``\n``, ``\r\n`` and ``\r`` are treated as line breaks. Nr8r rrr)variableblockz invalid state_beginTc3K|]}||V dSr\rT)rUgs rXraz"Lexer.tokeniter..s")S)SQ]!]]]])S)SrZr@r?rz= wanted to resolve the token dynamically but no group matchedrIrJrGrHrErF)rJrHrFz unexpected ''z ', expected 'rzA wanted to resolve the new state dynamically but no group matchedz* yielded empty string without stack changezunexpected char z at )#rrrr!rrrcr matchrrrprt isinstancetuplegroupsrrrstripcount groupdictrwrqrfindsearch enumeraterritems RuntimeErrorignore_if_emptygrouprrpopend)rr0r+rr1linesposrstack statetokens source_lengthbalancing_stackr newlines_stripped line_startingregexr new_statemrQtext strip_signstrippedl_posidxryrfr~r< expected_oppos2s rXr3zLexer.tokenitersb  ((1-) eBi2oob 5!!  &1111?111 LL) * * *jr+ F ')0 e ,7c c (vyKK,,9 #v&#+2(( fe,,^/XXZZF!&.99I &ay&*)S)SVADqD\)S)S)S%S%S %,,'+{{}}H04S]]__0E0K0KD0Q0Q-&.%<%'>tU'K'K!I.26E6l-HVABBZ-HF&/&7&722 U ?g55"'%"9"99#j00./kkmm.A.A.C.C " " U#(#4*0#u*<$<$<$<$*ekk$.?.?$?F$)E$5 '3',%<%<%<'"'"!"%*$*#;D#:uO'C'C&,eT&9 9 9 9"djj&6&69J&JJF01--52<7799D//3;;+2237777!S[[+2237777!S[[+2237777!_44#2"&9$:4$:$:$:FD('"'"!"+:*=*=*?*?K*d22&9$T4$T$Tk$T$T$T$*$($, '"'"!"3v_<<$fd2222djj...F ! "##$ 6 uuww( F** "j00*+++--*=*=*?*?JC$0 % S 1 1 1 % 1#/#(!9!9!9##!& Y///"&*U2Y"7KKS[[& NNN  -''F*?vc{??#??xGe rZ)rr rhN)NNN)NN)rrrrrrr/rrrr5rrrIteratorrzr4r3rTrZrXr r sw w w w r<<<<<<!%$(!% N N Njo N*S/ N z# N  N N N N!%$( 4.4. 173S=124.jo4.*S/ 4. E  4.4.4.4.t%)!% HHHjoH*S/ H z# H AGCcM* + HHHHHHrZ)rr rhr )prr]typingrastr collectionsrsysr _identifierrr exceptionsrutilsr TYPE_CHECKINGtyping_extensionsterr r MutableMappingrrr rrrr IGNORECASEVERBOSErr TOKEN_ADD TOKEN_ASSIGN TOKEN_COLON TOKEN_COMMA TOKEN_DIV TOKEN_DOTTOKEN_EQTOKEN_FLOORDIVTOKEN_GT TOKEN_GTEQ TOKEN_LBRACETOKEN_LBRACKET TOKEN_LPARENTOKEN_LT TOKEN_LTEQ TOKEN_MOD TOKEN_MULTOKEN_NE TOKEN_PIPE TOKEN_POW TOKEN_RBRACETOKEN_RBRACKET TOKEN_RPARENTOKEN_SEMICOLON TOKEN_SUB TOKEN_TILDErrrr}rrrorprqrrr"r#rkrlrmrsrtrr$rnrurrvrBrXrjrcr!rr frozensetr;rZrrxrrrrListrr NamedTuplerzrrrrPrrr rTrZrXrs1  ++++++++++++?)""""((((((4<8B<< aqw/0??? 6"" RZ ( ( BJBBD   RZ MBJ 2: MBJ   F5MM vh fWoo fWoo F5MM F5MM 6$<< ## 6$<< VF^^ vh  ##vh 6$<< VF^^ F5MM F5MM 6$<< VF^^ F5MM vh  ##vh &%% F5MM fWoo 6,''fWoo y!! VF^^ vh  ##F=))&%%v.//VN++&%%y!! f_--F=))y!! "F#899 &!455 &!455011F=)) VF^^ y!! F5MM      .         )         (   (! "# $         5    :98ioo&7&7888 s9~~./////1D///bjSPPvvi=N=N'O'O'OPPPPPSSS   )z=2CD "S"S""""(,',c,,,, &c &c & & & &*#*#****&8}&8S8I1J&8&8&8&8R ? ? ? ? ? ? ? ?99999AL9998,jjjjjjjjZ0 - - - - -U - - -AL PPPPPPPPPPrZ