How to use normalize_whitespace method in Robotframework

Best Python code snippet using robotframework

autograder.py

Source:autograder.py Github

copy

Full Screen

1import doctest2from cellular import generate3def test_generate():4 """5 >>> generate(0, 5) # doctest: +NORMALIZE_WHITESPACE6 P1 11 67 0 0 0 0 0 1 0 0 0 0 08 0 0 0 0 0 0 0 0 0 0 09 0 0 0 0 0 0 0 0 0 0 010 0 0 0 0 0 0 0 0 0 0 011 0 0 0 0 0 0 0 0 0 0 012 0 0 0 0 0 0 0 0 0 0 013 >>> generate(0, 20) # doctest: +NORMALIZE_WHITESPACE14 P1 41 2115 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 016 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 017 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 018 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 019 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 020 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 021 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 022 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 023 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 024 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 025 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 026 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 027 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 028 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 029 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 030 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 031 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 032 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 033 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 034 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 035 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 036 >>> generate(1, 5) # doctest: +NORMALIZE_WHITESPACE37 P1 11 638 0 0 0 0 0 1 0 0 0 0 039 1 1 1 1 0 0 0 1 1 1 140 0 0 0 0 0 1 0 0 0 0 041 1 1 1 1 0 0 0 1 1 1 142 0 0 0 0 0 1 0 0 0 0 043 1 1 1 1 0 0 0 1 1 1 144 >>> generate(1, 20) # doctest: +NORMALIZE_WHITESPACE45 P1 41 2146 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 047 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 148 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 049 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 150 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 051 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 152 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 053 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 154 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 055 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 156 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 057 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 158 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 059 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 160 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 061 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 162 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 063 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 164 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 065 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 166 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 067 >>> generate(2, 5) # doctest: +NORMALIZE_WHITESPACE68 P1 11 669 0 0 0 0 0 1 0 0 0 0 070 0 0 0 0 1 0 0 0 0 0 071 0 0 0 1 0 0 0 0 0 0 072 0 0 1 0 0 0 0 0 0 0 073 0 1 0 0 0 0 0 0 0 0 074 1 0 0 0 0 0 0 0 0 0 075 >>> generate(2, 20) # doctest: +NORMALIZE_WHITESPACE76 P1 41 2177 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 078 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 079 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 080 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 081 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 082 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 083 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 084 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 085 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 086 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 087 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 088 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 089 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 090 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 091 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 092 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 093 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 094 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 095 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 096 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 097 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 098 >>> generate(3, 5) # doctest: +NORMALIZE_WHITESPACE99 P1 11 6100 0 0 0 0 0 1 0 0 0 0 0101 1 1 1 1 1 0 0 1 1 1 1102 0 0 0 0 0 0 1 0 0 0 0103 1 1 1 1 1 1 0 0 1 1 1104 0 0 0 0 0 0 0 1 0 0 0105 1 1 1 1 1 1 1 0 0 1 1106 >>> generate(3, 20) # doctest: +NORMALIZE_WHITESPACE107 P1 41 21108 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0109 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1110 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0111 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1112 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0113 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1114 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0115 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1116 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0117 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1118 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0119 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1120 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0121 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1122 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0123 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1124 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0125 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1126 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0127 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1128 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0129 >>> generate(4, 5) # doctest: +NORMALIZE_WHITESPACE130 P1 11 6131 0 0 0 0 0 1 0 0 0 0 0132 0 0 0 0 0 1 0 0 0 0 0133 0 0 0 0 0 1 0 0 0 0 0134 0 0 0 0 0 1 0 0 0 0 0135 0 0 0 0 0 1 0 0 0 0 0136 0 0 0 0 0 1 0 0 0 0 0137 >>> generate(4, 20) # doctest: +NORMALIZE_WHITESPACE138 P1 41 21139 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0140 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0141 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0142 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0143 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0144 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0145 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0146 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0147 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0148 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0149 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0150 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0151 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0152 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0153 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0154 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0155 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0156 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0157 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0158 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0159 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0160 >>> generate(7, 5) # doctest: +NORMALIZE_WHITESPACE161 P1 11 6162 0 0 0 0 0 1 0 0 0 0 0163 1 1 1 1 1 1 0 1 1 1 1164 0 0 0 0 0 0 0 0 0 0 0165 1 1 1 1 1 1 1 1 1 1 1166 0 0 0 0 0 0 0 0 0 0 0167 1 1 1 1 1 1 1 1 1 1 1168 >>> generate(7, 20) # doctest: +NORMALIZE_WHITESPACE169 P1 41 21170 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0171 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1172 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0173 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1174 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0175 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1176 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0177 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1178 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0179 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1180 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0181 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1182 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0183 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1184 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0185 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1186 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0187 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1188 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0189 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1190 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0191 >>> generate(12, 5) # doctest: +NORMALIZE_WHITESPACE192 P1 11 6193 0 0 0 0 0 1 0 0 0 0 0194 0 0 0 0 0 1 0 0 0 0 0195 0 0 0 0 0 1 0 0 0 0 0196 0 0 0 0 0 1 0 0 0 0 0197 0 0 0 0 0 1 0 0 0 0 0198 0 0 0 0 0 1 0 0 0 0 0199 >>> generate(12, 20) # doctest: +NORMALIZE_WHITESPACE200 P1 41 21201 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0202 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0203 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0204 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0205 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0206 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0207 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0208 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0209 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0210 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0211 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0212 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0213 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0214 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0215 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0216 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0217 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0218 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0219 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0220 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0221 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0222 >>> generate(14, 5) # doctest: +NORMALIZE_WHITESPACE223 P1 11 6224 0 0 0 0 0 1 0 0 0 0 0225 0 0 0 0 1 1 0 0 0 0 0226 0 0 0 1 1 0 0 0 0 0 0227 0 0 1 1 0 0 0 0 0 0 0228 0 1 1 0 0 0 0 0 0 0 0229 1 1 0 0 0 0 0 0 0 0 0230 >>> generate(14, 20) # doctest: +NORMALIZE_WHITESPACE231 P1 41 21232 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0233 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0234 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0235 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0236 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0237 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0238 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0239 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0240 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0241 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0242 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0243 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0244 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0245 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0246 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0247 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0248 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0249 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0250 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0251 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0252 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0253 >>> generate(15, 5) # doctest: +NORMALIZE_WHITESPACE254 P1 11 6255 0 0 0 0 0 1 0 0 0 0 0256 1 1 1 1 1 1 0 1 1 1 1257 1 0 0 0 0 0 0 1 0 0 0258 1 0 1 1 1 1 1 1 0 1 1259 1 0 1 0 0 0 0 0 0 1 0260 1 0 1 0 1 1 1 1 1 1 0261 >>> generate(15, 20) # doctest: +NORMALIZE_WHITESPACE262 P1 41 21263 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0264 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1265 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0266 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1267 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0268 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1269 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0270 1 0 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1271 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0272 1 0 1 0 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1273 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0274 1 0 1 0 1 0 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1275 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0276 1 0 1 0 1 0 1 0 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1277 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0278 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1279 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0280 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1281 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0282 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1283 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1284 >>> generate(18, 5) # doctest: +NORMALIZE_WHITESPACE285 P1 11 6286 0 0 0 0 0 1 0 0 0 0 0287 0 0 0 0 1 0 1 0 0 0 0288 0 0 0 1 0 0 0 1 0 0 0289 0 0 1 0 1 0 1 0 1 0 0290 0 1 0 0 0 0 0 0 0 1 0291 1 0 1 0 0 0 0 0 1 0 1292 >>> generate(18, 20) # doctest: +NORMALIZE_WHITESPACE293 P1 41 21294 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0295 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0296 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0297 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0298 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0299 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0300 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0301 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0302 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0303 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0304 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0305 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0306 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0307 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0308 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0309 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0310 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0311 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0312 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0313 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0314 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1315 >>> generate(22, 5) # doctest: +NORMALIZE_WHITESPACE316 P1 11 6317 0 0 0 0 0 1 0 0 0 0 0318 0 0 0 0 1 1 1 0 0 0 0319 0 0 0 1 0 0 0 1 0 0 0320 0 0 1 1 1 0 1 1 1 0 0321 0 1 0 0 0 0 0 0 0 1 0322 1 1 1 0 0 0 0 0 1 1 1323 >>> generate(22, 20) # doctest: +NORMALIZE_WHITESPACE324 P1 41 21325 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0326 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0327 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0328 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0329 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0330 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0331 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0332 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0333 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0334 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0335 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0336 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0337 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0338 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0339 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0340 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0341 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0342 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0343 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0344 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0345 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1346 >>> generate(30, 5) # doctest: +NORMALIZE_WHITESPACE347 P1 11 6348 0 0 0 0 0 1 0 0 0 0 0349 0 0 0 0 1 1 1 0 0 0 0350 0 0 0 1 1 0 0 1 0 0 0351 0 0 1 1 0 1 1 1 1 0 0352 0 1 1 0 0 1 0 0 0 1 0353 1 1 0 1 1 1 1 0 1 1 1354 >>> generate(30, 20) # doctest: +NORMALIZE_WHITESPACE355 P1 41 21356 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0357 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0358 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0359 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0360 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0361 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0362 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0363 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 1 1 0 0 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0364 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 1 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0365 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 1 1 0 1 1 0 0 1 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0366 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 0 1 0 1 1 1 1 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0367 0 0 0 0 0 0 0 0 0 1 1 0 1 1 1 1 0 0 1 1 0 1 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0368 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 1 1 1 0 0 1 1 0 0 1 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0369 0 0 0 0 0 0 0 1 1 0 1 1 1 1 0 1 1 0 0 1 1 1 0 1 1 1 0 0 1 1 0 1 1 1 0 0 0 0 0 0 0370 0 0 0 0 0 0 1 1 0 0 1 0 0 0 0 1 0 1 1 1 0 0 0 1 0 0 1 1 1 0 0 1 0 0 1 0 0 0 0 0 0371 0 0 0 0 0 1 1 0 1 1 1 1 0 0 1 1 0 1 0 0 1 0 1 1 1 1 1 0 0 1 1 1 1 1 1 1 0 0 0 0 0372 0 0 0 0 1 1 0 0 1 0 0 0 1 1 1 0 0 1 1 1 1 0 1 0 0 0 0 1 1 1 0 0 0 0 0 0 1 0 0 0 0373 0 0 0 1 1 0 1 1 1 1 0 1 1 0 0 1 1 1 0 0 0 0 1 1 0 0 1 1 0 0 1 0 0 0 0 1 1 1 0 0 0374 0 0 1 1 0 0 1 0 0 0 0 1 0 1 1 1 0 0 1 0 0 1 1 0 1 1 1 0 1 1 1 1 0 0 1 1 0 0 1 0 0375 0 1 1 0 1 1 1 1 0 0 1 1 0 1 0 0 1 1 1 1 1 1 0 0 1 0 0 0 1 0 0 0 1 1 1 0 1 1 1 1 0376 1 1 0 0 1 0 0 0 1 1 1 0 0 1 1 1 1 0 0 0 0 0 1 1 1 1 0 1 1 1 0 1 1 0 0 0 1 0 0 0 1377 >>> generate(32, 5) # doctest: +NORMALIZE_WHITESPACE378 P1 11 6379 0 0 0 0 0 1 0 0 0 0 0380 0 0 0 0 0 0 0 0 0 0 0381 0 0 0 0 0 0 0 0 0 0 0382 0 0 0 0 0 0 0 0 0 0 0383 0 0 0 0 0 0 0 0 0 0 0384 0 0 0 0 0 0 0 0 0 0 0385 >>> generate(32, 20) # doctest: +NORMALIZE_WHITESPACE386 P1 41 21387 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0388 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0389 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0390 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0391 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0392 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0393 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0394 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0395 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0396 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0397 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0398 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0399 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0400 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0401 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0402 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0403 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0404 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0405 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0406 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0407 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0408 >>> generate(41, 5) # doctest: +NORMALIZE_WHITESPACE409 P1 11 6410 0 0 0 0 0 1 0 0 0 0 0411 1 1 1 1 0 0 0 1 1 1 1412 1 0 0 0 0 1 0 1 0 0 0413 0 0 1 1 0 0 1 0 0 1 1414 1 0 1 0 0 0 0 0 0 1 0415 0 1 0 0 1 1 1 1 0 0 0416 >>> generate(41, 20) # doctest: +NORMALIZE_WHITESPACE417 P1 41 21418 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0419 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1420 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0421 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1422 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0423 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1424 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0425 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1426 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0427 0 0 1 0 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1428 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0429 0 0 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 1 1 1 1 1 1 1 1430 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0431 0 1 0 0 1 1 1 0 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1432 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0433 1 1 1 0 0 0 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 1 1 1 1434 1 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0435 0 0 1 0 0 1 0 0 1 1 1 0 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1436 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0437 0 0 1 1 1 1 1 0 0 0 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1438 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0439 >>> generate(45, 5) # doctest: +NORMALIZE_WHITESPACE440 P1 11 6441 0 0 0 0 0 1 0 0 0 0 0442 1 1 1 1 0 1 0 1 1 1 1443 1 0 0 0 1 1 1 1 0 0 0444 1 0 1 0 1 0 0 0 0 1 1445 1 1 1 1 1 0 1 1 0 1 0446 1 0 0 0 0 1 1 0 1 1 0447 >>> generate(45, 20) # doctest: +NORMALIZE_WHITESPACE448 P1 41 21449 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0450 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1451 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0452 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1453 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0454 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 0 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1455 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0456 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 0 1 0 0 0 1 1 0 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1457 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0458 1 0 1 1 0 1 0 0 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 1 0 0 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1459 1 1 1 0 1 1 0 0 1 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0460 1 0 0 1 1 0 0 0 1 0 1 1 1 1 0 1 0 0 0 0 1 1 1 1 1 0 1 0 0 0 0 0 1 1 1 1 1 1 1 1 1461 1 0 0 1 0 0 1 0 1 1 1 0 0 0 1 1 0 1 1 0 1 0 0 0 0 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 0462 1 0 0 1 0 0 1 1 1 0 0 0 1 0 1 0 1 1 0 1 1 0 1 1 0 1 0 1 1 0 0 1 1 0 1 1 1 1 1 1 1463 1 0 0 1 0 0 1 0 0 0 1 0 1 1 1 1 1 0 1 1 0 1 1 0 1 1 1 1 0 0 0 1 0 1 1 0 0 0 0 0 0464 1 0 0 1 0 0 1 0 1 0 1 1 1 0 0 0 0 1 1 0 1 1 0 1 1 0 0 0 0 1 0 1 1 1 0 0 1 1 1 1 1465 1 0 0 1 0 0 1 1 1 1 1 0 0 0 1 1 0 1 0 1 1 0 1 1 0 0 1 1 0 1 1 1 0 0 0 0 1 0 0 0 0466 1 0 0 1 0 0 1 0 0 0 0 0 1 0 1 0 1 1 1 1 0 1 1 0 0 0 1 0 1 1 0 0 0 1 1 0 1 0 1 1 1467 1 0 0 1 0 0 1 0 1 1 1 0 1 1 1 1 1 0 0 0 1 1 0 0 1 0 1 1 1 0 0 1 0 1 0 1 1 1 1 0 0468 1 0 0 1 0 0 1 1 1 0 0 1 1 0 0 0 0 0 1 0 1 0 0 0 1 1 1 0 0 0 0 1 1 1 1 1 0 0 0 0 1469 1 0 0 1 0 0 1 0 0 0 0 1 0 0 1 1 1 0 1 1 1 0 1 0 1 0 0 0 1 1 0 1 0 0 0 0 0 1 1 0 1470 >>> generate(48, 5) # doctest: +NORMALIZE_WHITESPACE471 P1 11 6472 0 0 0 0 0 1 0 0 0 0 0473 0 0 0 0 0 0 1 0 0 0 0474 0 0 0 0 0 0 0 1 0 0 0475 0 0 0 0 0 0 0 0 1 0 0476 0 0 0 0 0 0 0 0 0 1 0477 0 0 0 0 0 0 0 0 0 0 1478 >>> generate(48, 20) # doctest: +NORMALIZE_WHITESPACE479 P1 41 21480 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0481 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0482 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0483 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0484 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0485 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0486 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0487 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0488 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0489 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0490 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0491 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0492 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0493 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0494 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0495 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0496 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0497 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0498 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0499 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0500 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1501 >>> generate(50, 5) # doctest: +NORMALIZE_WHITESPACE502 P1 11 6503 0 0 0 0 0 1 0 0 0 0 0504 0 0 0 0 1 0 1 0 0 0 0505 0 0 0 1 0 1 0 1 0 0 0506 0 0 1 0 1 0 1 0 1 0 0507 0 1 0 1 0 1 0 1 0 1 0508 1 0 1 0 1 0 1 0 1 0 1509 >>> generate(50, 20) # doctest: +NORMALIZE_WHITESPACE510 P1 41 21511 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0512 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0513 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0514 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0515 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0516 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0517 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0518 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0519 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0520 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0521 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0522 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0523 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0524 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0525 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0526 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0527 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0528 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0529 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0530 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0531 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1532 >>> generate(51, 5) # doctest: +NORMALIZE_WHITESPACE533 P1 11 6534 0 0 0 0 0 1 0 0 0 0 0535 1 1 1 1 1 0 1 1 1 1 1536 0 0 0 0 0 1 0 0 0 0 0537 1 1 1 1 1 0 1 1 1 1 1538 0 0 0 0 0 1 0 0 0 0 0539 1 1 1 1 1 0 1 1 1 1 1540 >>> generate(51, 20) # doctest: +NORMALIZE_WHITESPACE541 P1 41 21542 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0543 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1544 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0545 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1546 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0547 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1548 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0549 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1550 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0551 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1552 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0553 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1554 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0555 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1556 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0557 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1558 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0559 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1560 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0561 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1562 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0563 >>> generate(54, 5) # doctest: +NORMALIZE_WHITESPACE564 P1 11 6565 0 0 0 0 0 1 0 0 0 0 0566 0 0 0 0 1 1 1 0 0 0 0567 0 0 0 1 0 0 0 1 0 0 0568 0 0 1 1 1 0 1 1 1 0 0569 0 1 0 0 0 1 0 0 0 1 0570 1 1 1 0 1 1 1 0 1 1 1571 >>> generate(54, 20) # doctest: +NORMALIZE_WHITESPACE572 P1 41 21573 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0574 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0575 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0576 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0577 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0578 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0579 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0580 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0581 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0582 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0583 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0584 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0585 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0586 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0587 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0588 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0589 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0590 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0591 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0592 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0593 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1594 >>> generate(56, 5) # doctest: +NORMALIZE_WHITESPACE595 P1 11 6596 0 0 0 0 0 1 0 0 0 0 0597 0 0 0 0 0 0 1 0 0 0 0598 0 0 0 0 0 0 0 1 0 0 0599 0 0 0 0 0 0 0 0 1 0 0600 0 0 0 0 0 0 0 0 0 1 0601 0 0 0 0 0 0 0 0 0 0 1602 >>> generate(56, 20) # doctest: +NORMALIZE_WHITESPACE603 P1 41 21604 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0605 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0606 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0607 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0608 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0609 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0610 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0611 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0612 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0613 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0614 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0615 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0616 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0617 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0618 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0619 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0620 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0621 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0622 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0623 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0624 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1625 >>> generate(57, 5) # doctest: +NORMALIZE_WHITESPACE626 P1 11 6627 0 0 0 0 0 1 0 0 0 0 0628 1 1 1 1 0 0 1 1 1 1 1629 1 0 0 0 1 0 1 0 0 0 0630 0 1 1 0 0 1 0 1 1 1 1631 0 1 0 1 0 0 1 1 0 0 0632 0 0 1 0 1 0 1 0 1 1 1633 >>> generate(57, 20) # doctest: +NORMALIZE_WHITESPACE634 P1 41 21635 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0636 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1637 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0638 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1639 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0640 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1641 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0642 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1643 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0644 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 0 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1645 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0646 0 0 1 1 0 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 0 1 0 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1647 1 0 1 0 1 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0648 0 1 0 1 1 0 1 1 1 1 1 1 1 0 0 1 0 0 1 0 0 1 1 0 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1649 0 0 1 1 0 1 1 0 0 0 0 0 0 1 0 0 1 0 0 1 0 1 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0650 1 0 1 0 1 1 0 1 1 1 1 1 0 0 1 0 0 1 0 0 1 0 1 1 0 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1651 0 1 0 1 1 0 1 1 0 0 0 0 1 0 0 1 0 0 1 0 0 1 1 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0652 0 0 1 1 0 1 1 0 1 1 1 0 0 1 0 0 1 0 0 1 0 1 0 1 1 0 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1653 1 0 1 0 1 1 0 1 1 0 0 1 0 0 1 0 0 1 0 0 1 0 1 1 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0654 0 1 0 1 1 0 1 1 0 1 0 0 1 0 0 1 0 0 1 0 0 1 1 0 1 1 0 1 1 0 1 1 1 1 1 1 1 1 1 1 1655 0 0 1 1 0 1 1 0 1 0 1 0 0 1 0 0 1 0 0 1 0 1 0 1 1 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0656 >>> generate(60, 5) # doctest: +NORMALIZE_WHITESPACE657 P1 11 6658 0 0 0 0 0 1 0 0 0 0 0659 0 0 0 0 0 1 1 0 0 0 0660 0 0 0 0 0 1 0 1 0 0 0661 0 0 0 0 0 1 1 1 1 0 0662 0 0 0 0 0 1 0 0 0 1 0663 0 0 0 0 0 1 1 0 0 1 1664 >>> generate(60, 20) # doctest: +NORMALIZE_WHITESPACE665 P1 41 21666 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0667 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0668 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0669 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0670 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0671 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0672 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0673 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0674 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0675 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0676 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0677 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0678 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0679 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0680 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0681 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0682 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0683 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0684 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0685 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0686 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1687 >>> generate(62, 5) # doctest: +NORMALIZE_WHITESPACE688 P1 11 6689 0 0 0 0 0 1 0 0 0 0 0690 0 0 0 0 1 1 1 0 0 0 0691 0 0 0 1 1 0 0 1 0 0 0692 0 0 1 1 0 1 1 1 1 0 0693 0 1 1 0 1 1 0 0 0 1 0694 1 1 0 1 1 0 1 0 1 1 1695 >>> generate(62, 20) # doctest: +NORMALIZE_WHITESPACE696 P1 41 21697 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0698 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0699 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0700 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0701 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0702 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0703 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 1 1 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0704 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 1 1 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0705 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 1 1 0 1 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0706 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 1 1 0 1 1 1 1 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0707 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 1 1 0 1 1 0 0 0 1 1 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0708 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 1 1 0 1 1 0 1 0 1 1 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0709 0 0 0 0 0 0 0 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 1 1 0 1 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0710 0 0 0 0 0 0 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 0 0 1 1 1 1 0 1 0 1 1 1 0 0 0 0 0 0 0711 0 0 0 0 0 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 0 1 1 0 0 0 1 1 1 1 0 0 1 0 0 0 0 0 0712 0 0 0 0 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 1 1 0 1 0 1 1 0 0 0 1 1 1 1 0 0 0 0 0713 0 0 0 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 0 0 1 1 1 1 0 1 0 1 1 0 0 0 1 0 0 0 0714 0 0 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 0 1 1 0 0 0 1 1 1 1 0 1 0 1 1 1 0 0 0715 0 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 1 1 0 1 0 1 1 0 0 0 1 1 1 1 0 0 1 0 0716 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 0 0 1 1 1 1 0 1 0 1 1 0 0 0 1 1 1 1 0717 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 0 1 1 0 0 0 1 1 1 1 0 1 0 1 1 0 0 0 1718 >>> generate(73, 5) # doctest: +NORMALIZE_WHITESPACE719 P1 11 6720 0 0 0 0 0 1 0 0 0 0 0721 1 1 1 1 0 0 0 1 1 1 1722 1 0 0 1 0 1 0 1 0 0 1723 0 0 0 0 0 0 0 0 0 0 0724 1 1 1 1 1 1 1 1 1 1 1725 1 0 0 0 0 0 0 0 0 0 1726 >>> generate(73, 20) # doctest: +NORMALIZE_WHITESPACE727 P1 41 21728 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0729 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1730 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1731 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0732 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1733 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 0 0 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0734 1 1 1 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 1 1 1735 1 0 1 0 0 0 1 1 1 1 1 1 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 1 1 1 1 1 1 0 0 0 1 0 1736 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 1 1 0 0 0 1 0 0 0 1 1 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0737 1 1 1 0 0 0 0 0 1 1 1 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 1 0 0 0 1 1 1 0 0 0 0 0 1 1 1738 1 0 1 0 1 1 1 0 1 0 1 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 1 0 1 0 1 1 1 0 1 0 1739 0 0 0 0 1 0 1 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0 1 1 1 1 1 0 0 0 0 0 0 0 1 0 1 0 0 0 0740 1 1 1 0 0 0 0 0 1 1 1 1 1 0 1 0 0 0 1 0 1 0 1 0 0 0 1 0 1 1 1 1 1 0 0 0 0 0 1 1 1741 1 0 1 0 1 1 1 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 1 1 1 0 1 0 1742 0 0 0 0 1 0 1 0 0 0 1 0 0 0 1 0 0 0 1 1 1 1 1 0 0 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0743 1 1 1 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 1 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 1 1 1744 1 0 1 0 1 1 1 0 0 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 1 1 1 0 1 0 1745 0 0 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 1 1 0 0 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 0746 1 1 1 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 1 1 1747 1 0 1 0 1 1 1 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 1 1 1 0 1 0 1748 0 0 0 0 1 0 0 0 1 0 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 0 0 0 1 0 0 0 0749 >>> generate(85, 5) # doctest: +NORMALIZE_WHITESPACE750 P1 11 6751 0 0 0 0 0 1 0 0 0 0 0752 1 1 1 1 0 1 1 1 1 1 1753 0 0 0 1 0 0 0 0 0 0 1754 1 1 0 1 1 1 1 1 1 0 1755 0 1 0 0 0 0 0 0 1 0 1756 0 1 1 1 1 1 1 0 1 0 1757 >>> generate(85, 20) # doctest: +NORMALIZE_WHITESPACE758 P1 41 21759 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0760 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1761 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1762 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1763 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1764 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1765 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1766 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 0 1767 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1768 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 0 1 0 1769 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1770 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 0 1 0 1 0 1771 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1772 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 0 1 0 1 0 1 0 1773 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1774 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1775 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1776 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1777 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1778 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1779 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1780 >>> generate(86, 5) # doctest: +NORMALIZE_WHITESPACE781 P1 11 6782 0 0 0 0 0 1 0 0 0 0 0783 0 0 0 0 1 1 1 0 0 0 0784 0 0 0 1 0 0 1 1 0 0 0785 0 0 1 1 1 1 0 1 1 0 0786 0 1 0 0 0 1 0 0 1 1 0787 1 1 1 0 1 1 1 1 0 1 1788 >>> generate(86, 20) # doctest: +NORMALIZE_WHITESPACE789 P1 41 21790 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0791 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0792 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0793 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0794 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0795 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0796 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0797 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 0 0 1 1 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0798 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 1 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0799 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 1 0 0 1 1 0 1 1 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0800 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 1 1 1 1 0 1 0 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0801 0 0 0 0 0 0 0 0 0 1 1 1 1 0 1 0 0 0 0 1 0 1 1 0 0 1 1 1 1 0 1 1 0 0 0 0 0 0 0 0 0802 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 1 0 0 1 1 1 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0803 0 0 0 0 0 0 0 1 1 1 0 1 1 0 0 1 1 1 0 1 1 1 0 0 1 1 0 1 1 1 1 0 1 1 0 0 0 0 0 0 0804 0 0 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 1 0 0 0 1 1 1 0 1 0 0 0 0 1 0 0 1 1 0 0 0 0 0 0805 0 0 0 0 0 1 1 1 1 1 1 1 0 0 1 1 1 1 1 0 1 0 0 1 0 1 1 0 0 1 1 1 1 0 1 1 0 0 0 0 0806 0 0 0 0 1 0 0 0 0 0 0 1 1 1 0 0 0 0 1 0 1 1 1 1 0 0 1 1 1 0 0 0 1 0 0 1 1 0 0 0 0807 0 0 0 1 1 1 0 0 0 0 1 0 0 1 1 0 0 1 1 0 0 0 0 1 1 1 0 0 1 1 0 1 1 1 1 0 1 1 0 0 0808 0 0 1 0 0 1 1 0 0 1 1 1 1 0 1 1 1 0 1 1 0 0 1 0 0 1 1 1 0 1 0 0 0 0 1 0 0 1 1 0 0809 0 1 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 0 1 1 1 1 1 1 0 0 1 0 1 1 0 0 1 1 1 1 0 1 1 0810 1 0 0 0 1 0 0 0 1 1 0 1 1 1 0 1 1 1 1 0 0 0 0 0 1 1 1 1 0 0 1 1 1 0 0 0 1 0 0 1 1811 >>> generate(90, 5) # doctest: +NORMALIZE_WHITESPACE812 P1 11 6813 0 0 0 0 0 1 0 0 0 0 0814 0 0 0 0 1 0 1 0 0 0 0815 0 0 0 1 0 0 0 1 0 0 0816 0 0 1 0 1 0 1 0 1 0 0817 0 1 0 0 0 0 0 0 0 1 0818 1 0 1 0 0 0 0 0 1 0 1819 >>> generate(90, 20) # doctest: +NORMALIZE_WHITESPACE820 P1 41 21821 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0822 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0823 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0824 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0825 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0826 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0827 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0828 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0829 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0830 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0831 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0832 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0833 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0834 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0835 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0836 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0837 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0838 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0839 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0840 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0841 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1842 >>> generate(94, 5) # doctest: +NORMALIZE_WHITESPACE843 P1 11 6844 0 0 0 0 0 1 0 0 0 0 0845 0 0 0 0 1 1 1 0 0 0 0846 0 0 0 1 1 0 1 1 0 0 0847 0 0 1 1 1 0 1 1 1 0 0848 0 1 1 0 1 0 1 0 1 1 0849 1 1 1 0 1 0 1 0 1 1 1850 >>> generate(94, 20) # doctest: +NORMALIZE_WHITESPACE851 P1 41 21852 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0853 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0854 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0855 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0856 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0857 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0858 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 0 1 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0859 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 1 0 1 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0860 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0861 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0862 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0863 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0864 0 0 0 0 0 0 0 0 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 0 0 0 0 0 0 0 0865 0 0 0 0 0 0 0 1 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 1 0 0 0 0 0 0 0866 0 0 0 0 0 0 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 0 0 0 0 0 0867 0 0 0 0 0 1 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 1 0 0 0 0 0868 0 0 0 0 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 0 0 0 0869 0 0 0 1 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 1 0 0 0870 0 0 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 0 0871 0 1 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1 1 0872 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 1873 >>> generate(102, 5) # doctest: +NORMALIZE_WHITESPACE874 P1 11 6875 0 0 0 0 0 1 0 0 0 0 0876 0 0 0 0 1 1 0 0 0 0 0877 0 0 0 1 0 1 0 0 0 0 0878 0 0 1 1 1 1 0 0 0 0 0879 0 1 0 0 0 1 0 0 0 0 0880 1 1 0 0 1 1 0 0 0 0 0881 >>> generate(102, 20) # doctest: +NORMALIZE_WHITESPACE882 P1 41 21883 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0884 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0885 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0886 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0887 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0888 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0889 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0890 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0891 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0892 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0893 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0894 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0895 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0896 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0897 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0898 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0899 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0900 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0901 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0902 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0903 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0904 >>> generate(103, 5) # doctest: +NORMALIZE_WHITESPACE905 P1 11 6906 0 0 0 0 0 1 0 0 0 0 0907 1 1 1 1 1 1 0 1 1 1 1908 0 0 0 0 0 1 1 0 0 0 1909 1 1 1 1 1 0 1 0 1 1 1910 0 0 0 0 1 1 1 1 0 0 1911 1 1 1 1 0 0 0 1 0 1 1912 >>> generate(103, 20) # doctest: +NORMALIZE_WHITESPACE913 P1 41 21914 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0915 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1916 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1917 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1918 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1919 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1920 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1921 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1922 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1923 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1924 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1925 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1926 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1927 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1928 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1929 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1930 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1931 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1932 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1933 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1934 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 1935 >>> generate(105, 5) # doctest: +NORMALIZE_WHITESPACE936 P1 11 6937 0 0 0 0 0 1 0 0 0 0 0938 1 1 1 1 0 0 0 1 1 1 1939 1 0 0 1 0 1 0 1 0 0 1940 0 0 0 0 1 0 1 0 0 0 0941 1 1 1 0 0 1 0 0 1 1 1942 1 0 1 0 0 0 0 0 1 0 1943 >>> generate(105, 20) # doctest: +NORMALIZE_WHITESPACE944 P1 41 21945 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0946 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1947 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1948 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0949 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1950 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 0 0 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0951 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0952 1 1 1 0 0 0 1 1 1 1 1 1 1 0 0 1 0 0 1 0 0 0 1 0 0 1 0 0 1 1 1 1 1 1 1 0 0 0 1 1 1953 1 0 1 0 1 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 1 0 1 0 1954 0 1 0 1 0 1 0 0 1 1 1 0 0 0 1 1 1 1 1 0 0 0 1 1 1 1 1 0 0 0 1 1 1 0 0 1 0 1 0 1 0955 0 0 1 0 1 0 0 0 1 0 1 0 1 0 1 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 1 0 1 0 0 0 1 0 1 0 0956 1 0 0 1 0 0 1 0 0 1 0 1 0 1 0 0 1 0 0 1 0 1 0 0 1 0 0 1 0 1 0 1 0 0 1 0 0 1 0 0 1957 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0958 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 1 1 1 1 0 0 0 1 1 1 1 1 0 0 1 0 0 1 1 1 1 1 1 1 1 1959 1 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 1960 0 0 1 1 1 1 1 0 0 0 1 1 1 0 0 0 1 0 0 1 0 1 0 0 1 0 0 0 1 1 1 0 0 0 1 1 1 1 1 0 0961 1 0 1 0 0 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 0 0 1 0 1962 0 1 0 0 1 0 0 1 0 1 0 1 0 1 0 0 1 1 1 0 0 0 1 1 1 0 0 1 0 1 0 1 0 1 0 0 1 0 0 1 0963 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 1 0 1 0 1 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0964 1 1 1 1 1 1 1 0 0 1 0 1 0 0 1 0 0 1 0 1 0 1 0 1 0 0 1 0 0 1 0 1 0 0 1 1 1 1 1 1 1965 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1966 >>> generate(107, 5) # doctest: +NORMALIZE_WHITESPACE967 P1 11 6968 0 0 0 0 0 1 0 0 0 0 0969 1 1 1 1 1 0 0 1 1 1 1970 1 0 0 0 1 0 1 1 0 0 1971 0 0 1 1 0 1 1 1 0 1 0972 1 1 1 1 1 1 0 1 1 0 0973 1 0 0 0 0 1 1 1 1 0 1974 >>> generate(107, 20) # doctest: +NORMALIZE_WHITESPACE975 P1 41 21976 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0977 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1978 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1979 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0980 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0981 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0982 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 1 0 1983 1 0 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 0 1 0984 0 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 1 1 0 0985 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 0 1986 1 0 0 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 1 1 0 1 1 0 0 0 0 0 0 0 1 1 0987 0 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 0988 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 1 0989 1 0 1 0 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 0 1 1 1 1 1 0 0990 0 1 0 1 1 0 0 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 1 1 0 1 1 0 0 0 1 0 1991 1 0 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 0 1 0992 0 1 1 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 1 1 0 0993 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 0 0 0 1 0 1994 1 0 0 0 1 1 0 1 1 0 0 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 1 0 1 1 0 1 0995 0 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 0 1 1 1 1 0 0996 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 0 0 1 0 1997 >>> generate(108, 5) # doctest: +NORMALIZE_WHITESPACE998 P1 11 6999 0 0 0 0 0 1 0 0 0 0 01000 0 0 0 0 0 1 0 0 0 0 01001 0 0 0 0 0 1 0 0 0 0 01002 0 0 0 0 0 1 0 0 0 0 01003 0 0 0 0 0 1 0 0 0 0 01004 0 0 0 0 0 1 0 0 0 0 01005 >>> generate(108, 20) # doctest: +NORMALIZE_WHITESPACE1006 P1 41 211007 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01008 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01009 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01010 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01011 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01012 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01013 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01014 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01015 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01016 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01017 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01018 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01019 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01020 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01021 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01022 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01023 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01024 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01025 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01026 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01027 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01028 >>> generate(109, 5) # doctest: +NORMALIZE_WHITESPACE1029 P1 11 61030 0 0 0 0 0 1 0 0 0 0 01031 1 1 1 1 0 1 0 1 1 1 11032 1 0 0 1 1 1 1 1 0 0 11033 1 0 0 1 0 0 0 1 0 0 11034 1 0 0 1 0 1 0 1 0 0 11035 1 0 0 1 1 1 1 1 0 0 11036 >>> generate(109, 20) # doctest: +NORMALIZE_WHITESPACE1037 P1 41 211038 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01039 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11040 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11041 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 11042 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 11043 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 11044 1 1 1 1 1 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0 1 1 1 1 1 0 0 0 0 0 0 0 0 0 1 1 1 1 11045 1 0 0 0 1 0 1 1 1 1 1 1 1 0 1 0 0 0 1 0 1 0 1 0 0 0 1 0 1 1 1 1 1 1 1 0 1 0 0 0 11046 1 0 1 0 1 1 1 0 0 0 0 0 1 1 1 0 1 0 1 1 1 1 1 0 1 0 1 1 1 0 0 0 0 0 1 1 1 0 1 0 11047 1 1 1 1 1 0 1 0 1 1 1 0 1 0 1 1 1 1 1 0 0 0 1 1 1 1 1 0 1 0 1 1 1 0 1 0 1 1 1 1 11048 1 0 0 0 1 1 1 1 1 0 1 1 1 1 1 0 0 0 1 0 1 0 1 0 0 0 1 1 1 1 1 0 1 1 1 1 1 0 0 0 11049 1 0 1 0 1 0 0 0 1 1 1 0 0 0 1 0 1 0 1 1 1 1 1 0 1 0 1 0 0 0 1 1 1 0 0 0 1 0 1 0 11050 1 1 1 1 1 0 1 0 1 0 1 0 1 0 1 1 1 1 1 0 0 0 1 1 1 1 1 0 1 0 1 0 1 0 1 0 1 1 1 1 11051 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 0 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 11052 1 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 1 1 1 1 1 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 11053 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 1 0 0 0 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 11054 1 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 1 0 1 0 1 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 11055 1 0 1 0 1 0 1 0 1 1 1 0 1 0 1 0 1 0 1 1 1 1 1 0 1 0 1 0 1 0 1 1 1 0 1 0 1 0 1 0 11056 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 11057 1 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 11058 1 0 1 1 1 1 1 0 1 0 1 0 1 1 1 1 1 0 1 1 1 1 1 0 1 1 1 1 1 0 1 0 1 0 1 1 1 1 1 0 11059 >>> generate(110, 5) # doctest: +NORMALIZE_WHITESPACE1060 P1 11 61061 0 0 0 0 0 1 0 0 0 0 01062 0 0 0 0 1 1 0 0 0 0 01063 0 0 0 1 1 1 0 0 0 0 01064 0 0 1 1 0 1 0 0 0 0 01065 0 1 1 1 1 1 0 0 0 0 01066 1 1 0 0 0 1 0 0 0 0 01067 >>> generate(110, 20) # doctest: +NORMALIZE_WHITESPACE1068 P1 41 211069 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01070 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01071 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01072 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01073 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01074 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01075 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01076 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01077 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01078 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01079 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01080 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01081 0 0 0 0 0 0 0 0 1 1 1 1 1 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01082 0 0 0 0 0 0 0 1 1 0 0 0 1 0 1 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01083 0 0 0 0 0 0 1 1 1 0 0 1 1 1 1 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01084 0 0 0 0 0 1 1 0 1 0 1 1 0 0 1 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01085 0 0 0 0 1 1 1 1 1 1 1 1 0 1 1 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01086 0 0 0 1 1 0 0 0 0 0 0 1 1 1 1 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01087 0 0 1 1 1 0 0 0 0 0 1 1 0 0 1 0 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01088 0 1 1 0 1 0 0 0 0 1 1 1 0 1 1 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01089 1 1 1 1 1 0 0 0 1 1 0 1 1 1 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01090 >>> generate(121, 5) # doctest: +NORMALIZE_WHITESPACE1091 P1 11 61092 0 0 0 0 0 1 0 0 0 0 01093 1 1 1 1 0 0 1 1 1 1 11094 1 0 0 1 1 0 1 0 0 0 11095 0 1 0 1 1 1 0 1 1 0 01096 0 0 1 1 0 1 1 1 1 1 11097 1 0 1 1 1 1 0 0 0 0 11098 >>> generate(121, 20) # doctest: +NORMALIZE_WHITESPACE1099 P1 41 211100 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01101 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11102 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11103 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 01104 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 11105 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 11106 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 01107 0 1 0 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 11108 0 0 1 1 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 01109 1 0 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 11110 0 1 1 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 11111 0 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 01112 0 1 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 11113 0 0 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 0 11114 1 0 1 0 0 0 1 1 0 1 1 0 0 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 1 1 0 1 01115 0 1 0 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 0 11116 0 0 1 1 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 1 1 01117 1 0 1 0 0 0 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 0 1 1 1 1 11118 0 1 0 1 1 0 1 0 0 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 1 1 0 1 1 0 0 0 11119 0 0 1 1 1 1 0 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 01120 1 0 1 0 0 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 11121 >>> generate(123, 5) # doctest: +NORMALIZE_WHITESPACE1122 P1 11 61123 0 0 0 0 0 1 0 0 0 0 01124 1 1 1 1 1 0 1 1 1 1 11125 1 0 0 0 1 1 1 0 0 0 11126 0 1 1 1 1 0 1 1 1 1 01127 1 1 0 0 1 1 1 0 0 1 11128 1 1 1 1 1 0 1 1 1 1 11129 >>> generate(123, 20) # doctest: +NORMALIZE_WHITESPACE1130 P1 41 211131 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01132 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11133 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11134 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 01135 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 11136 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11137 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11138 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 01139 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 11140 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11141 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11142 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 01143 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 11144 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11145 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11146 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 01147 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 11148 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11149 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11150 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 01151 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 11152 >>> generate(124, 5) # doctest: +NORMALIZE_WHITESPACE1153 P1 11 61154 0 0 0 0 0 1 0 0 0 0 01155 0 0 0 0 0 1 1 0 0 0 01156 0 0 0 0 0 1 1 1 0 0 01157 0 0 0 0 0 1 0 1 1 0 01158 0 0 0 0 0 1 1 1 1 1 01159 0 0 0 0 0 1 0 0 0 1 11160 >>> generate(124, 20) # doctest: +NORMALIZE_WHITESPACE1161 P1 41 211162 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01163 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01164 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01165 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01166 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01167 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01168 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01169 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 01170 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 01171 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 01172 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 01173 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0 1 0 1 1 0 0 0 0 0 0 0 0 01174 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 1 1 1 1 1 0 0 0 0 0 0 0 01175 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 1 0 1 0 0 0 1 1 0 0 0 0 0 0 01176 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 1 1 1 1 0 0 1 1 1 0 0 0 0 0 01177 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 1 0 0 1 1 0 1 0 1 1 0 0 0 0 01178 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 1 1 0 1 1 1 1 1 1 1 1 0 0 0 01179 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 1 1 1 1 0 0 0 0 0 0 1 1 0 0 01180 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 1 0 0 1 1 0 0 0 0 0 1 1 1 0 01181 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 1 1 0 1 1 1 0 0 0 0 1 0 1 1 01182 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 1 1 1 0 1 1 0 0 0 1 1 1 1 11183 >>> generate(126, 5) # doctest: +NORMALIZE_WHITESPACE1184 P1 11 61185 0 0 0 0 0 1 0 0 0 0 01186 0 0 0 0 1 1 1 0 0 0 01187 0 0 0 1 1 0 1 1 0 0 01188 0 0 1 1 1 1 1 1 1 0 01189 0 1 1 0 0 0 0 0 1 1 01190 1 1 1 1 0 0 0 1 1 1 11191 >>> generate(126, 20) # doctest: +NORMALIZE_WHITESPACE1192 P1 41 211193 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01194 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01195 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01196 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01197 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01198 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01199 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01200 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 01201 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 01202 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 01203 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 01204 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 01205 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 01206 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 1 1 1 1 0 0 0 1 1 1 1 0 0 0 0 1 1 1 1 0 0 0 0 0 0 01207 0 0 0 0 0 0 1 1 0 0 1 1 0 0 1 1 0 0 1 1 0 1 1 0 0 1 1 0 0 1 1 0 0 1 1 0 0 0 0 0 01208 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 01209 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 01210 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 01211 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 01212 0 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 01213 1 1 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 11214 >>> generate(127, 5) # doctest: +NORMALIZE_WHITESPACE1215 P1 11 61216 0 0 0 0 0 1 0 0 0 0 01217 1 1 1 1 1 1 1 1 1 1 11218 1 0 0 0 0 0 0 0 0 0 11219 1 1 1 1 1 1 1 1 1 1 11220 1 0 0 0 0 0 0 0 0 0 11221 1 1 1 1 1 1 1 1 1 1 11222 >>> generate(127, 20) # doctest: +NORMALIZE_WHITESPACE1223 P1 41 211224 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01225 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11226 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11227 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11228 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11229 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11230 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11231 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11232 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11233 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11234 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11235 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11236 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11237 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11238 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11239 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11240 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11241 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11242 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11243 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11244 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11245 >>> generate(128, 5) # doctest: +NORMALIZE_WHITESPACE1246 P1 11 61247 0 0 0 0 0 1 0 0 0 0 01248 0 0 0 0 0 0 0 0 0 0 01249 0 0 0 0 0 0 0 0 0 0 01250 0 0 0 0 0 0 0 0 0 0 01251 0 0 0 0 0 0 0 0 0 0 01252 0 0 0 0 0 0 0 0 0 0 01253 >>> generate(128, 20) # doctest: +NORMALIZE_WHITESPACE1254 P1 41 211255 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01256 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01257 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01258 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01259 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01260 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01261 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01262 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01263 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01264 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01265 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01266 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01267 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01268 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01269 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01270 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01271 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01272 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01273 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01274 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01275 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01276 >>> generate(129, 5) # doctest: +NORMALIZE_WHITESPACE1277 P1 11 61278 0 0 0 0 0 1 0 0 0 0 01279 1 1 1 1 0 0 0 1 1 1 11280 0 1 1 0 0 1 0 0 1 1 01281 0 0 0 0 0 0 0 0 0 0 01282 1 1 1 1 1 1 1 1 1 1 11283 0 1 1 1 1 1 1 1 1 1 01284 >>> generate(129, 20) # doctest: +NORMALIZE_WHITESPACE1285 P1 41 211286 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01287 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11288 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 01289 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 01290 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 11291 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 1 1 1 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 01292 1 1 1 0 0 1 1 1 1 1 1 1 1 1 0 0 1 1 0 0 1 0 0 1 1 0 0 1 1 1 1 1 1 1 1 1 0 0 1 1 11293 0 1 0 0 0 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 0 0 0 0 1 01294 0 0 0 1 1 0 0 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 0 0 1 1 0 0 01295 1 1 0 0 0 0 0 0 1 1 1 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 1 1 1 0 0 0 0 0 0 1 11296 0 0 0 1 1 1 1 0 0 1 0 0 1 1 0 0 1 1 1 1 1 1 1 1 1 0 0 1 1 0 0 1 0 0 1 1 1 1 0 0 01297 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 11298 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 01299 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 0 0 0 0 1 1 1 0 0 0 0 1 1 1 1 1 1 1 0 0 1 1 1 1 1 11300 0 1 1 1 1 0 0 0 0 1 1 1 1 1 0 0 1 1 0 0 1 0 0 1 1 0 0 1 1 1 1 1 0 0 0 0 1 1 1 1 01301 0 0 1 1 0 0 1 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 1 1 0 0 1 1 0 01302 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 11303 0 0 1 1 1 1 1 1 1 1 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 1 1 1 1 1 1 1 1 0 01304 1 0 0 1 1 1 1 1 1 0 0 1 1 1 0 0 1 1 1 1 1 1 1 1 1 0 0 1 1 1 0 0 1 1 1 1 1 1 0 0 11305 0 0 0 0 1 1 1 1 0 0 0 0 1 0 0 0 0 1 1 1 1 1 1 1 0 0 0 0 1 0 0 0 0 1 1 1 1 0 0 0 01306 1 1 1 0 0 1 1 0 0 1 1 0 0 0 1 1 0 0 1 1 1 1 1 0 0 1 1 0 0 0 1 1 0 0 1 1 0 0 1 1 11307 >>> generate(132, 5) # doctest: +NORMALIZE_WHITESPACE1308 P1 11 61309 0 0 0 0 0 1 0 0 0 0 01310 0 0 0 0 0 1 0 0 0 0 01311 0 0 0 0 0 1 0 0 0 0 01312 0 0 0 0 0 1 0 0 0 0 01313 0 0 0 0 0 1 0 0 0 0 01314 0 0 0 0 0 1 0 0 0 0 01315 >>> generate(132, 20) # doctest: +NORMALIZE_WHITESPACE1316 P1 41 211317 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01318 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01319 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01320 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01321 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01322 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01323 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01324 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01325 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01326 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01327 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01328 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01329 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01330 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01331 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01332 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01333 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01334 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01335 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01336 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01337 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01338 >>> generate(136, 5) # doctest: +NORMALIZE_WHITESPACE1339 P1 11 61340 0 0 0 0 0 1 0 0 0 0 01341 0 0 0 0 0 0 0 0 0 0 01342 0 0 0 0 0 0 0 0 0 0 01343 0 0 0 0 0 0 0 0 0 0 01344 0 0 0 0 0 0 0 0 0 0 01345 0 0 0 0 0 0 0 0 0 0 01346 >>> generate(136, 20) # doctest: +NORMALIZE_WHITESPACE1347 P1 41 211348 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01349 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01350 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01351 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01352 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01353 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01354 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01355 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01356 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01357 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01358 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01359 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01360 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01361 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01362 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01363 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01364 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01365 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01366 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01367 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01368 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01369 >>> generate(137, 5) # doctest: +NORMALIZE_WHITESPACE1370 P1 11 61371 0 0 0 0 0 1 0 0 0 0 01372 1 1 1 1 0 0 0 1 1 1 11373 1 1 1 0 0 1 0 1 1 1 01374 1 1 0 0 0 0 0 1 1 0 01375 1 0 0 1 1 1 0 1 0 0 11376 0 0 0 1 1 0 0 0 0 0 01377 >>> generate(137, 20) # doctest: +NORMALIZE_WHITESPACE1378 P1 41 211379 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01380 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11381 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 01382 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 01383 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 11384 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 01385 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 11386 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 01387 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 1 0 01388 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 11389 1 1 1 1 1 1 1 1 1 1 0 0 1 0 1 1 1 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 0 01390 1 1 1 1 1 1 1 1 1 0 0 0 0 0 1 1 0 0 1 1 1 0 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 0 0 11391 1 1 1 1 1 1 1 1 0 0 1 1 1 0 1 0 0 0 1 1 0 0 1 1 1 1 1 1 1 1 0 0 1 0 1 1 1 0 0 0 01392 1 1 1 1 1 1 1 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 1 1 1 1 1 1 0 0 0 0 0 1 1 0 0 1 1 11393 1 1 1 1 1 1 0 0 1 0 1 0 0 1 1 0 0 0 0 0 1 0 1 1 1 1 1 1 0 0 1 1 1 0 1 0 0 0 1 1 01394 1 1 1 1 1 0 0 0 0 0 0 0 0 1 0 0 1 1 1 0 0 0 1 1 1 1 1 0 0 0 1 1 0 0 0 0 1 0 1 0 01395 1 1 1 1 0 0 1 1 1 1 1 1 0 0 0 0 1 1 0 0 1 0 1 1 1 1 0 0 1 0 1 0 0 1 1 0 0 0 0 0 11396 1 1 1 0 0 0 1 1 1 1 1 0 0 1 1 0 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 1 0 0 1 1 1 0 01397 1 1 0 0 1 0 1 1 1 1 0 0 0 1 0 0 0 0 1 1 1 0 1 1 0 0 1 1 1 1 1 1 0 0 0 0 1 1 0 0 11398 1 0 0 0 0 0 1 1 1 0 0 1 0 0 0 1 1 0 1 1 0 0 1 0 0 0 1 1 1 1 1 0 0 1 1 0 1 0 0 0 01399 0 0 1 1 1 0 1 1 0 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 1 0 1 1 1 1 0 0 0 1 0 0 0 0 1 1 11400 >>> generate(144, 5) # doctest: +NORMALIZE_WHITESPACE1401 P1 11 61402 0 0 0 0 0 1 0 0 0 0 01403 0 0 0 0 0 0 1 0 0 0 01404 0 0 0 0 0 0 0 1 0 0 01405 0 0 0 0 0 0 0 0 1 0 01406 0 0 0 0 0 0 0 0 0 1 01407 0 0 0 0 0 0 0 0 0 0 11408 >>> generate(144, 20) # doctest: +NORMALIZE_WHITESPACE1409 P1 41 211410 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01411 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01412 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01413 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01414 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01415 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01416 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01417 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 01418 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 01419 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 01420 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 01421 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 01422 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 01423 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 01424 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 01425 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 01426 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 01427 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 01428 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 01429 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 01430 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11431 >>> generate(146, 5) # doctest: +NORMALIZE_WHITESPACE1432 P1 11 61433 0 0 0 0 0 1 0 0 0 0 01434 0 0 0 0 1 0 1 0 0 0 01435 0 0 0 1 0 0 0 1 0 0 01436 0 0 1 0 1 0 1 0 1 0 01437 0 1 0 0 0 0 0 0 0 1 01438 1 0 1 0 0 0 0 0 1 0 11439 >>> generate(146, 20) # doctest: +NORMALIZE_WHITESPACE1440 P1 41 211441 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01442 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01443 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01444 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01445 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01446 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01447 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01448 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 01449 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 01450 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 01451 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 01452 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 01453 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 01454 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 01455 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 01456 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 01457 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 01458 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 01459 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 01460 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 01461 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 11462 >>> generate(148, 5) # doctest: +NORMALIZE_WHITESPACE1463 P1 11 61464 0 0 0 0 0 1 0 0 0 0 01465 0 0 0 0 0 1 1 0 0 0 01466 0 0 0 0 0 0 0 1 0 0 01467 0 0 0 0 0 0 0 1 1 0 01468 0 0 0 0 0 0 0 0 0 1 01469 0 0 0 0 0 0 0 0 0 1 11470 >>> generate(148, 20) # doctest: +NORMALIZE_WHITESPACE1471 P1 41 211472 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01473 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01474 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01475 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01476 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01477 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01478 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01479 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 01480 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 01481 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 01482 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 01483 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 01484 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 01485 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 01486 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 01487 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 01488 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 01489 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 01490 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 01491 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 01492 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11493 >>> generate(150, 5) # doctest: +NORMALIZE_WHITESPACE1494 P1 11 61495 0 0 0 0 0 1 0 0 0 0 01496 0 0 0 0 1 1 1 0 0 0 01497 0 0 0 1 0 1 0 1 0 0 01498 0 0 1 1 0 1 0 1 1 0 01499 0 1 0 0 0 1 0 0 0 1 01500 1 1 1 0 1 1 1 0 1 1 11501 >>> generate(150, 20) # doctest: +NORMALIZE_WHITESPACE1502 P1 41 211503 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01504 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01505 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01506 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01507 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01508 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01509 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01510 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 1 1 1 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 01511 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 01512 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 01513 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 01514 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 1 0 1 1 0 1 0 1 1 0 1 1 0 1 0 1 1 0 0 0 0 0 0 0 0 01515 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 01516 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 1 1 1 0 0 0 0 0 0 01517 0 0 0 0 0 0 1 0 1 0 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 0 0 1 0 1 0 0 0 0 0 01518 0 0 0 0 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 0 0 0 01519 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 01520 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 01521 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 01522 0 1 1 0 1 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 1 01523 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 11524 >>> generate(152, 5) # doctest: +NORMALIZE_WHITESPACE1525 P1 11 61526 0 0 0 0 0 1 0 0 0 0 01527 0 0 0 0 0 0 1 0 0 0 01528 0 0 0 0 0 0 0 1 0 0 01529 0 0 0 0 0 0 0 0 1 0 01530 0 0 0 0 0 0 0 0 0 1 01531 0 0 0 0 0 0 0 0 0 0 11532 >>> generate(152, 20) # doctest: +NORMALIZE_WHITESPACE1533 P1 41 211534 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01535 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01536 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01537 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01538 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01539 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01540 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01541 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 01542 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 01543 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 01544 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 01545 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 01546 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 01547 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 01548 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 01549 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 01550 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 01551 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 01552 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 01553 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 01554 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11555 >>> generate(160, 5) # doctest: +NORMALIZE_WHITESPACE1556 P1 11 61557 0 0 0 0 0 1 0 0 0 0 01558 0 0 0 0 0 0 0 0 0 0 01559 0 0 0 0 0 0 0 0 0 0 01560 0 0 0 0 0 0 0 0 0 0 01561 0 0 0 0 0 0 0 0 0 0 01562 0 0 0 0 0 0 0 0 0 0 01563 >>> generate(160, 20) # doctest: +NORMALIZE_WHITESPACE1564 P1 41 211565 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01566 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01567 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01568 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01569 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01570 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01571 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01572 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01573 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01574 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01575 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01576 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01577 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01578 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01579 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01580 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01581 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01582 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01583 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01584 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01585 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01586 >>> generate(170, 5) # doctest: +NORMALIZE_WHITESPACE1587 P1 11 61588 0 0 0 0 0 1 0 0 0 0 01589 0 0 0 0 1 0 0 0 0 0 01590 0 0 0 1 0 0 0 0 0 0 01591 0 0 1 0 0 0 0 0 0 0 01592 0 1 0 0 0 0 0 0 0 0 01593 1 0 0 0 0 0 0 0 0 0 01594 >>> generate(170, 20) # doctest: +NORMALIZE_WHITESPACE1595 P1 41 211596 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01597 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01598 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01599 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01600 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01601 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01602 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01603 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01604 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01605 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01606 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01607 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01608 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01609 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01610 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01611 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01612 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01613 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01614 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01615 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01616 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01617 >>> generate(172, 5) # doctest: +NORMALIZE_WHITESPACE1618 P1 11 61619 0 0 0 0 0 1 0 0 0 0 01620 0 0 0 0 0 1 0 0 0 0 01621 0 0 0 0 0 1 0 0 0 0 01622 0 0 0 0 0 1 0 0 0 0 01623 0 0 0 0 0 1 0 0 0 0 01624 0 0 0 0 0 1 0 0 0 0 01625 >>> generate(172, 20) # doctest: +NORMALIZE_WHITESPACE1626 P1 41 211627 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01628 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01629 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01630 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01631 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01632 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01633 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01634 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01635 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01636 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01637 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01638 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01639 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01640 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01641 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01642 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01643 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01644 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01645 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01646 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01647 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01648 >>> generate(176, 5) # doctest: +NORMALIZE_WHITESPACE1649 P1 11 61650 0 0 0 0 0 1 0 0 0 0 01651 0 0 0 0 0 0 1 0 0 0 01652 0 0 0 0 0 0 0 1 0 0 01653 0 0 0 0 0 0 0 0 1 0 01654 0 0 0 0 0 0 0 0 0 1 01655 0 0 0 0 0 0 0 0 0 0 11656 >>> generate(176, 20) # doctest: +NORMALIZE_WHITESPACE1657 P1 41 211658 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01659 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01660 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01661 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01662 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01663 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01664 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01665 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 01666 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 01667 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 01668 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 01669 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 01670 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 01671 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 01672 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 01673 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 01674 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 01675 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 01676 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 01677 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 01678 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11679 >>> generate(182, 5) # doctest: +NORMALIZE_WHITESPACE1680 P1 11 61681 0 0 0 0 0 1 0 0 0 0 01682 0 0 0 0 1 1 1 0 0 0 01683 0 0 0 1 0 1 0 1 0 0 01684 0 0 1 1 1 1 1 1 1 0 01685 0 1 0 1 1 1 1 1 0 1 01686 1 1 1 0 1 1 1 0 1 1 11687 >>> generate(182, 20) # doctest: +NORMALIZE_WHITESPACE1688 P1 41 211689 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01690 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01691 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01692 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01693 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01694 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01695 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01696 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 01697 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 01698 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 01699 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 1 1 1 1 1 1 1 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 01700 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 01701 0 0 0 0 0 0 0 0 1 0 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 0 0 0 0 0 0 0 01702 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 01703 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 01704 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 01705 0 0 0 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 0 01706 0 0 0 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 0 01707 0 0 1 0 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 0 1 0 01708 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 01709 1 0 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 0 11710 >>> generate(184, 5) # doctest: +NORMALIZE_WHITESPACE1711 P1 11 61712 0 0 0 0 0 1 0 0 0 0 01713 0 0 0 0 0 0 1 0 0 0 01714 0 0 0 0 0 0 0 1 0 0 01715 0 0 0 0 0 0 0 0 1 0 01716 0 0 0 0 0 0 0 0 0 1 01717 0 0 0 0 0 0 0 0 0 0 11718 >>> generate(184, 20) # doctest: +NORMALIZE_WHITESPACE1719 P1 41 211720 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01721 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01722 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01723 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01724 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01725 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01726 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01727 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 01728 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 01729 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 01730 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 01731 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 01732 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 01733 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 01734 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 01735 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 01736 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 01737 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 01738 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 01739 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 01740 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11741 >>> generate(188, 5) # doctest: +NORMALIZE_WHITESPACE1742 P1 11 61743 0 0 0 0 0 1 0 0 0 0 01744 0 0 0 0 0 1 1 0 0 0 01745 0 0 0 0 0 1 0 1 0 0 01746 0 0 0 0 0 1 1 1 1 0 01747 0 0 0 0 0 1 1 1 0 1 01748 0 0 0 0 0 1 1 0 1 1 11749 >>> generate(188, 20) # doctest: +NORMALIZE_WHITESPACE1750 P1 41 211751 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01752 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01753 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01754 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01755 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01756 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01757 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01758 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 01759 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 01760 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 01761 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 01762 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 01763 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 01764 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 01765 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 0 0 0 0 0 01766 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 01767 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 0 0 0 01768 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 01769 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 0 01770 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 01771 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 11772 >>> generate(190, 5) # doctest: +NORMALIZE_WHITESPACE1773 P1 11 61774 0 0 0 0 0 1 0 0 0 0 01775 0 0 0 0 1 1 1 0 0 0 01776 0 0 0 1 1 1 0 1 0 0 01777 0 0 1 1 1 0 1 1 1 0 01778 0 1 1 1 0 1 1 1 0 1 01779 1 1 1 0 1 1 1 0 1 1 11780 >>> generate(190, 20) # doctest: +NORMALIZE_WHITESPACE1781 P1 41 211782 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01783 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01784 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01785 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01786 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01787 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01788 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01789 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 01790 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 01791 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 01792 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 01793 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 01794 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 01795 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 01796 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 0 0 0 0 0 01797 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 0 0 01798 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 0 0 0 01799 0 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 0 01800 0 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 0 01801 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 01802 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 11803 >>> generate(192, 5) # doctest: +NORMALIZE_WHITESPACE1804 P1 11 61805 0 0 0 0 0 1 0 0 0 0 01806 0 0 0 0 0 0 0 0 0 0 01807 0 0 0 0 0 0 0 0 0 0 01808 0 0 0 0 0 0 0 0 0 0 01809 0 0 0 0 0 0 0 0 0 0 01810 0 0 0 0 0 0 0 0 0 0 01811 >>> generate(192, 20) # doctest: +NORMALIZE_WHITESPACE1812 P1 41 211813 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01814 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01815 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01816 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01817 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01818 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01819 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01820 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01821 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01822 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01823 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01824 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01825 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01826 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01827 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01828 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01829 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01830 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01831 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01832 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01833 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01834 >>> generate(193, 5) # doctest: +NORMALIZE_WHITESPACE1835 P1 11 61836 0 0 0 0 0 1 0 0 0 0 01837 1 1 1 1 0 0 0 1 1 1 11838 0 1 1 1 0 1 0 0 1 1 11839 0 0 1 1 0 0 0 0 0 1 11840 1 0 0 1 0 1 1 1 0 0 11841 0 0 0 0 0 0 1 1 0 0 01842 >>> generate(193, 20) # doctest: +NORMALIZE_WHITESPACE1843 P1 41 211844 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01845 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11846 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11847 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11848 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11849 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11850 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 11851 0 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 11852 0 0 1 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 11853 1 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 11854 0 0 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 1 1 1 0 1 0 0 1 1 1 1 1 1 1 1 1 11855 1 0 0 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 0 1 1 1 0 0 1 1 0 0 0 0 0 1 1 1 1 1 1 1 1 11856 0 0 0 0 1 1 1 0 1 0 0 1 1 1 1 1 1 1 1 0 0 1 1 0 0 0 1 0 1 1 1 0 0 1 1 1 1 1 1 1 11857 1 1 1 0 0 1 1 0 0 0 0 0 1 1 1 1 1 1 1 0 0 0 1 0 1 0 0 0 0 1 1 0 0 0 1 1 1 1 1 1 11858 0 1 1 0 0 0 1 0 1 1 1 0 0 1 1 1 1 1 1 0 1 0 0 0 0 0 1 1 0 0 1 0 1 0 0 1 1 1 1 1 11859 0 0 1 0 1 0 0 0 0 1 1 0 0 0 1 1 1 1 1 0 0 0 1 1 1 0 0 1 0 0 0 0 0 0 0 0 1 1 1 1 11860 1 0 0 0 0 0 1 1 0 0 1 0 1 0 0 1 1 1 1 0 1 0 0 1 1 0 0 0 0 1 1 1 1 1 1 0 0 1 1 1 11861 0 0 1 1 1 0 0 1 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 0 1 1 0 0 1 1 1 1 1 0 0 0 1 1 11862 1 0 0 1 1 0 0 0 0 1 1 1 1 1 1 0 0 1 1 0 1 1 1 0 0 0 0 1 0 0 0 1 1 1 1 0 1 0 0 1 11863 0 0 0 0 1 0 1 1 0 0 1 1 1 1 1 0 0 0 1 0 0 1 1 0 1 1 0 0 0 1 0 0 1 1 1 0 0 0 0 0 11864 1 1 1 0 0 0 0 1 0 0 0 1 1 1 1 0 1 0 0 0 0 0 1 0 0 1 0 1 0 0 0 0 0 1 1 0 1 1 1 0 01865 >>> generate(204, 5) # doctest: +NORMALIZE_WHITESPACE1866 P1 11 61867 0 0 0 0 0 1 0 0 0 0 01868 0 0 0 0 0 1 0 0 0 0 01869 0 0 0 0 0 1 0 0 0 0 01870 0 0 0 0 0 1 0 0 0 0 01871 0 0 0 0 0 1 0 0 0 0 01872 0 0 0 0 0 1 0 0 0 0 01873 >>> generate(204, 20) # doctest: +NORMALIZE_WHITESPACE1874 P1 41 211875 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01876 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01877 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01878 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01879 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01880 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01881 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01882 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01883 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01884 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01885 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01886 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01887 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01888 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01889 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01890 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01891 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01892 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01893 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01894 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01895 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01896 >>> generate(218, 5) # doctest: +NORMALIZE_WHITESPACE1897 P1 11 61898 0 0 0 0 0 1 0 0 0 0 01899 0 0 0 0 1 0 1 0 0 0 01900 0 0 0 1 0 0 0 1 0 0 01901 0 0 1 0 1 0 1 0 1 0 01902 0 1 0 0 0 0 0 0 0 1 01903 1 0 1 0 0 0 0 0 1 0 11904 >>> generate(218, 20) # doctest: +NORMALIZE_WHITESPACE1905 P1 41 211906 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01907 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01908 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01909 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01910 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01911 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01912 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 01913 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 01914 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 01915 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 01916 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 01917 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 01918 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 01919 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 01920 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 01921 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 01922 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 01923 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 01924 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 01925 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 01926 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 11927 >>> generate(225, 5) # doctest: +NORMALIZE_WHITESPACE1928 P1 11 61929 0 0 0 0 0 1 0 0 0 0 01930 1 1 1 1 0 0 0 1 1 1 11931 0 1 1 1 0 1 0 0 1 1 11932 0 0 1 1 1 0 0 0 0 1 11933 1 0 0 1 1 0 1 1 0 0 11934 0 0 0 0 1 1 0 1 0 0 01935 >>> generate(225, 20) # doctest: +NORMALIZE_WHITESPACE1936 P1 41 211937 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01938 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11939 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11940 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11941 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11942 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11943 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 11944 0 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 11945 0 0 1 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 11946 1 0 0 1 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 11947 0 0 0 0 0 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 0 1 0 0 1 1 1 1 1 1 1 1 1 11948 1 1 1 1 0 0 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 0 0 0 0 1 1 1 1 1 1 1 1 11949 0 1 1 1 0 0 0 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 0 0 1 1 1 1 1 1 1 11950 0 0 1 1 0 1 1 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 0 0 0 1 1 1 1 1 1 11951 1 0 0 1 1 0 1 0 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 1 0 0 1 1 1 1 1 11952 0 0 0 0 1 1 0 1 0 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 1 1 1 1 11953 1 1 1 0 0 1 1 0 1 0 1 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 0 0 1 1 1 11954 0 1 1 0 0 0 1 1 0 1 0 1 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 0 0 0 1 1 11955 0 0 1 0 1 0 0 1 1 0 1 0 0 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 1 0 0 1 11956 1 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 0 0 0 11957 0 0 0 0 0 1 1 0 0 1 0 1 1 0 0 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 0 1 1 0 01958 >>> generate(232, 5) # doctest: +NORMALIZE_WHITESPACE1959 P1 11 61960 0 0 0 0 0 1 0 0 0 0 01961 0 0 0 0 0 0 0 0 0 0 01962 0 0 0 0 0 0 0 0 0 0 01963 0 0 0 0 0 0 0 0 0 0 01964 0 0 0 0 0 0 0 0 0 0 01965 0 0 0 0 0 0 0 0 0 0 01966 >>> generate(232, 20) # doctest: +NORMALIZE_WHITESPACE1967 P1 41 211968 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01969 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01970 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01971 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01972 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01973 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01974 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01975 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01976 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01977 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01978 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01979 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01980 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01981 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01982 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01983 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01984 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01985 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01986 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01987 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01988 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01989 >>> generate(236, 5) # doctest: +NORMALIZE_WHITESPACE1990 P1 11 61991 0 0 0 0 0 1 0 0 0 0 01992 0 0 0 0 0 1 0 0 0 0 01993 0 0 0 0 0 1 0 0 0 0 01994 0 0 0 0 0 1 0 0 0 0 01995 0 0 0 0 0 1 0 0 0 0 01996 0 0 0 0 0 1 0 0 0 0 01997 >>> generate(236, 20) # doctest: +NORMALIZE_WHITESPACE1998 P1 41 211999 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02000 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02001 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02002 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02003 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02004 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02005 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02006 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02007 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02008 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02009 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02010 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02011 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02012 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02013 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02014 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02015 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02016 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02017 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02018 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02019 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02020 >>> generate(238, 5) # doctest: +NORMALIZE_WHITESPACE2021 P1 11 62022 0 0 0 0 0 1 0 0 0 0 02023 0 0 0 0 1 1 0 0 0 0 02024 0 0 0 1 1 1 0 0 0 0 02025 0 0 1 1 1 1 0 0 0 0 02026 0 1 1 1 1 1 0 0 0 0 02027 1 1 1 1 1 1 0 0 0 0 02028 >>> generate(238, 20) # doctest: +NORMALIZE_WHITESPACE2029 P1 41 212030 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02031 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02032 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02033 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02034 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02035 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02036 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02037 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02038 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02039 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02040 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02041 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02042 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02043 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02044 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02045 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02046 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02047 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02048 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02049 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02050 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02051 >>> generate(240, 5) # doctest: +NORMALIZE_WHITESPACE2052 P1 11 62053 0 0 0 0 0 1 0 0 0 0 02054 0 0 0 0 0 0 1 0 0 0 02055 0 0 0 0 0 0 0 1 0 0 02056 0 0 0 0 0 0 0 0 1 0 02057 0 0 0 0 0 0 0 0 0 1 02058 0 0 0 0 0 0 0 0 0 0 12059 >>> generate(240, 20) # doctest: +NORMALIZE_WHITESPACE2060 P1 41 212061 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02062 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02063 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02064 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02065 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02066 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02067 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 02068 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 02069 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 02070 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 02071 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 02072 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 02073 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 02074 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 02075 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 02076 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 02077 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 02078 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 02079 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 02080 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 02081 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 12082 >>> generate(250, 5) # doctest: +NORMALIZE_WHITESPACE2083 P1 11 62084 0 0 0 0 0 1 0 0 0 0 02085 0 0 0 0 1 0 1 0 0 0 02086 0 0 0 1 0 1 0 1 0 0 02087 0 0 1 0 1 0 1 0 1 0 02088 0 1 0 1 0 1 0 1 0 1 02089 1 0 1 0 1 0 1 0 1 0 12090 >>> generate(250, 20) # doctest: +NORMALIZE_WHITESPACE2091 P1 41 212092 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02093 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02094 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02095 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02096 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02097 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02098 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 02099 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 02100 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 02101 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 02102 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 02103 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 02104 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 02105 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 02106 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 02107 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 02108 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 02109 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 0 02110 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 02111 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 02112 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 12113 >>> generate(254, 5) # doctest: +NORMALIZE_WHITESPACE2114 P1 11 62115 0 0 0 0 0 1 0 0 0 0 02116 0 0 0 0 1 1 1 0 0 0 02117 0 0 0 1 1 1 1 1 0 0 02118 0 0 1 1 1 1 1 1 1 0 02119 0 1 1 1 1 1 1 1 1 1 02120 1 1 1 1 1 1 1 1 1 1 12121 >>> generate(254, 20) # doctest: +NORMALIZE_WHITESPACE2122 P1 41 212123 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02124 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02125 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02126 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02127 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02128 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02129 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 02130 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 02131 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 02132 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 02133 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 02134 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 02135 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 02136 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 02137 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 02138 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 02139 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 02140 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 02141 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 02142 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 02143 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12144 >>> generate(255, 5) # doctest: +NORMALIZE_WHITESPACE2145 P1 11 62146 0 0 0 0 0 1 0 0 0 0 02147 1 1 1 1 1 1 1 1 1 1 12148 1 1 1 1 1 1 1 1 1 1 12149 1 1 1 1 1 1 1 1 1 1 12150 1 1 1 1 1 1 1 1 1 1 12151 1 1 1 1 1 1 1 1 1 1 12152 >>> generate(255, 20) # doctest: +NORMALIZE_WHITESPACE2153 P1 41 212154 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02155 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12156 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12157 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12158 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12159 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12160 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12161 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12162 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12163 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12164 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12165 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12166 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12167 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12168 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12169 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12170 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12171 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12172 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12173 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12174 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 12175 """2176if __name__ == '__main__':2177 test_result = doctest.testmod(exclude_empty=True)2178 print("Autograder Result: {}/{} success, {} failed."...

Full Screen

Full Screen

augmentation.py

Source:augmentation.py Github

copy

Full Screen

1# -*- coding: utf-8 -*-2"""3Created on Thu Nov 30 10:09:30 20174Copyright (c) 2017, Tommy Löfstedt. All rights reserved.5@author: Tommy Löfstedt6@email: tommy.lofstedt@umu.se7@license: BSD 3-clause.8"""9import abc10import warnings11import numpy as np12import scipy.ndimage13__all__ = ["BaseAugmentation",14 "Flip", "Resize", "Rotate", "Crop", "Shear",15 "ImageHistogramShift", "ImageHistogramScale",16 "ImageHistogramAffineTransform", "ImageHistogramTransform",17 "ImageTransform",18 "Pipeline"]19class BaseAugmentation(metaclass=abc.ABCMeta):20 """Base class for data augmentation functions.21 Parameters22 ----------23 data_format : str, optional24 One of `channels_last` (default) or `channels_first`. The ordering of25 the dimensions in the inputs. `channels_last` corresponds to inputs26 with shape `(batch, height, width, [depth], channels)` while27 `channels_first` corresponds to inputs with shape `(batch, channels,28 height, [depth], width)`. It defaults to the `image_data_format` value29 found in your Keras config file at `~/.keras/keras.json`. If you never30 set it, then it will be "channels_last".31 random_state : int, float, array_like or numpy.random.RandomState, optional32 A random state to use when sampling pseudo-random numbers. If int,33 float or array_like, a new random state is created with the provided34 value as seed. If None, the default numpy random state (np.random) is35 used. Default is None, use the default numpy random state.36 """37 def __init__(self,38 data_format=None,39 random_state=None):40 from tensorflow.python import keras as tf_keras41 self.data_format = tf_keras.utils.conv_utils.normalize_data_format(42 data_format)43 if random_state is None:44 self.random_state = np.random.random.__self__ # Numpy built-in45 else:46 if isinstance(random_state, (int, float, np.ndarray)):47 self.random_state = np.random.RandomState(seed=random_state)48 elif isinstance(random_state, np.random.RandomState):49 self.random_state = random_state50 elif hasattr(random_state, "rand"): # E.g., np.random51 self.random_state = random_state52 # Note: You may need to augment this "list" of required53 # functions in your subclasses.54 else: # May crash here..55 self.random_state = np.random.RandomState(seed=random_state)56 self._lock = False57 self._random = None58 def lock(self):59 """Use this function to reuse the same augmentation multiple times.60 Useful if e.g. the same transform need to be applied to multiple61 channels even when randomness is involved.62 """63 self._lock = True64 def unlock(self):65 """Use this function to stop using the same augmentation.66 """67 self._lock = False68 def __call__(self, inputs):69 """The function performing the data augmentation.70 Specialise this function in your subclasses.71 """72 return inputs73class Flip(BaseAugmentation):74 """Flips an image in any direction.75 Parameters76 ----------77 probability : float or list/tuple of floats78 The probability of a flip. If a float, flip with probability79 ``probability`` in the horizontal direction (second image dimension)80 when ``axis=1`` or ``axis=None``, and otherwise flip with probability81 ``probability`` along all axes defined by ``axis``. If a list/tuple and82 ``axis=None``, flip with ``probability[d]`` in the direction of83 dimension ``d``, and if ``axis`` is a list or tuple, flip with84 probability ``probability[i]`` along dimension ``axis[i]`` (this case85 requires that ``len(probability) == len(axis)``). If fewer86 probabilities given than axes present, only the first given axes will87 be considered. Default is 0.5, which means to flip with probability88 ``0.5`` in the horizontal direction (along ``axis=1``), or to flip with89 probability ``0.5`` along the axes defined by ``axis``.90 axis : None or int or tuple of ints, optional91 Axis or axes along which to flip the image. If axis is a tuple92 of ints, flipping is performed with the provided probabilities on all93 of the axes specified in the tuple. If an axis is negative it counts94 from the last to the first axis. If ``axis=None``, the axes are95 determined from ``probability``. Default is ``axis=None``, which means96 to flip along the second images axis (the assumed horizontal axis), or97 to flip along the axes using indices ``0, ...,98 len(probabilities) - 1.``99 data_format : str, optional100 One of ``"channels_last"`` (default) or ``"channels_first"``. The101 ordering of the dimensions in the inputs. ``"channels_last"``102 corresponds to inputs with shape ``(batch, [image dimensions ...],103 channels)`` while ``channels_first`` corresponds to inputs with shape104 ``(batch, channels, [image dimensions ...])``. It defaults to the105 ``image_data_format`` value found in your Keras config file at106 ``~/.keras/keras.json``. If you never set it, then it will be107 ``"channels_last"``.108 random_state : int, float, array_like or numpy.random.RandomState, optional109 A random state to use when sampling pseudo-random numbers. If int,110 float or array_like, a new random state is created with the provided111 value as seed. If None, the default numpy random state (np.random) is112 used. Default is None, use the default numpy random state.113 Examples114 --------115 >>> from nethin.augmentation import Flip116 >>> import numpy as np117 >>> np.random.seed(42)118 >>>119 >>> X = np.random.rand(2, 3, 1)120 >>> X[:, :, 0] # doctest: +NORMALIZE_WHITESPACE121 array([[ 0.37454012, 0.95071431, 0.73199394],122 [ 0.59865848, 0.15601864, 0.15599452]])123 >>> flip_h = Flip(probability=1.0,124 ... random_state=42, data_format="channels_last")125 >>> flip_h(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE126 array([[ 0.73199394, 0.95071431, 0.37454012],127 [ 0.15599452, 0.15601864, 0.59865848]])128 >>> flip_v = Flip(probability=[1.0, 0.0],129 ... random_state=42, data_format="channels_last")130 >>> flip_v(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE131 array([[ 0.59865848, 0.15601864, 0.15599452],132 [ 0.37454012, 0.95071431, 0.73199394]])133 >>> flip_hv = Flip(probability=[0.5, 0.5],134 ... random_state=42, data_format="channels_last")135 >>> flip_hv(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE136 array([[ 0.59865848, 0.15601864, 0.15599452],137 [ 0.37454012, 0.95071431, 0.73199394]])138 >>> flip_hv(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE139 array([[ 0.37454012, 0.95071431, 0.73199394],140 [ 0.59865848, 0.15601864, 0.15599452]])141 >>> flip_hv(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE142 array([[ 0.15599452, 0.15601864, 0.59865848],143 [ 0.73199394, 0.95071431, 0.37454012]])144 >>> np.random.seed(42)145 >>> X = np.random.rand(2, 3, 1)146 >>> flip = Flip(probability=1.0, random_state=42)147 >>> flip(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE148 array([[ 0.73199394, 0.95071431, 0.37454012],149 [ 0.15599452, 0.15601864, 0.59865848]])150 >>> np.random.seed(42)151 >>> X = np.random.rand(1, 2, 3)152 >>> flip = Flip(probability=1.0, random_state=42)153 >>> flip(X)[:, :, 0] # Wrong154 ... # doctest: +NORMALIZE_WHITESPACE155 array([[ 0.59865848, 0.37454012]])156 >>> flip(X)[0, :, :] # Wrong157 ... # doctest: +NORMALIZE_WHITESPACE158 array([[ 0.59865848, 0.15601864, 0.15599452],159 [ 0.37454012, 0.95071431, 0.73199394]])160 >>> flip = Flip(probability=1.0,161 ... random_state=42, data_format="channels_first")162 >>> flip(X)[0, :, :] # Right163 ... # doctest: +NORMALIZE_WHITESPACE164 array([[ 0.73199394, 0.95071431, 0.37454012],165 [ 0.15599452, 0.15601864, 0.59865848]])166 >>>167 >>> np.random.seed(42)168 >>> X = np.random.rand(5, 4, 3, 2)169 >>> X[:, :, 0, 0] # doctest: +NORMALIZE_WHITESPACE170 array([[0.37454012, 0.05808361, 0.83244264, 0.43194502],171 [0.45606998, 0.60754485, 0.30461377, 0.03438852],172 [0.54671028, 0.59789998, 0.38867729, 0.14092422],173 [0.00552212, 0.35846573, 0.31098232, 0.11959425],174 [0.52273283, 0.31435598, 0.22879817, 0.63340376]])175 >>> flip = Flip(probability=1.0, random_state=42)176 >>> flip(X)[:, :, 0, 0]177 array([[0.43194502, 0.83244264, 0.05808361, 0.37454012],178 [0.03438852, 0.30461377, 0.60754485, 0.45606998],179 [0.14092422, 0.38867729, 0.59789998, 0.54671028],180 [0.11959425, 0.31098232, 0.35846573, 0.00552212],181 [0.63340376, 0.22879817, 0.31435598, 0.52273283]])182 >>> flip = Flip(probability=1.0, axis=[0, 1],183 ... random_state=42)184 >>> flip(X)[:, :, 0, 0]185 array([[0.52273283, 0.31435598, 0.22879817, 0.63340376],186 [0.00552212, 0.35846573, 0.31098232, 0.11959425],187 [0.54671028, 0.59789998, 0.38867729, 0.14092422],188 [0.45606998, 0.60754485, 0.30461377, 0.03438852],189 [0.37454012, 0.05808361, 0.83244264, 0.43194502]])190 >>> flip = Flip(probability=[1.0, 1.0], axis=[1],191 ... random_state=42) # doctest: +ELLIPSIS192 Traceback (most recent call last):193 ...194 ValueError: Number of probabilities suppled does not match ...195 >>> X[0, :, 0, :] # doctest: +NORMALIZE_WHITESPACE196 array([[0.37454012, 0.95071431],197 [0.05808361, 0.86617615],198 [0.83244264, 0.21233911],199 [0.43194502, 0.29122914]])200 >>> flip = Flip(probability=[1.0, 1.0], axis=[1, 3],201 ... random_state=42) # doctest: +ELLIPSIS202 >>> flip(X)[0, :, 0, :]203 array([[0.43194502, 0.29122914],204 [0.83244264, 0.21233911],205 [0.05808361, 0.86617615],206 [0.37454012, 0.95071431]])207 >>>208 >>> np.random.seed(42)209 >>>210 >>> X = np.random.rand(2, 3)211 >>> X # doctest: +NORMALIZE_WHITESPACE212 array([[0.37454012, 0.95071431, 0.73199394],213 [0.59865848, 0.15601864, 0.15599452]])214 >>> flip = Flip(probability=[0.0, 0.5],215 ... random_state=42) # doctest: +ELLIPSIS216 >>> flip(X)217 array([[0.37454012, 0.95071431, 0.73199394],218 [0.59865848, 0.15601864, 0.15599452]])219 >>> flip(X)220 array([[0.37454012, 0.95071431, 0.73199394],221 [0.59865848, 0.15601864, 0.15599452]])222 >>> flip(X)223 array([[0.73199394, 0.95071431, 0.37454012],224 [0.15599452, 0.15601864, 0.59865848]])225 >>> flip.lock()226 >>> flip(X)227 array([[0.73199394, 0.95071431, 0.37454012],228 [0.15599452, 0.15601864, 0.59865848]])229 >>> flip(X)230 array([[0.73199394, 0.95071431, 0.37454012],231 [0.15599452, 0.15601864, 0.59865848]])232 >>> flip(X)233 array([[0.73199394, 0.95071431, 0.37454012],234 [0.15599452, 0.15601864, 0.59865848]])235 >>> flip(X)236 array([[0.73199394, 0.95071431, 0.37454012],237 [0.15599452, 0.15601864, 0.59865848]])238 >>> flip(X)239 array([[0.73199394, 0.95071431, 0.37454012],240 [0.15599452, 0.15601864, 0.59865848]])241 >>> flip(X)242 array([[0.73199394, 0.95071431, 0.37454012],243 [0.15599452, 0.15601864, 0.59865848]])244 >>> flip(X)245 array([[0.73199394, 0.95071431, 0.37454012],246 [0.15599452, 0.15601864, 0.59865848]])247 >>> flip.unlock()248 >>> flip(X)249 array([[0.37454012, 0.95071431, 0.73199394],250 [0.59865848, 0.15601864, 0.15599452]])251 """252 def __init__(self,253 probability=0.5,254 axis=None,255 data_format=None,256 random_state=None):257 super().__init__(data_format=data_format,258 random_state=random_state)259 if axis is None:260 if isinstance(probability, (float, int)):261 self.axis = [1]262 else:263 self.axis = None264 elif isinstance(axis, int):265 self.axis = [axis]266 elif isinstance(axis, (tuple, list)):267 self.axis = [int(a) for a in axis]268 else:269 raise ValueError("The value of axis must be either None, int or "270 "list/tuple.")271 if isinstance(probability, (float, int)): # self.axis != None here272 probability = [float(probability) for i in range(len(self.axis))]273 elif isinstance(probability, (list, tuple)):274 if self.axis is None:275 probability = [float(probability[i])276 for i in range(len(probability))]277 self.axis = [i for i in range(len(probability))]278 else:279 if len(probability) != len(self.axis):280 raise ValueError("Number of probabilities suppled does "281 "not match the number of axes.")282 else:283 probability = [float(probability[i])284 for i in range(len(probability))]285 # Normalise286 for i in range(len(probability)):287 probability[i] = max(0.0, min(float(probability[i]), 1.0))288 self.probability = probability289 if self.data_format == "channels_last":290 self._axis_offset = 0291 else: # data_format == "channels_first":292 self._axis_offset = 1293 self._random = [None] * len(self.probability)294 def __call__(self, inputs):295 outputs = inputs296 for i in range(len(self.probability)):297 p = self.probability[i]298 a = self._axis_offset + self.axis[i]299 if (not self._lock) or (self._random[i] is None):300 self._random[i] = self.random_state.rand()301 if self._random[i] < p:302 outputs = np.flip(outputs, axis=a)303 return outputs304class Resize(BaseAugmentation):305 """Resizes an image.306 Parameters307 ----------308 size : list of int309 List of positive int. The size to re-size the images to.310 random_size : int or list of int, optional311 An int or a list of positive int the same length as ``size``. The upper312 bounds on the amount of extra size to randomly add to the size. If a313 single int, the same random size will be added to all axes. Default is314 0, which means to not add any extra random size.315 keep_aspect_ratio : bool, optional316 Whether or not to keep the aspect ratio of the image when resizing.317 Default is False, do not keep the aspect ratio of the original image.318 minimum_size : bool, optional319 If ``keep_aspect_ratio=True``, then ``minimum_size`` determines if the320 given size is the minimum size (scaled image is equal to or larger than321 the given ``size``) or the maximum size (scaled image is equal to or322 smaller than the given ``size``) of the scaled image. Default is True,323 the scaled image will be at least as large as ``size``. See also324 ``keep_aspect_ratio``.325 order : int, optional326 Integer in [0, 5], the order of the spline used in the interpolation.327 The order corresponds to the following interpolations:328 0: Nearest-neighbor329 1: Bi-linear (default)330 2: Bi-quadratic331 3: Bi-cubic332 4: Bi-quartic333 5: Bi-quintic334 Beware! Higher orders than 1 may cause the values to be outside of the335 allowed range of values for your data. This must be handled manually.336 Default is 1, i.e. bi-linear interpolation.337 mode : {"reflect", "constant", "nearest", "mirror", "wrap"}, optional338 Determines how the border should be handled. Default is "nearest".339 The behavior for each option is:340 "reflect": (d c b a | a b c d | d c b a)341 The input is extended by reflecting about the edge of the last342 pixel.343 "constant": (k k k k | a b c d | k k k k)344 The input is extended by filling all values beyond the edge345 with the same constant value, defined by the cval parameter.346 "nearest": (a a a a | a b c d | d d d d)347 The input is extended by replicating the last pixel.348 "mirror": (d c b | a b c d | c b a)349 The input is extended by reflecting about the center of the350 last pixel.351 "wrap": (a b c d | a b c d | a b c d)352 The input is extended by wrapping around to the opposite edge.353 cval : float, optional354 Value to fill past edges of input if mode is "constant". Default is355 0.0.356 prefilter : bool, optional357 Whether or not to prefilter the input array with a spline filter before358 interpolation. Default is True.359 data_format : str, optional360 One of `channels_last` (default) or `channels_first`. The ordering of361 the dimensions in the inputs. `channels_last` corresponds to inputs362 with shape `(batch, height, width, channels)` while `channels_first`363 corresponds to inputs with shape `(batch, channels, height, width)`. It364 defaults to the `image_data_format` value found in your Keras config365 file at `~/.keras/keras.json`. If you never set it, then it will be366 "channels_last".367 random_state : int, float, array_like or numpy.random.RandomState, optional368 A random state to use when sampling pseudo-random numbers. If int,369 float or array_like, a new random state is created with the provided370 value as seed. If None, the default numpy random state (np.random) is371 used. Default is None, use the default numpy random state.372 Examples373 --------374 >>> from nethin.augmentation import Resize375 >>> import numpy as np376 >>>377 >>> np.random.seed(42)378 >>> X = np.array([[1, 2],379 ... [2, 3]]).astype(float)380 >>> X = np.reshape(X, [2, 2, 1])381 >>> X[:, :, 0] # doctest: +NORMALIZE_WHITESPACE382 array([[1., 2.],383 [2., 3.]])384 >>> resize = Resize([4, 4], order=1, data_format="channels_last")385 >>> Y = resize(X)386 >>> Y[:, :, 0] # doctest: +NORMALIZE_WHITESPACE387 array([[1. , 1.33333333, 1.66666667, 2. ],388 [1.33333333, 1.66666667, 2. , 2.33333333],389 [1.66666667, 2. , 2.33333333, 2.66666667],390 [2. , 2.33333333, 2.66666667, 3. ]])391 >>> resize = Resize([2, 2], order=1, data_format="channels_last")392 >>> X_ = resize(Y)393 >>> X_[:, :, 0] # doctest: +NORMALIZE_WHITESPACE394 array([[1., 2.],395 [2., 3.]])396 >>>397 >>> X = np.array([[1, 2],398 ... [2, 3]]).reshape((1, 2, 2)).astype(float)399 >>> X[0, :, :] # doctest: +NORMALIZE_WHITESPACE400 array([[1., 2.],401 [2., 3.]])402 >>> resize = Resize([4, 4], order=1, data_format="channels_first")403 >>> Y = resize(X)404 >>> Y[0, :, :] # doctest: +NORMALIZE_WHITESPACE405 array([[1. , 1.33333333, 1.66666667, 2. ],406 [1.33333333, 1.66666667, 2. , 2.33333333],407 [1.66666667, 2. , 2.33333333, 2.66666667],408 [2. , 2.33333333, 2.66666667, 3. ]])409 >>>410 >>> X = np.random.rand(10, 20, 1)411 >>> resize = Resize([5, 5], keep_aspect_ratio=False, order=1)412 >>> Y = resize(X)413 >>> Y.shape # doctest: +NORMALIZE_WHITESPACE414 (5, 5, 1)415 >>> resize = Resize([5, 5], keep_aspect_ratio=True,416 ... minimum_size=True, order=1)417 >>> Y = resize(X)418 >>> Y.shape # doctest: +NORMALIZE_WHITESPACE419 (5, 10, 1)420 >>> resize = Resize([5, 5], keep_aspect_ratio=True,421 ... minimum_size=False, order=1)422 >>> Y = resize(X)423 >>> Y.shape # doctest: +NORMALIZE_WHITESPACE424 (3, 5, 1)425 >>>426 >>> X = np.random.rand(10, 20, 30, 1)427 >>> resize = Resize([5, 5, 5],428 ... keep_aspect_ratio=True,429 ... minimum_size=False,430 ... order=1)431 >>> Y = resize(X)432 >>> Y.shape # doctest: +NORMALIZE_WHITESPACE433 (2, 3, 5, 1)434 >>> resize = Resize([5, 5, 5],435 ... keep_aspect_ratio=True,436 ... minimum_size=True,437 ... order=1)438 >>> Y = resize(X)439 >>> Y.shape # doctest: +NORMALIZE_WHITESPACE440 (5, 10, 15, 1)441 >>> X = np.arange(27).reshape((3, 3, 3, 1)).astype(float)442 >>> resize = Resize([5, 5, 5],443 ... keep_aspect_ratio=True)444 >>> Y = resize(X)445 >>> Y.shape # doctest: +NORMALIZE_WHITESPACE446 (5, 5, 5, 1)447 >>> X[:5, :5, 0, 0] # doctest: +NORMALIZE_WHITESPACE448 array([[ 0., 3., 6.],449 [ 9., 12., 15.],450 [18., 21., 24.]])451 >>> Y[:5, :5, 0, 0] # doctest: +NORMALIZE_WHITESPACE452 array([[ 0. , 1.5, 3. , 4.5, 6. ],453 [ 4.5, 6. , 7.5, 9. , 10.5],454 [ 9. , 10.5, 12. , 13.5, 15. ],455 [13.5, 15. , 16.5, 18. , 19.5],456 [18. , 19.5, 21. , 22.5, 24. ]])457 >>> X[0, :5, :5, 0] # doctest: +NORMALIZE_WHITESPACE458 array([[0., 1., 2.],459 [3., 4., 5.],460 [6., 7., 8.]])461 >>> Y[0, :5, :5, 0] # doctest: +NORMALIZE_WHITESPACE462 array([[0. , 0.5, 1. , 1.5, 2. ],463 [1.5, 2. , 2.5, 3. , 3.5],464 [3. , 3.5, 4. , 4.5, 5. ],465 [4.5, 5. , 5.5, 6. , 6.5],466 [6. , 6.5, 7. , 7.5, 8. ]])467 >>>468 >>> np.random.seed(42)469 >>> X = np.array([[1, 2],470 ... [2, 3]]).astype(float)471 >>> X = np.reshape(X, [2, 2, 1])472 >>> X[:, :, 0] # doctest: +NORMALIZE_WHITESPACE473 array([[1., 2.],474 [2., 3.]])475 >>> resize = Resize([2, 2],476 ... random_size=[3, 3],477 ... keep_aspect_ratio=True,478 ... order=1,479 ... data_format="channels_last")480 >>> resize(X)[:, :, 0]481 array([[1. , 1.33333333, 1.66666667, 2. ],482 [1.33333333, 1.66666667, 2. , 2.33333333],483 [1.66666667, 2. , 2.33333333, 2.66666667],484 [2. , 2.33333333, 2.66666667, 3. ]])485 >>> resize(X)[:, :, 0]486 array([[1., 2.],487 [2., 3.]])488 >>> resize(X)[:, :, 0]489 array([[1. , 1.33333333, 1.66666667, 2. ],490 [1.33333333, 1.66666667, 2. , 2.33333333],491 [1.66666667, 2. , 2.33333333, 2.66666667],492 [2. , 2.33333333, 2.66666667, 3. ]])493 >>> resize(X)[:, :, 0]494 array([[1., 2.],495 [2., 3.]])496 >>> resize(X)[:, :, 0]497 array([[1. , 1.33333333, 1.66666667, 2. ],498 [1.33333333, 1.66666667, 2. , 2.33333333],499 [1.66666667, 2. , 2.33333333, 2.66666667],500 [2. , 2.33333333, 2.66666667, 3. ]])501 >>> resize.lock()502 >>> resize(X)[:, :, 0]503 array([[1. , 1.33333333, 1.66666667, 2. ],504 [1.33333333, 1.66666667, 2. , 2.33333333],505 [1.66666667, 2. , 2.33333333, 2.66666667],506 [2. , 2.33333333, 2.66666667, 3. ]])507 >>> resize(X)[:, :, 0]508 array([[1. , 1.33333333, 1.66666667, 2. ],509 [1.33333333, 1.66666667, 2. , 2.33333333],510 [1.66666667, 2. , 2.33333333, 2.66666667],511 [2. , 2.33333333, 2.66666667, 3. ]])512 >>> resize(X)[:, :, 0]513 array([[1. , 1.33333333, 1.66666667, 2. ],514 [1.33333333, 1.66666667, 2. , 2.33333333],515 [1.66666667, 2. , 2.33333333, 2.66666667],516 [2. , 2.33333333, 2.66666667, 3. ]])517 >>> resize(X)[:, :, 0]518 array([[1. , 1.33333333, 1.66666667, 2. ],519 [1.33333333, 1.66666667, 2. , 2.33333333],520 [1.66666667, 2. , 2.33333333, 2.66666667],521 [2. , 2.33333333, 2.66666667, 3. ]])522 >>> resize(X)[:, :, 0]523 array([[1. , 1.33333333, 1.66666667, 2. ],524 [1.33333333, 1.66666667, 2. , 2.33333333],525 [1.66666667, 2. , 2.33333333, 2.66666667],526 [2. , 2.33333333, 2.66666667, 3. ]])527 >>> resize.unlock()528 >>> resize(X)[:, :, 0]529 array([[1. , 1.33333333, 1.66666667, 2. ],530 [1.33333333, 1.66666667, 2. , 2.33333333],531 [1.66666667, 2. , 2.33333333, 2.66666667],532 [2. , 2.33333333, 2.66666667, 3. ]])533 >>> resize(X)[:, :, 0]534 array([[1. , 1.33333333, 1.66666667, 2. ],535 [1.33333333, 1.66666667, 2. , 2.33333333],536 [1.66666667, 2. , 2.33333333, 2.66666667],537 [2. , 2.33333333, 2.66666667, 3. ]])538 >>> resize(X)[:, :, 0]539 array([[1. , 1.25, 1.5 , 1.75, 2. ],540 [1.25, 1.5 , 1.75, 2. , 2.25],541 [1.5 , 1.75, 2. , 2.25, 2.5 ],542 [1.75, 2. , 2.25, 2.5 , 2.75],543 [2. , 2.25, 2.5 , 2.75, 3. ]])544 >>> resize(X)[:, :, 0]545 array([[1. , 1.25, 1.5 , 1.75, 2. ],546 [1.25, 1.5 , 1.75, 2. , 2.25],547 [1.5 , 1.75, 2. , 2.25, 2.5 ],548 [1.75, 2. , 2.25, 2.5 , 2.75],549 [2. , 2.25, 2.5 , 2.75, 3. ]])550 """551 def __init__(self,552 size,553 random_size=0,554 keep_aspect_ratio=False,555 minimum_size=True,556 order=1,557 mode="nearest",558 cval=0.0,559 prefilter=True,560 data_format=None,561 random_state=None):562 super().__init__(data_format=data_format,563 random_state=random_state)564 self.size = [max(1, int(size[i])) for i in range(len(list(size)))]565 if isinstance(random_size, int):566 self.random_size = max(0, int(random_size))567 elif isinstance(random_size, (list, tuple)):568 self.random_size = [max(0, int(random_size[i]))569 for i in range(len(list(random_size)))]570 if len(self.random_size) != len(self.size):571 raise ValueError("random_size and size must have the same "572 "lengths.")573 else:574 raise ValueError("random_size must be an int, or a list/tuple of "575 "int.")576 self.keep_aspect_ratio = bool(keep_aspect_ratio)577 self.minimum_size = bool(minimum_size)578 if int(order) not in [0, 1, 2, 3, 4, 5]:579 raise ValueError('``order`` must be in [0, 5].')580 self.order = int(order)581 if str(mode).lower() in {"reflect", "constant", "nearest", "mirror",582 "wrap"}:583 self.mode = str(mode).lower()584 else:585 raise ValueError('``mode`` must be one of "reflect", "constant", '586 '"nearest", "mirror", or "wrap".')587 self.cval = float(cval)588 self.prefilter = bool(prefilter)589 if isinstance(self.random_size, int):590 self._random = None591 else:592 self._random = [None] * len(self.size)593 def __call__(self, inputs):594 shape = inputs.shape595 if self.data_format == "channels_last":596 shape = shape[:-1]597 else: # self.data_format == "channels_first"598 shape = shape[1:]599 ndim = len(shape) # inputs.ndim - 1600 size_ = [0] * len(self.size)601 if len(size_) < ndim:602 size_.extend(shape[len(size_):ndim]) # Add dims from the data603 elif len(size_) > ndim:604 raise ValueError("The given size specifies more dimensions than "605 "what is present in the data.")606 if isinstance(self.random_size, int):607 # random_size = self.random_state.randint(0, self.random_size + 1)608 if (not self._lock) or (self._random is None):609 self._random = self.random_state.randint(610 0, self.random_size + 1)611 for i in range(len(self.size)): # Recall: May be fewer than ndim612 size_[i] = self.size[i] + self._random613 else: # List or tuple614 for i in range(len(self.size)): # Recall: May be fewer than ndim615 if (not self._lock) or (self._random[i] is None):616 # random_size = self.random_state.randint(617 # 0, self.random_size[i] + 1)618 self._random[i] = self.random_state.randint(619 0, self.random_size[i] + 1)620 size_[i] = self.size[i] + self._random[i]621 if self.keep_aspect_ratio:622 if self.minimum_size:623 val_i = np.argmin(shape)624 else:625 val_i = np.argmax(shape)626 factor = size_[val_i] / shape[val_i]627 new_size = [int((shape[i] * factor) + 0.5)628 for i in range(len(shape))]629 new_factor = [new_size[i] / shape[i] for i in range(len(shape))]630 else:631 new_size = size_632 new_factor = [size_[i] / shape[i] for i in range(len(shape))]633 if self.data_format == "channels_last":634 num_channels = inputs.shape[-1]635 outputs = None # np.zeros(new_size + [num_channels])636 for c in range(num_channels):637 im = scipy.ndimage.zoom(inputs[..., c],638 new_factor,639 order=self.order,640 # = "edge"641 mode=self.mode,642 cval=self.cval,643 prefilter=self.prefilter)644 if outputs is None:645 outputs = np.zeros(list(im.shape) + [num_channels])646 outputs[..., c] = im647 else: # data_format == "channels_first":648 num_channels = inputs.shape[0]649 outputs = None650 for c in range(num_channels):651 im = scipy.ndimage.zoom(inputs[c, ...],652 new_factor,653 order=self.order,654 mode=self.mode,655 cval=self.cval,656 prefilter=self.prefilter)657 if outputs is None:658 outputs = np.zeros([num_channels] + list(im.shape))659 outputs[c, ...] = im660 return outputs661class Rotate(BaseAugmentation):662 """Rotates an image about all standard planes (pairwise standard basis663 vectors).664 The general rotation of the ndimage is implemented as a series of plane665 rotations as666 ``R(I) = R_{n-1, n}(... R_{0, 2}(R_{0, 1}(I))...),``667 for ``I`` and ``n``-dimensional image, where ``R`` is the total rotation,668 and ``R_{i, j}`` is a rotation in the plane defined by axes ``i`` and669 ``j``.670 Hence, a 2-dimensional image will be rotated in the plane defined by the671 axes ``(0, 1)`` (i.e., the image plane), and a 3-dimensional image with672 axes ``(0, 1, 2)`` will be rotated first in the plane defined by ``(0,673 1)``, then in the plane defined by ``(0, 2)``, and finally in the plane674 defined by ``(1, 2)``.675 The order in which the rotations are applied by this class are676 well-defined, but does not commute for dimensions higher than two.677 Rotations in 2-dimensional spatial dimensions, i.e. in ``R^2``, will work678 precisely as expected, and be identical to the expected rotation.679 More information can be found e.g. here:680 http://www.euclideanspace.com/maths/geometry/rotations/theory/nDimensions/index.htm681 Parameters682 ----------683 angles : float, or list/tuple of float684 The rotation angles in degrees. If a single float, rotates all axes by685 this number of degrees. If a list/tuple, rotates the corresponding686 planes by this many degrees. The planes are rotated in the order687 defined by the pairwise axes in increasing indices like ``(0, 1), ...,688 (0, n), ..., (1, 2), ..., (n - 1, n)`` and should thus be of length ``n689 * (n - 1) / 2`` where ``n`` is the number of dimensions in the image.690 reshape : bool, optional691 If True, the output image is reshaped such that the input image is692 contained completely in the output image. Default is True.693 order : int, optional694 Integer in [0, 5], the order of the spline used in the interpolation.695 The order corresponds to the following interpolations:696 0: Nearest-neighbor697 1: Bi-linear (default)698 2: Bi-quadratic699 3: Bi-cubic700 4: Bi-quartic701 5: Bi-quintic702 Beware! Higher orders than 1 may cause the values to be outside of the703 allowed range of values for your data. This must be handled manually.704 Default is 1, i.e. bi-linear interpolation.705 mode : {"reflect", "constant", "nearest", "mirror", "wrap"}, optional706 Determines how the border should be handled. Default is "nearest".707 The behavior for each option is:708 "reflect": (d c b a | a b c d | d c b a)709 The input is extended by reflecting about the edge of the last710 pixel.711 "constant": (k k k k | a b c d | k k k k)712 The input is extended by filling all values beyond the edge713 with the same constant value, defined by the cval parameter.714 "nearest": (a a a a | a b c d | d d d d)715 The input is extended by replicating the last pixel.716 "mirror": (d c b | a b c d | c b a)717 The input is extended by reflecting about the center of the718 last pixel.719 "wrap": (a b c d | a b c d | a b c d)720 The input is extended by wrapping around to the opposite edge.721 cval : float, optional722 Value to fill past edges of input if mode is "constant". Default is723 0.0.724 prefilter : bool, optional725 Whether or not to prefilter the input array with a spline filter before726 interpolation. Default is True.727 data_format : str, optional728 One of `channels_last` (default) or `channels_first`. The ordering of729 the dimensions in the inputs. `channels_last` corresponds to inputs730 with shape `(batch, height, width, channels)` while `channels_first`731 corresponds to inputs with shape `(batch, channels, height, width)`. It732 defaults to the `image_data_format` value found in your Keras config733 file at `~/.keras/keras.json`. If you never set it, then it will be734 "channels_last".735 random_state : int, float, array_like or numpy.random.RandomState, optional736 A random state to use when sampling pseudo-random numbers. If int,737 float or array_like, a new random state is created with the provided738 value as seed. If None, the default numpy random state (np.random) is739 used. Default is None, use the default numpy random state.740 Examples741 --------742 >>> from nethin.augmentation import Rotate743 >>> import numpy as np744 >>>745 >>> X = np.zeros((5, 5, 1))746 >>> X[1:-1, 1:-1] = 1747 >>> X[:, :, 0] # doctest: +NORMALIZE_WHITESPACE748 array([[0., 0., 0., 0., 0.],749 [0., 1., 1., 1., 0.],750 [0., 1., 1., 1., 0.],751 [0., 1., 1., 1., 0.],752 [0., 0., 0., 0., 0.]])753 >>> rotate = Rotate(45, order=1, data_format="channels_last")754 >>> Y = rotate(X)755 >>> Y[:, :, 0] # doctest: +NORMALIZE_WHITESPACE756 array([[0., 0., 0., 0., 0., 0., 0.],757 [0., 0., 0., 0.34314575, 0., 0., 0.],758 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],759 [0., 0.34314575, 1., 1., 1., 0.34314575, 0.],760 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],761 [0., 0., 0., 0.34314575, 0., 0., 0.],762 [0., 0., 0., 0., 0., 0., 0.]])763 >>> X = X.reshape((1, 5, 5))764 >>> X[0, :, :] # doctest: +NORMALIZE_WHITESPACE765 array([[0., 0., 0., 0., 0.],766 [0., 1., 1., 1., 0.],767 [0., 1., 1., 1., 0.],768 [0., 1., 1., 1., 0.],769 [0., 0., 0., 0., 0.]])770 >>> rotate = Rotate(25, order=1, data_format="channels_first")771 >>> Y = rotate(X)772 >>> Y[0, :, :] # doctest: +NORMALIZE_WHITESPACE773 array([[0., 0., 0., 0., 0., 0., 0.],774 [0., 0., 0., 0.18738443, 0.15155864, 0., 0.],775 [0., 0.15155864, 0.67107395, 1., 0.67107395, 0., 0.],776 [0., 0.18738443, 1., 1., 1., 0.18738443, 0.],777 [0., 0., 0.67107395, 1., 0.67107395, 0.15155864, 0.],778 [0., 0., 0.15155864, 0.18738443, 0., 0., 0.],779 [0., 0., 0., 0., 0., 0., 0.]])780 >>> rotate = Rotate(-25, order=1, data_format="channels_first")781 >>> Y = rotate(X)782 >>> Y[0, :, :] # doctest: +NORMALIZE_WHITESPACE783 array([[0., 0., 0., 0., 0., 0., 0.],784 [0., 0., 0.15155864, 0.18738443, 0., 0., 0.],785 [0., 0., 0.67107395, 1., 0.67107395, 0.15155864, 0.],786 [0., 0.18738443, 1., 1., 1., 0.18738443, 0.],787 [0., 0.15155864, 0.67107395, 1., 0.67107395, 0., 0.],788 [0., 0., 0., 0.18738443, 0.15155864, 0., 0.],789 [0., 0., 0., 0., 0., 0., 0.]])790 >>>791 >>> X = np.zeros((5, 5, 5, 1))792 >>> X[1:-1, 1:-1, 1:-1] = 1793 >>> X[:, :, 1, 0] # doctest: +NORMALIZE_WHITESPACE794 array([[0., 0., 0., 0., 0.],795 [0., 1., 1., 1., 0.],796 [0., 1., 1., 1., 0.],797 [0., 1., 1., 1., 0.],798 [0., 0., 0., 0., 0.]])799 >>> rotate = Rotate([45, 0, 0], order=1, data_format="channels_last")800 >>> Y = rotate(X)801 >>> Y[:, :, 2, 0] # doctest: +NORMALIZE_WHITESPACE802 array([[0., 0., 0., 0., 0., 0., 0.],803 [0., 0., 0., 0.34314575, 0., 0., 0.],804 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],805 [0., 0.34314575, 1., 1., 1., 0.34314575, 0.],806 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],807 [0., 0., 0., 0.34314575, 0., 0., 0.],808 [0., 0., 0., 0., 0., 0., 0.]])809 >>> Y[:, 2, :, 0] # doctest: +NORMALIZE_WHITESPACE810 array([[0. , 0. , 0. , 0. , 0. ],811 [0. , 0. , 0. , 0. , 0. ],812 [0. , 0.58578644, 0.58578644, 0.58578644, 0. ],813 [0. , 1. , 1. , 1. , 0. ],814 [0. , 0.58578644, 0.58578644, 0.58578644, 0. ],815 [0. , 0. , 0. , 0. , 0. ],816 [0. , 0. , 0. , 0. , 0. ]])817 >>> Y[2, :, :, 0] # doctest: +NORMALIZE_WHITESPACE818 array([[0. , 0. , 0. , 0. , 0. ],819 [0. , 0. , 0. , 0. , 0. ],820 [0. , 0.58578644, 0.58578644, 0.58578644, 0. ],821 [0. , 1. , 1. , 1. , 0. ],822 [0. , 0.58578644, 0.58578644, 0.58578644, 0. ],823 [0. , 0. , 0. , 0. , 0. ],824 [0. , 0. , 0. , 0. , 0. ]])825 >>> rotate = Rotate([0, 45, 0], order=1, data_format="channels_last")826 >>> Y = rotate(X)827 >>> Y[:, :, 2, 0] # doctest: +NORMALIZE_WHITESPACE828 array([[0. , 0. , 0. , 0. , 0. ],829 [0. , 0. , 0. , 0. , 0. ],830 [0. , 0.58578644, 0.58578644, 0.58578644, 0. ],831 [0. , 1. , 1. , 1. , 0. ],832 [0. , 0.58578644, 0.58578644, 0.58578644, 0. ],833 [0. , 0. , 0. , 0. , 0. ],834 [0. , 0. , 0. , 0. , 0. ]])835 >>> Y[:, 2, :, 0] # doctest: +NORMALIZE_WHITESPACE836 array([[0., 0., 0., 0., 0., 0., 0.],837 [0., 0., 0., 0.34314575, 0., 0., 0.],838 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],839 [0., 0.34314575, 1., 1., 1., 0.34314575, 0.],840 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],841 [0., 0., 0., 0.34314575, 0., 0., 0.],842 [0., 0., 0., 0., 0., 0., 0.]])843 >>> Y[2, :, :, 0] # doctest: +NORMALIZE_WHITESPACE844 array([[0., 0., 0., 0., 0., 0., 0.],845 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],846 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],847 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],848 [0., 0., 0., 0., 0., 0., 0.]])849 >>> rotate = Rotate([0, 0, 45], order=1, data_format="channels_last")850 >>> Y = rotate(X)851 >>> Y[:, :, 2, 0] # doctest: +NORMALIZE_WHITESPACE852 array([[0., 0., 0., 0., 0., 0., 0.],853 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],854 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],855 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],856 [0., 0., 0., 0., 0., 0., 0.]])857 >>> Y[:, 2, :, 0] # doctest: +NORMALIZE_WHITESPACE858 array([[0., 0., 0., 0., 0., 0., 0.],859 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],860 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],861 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],862 [0., 0., 0., 0., 0., 0., 0.]])863 >>> Y[2, :, :, 0] # doctest: +NORMALIZE_WHITESPACE864 array([[0., 0., 0., 0., 0., 0., 0.],865 [0., 0., 0., 0.34314575, 0., 0., 0.],866 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],867 [0., 0.34314575, 1., 1., 1., 0.34314575, 0.],868 [0., 0., 0.58578644, 1., 0.58578644, 0., 0.],869 [0., 0., 0., 0.34314575, 0., 0., 0.],870 [0., 0., 0., 0., 0., 0., 0.]])871 """872 def __init__(self,873 angles,874 reshape=True,875 order=1,876 mode="nearest",877 cval=0.0,878 prefilter=True,879 data_format=None,880 random_state=None):881 super().__init__(data_format=data_format,882 random_state=random_state)883 if isinstance(angles, (int, float)):884 self.angles = float(angles)885 elif isinstance(angles, (list, tuple)):886 self.angles = [float(angles[i]) for i in range(len(angles))]887 else:888 raise ValueError("angles must be a float, or a list of floats.")889 self.reshape = bool(reshape)890 if int(order) not in [0, 1, 2, 3, 4, 5]:891 raise ValueError('``order`` must be in [0, 5].')892 self.order = int(order)893 if str(mode).lower() in {"reflect", "constant", "nearest", "mirror",894 "wrap"}:895 self.mode = str(mode).lower()896 else:897 raise ValueError('``mode`` must be one of "reflect", "constant", '898 '"nearest", "mirror", or "wrap".')899 self.cval = float(cval)900 self.prefilter = bool(prefilter)901 def __call__(self, inputs):902 n = inputs.ndim - 1 # Channel dimension excluded903 nn2 = n * (n-1) // 2904 if isinstance(self.angles, float):905 angles = [self.angles] * nn2906 else:907 angles = self.angles908 if len(angles) != nn2:909 warnings.warn("The number of provided angles (%d) does not match "910 "the required number of angles (n * (n - 1) / 2 = "911 "%d). The result may suffer."912 % (len(angles), nn2))913 if self.data_format == "channels_last":914 num_channels = inputs.shape[-1]915 else: # data_format == "channels_first":916 num_channels = inputs.shape[0]917 plane_i = 0918 # i = 0919 for i in range(n - 1):920 # j = 1921 for j in range(i + 1, n):922 if plane_i < len(angles): # Only rotate if specified923 outputs = None924 # c = 0925 for c in range(num_channels):926 if self.data_format == "channels_last":927 inputs_ = inputs[..., c]928 else: # data_format == "channels_first":929 inputs_ = inputs[c, ...]930 im = scipy.ndimage.rotate(inputs_,931 angles[plane_i],932 axes=(i, j),933 reshape=self.reshape,934 output=None,935 order=self.order,936 mode=self.mode,937 cval=self.cval,938 prefilter=self.prefilter)939 if self.data_format == "channels_last":940 if outputs is None:941 outputs = np.zeros(942 list(im.shape) + [num_channels])943 outputs[..., c] = im944 else: # data_format == "channels_first":945 if outputs is None:946 outputs = np.zeros(947 [num_channels] + list(im.shape))948 outputs[c, ...] = im949 inputs = outputs # Next pair of axes will use output950 plane_i += 1951 return outputs952class Crop(BaseAugmentation):953 """Crops an image.954 Parameters955 ----------956 crop : int, or list/tuple of int957 A subimage size to crop from the image. If a single int, use the same958 crop size for all dimensions. If an image is smaller than crop in any959 direction, no cropping will be performed in that direction.960 random : bool, optional961 Whether to select a random crop position, or to crop the middle portion962 of the image when cropping. Default is True, select a random crop.963 data_format : str, optional964 One of `channels_last` (default) or `channels_first`. The ordering of965 the dimensions in the inputs. `channels_last` corresponds to inputs966 with shape `(batch, height, width, channels)` while `channels_first`967 corresponds to inputs with shape `(batch, channels, height, width)`. It968 defaults to the `image_data_format` value found in your Keras config969 file at `~/.keras/keras.json`. If you never set it, then it will be970 "channels_last".971 random_state : int, float, array_like or numpy.random.RandomState, optional972 A random state to use when sampling pseudo-random numbers. If int,973 float or array_like, a new random state is created with the provided974 value as seed. If None, the default numpy random state (np.random) is975 used. Default is None, use the default numpy random state.976 Examples977 --------978 >>> from nethin.augmentation import Crop979 >>> import numpy as np980 >>>981 >>> np.random.seed(42)982 >>> X = np.random.rand(4, 4, 1)983 >>> X[:, :, 0] # doctest: +NORMALIZE_WHITESPACE984 array([[ 0.37454012, 0.95071431, 0.73199394, 0.59865848],985 [ 0.15601864, 0.15599452, 0.05808361, 0.86617615],986 [ 0.60111501, 0.70807258, 0.02058449, 0.96990985],987 [ 0.83244264, 0.21233911, 0.18182497, 0.18340451]])988 >>> crop = Crop([2, 2], random=False)989 >>> crop(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE990 array([[ 0.15599452, 0.05808361],991 [ 0.70807258, 0.02058449]])992 >>> crop = Crop([2, 2], random=True)993 >>> crop(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE994 array([[ 0.15599452, 0.05808361],995 [ 0.70807258, 0.02058449]])996 >>> crop(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE997 array([[ 0.37454012, 0.95071431],998 [ 0.15601864, 0.15599452]])999 >>> crop(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1000 array([[ 0.73199394, 0.59865848],1001 [ 0.05808361, 0.86617615]])1002 >>>1003 >>> np.random.seed(42)1004 >>> X = np.random.rand(3, 3, 1)1005 >>> X[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1006 array([[ 0.37454012, 0.95071431, 0.73199394],1007 [ 0.59865848, 0.15601864, 0.15599452],1008 [ 0.05808361, 0.86617615, 0.60111501]])1009 >>> crop = Crop([2, 2], random=False)1010 >>> crop(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1011 array([[0.15601864, 0.15599452],1012 [0.86617615, 0.60111501]])1013 >>> crop = Crop([2, 2], random=True)1014 >>> crop(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1015 array([[ 0.59865848, 0.15601864],1016 [ 0.05808361, 0.86617615]])1017 >>> crop(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1018 array([[ 0.59865848, 0.15601864],1019 [ 0.05808361, 0.86617615]])1020 >>> crop(X)[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1021 array([[ 0.15601864, 0.15599452],1022 [ 0.86617615, 0.60111501]])1023 >>>1024 >>> np.random.seed(42)1025 >>> X = np.random.rand(3, 3, 3, 1)1026 >>> X[:, :, :, 0] # doctest: +NORMALIZE_WHITESPACE1027 array([[[ 0.37454012, 0.95071431, 0.73199394],1028 [ 0.59865848, 0.15601864, 0.15599452],1029 [ 0.05808361, 0.86617615, 0.60111501]],1030 [[ 0.70807258, 0.02058449, 0.96990985],1031 [ 0.83244264, 0.21233911, 0.18182497],1032 [ 0.18340451, 0.30424224, 0.52475643]],1033 [[ 0.43194502, 0.29122914, 0.61185289],1034 [ 0.13949386, 0.29214465, 0.36636184],1035 [ 0.45606998, 0.78517596, 0.19967378]]])1036 >>> crop = Crop([2, 2, 2], random=False)1037 >>> crop(X) # doctest: +NORMALIZE_WHITESPACE1038 array([[[[0.21233911],1039 [0.18182497]],1040 [[0.30424224],1041 [0.52475643]]],1042 [[[0.29214465],1043 [0.36636184]],1044 [[0.78517596],1045 [0.19967378]]]])1046 >>> crop = Crop([2, 2, 2], random=True)1047 >>> crop(X) # doctest: +NORMALIZE_WHITESPACE1048 array([[[[0.37454012],1049 [0.95071431]],1050 [[0.59865848],1051 [0.15601864]]],1052 [[[0.70807258],1053 [0.02058449]],1054 [[0.83244264],1055 [0.21233911]]]])1056 >>> crop = Crop([2, 2, 2], random=False, data_format="channels_last")1057 >>> crop(X) # doctest: +NORMALIZE_WHITESPACE1058 array([[[[0.21233911],1059 [0.18182497]],1060 [[0.30424224],1061 [0.52475643]]],1062 [[[0.29214465],1063 [0.36636184]],1064 [[0.78517596],1065 [0.19967378]]]])1066 >>> np.all(crop(X) == X[1:3, 1:3, 1:3, :])1067 True1068 >>> crop(X).shape == X[1:3, 1:3, 1:3, :].shape1069 True1070 >>> crop = Crop([2, 2, 2], random=False, data_format="channels_first")1071 >>> crop(X) # doctest: +NORMALIZE_WHITESPACE1072 array([[[[0.15601864],1073 [0.15599452]],1074 [[0.86617615],1075 [0.60111501]]],1076 [[[0.21233911],1077 [0.18182497]],1078 [[0.30424224],1079 [0.52475643]]],1080 [[[0.29214465],1081 [0.36636184]],1082 [[0.78517596],1083 [0.19967378]]]])1084 >>> np.all(crop(X) == X[:, 1:3, 1:3, 1:3])1085 True1086 >>> crop(X).shape == X[:, 1:3, 1:3, :].shape1087 True1088 >>> np.random.seed(42)1089 >>> X = np.random.rand(4, 4, 1)1090 >>> X[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1091 array([[0.37454012, 0.95071431, 0.73199394, 0.59865848],1092 [0.15601864, 0.15599452, 0.05808361, 0.86617615],1093 [0.60111501, 0.70807258, 0.02058449, 0.96990985],1094 [0.83244264, 0.21233911, 0.18182497, 0.18340451]])1095 >>> crop = Crop([2, 2], random=True, data_format="channels_last")1096 >>> crop(X)[:, :, 0]1097 array([[0.15599452, 0.05808361],1098 [0.70807258, 0.02058449]])1099 >>> crop(X)[:, :, 0]1100 array([[0.37454012, 0.95071431],1101 [0.15601864, 0.15599452]])1102 >>> crop.lock()1103 >>> crop(X)[:, :, 0]1104 array([[0.37454012, 0.95071431],1105 [0.15601864, 0.15599452]])1106 >>> crop(X)[:, :, 0]1107 array([[0.37454012, 0.95071431],1108 [0.15601864, 0.15599452]])1109 >>> crop(X)[:, :, 0]1110 array([[0.37454012, 0.95071431],1111 [0.15601864, 0.15599452]])1112 >>> crop(X)[:, :, 0]1113 array([[0.37454012, 0.95071431],1114 [0.15601864, 0.15599452]])1115 >>> crop(X)[:, :, 0]1116 array([[0.37454012, 0.95071431],1117 [0.15601864, 0.15599452]])1118 >>> crop.unlock()1119 >>> crop(X)[:, :, 0]1120 array([[0.73199394, 0.59865848],1121 [0.05808361, 0.86617615]])1122 >>> crop(X)[:, :, 0]1123 array([[0.02058449, 0.96990985],1124 [0.18182497, 0.18340451]])1125 """1126 def __init__(self,1127 crop,1128 random=True,1129 data_format=None,1130 random_state=None):1131 super().__init__(data_format=data_format,1132 random_state=random_state)1133 if isinstance(crop, (int, float)):1134 self.crop = int(crop)1135 elif isinstance(crop, (list, tuple)):1136 self.crop = [max(0, int(crop[i])) for i in range(len(crop))]1137 else:1138 raise ValueError("crop must be an int, or a list/tuple of ints.")1139 self.random = bool(random)1140 # Not checked in base class1141 assert(hasattr(self.random_state, "randint"))1142 if self.data_format == "channels_last":1143 self._axis_offset = 01144 else: # data_format == "channels_first":1145 self._axis_offset = 11146 def __call__(self, inputs):1147 ndim = inputs.ndim - 1 # Channel dimension excluded1148 if isinstance(self.crop, int):1149 crop = [self.crop] * ndim1150 else:1151 crop = self.crop1152 if self._random is None:1153 self._random = [None] * len(crop)1154 if len(crop) != ndim:1155 warnings.warn("The provided number of crop sizes (%d) does not "1156 "match the required number of crop sizes (%d). The "1157 "result may suffer."1158 % (len(crop), ndim))1159 for i in range(len(crop)):1160 crop[i] = min(inputs.shape[self._axis_offset + i], crop[i])1161 slices = []1162 if self._axis_offset > 0:1163 slices.append(slice(None))1164 for i in range(len(crop)):1165 if not self.random:1166 coord = int(((inputs.shape[self._axis_offset + i] / 2)1167 - (crop[i] / 2)) + 0.5)1168 else:1169 if (not self._lock) or (self._random[i] is None):1170 coord = self.random_state.randint(1171 0,1172 max(1,1173 inputs.shape[self._axis_offset + i] - crop[i] + 1))1174 self._random[i] = coord1175 else:1176 coord = self._random[i]1177 slices.append(slice(coord, coord + crop[i]))1178 outputs = inputs[tuple(slices)]1179 return outputs1180class Shear(BaseAugmentation):1181 """Shears an image.1182 Parameters1183 ----------1184 shear : float, or list/tuple of float1185 The angle in degrees to shear the image with. If a single float, use1186 the same shear for all specified axes, otherwise use ``shear[i]`` for1187 axis ``axes[i]``.1188 axes : tuple of int, or list/tuple of tuple of int1189 The first value of each tuple corresponds to the axis to shear parallel1190 to, and the second value of each tuple is the axis to shear. The length1191 of axes should be the same as the length of shear, or shear should be a1192 single float (meaning to shear all axes).1193 order : int, optional1194 Integer in [0, 5], the order of the spline used in the interpolation.1195 The order corresponds to the following interpolations:1196 0: Nearest-neighbor1197 1: Bi-linear (default)1198 2: Bi-quadratic1199 3: Bi-cubic1200 4: Bi-quartic1201 5: Bi-quintic1202 Beware! Higher orders than 1 may cause the values to be outside of the1203 allowed range of values for your data. This must be handled manually.1204 Default is 1, i.e. bi-linear interpolation.1205 mode : {"reflect", "constant", "nearest", "mirror", "wrap"}, optional1206 Determines how the border should be handled. Default is "nearest".1207 The behavior for each option is:1208 "reflect": (d c b a | a b c d | d c b a)1209 The input is extended by reflecting about the edge of the last1210 pixel.1211 "constant": (k k k k | a b c d | k k k k)1212 The input is extended by filling all values beyond the edge1213 with the same constant value, defined by the cval parameter.1214 "nearest": (a a a a | a b c d | d d d d)1215 The input is extended by replicating the last pixel.1216 "mirror": (d c b | a b c d | c b a)1217 The input is extended by reflecting about the center of the1218 last pixel.1219 "wrap": (a b c d | a b c d | a b c d)1220 The input is extended by wrapping around to the opposite edge.1221 cval : float, optional1222 Value to fill past edges of input if mode is "constant". Default is1223 0.0.1224 prefilter : bool, optional1225 Whether or not to prefilter the input array with a spline filter before1226 interpolation. Default is True.1227 data_format : str, optional1228 One of `channels_last` (default) or `channels_first`. The ordering of1229 the dimensions in the inputs. `channels_last` corresponds to inputs1230 with shape `(batch, height, width, channels)` while `channels_first`1231 corresponds to inputs with shape `(batch, channels, height, width)`. It1232 defaults to the `image_data_format` value found in your Keras config1233 file at `~/.keras/keras.json`. If you never set it, then it will be1234 "channels_last".1235 random_state : int, float, array_like or numpy.random.RandomState, optional1236 A random state to use when sampling pseudo-random numbers. If int,1237 float or array_like, a new random state is created with the provided1238 value as seed. If None, the default numpy random state (np.random) is1239 used. Default is None, use the default numpy random state.1240 Examples1241 --------1242 >>> from nethin.augmentation import Shear1243 >>> import numpy as np1244 >>>1245 >>> X = np.zeros((5, 5, 1))1246 >>> X[1:-1, 1:-1] = 11247 >>> X[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1248 array([[0., 0., 0., 0., 0.],1249 [0., 1., 1., 1., 0.],1250 [0., 1., 1., 1., 0.],1251 [0., 1., 1., 1., 0.],1252 [0., 0., 0., 0., 0.]])1253 >>> shear = Shear([-45], axes=(1, 0))1254 >>> (shear(X)[:, :, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1255 array([[0, 0, 0, 0, 0, 0, 0, 0, 0],1256 [0, 0, 1, 1, 1, 0, 0, 0, 0],1257 [0, 0, 0, 1, 1, 1, 0, 0, 0],1258 [0, 0, 0, 0, 1, 1, 0, 0, 0],1259 [0, 0, 0, 0, 0, 0, 0, 0, 0]])1260 >>> shear = Shear([45], axes=(1, 0))1261 >>> (shear(X)[:, :, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1262 array([[0, 0, 0, 0, 0, 0, 0, 0, 0],1263 [0, 0, 0, 0, 1, 1, 1, 0, 0],1264 [0, 0, 0, 1, 1, 1, 0, 0, 0],1265 [0, 0, 1, 1, 1, 0, 0, 0, 0],1266 [0, 0, 0, 0, 0, 0, 0, 0, 0]])1267 >>> shear = Shear([45], axes=(0, 1))1268 >>> (shear(X)[:, :, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1269 array([[0, 0, 0, 0, 0],1270 [0, 0, 0, 0, 0],1271 [0, 0, 0, 1, 0],1272 [0, 0, 1, 1, 0],1273 [0, 1, 1, 1, 0],1274 [0, 1, 1, 0, 0],1275 [0, 1, 0, 0, 0],1276 [0, 0, 0, 0, 0],1277 [0, 0, 0, 0, 0]])1278 >>>1279 >>> if False: # Stress test1280 >>> import matplotlib.pyplot as plt1281 >>> s = 111282 >>> d = 111283 >>> ss = np.linspace(-np.pi / 3, np.pi / 3, d) * 180 / np.pi1284 >>> plt.figure()1285 >>> plot_i = 11286 >>> for s_i in range(s):1287 >>> X = np.zeros((5 + s_i, 10, 1))1288 >>> X[1:-1, 1:-1] = 11289 >>> for d_i in range(d):1290 >>> plt.subplot(s, d, plot_i)1291 >>> print(s_i + 1)1292 >>> print(d_i + 1)1293 >>> print(plot_i)1294 >>> plot_i += 11295 >>> shear = Shear([ss[d_i]], axes=(1, 0))1296 >>> plt.imshow(shear(X)[:, :, 0])1297 >>>1298 >>> X = np.zeros((5, 5, 5, 1))1299 >>> X[1:-1, 1:-1, 1:-1] = 11300 >>> X[2, :, :, 0] # doctest: +NORMALIZE_WHITESPACE1301 array([[0., 0., 0., 0., 0.],1302 [0., 1., 1., 1., 0.],1303 [0., 1., 1., 1., 0.],1304 [0., 1., 1., 1., 0.],1305 [0., 0., 0., 0., 0.]])1306 >>> shear = Shear([45], axes=(1, 0))1307 >>> Y = shear(X)1308 >>> (Y[:, :, 2, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1309 array([[0, 0, 0, 0, 0, 0, 0, 0, 0],1310 [0, 0, 0, 0, 1, 1, 1, 0, 0],1311 [0, 0, 0, 1, 1, 1, 0, 0, 0],1312 [0, 0, 1, 1, 1, 0, 0, 0, 0],1313 [0, 0, 0, 0, 0, 0, 0, 0, 0]])1314 >>> (Y[:, 3, :, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1315 array([[0, 0, 0, 0, 0],1316 [0, 0, 0, 0, 0],1317 [0, 1, 1, 1, 0],1318 [0, 1, 1, 1, 0],1319 [0, 0, 0, 0, 0]])1320 >>> shear = Shear([45], axes=(2, 1))1321 >>> Y = shear(X)1322 >>> (Y[2, :, :, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1323 array([[0, 0, 0, 0, 0, 0, 0, 0, 0],1324 [0, 0, 0, 0, 1, 1, 1, 0, 0],1325 [0, 0, 0, 1, 1, 1, 0, 0, 0],1326 [0, 0, 1, 1, 1, 0, 0, 0, 0],1327 [0, 0, 0, 0, 0, 0, 0, 0, 0]])1328 >>> (Y[:, :, 3, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1329 array([[0, 0, 0, 0, 0],1330 [0, 0, 1, 1, 0],1331 [0, 0, 1, 1, 0],1332 [0, 0, 1, 1, 0],1333 [0, 0, 0, 0, 0]])1334 >>>1335 >>> shear = Shear([30, 60], axes=[(1, 0), (2, 1)])1336 >>> Y = shear(X)1337 >>> (Y[:, :, 2, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1338 array([[0, 0, 0, 0, 0, 0, 0],1339 [0, 0, 0, 0, 0, 0, 0],1340 [0, 0, 0, 0, 0, 0, 0],1341 [0, 0, 0, 0, 0, 0, 0],1342 [0, 0, 0, 0, 0, 0, 0]])1343 >>> (Y[:, :, 4, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1344 array([[0, 0, 0, 0, 0, 0, 0],1345 [0, 0, 0, 0, 1, 1, 0],1346 [0, 0, 0, 0, 1, 0, 0],1347 [0, 0, 0, 0, 0, 0, 0],1348 [0, 0, 0, 0, 0, 0, 0]])1349 >>> (Y[:, :, 6, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1350 array([[0, 0, 0, 0, 0, 0, 0],1351 [0, 0, 0, 1, 1, 0, 0],1352 [0, 0, 0, 1, 1, 0, 0],1353 [0, 0, 0, 1, 1, 0, 0],1354 [0, 0, 0, 0, 0, 0, 0]])1355 >>> (Y[:, :, 8, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1356 array([[0, 0, 0, 0, 0, 0, 0],1357 [0, 0, 0, 1, 0, 0, 0],1358 [0, 0, 1, 1, 0, 0, 0],1359 [0, 0, 1, 1, 0, 0, 0],1360 [0, 0, 0, 0, 0, 0, 0]])1361 >>> (Y[:, :, 10, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1362 array([[0, 0, 0, 0, 0, 0, 0],1363 [0, 0, 0, 0, 0, 0, 0],1364 [0, 0, 1, 0, 0, 0, 0],1365 [0, 0, 1, 0, 0, 0, 0],1366 [0, 0, 0, 0, 0, 0, 0]])1367 >>> (Y[:, :, 12, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1368 array([[0, 0, 0, 0, 0, 0, 0],1369 [0, 0, 0, 0, 0, 0, 0],1370 [0, 0, 0, 0, 0, 0, 0],1371 [0, 0, 0, 0, 0, 0, 0],1372 [0, 0, 0, 0, 0, 0, 0]])1373 >>> (Y[2, :, :, 0] + 0.5).astype(int) # doctest: +NORMALIZE_WHITESPACE1374 array([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],1375 [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],1376 [0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0],1377 [0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0],1378 [0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0],1379 [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],1380 [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]])1381 """1382 def __init__(self,1383 shear,1384 axes,1385 order=1,1386 mode="nearest",1387 cval=0.0,1388 prefilter=True,1389 data_format=None,1390 random_state=None):1391 super().__init__(data_format=data_format,1392 random_state=random_state)1393 if isinstance(shear, (float, int)):1394 self.shear = float(shear)1395 elif isinstance(shear, (list, tuple)):1396 self.shear = [float(shear[i]) for i in range(len(shear))]1397 else:1398 raise ValueError("shear must be a float, or a list/tuple of "1399 "floats.")1400 if isinstance(axes, (list, tuple)):1401 if isinstance(axes[0], (list, tuple)):1402 self.axes = [(int(a[0]), int(a[1])) for a in axes]1403 else:1404 self.axes = [axes]1405 else:1406 raise ValueError("axes should be a list/tuple of 2-tuples of ints")1407 if int(order) not in [0, 1, 2, 3, 4, 5]:1408 raise ValueError('``order`` must be in [0, 5].')1409 self.order = int(order)1410 if str(mode).lower() in {"reflect", "constant", "nearest", "mirror",1411 "wrap"}:1412 self.mode = str(mode).lower()1413 else:1414 raise ValueError('``mode`` must be one of "reflect", "constant", '1415 '"nearest", "mirror", or "wrap".')1416 self.cval = float(cval)1417 self.prefilter = bool(prefilter)1418 if self.data_format == "channels_last":1419 self._axis_offset = 01420 else: # data_format == "channels_first":1421 self._axis_offset = 11422 def __call__(self, inputs):1423 ndim = inputs.ndim - 1 # Channel dimension excluded1424 if isinstance(self.shear, float):1425 shear = [self.shear] * ndim1426 else:1427 shear = self.shear1428 if len(self.axes) != len(shear):1429 raise RuntimeError("The provided number of axes (%d) does not "1430 "match the provided number of shear values "1431 "(%d)." % (len(self.axes), len(shear)))1432 if self.data_format == "channels_last":1433 num_channels = inputs.shape[-1]1434 else: # data_format == "channels_first":1435 num_channels = inputs.shape[0]1436 S = np.eye(ndim)1437 offset = [0.0] * ndim1438 if self.data_format == "channels_last":1439 output_shape = list(inputs.shape[:-1])1440 else: # data_format == "channels_first":1441 output_shape = list(inputs.shape[1:])1442 # i = 01443 for i in range(len(shear)):1444 idx0 = self.axes[i][0]1445 idx1 = self.axes[i][1]1446 # +-------+---+1447 # | | /|1448 # | I |_/ | h1449 # | |/ |1450 # +-------+---+1451 # w b1452 s = shear[i]1453 h = float(output_shape[idx1] - 1)1454 b = h * np.tan(s * (np.pi / 180.0))1455 S_i = np.eye(ndim)1456 if abs(b) > 10.0 * np.finfo("float").eps:1457 S_i[idx0, idx1] = b / h1458 else:1459 S_i[idx0, idx1] = 0.01460 output_shape[idx0] += int(abs(b) + 0.5)1461 if s > 0.0:1462 offset[idx0] -= b1463 S = np.dot(S, S_i)1464 if self.data_format == "channels_last":1465 outputs = np.zeros(output_shape + [num_channels])1466 else: # data_format == "channels_first":1467 outputs = np.zeros([num_channels] + output_shape)1468 output_shape = tuple(output_shape)1469 for c in range(num_channels):1470 if self.data_format == "channels_last":1471 inputs_ = inputs[..., c]1472 else: # data_format == "channels_first":1473 inputs_ = inputs[c, ...]1474 im = scipy.ndimage.affine_transform(inputs_,1475 S,1476 offset=offset,1477 output_shape=output_shape,1478 output=None,1479 order=self.order,1480 mode=self.mode,1481 cval=self.cval,1482 prefilter=self.prefilter)1483 if self.data_format == "channels_last":1484 outputs[..., c] = im1485 else: # data_format == "channels_first":1486 outputs[c, ...] = im1487 return outputs1488class DistortionField(BaseAugmentation):1489 """Applies a vector field to an image to distort it.1490 Parameters1491 ----------1492 field : ndarray, optional1493 An ndarray of shape ``(*I.shape, I.ndim)``, where ``I`` is the input1494 image. Passing None is equal to passing ``np.zeros((*I.shape,1495 I.ndim))``, i.e. no vector field is added (unless there is a random1496 addition). Default is None.1497 random_size : float, optional1498 If ``random_size > 0``, then a new random vector field is added to the1499 provided field at each call, with the largest vector having 2-norm1500 ``random_size``. The amount is independent and uniform for all elements1501 and all dimensions. Default is 0.0, i.e. add no random vector field.1502 reshape : bool, optional1503 If True, the output image is reshaped such that the input image is1504 contained completely in the output image. Default is True.1505 order : int, optional1506 Integer in [0, 5], the order of the spline used in the interpolation.1507 The order corresponds to the following interpolations:1508 0: Nearest-neighbor1509 1: Bi-linear (default)1510 2: Bi-quadratic1511 3: Bi-cubic1512 4: Bi-quartic1513 5: Bi-quintic1514 Beware! Higher orders than 1 may cause the values to be outside of the1515 allowed range of values for your data. This must be handled manually.1516 Default is 1, i.e. bi-linear interpolation.1517 mode : {"reflect", "constant", "nearest", "wrap"}, optional1518 Determines how the border should be handled. Default is "nearest".1519 The behavior for each option is:1520 "reflect": (d c b a | a b c d | d c b a)1521 The input is extended by reflecting about the edge of the last1522 pixel.1523 "constant": (k k k k | a b c d | k k k k)1524 The input is extended by filling all values beyond the edge1525 with the same constant value, defined by the cval parameter.1526 "nearest": (a a a a | a b c d | d d d d)1527 The input is extended by replicating the last pixel.1528 "wrap": (a b c d | a b c d | a b c d)1529 The input is extended by wrapping around to the opposite edge.1530 cval : float, optional1531 Value to fill past edges of input if mode is "constant". Default is1532 0.0.1533 prefilter : bool, optional1534 Whether or not to prefilter the input array with a spline filter before1535 interpolation. Default is True.1536 data_format : str, optional1537 One of `channels_last` (default) or `channels_first`. The ordering of1538 the dimensions in the inputs. `channels_last` corresponds to inputs1539 with shape `(batch, height, width, channels)` while `channels_first`1540 corresponds to inputs with shape `(batch, channels, height, width)`. It1541 defaults to the `image_data_format` value found in your Keras config1542 file at `~/.keras/keras.json`. If you never set it, then it will be1543 "channels_last".1544 random_state : int, float, array_like or numpy.random.RandomState, optional1545 A random state to use when sampling pseudo-random numbers. If int,1546 float or array_like, a new random state is created with the provided1547 value as seed. If None, the default numpy random state (np.random) is1548 used. Default is None, use the default numpy random state.1549 Examples1550 --------1551 >>> from nethin.augmentation import DistortionField1552 >>> import nethin.utils as utils1553 >>> import numpy as np1554 >>> np.random.seed(42)1555 >>>1556 >>> d = 51557 >>> X = np.zeros((d, d, 1))1558 >>> for i in range(1, d, 2):1559 ... X[i, 1:-1] = 11560 >>> X[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1561 array([[0., 0., 0., 0., 0.],1562 [0., 1., 1., 1., 0.],1563 [0., 0., 0., 0., 0.],1564 [0., 1., 1., 1., 0.],1565 [0., 0., 0., 0., 0.]])1566 >>> m = 11567 >>> U = np.tile(np.linspace(-m, m, d).reshape(-1, 1), (1, d))[...,1568 ... np.newaxis]1569 >>> V = np.tile(np.linspace(-m, m, d), (d, 1))[..., np.newaxis]1570 >>> vf = np.concatenate([U, V], axis=2)1571 >>> vf[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1572 array([[-1. , -1. , -1. , -1. , -1. ],1573 [-0.5, -0.5, -0.5, -0.5, -0.5],1574 [ 0. , 0. , 0. , 0. , 0. ],1575 [ 0.5, 0.5, 0.5, 0.5, 0.5],1576 [ 1. , 1. , 1. , 1. , 1. ]])1577 >>> vf[:, :, 1] # doctest: +NORMALIZE_WHITESPACE1578 array([[-1. , -0.5, 0. , 0.5, 1. ],1579 [-1. , -0.5, 0. , 0.5, 1. ],1580 [-1. , -0.5, 0. , 0.5, 1. ],1581 [-1. , -0.5, 0. , 0.5, 1. ],1582 [-1. , -0.5, 0. , 0.5, 1. ]])1583 >>> distortion_field = DistortionField(vf)1584 >>> Y = distortion_field(X)1585 >>> Y[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1586 array([[1. , 1. , 1. , 1. , 1. ],1587 [0.5, 0.5, 0.5, 0.5, 0.5],1588 [0. , 0. , 0. , 0. , 0. ],1589 [0.5, 0.5, 0.5, 0.5, 0.5],1590 [1. , 1. , 1. , 1. , 1. ]])1591 >>>1592 >>> X = np.zeros((d, d, 1))1593 >>> for i in range(1, d, 2):1594 ... X[1:-1, i] = 11595 >>> X[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1596 array([[0., 0., 0., 0., 0.],1597 [0., 1., 0., 1., 0.],1598 [0., 1., 0., 1., 0.],1599 [0., 1., 0., 1., 0.],1600 [0., 0., 0., 0., 0.]])1601 >>> distortion_field = DistortionField(vf)1602 >>> Y = distortion_field(X)1603 >>> Y[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1604 array([[1. , 0.5, 0. , 0.5, 1. ],1605 [1. , 0.5, 0. , 0.5, 1. ],1606 [1. , 0.5, 0. , 0.5, 1. ],1607 [1. , 0.5, 0. , 0.5, 1. ],1608 [1. , 0.5, 0. , 0.5, 1. ]])1609 >>>1610 >>> vx = -U / np.sqrt(V**2 + U**2 + 10**-16) * np.exp(-(V**2 + U**2))1611 >>> vy = V / np.sqrt(V**2 + U**2 + 10**-16) * np.exp(-(V**2 + U**2))1612 >>> vf = np.concatenate((vy, vx), axis=2)1613 >>> distortion_field = DistortionField(vf)1614 >>> Y = distortion_field(X)1615 >>> Y[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1616 array([[0. , 0.09529485, 0. , 0. , 0. ],1617 [0. , 0.57111806, 0.77880073, 0.32617587, 0.09529479],1618 [0. , 1. , 0. , 1. , 0. ],1619 [0.09529483, 0.3261759 , 0.77880073, 0.57111812, 0. ],1620 [0. , 0. , 0. , 0.09529477, 0. ]])1621 >>>1622 >>> V = np.tile(np.linspace(-m, m, d), (d, 1))[..., np.newaxis]1623 >>> U = np.zeros_like(V)1624 >>> vf = np.concatenate([U, V], axis=2)1625 >>> vf[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1626 array([[0., 0., 0., 0., 0.],1627 [0., 0., 0., 0., 0.],1628 [0., 0., 0., 0., 0.],1629 [0., 0., 0., 0., 0.],1630 [0., 0., 0., 0., 0.]])1631 >>> vf[:, :, 1] # doctest: +NORMALIZE_WHITESPACE1632 array([[-1. , -0.5, 0. , 0.5, 1. ],1633 [-1. , -0.5, 0. , 0.5, 1. ],1634 [-1. , -0.5, 0. , 0.5, 1. ],1635 [-1. , -0.5, 0. , 0.5, 1. ],1636 [-1. , -0.5, 0. , 0.5, 1. ]])1637 >>> distortion_field = DistortionField(vf)1638 >>> Y = distortion_field(X)1639 >>> Y[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1640 array([[0. , 0. , 0. , 0. , 0. ],1641 [1. , 0.5, 0. , 0.5, 1. ],1642 [1. , 0.5, 0. , 0.5, 1. ],1643 [1. , 0.5, 0. , 0.5, 1. ],1644 [0. , 0. , 0. , 0. , 0. ]])1645 >>> U = np.tile(np.linspace(-m, m, d).reshape(-1, 1), (1, d))[...,1646 ... np.newaxis]1647 >>> V = np.zeros_like(U)1648 >>> vf = np.concatenate([U, V], axis=2)1649 >>> distortion_field = DistortionField(vf)1650 >>> Y = distortion_field(X)1651 >>> Y[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1652 array([[0., 1., 0., 1., 0.],1653 [0., 1., 0., 1., 0.],1654 [0., 1., 0., 1., 0.],1655 [0., 1., 0., 1., 0.],1656 [0., 1., 0., 1., 0.]])1657 >>>1658 >>> X = np.zeros((d, d, 1))1659 >>> X[d // 2, :] = 11660 >>> X[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1661 array([[0., 0., 0., 0., 0.],1662 [0., 0., 0., 0., 0.],1663 [1., 1., 1., 1., 1.],1664 [0., 0., 0., 0., 0.],1665 [0., 0., 0., 0., 0.]])1666 >>> vf[..., 0] = vf[..., 0].T1667 >>> vf[..., 0] # doctest: +NORMALIZE_WHITESPACE1668 array([[-1. , -0.5, 0. , 0.5, 1. ],1669 [-1. , -0.5, 0. , 0.5, 1. ],1670 [-1. , -0.5, 0. , 0.5, 1. ],1671 [-1. , -0.5, 0. , 0.5, 1. ],1672 [-1. , -0.5, 0. , 0.5, 1. ]])1673 >>> distortion_field = DistortionField(vf)1674 >>> Y = distortion_field(X)1675 >>> Y[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1676 array([[0. , 0. , 0. , 0. , 0. ],1677 [0. , 0. , 0. , 0. , 0. ],1678 [1. , 0.5, 0. , 0. , 0. ],1679 [0. , 0.5, 1. , 0.5, 0. ],1680 [0. , 0. , 0. , 0.5, 1. ],1681 [0. , 0. , 0. , 0. , 0. ],1682 [0. , 0. , 0. , 0. , 0. ]])1683 >>> distortion_field = DistortionField(vf, reshape=False)1684 >>> Y = distortion_field(X)1685 >>> Y[:, :, 0] # doctest: +NORMALIZE_WHITESPACE1686 array([[0. , 0. , 0. , 0. , 0. ],1687 [1. , 0.5, 0. , 0. , 0. ],1688 [0. , 0.5, 1. , 0.5, 0. ],1689 [0. , 0. , 0. , 0.5, 1. ],1690 [0. , 0. , 0. , 0. , 0. ]])1691 """1692 def __init__(self,1693 field=None,1694 random_size=0.0,1695 reshape=True,1696 order=1,1697 mode="nearest",1698 cval=0.0,1699 prefilter=True,1700 data_format=None,1701 random_state=None):1702 super().__init__(data_format=data_format,1703 random_state=random_state)1704 if field is None:1705 self.field = field1706 elif isinstance(field, np.ndarray):1707 self.field = field.astype(np.float32)1708 else:1709 raise ValueError("field should be a numpy.ndarray.")1710 if isinstance(random_size, (float, int)):1711 self.random_size = max(0.0, float(random_size))1712 else:1713 raise ValueError("random_size should be a float.")1714 self.reshape = bool(reshape)1715 if int(order) not in [0, 1, 2, 3, 4, 5]:1716 raise ValueError('``order`` must be in [0, 5].')1717 self.order = int(order)1718 # Note: No "mirror" in map_coordinates, so removed in this augmentor!1719 if str(mode).lower() in {"reflect", "constant", "nearest", "wrap"}:1720 self.mode = str(mode).lower()1721 else:1722 raise ValueError('``mode`` must be one of "reflect", "constant", '1723 '"nearest", or "wrap".')1724 self.cval = float(cval)1725 self.prefilter = bool(prefilter)1726 if self.data_format == "channels_last":1727 self._axis_offset = 01728 else: # data_format == "channels_first":1729 self._axis_offset = 11730 def __call__(self, inputs):1731 if self.data_format == "channels_last":1732 shape = inputs[..., 0].shape1733 else:1734 shape = inputs[0, ...].shape1735 if self.field is None:1736 field = None1737 else:1738 if self.field[..., 0].shape != shape:1739 raise RuntimeError("The shape of the provided vector field "1740 "(%s) does not match the shape of the "1741 "provided inputs (%s)."1742 % (str(self.field.shape), str(shape)))1743 if self.field.shape[-1] != len(shape):1744 raise RuntimeError("The dimension of the vector field (%s) "1745 "does not match the dimension of the "1746 "provided inputs (%s)."1747 % (str(self.field.shape[-1]),1748 str(len(shape))))1749 field = self.field1750 if self.random_size > 0.0:1751 random_field = np.random.rand(*shape)1752 max_norm = np.max(np.linalg.norm(random_field, axis=-1))1753 random_field *= (self.random_size / max_norm)1754 if field is None:1755 field = random_field1756 else:1757 field += random_field1758 if field is None:1759 outputs = inputs1760 else:1761 if self.data_format == "channels_last":1762 num_channels = inputs.shape[-1]1763 else: # data_format == "channels_first":1764 num_channels = inputs.shape[0]1765 dims = [np.arange(d) for d in inputs.shape[:-1]]1766 coords = np.stack(np.meshgrid(*dims, indexing="ij"), axis=-1)1767 coords = coords.astype(field.dtype)1768 pad1 = [0.0] * coords.shape[-1]1769 pad2 = [0.0] * coords.shape[-1]1770 for d in range(coords.shape[-1]):1771 coords[..., d] -= field[..., d]1772 if self.reshape:1773 pad1[d] = max(pad1[d], -np.min(coords[..., d]))1774 pad2[d] = max(pad2[d],1775 np.max(coords[..., d]) - coords.shape[d] + 1)1776 if self.reshape:1777 pad = [(int(b + 0.5), int(a + 0.5))1778 for b, a in zip(pad1, pad2)] + [(0, 0)]1779 pad_mode = self.mode1780 if pad_mode == "nearest":1781 pad_mode = "edge" # Note: Different name in np.pad!1782 elif pad_mode == "reflect":1783 pad_mode = "symmetric" # Note: Different name in np.pad!1784 pad_kwargs = {}1785 if pad_mode == "constant":1786 pad_kwargs = dict(constant_values=self.cval)1787 inputs = np.pad(inputs,1788 pad,1789 pad_mode,1790 **pad_kwargs)1791 # TODO: We redo the coordinate computations. Better way?1792 dims = [np.arange(d) for d in inputs.shape[:-1]]1793 coords = np.stack(np.meshgrid(*dims, indexing="ij"), axis=-1)1794 coords = coords.astype(field.dtype)1795 field = np.pad(field,1796 pad,1797 "constant",1798 constant_values=0.0)1799 for d in range(coords.shape[-1]):1800 coords[..., d] -= field[..., d]1801 coords = [coords[..., i] for i in range(coords.shape[-1])]1802 outputs = None1803 for c in range(num_channels):1804 if self.data_format == "channels_last":1805 inputs_ = inputs[..., c]1806 else: # data_format == "channels_first":1807 inputs_ = inputs[c, ...]1808 outputs_ = scipy.ndimage.map_coordinates(1809 inputs_,1810 coords,1811 order=self.order,1812 mode=self.mode,1813 cval=self.cval,1814 prefilter=self.prefilter)1815 if self.data_format == "channels_last":1816 if outputs is None:1817 outputs = np.zeros(1818 list(outputs_.shape) + [num_channels])1819 outputs[..., c] = outputs_1820 else: # data_format == "channels_first":1821 if outputs is None:1822 outputs = np.zeros(1823 [num_channels] + list(outputs_.shape))1824 outputs[c, ...] = outputs_1825 return outputs1826class ImageHistogramShift(BaseAugmentation):1827 """Shifts the histogram of the inputs.1828 Parameters1829 ----------1830 shift : float or Callable1831 Either the amount to shift, or a image-wise shift function (a function1832 that takes a whole image as input).1833 min_value : float, optional1834 The minimum possible or allowed value of the image. If None, no lower1835 clipping will be performed. Default is None.1836 max_value : float, optional1837 The maximum possible or allowed value of the image. If None, no upper1838 clipping will be performed. Default is None.1839 Examples1840 --------1841 >>> from nethin.augmentation import ImageHistogramShift1842 >>> import numpy as np1843 >>>1844 >>> np.random.seed(42)1845 >>> X = np.random.rand(3, 3)1846 >>> X # doctest: +NORMALIZE_WHITESPACE1847 array([[ 0.37454012, 0.95071431, 0.73199394],1848 [ 0.59865848, 0.15601864, 0.15599452],1849 [ 0.05808361, 0.86617615, 0.60111501]])1850 >>> shift = ImageHistogramShift(1)1851 >>> shift(X) # doctest: +NORMALIZE_WHITESPACE1852 array([[ 1.37454012, 1.95071431, 1.73199394],1853 [ 1.59865848, 1.15601864, 1.15599452],1854 [ 1.05808361, 1.86617615, 1.60111501]])1855 >>> shift = ImageHistogramShift(-0.5, min_value=0.0)1856 >>> shift(X) # doctest: +NORMALIZE_WHITESPACE1857 array([[ 0. , 0.45071431, 0.23199394],1858 [ 0.09865848, 0. , 0. ],1859 [ 0. , 0.36617615, 0.10111501]])1860 >>> shift = ImageHistogramShift(0.5, max_value=1.0)1861 >>> shift(X) # doctest: +NORMALIZE_WHITESPACE1862 array([[ 0.87454012, 1. , 1. ],1863 [ 1. , 0.65601864, 0.65599452],1864 [ 0.55808361, 1. , 1. ]])1865 """1866 def __init__(self,1867 shift,1868 min_value=None,1869 max_value=None,1870 data_format=None,1871 random_state=None):1872 super(ImageHistogramShift, self).__init__(data_format=data_format,1873 random_state=random_state)1874 if isinstance(shift, (int, float)):1875 self.shift = lambda I: I + float(shift)1876 elif callable(shift):1877 self.shift = shift1878 else:1879 raise ValueError("``shift`` must either be a scalar or a "1880 "callable.")1881 if min_value is None:1882 self.min_value = min_value1883 else:1884 self.min_value = float(min_value)1885 if max_value is None:1886 self.max_value = max_value1887 else:1888 self.max_value = float(max_value)1889 def __call__(self, inputs):1890 outputs = self.shift(inputs)1891 if (self.min_value is not None) or (self.max_value is not None):1892 outputs = np.clip(outputs, self.min_value, self.max_value)1893 return outputs1894class ImageHistogramScale(BaseAugmentation):1895 """Scales the histogram of the inputs.1896 Parameters1897 ----------1898 scale : float or Callable1899 Either the scale factor, or a image-wise scale function (a function1900 that takes a whole image as input).1901 min_value : float, optional1902 The minimum possible or allowed value of the image. If None, no lower1903 clipping will be performed. Default is None.1904 max_value : float, optional1905 The maximum possible or allowed value of the image. If None, no upper1906 clipping will be performed. Default is None.1907 Examples1908 --------1909 >>> from nethin.augmentation import ImageHistogramScale1910 >>> import numpy as np1911 >>>1912 >>> np.random.seed(42)1913 >>> X = np.random.rand(3, 3)1914 >>> X # doctest: +NORMALIZE_WHITESPACE1915 array([[ 0.37454012, 0.95071431, 0.73199394],1916 [ 0.59865848, 0.15601864, 0.15599452],1917 [ 0.05808361, 0.86617615, 0.60111501]])1918 >>> scale = ImageHistogramScale(2)1919 >>> scale(X) # doctest: +NORMALIZE_WHITESPACE1920 array([[ 0.74908024, 1.90142861, 1.46398788],1921 [ 1.19731697, 0.31203728, 0.31198904],1922 [ 0.11616722, 1.73235229, 1.20223002]])1923 >>> scale = ImageHistogramScale(20.0, max_value=10.0)1924 >>> scale(X) # doctest: +NORMALIZE_WHITESPACE1925 array([[ 7.49080238, 10. , 10. ],1926 [ 10. , 3.12037281, 3.11989041],1927 [ 1.16167224, 10. , 10. ]])1928 >>> scale = ImageHistogramScale(0.5, max_value=0.25)1929 >>> scale(X) # doctest: +NORMALIZE_WHITESPACE1930 array([[ 0.18727006, 0.25 , 0.25 ],1931 [ 0.25 , 0.07800932, 0.07799726],1932 [ 0.02904181, 0.25 , 0.25 ]])1933 """1934 def __init__(self,1935 scale,1936 min_value=None,1937 max_value=None,1938 data_format=None,1939 random_state=None):1940 super(ImageHistogramScale, self).__init__(data_format=data_format,1941 random_state=random_state)1942 if isinstance(scale, (int, float)):1943 self.scale = lambda I: I * float(scale)1944 elif callable(scale):1945 self.scale = scale1946 else:1947 raise ValueError("``scale`` must either be a scalar or a "1948 "callable.")1949 if min_value is None:1950 self.min_value = min_value1951 else:1952 self.min_value = float(min_value)1953 if max_value is None:1954 self.max_value = max_value1955 else:1956 self.max_value = float(max_value)1957 def __call__(self, inputs):1958 outputs = self.scale(inputs)1959 if (self.min_value is not None) or (self.max_value is not None):1960 outputs = np.clip(outputs, self.min_value, self.max_value)1961 return outputs1962class ImageHistogramAffineTransform(BaseAugmentation):1963 """Performs an affine transformation of the histogram of the inputs.1964 This means that the following affine transformation is applied:1965 I' = scale * I + shift.1966 Parameters1967 ----------1968 scale : float or Callable, optional1969 Either the scale factor, or a image-wise scale function (a function1970 that takes a whole image as input).1971 shift : float or Callable, optional1972 Either the amount to shift, or a image-wise shift function (a function1973 that takes a whole image as input).1974 min_value : float, optional1975 The minimum possible or allowed value of the image. If None, no lower1976 clipping will be performed. Default is None.1977 max_value : float, optional1978 The maximum possible or allowed value of the image. If None, no upper1979 clipping will be performed. Default is None.1980 Examples1981 --------1982 >>> from nethin.augmentation import ImageHistogramAffineTransform1983 >>> import numpy as np1984 >>>1985 >>> np.random.seed(42)1986 >>> X = np.random.rand(3, 3)1987 >>> X # doctest: +NORMALIZE_WHITESPACE1988 array([[ 0.37454012, 0.95071431, 0.73199394],1989 [ 0.59865848, 0.15601864, 0.15599452],1990 [ 0.05808361, 0.86617615, 0.60111501]])1991 >>> affine_transform = ImageHistogramAffineTransform(shift=2, scale=3)1992 >>> affine_transform(X) # doctest: +NORMALIZE_WHITESPACE1993 array([[ 3.12362036, 4.85214292, 4.19598183],1994 [ 3.79597545, 2.46805592, 2.46798356],1995 [ 2.17425084, 4.59852844, 3.80334504]])1996 >>> np.linalg.norm((3 * X + 2) - affine_transform(X))1997 0.01998 >>> affine_transform = ImageHistogramAffineTransform(shift=5.0,1999 ... scale=10.0,2000 ... max_value=10.0)2001 >>> affine_transform(X) # doctest: +NORMALIZE_WHITESPACE2002 array([[ 8.74540119, 10. , 10. ],2003 [ 10. , 6.5601864 , 6.5599452 ],2004 [ 5.58083612, 10. , 10. ]])2005 >>> affine_transform = ImageHistogramAffineTransform(shift=-0.5, scale=1.0,2006 ... min_value=-0.25,2007 ... max_value=0.25)2008 >>> affine_transform(X) # doctest: +NORMALIZE_WHITESPACE2009 array([[-0.12545988, 0.25 , 0.23199394],2010 [ 0.09865848, -0.25 , -0.25 ],2011 [-0.25 , 0.25 , 0.10111501]])2012 """2013 def __init__(self,2014 scale=1.0,2015 shift=0.0,2016 min_value=None,2017 max_value=None,2018 data_format=None,2019 random_state=None):2020 super(ImageHistogramAffineTransform,2021 self).__init__(data_format=data_format,2022 random_state=random_state)2023 if isinstance(shift, (int, float)):2024 shift = float(shift)2025 if shift != 0.0:2026 self.shift = lambda I: I + shift2027 else:2028 self.shift = lambda I: I2029 elif callable(shift):2030 self.shift = shift2031 else:2032 raise ValueError("``shift`` must either be a scalar or a "2033 "callable.")2034 if isinstance(scale, (int, float)):2035 scale = float(scale)2036 if scale != 1.0:2037 self.scale = lambda I: scale * I2038 else:2039 self.scale = lambda I: I2040 elif callable(scale):2041 self.scale = scale2042 else:2043 raise ValueError("``scale`` must either be a scalar or a "2044 "callable.")2045 if min_value is None:2046 self.min_value = min_value2047 else:2048 self.min_value = float(min_value)2049 if max_value is None:2050 self.max_value = max_value2051 else:2052 self.max_value = float(max_value)2053 def __call__(self, inputs):2054 outputs = self.shift(self.scale(inputs))2055 if (self.min_value is not None) or (self.max_value is not None):2056 outputs = np.clip(outputs, self.min_value, self.max_value)2057 return outputs2058class ImageHistogramTransform(BaseAugmentation):2059 """Transforms the histogram of the input image (i.e., intensity transform).2060 Parameters2061 ----------2062 transform : Transform or Callable2063 Any scalar function defined on the domain of the input image. Ideally,2064 this function should be monotonically increasing, but this is not a2065 formal requirement by this class.2066 min_value : float, optional2067 The minimum possible or allowed value of the image. If None, no lower2068 clipping will be performed. Default is None.2069 max_value : float, optional2070 The maximum possible or allowed value of the image. If None, no upper2071 clipping will be performed. Default is None.2072 Examples2073 --------2074 >>> from nethin.augmentation import ImageHistogramTransform2075 >>> from nethin.utils import simple_bezier2076 >>> import numpy as np2077 >>>2078 >>> np.random.seed(42)2079 >>> x = 0.125 * np.random.randn(10000, 1) + 0.52080 >>> float("%.12f" % np.mean(x))2081 0.4997330020792082 >>> hist, _ = np.histogram(x, bins=15)2083 >>> hist.tolist() # doctest: +NORMALIZE_WHITESPACE2084 [4, 19, 75, 252, 630, 1208, 1787, 2071, 1817, 1176, 626, 239, 73, 20, 3]2085 >>> transf = simple_bezier([-0.45, 0.0, 0.45], controls=[0.15, 0.5, 0.85],2086 ... steps=100)2087 >>> transform = ImageHistogramTransform(transf,2088 ... min_value=0.0, max_value=1.0)2089 >>> x_trans = transform(x)2090 >>> float("%.12f" % np.mean(x_trans))2091 0.499327469582092 >>> hist, _ = np.histogram(x_trans, bins=15)2093 >>> hist.tolist() # doctest: +NORMALIZE_WHITESPACE2094 [656, 621, 699, 690, 696, 649, 652, 674, 657, 678, 676, 684, 666, 644, 658]2095 >>> transf = simple_bezier([-0.45, 0.2, 0.4], controls=[0.15, 0.6, 0.85],2096 ... steps=100)2097 >>> transform = ImageHistogramTransform(transf,2098 ... min_value=0.0, max_value=1.0)2099 >>> x_trans = transform(x)2100 >>> float("%.12f" % np.mean(x_trans))2101 0.5721891923122102 >>> hist, _ = np.histogram(x_trans, bins=15)2103 >>> hist.tolist() # doctest: +NORMALIZE_WHITESPACE2104 [345, 398, 489, 566, 587, 607, 637, 678, 688, 709, 807, 843, 850, 856, 940]2105 >>>2106 >>> np.random.seed(42)2107 >>> x = 125.0 * np.random.randn(10000, 1) + 15002108 >>> float("%.12f" % np.mean(x))2109 1499.7330020789472110 >>> hist, _ = np.histogram(x, bins=15)2111 >>> hist.tolist() # doctest: +NORMALIZE_WHITESPACE2112 [4, 19, 75, 252, 630, 1208, 1787, 2071, 1817, 1176, 626, 239, 73, 20, 3]2113 >>> transf_ = simple_bezier([-0.45, 0.0, 0.45], controls=[0.15, 0.5, 0.85],2114 ... steps=100)2115 >>> transf = lambda x: (transf_((x - 1000.0) / 1000.0) * 1000.0 + 1000.0)2116 >>> transform = ImageHistogramTransform(transf,2117 ... min_value=1000.0, max_value=2000.0)2118 >>> x_trans = transform(x)2119 >>> float("%.12f" % np.mean(x_trans))2120 1499.3274695798972121 >>> hist, _ = np.histogram(x_trans, bins=15)2122 >>> hist.tolist() # doctest: +NORMALIZE_WHITESPACE2123 [656, 621, 699, 690, 696, 649, 652, 674, 657, 678, 676, 684, 666, 644, 658]2124 """2125 def __init__(self,2126 transform,2127 min_value=None,2128 max_value=None,2129 vectorize=True,2130 data_format=None,2131 random_state=None):2132 super(ImageHistogramTransform, self).__init__(2133 data_format=data_format,2134 random_state=random_state)2135 if not callable(transform):2136 raise ValueError('``transform`` must be callable.')2137 self.transform = transform2138 if min_value is None:2139 self.min_value = min_value2140 else:2141 self.min_value = float(min_value)2142 if max_value is None:2143 self.max_value = max_value2144 else:2145 self.max_value = float(max_value)2146 self.vectorize = bool(vectorize)2147 if self.vectorize:2148 self._vec_trans = np.vectorize(self.transform)2149 def __call__(self, inputs):2150 if isinstance(self.transform, Transform):2151 self.transform.prepare()2152 if self.vectorize:2153 outputs = self._vec_trans(inputs)2154 else:2155 outputs = self.transform(inputs)2156 if (self.min_value is not None) or (self.max_value is not None):2157 outputs = np.clip(outputs, self.min_value, self.max_value)2158 return outputs2159class ImageTransform(BaseAugmentation):2160 """Transforms an entire input image.2161 Parameters2162 ----------2163 transform : Transform or Callable2164 Any function defined on the domain of the input image mapping to2165 another image.2166 min_value : float, optional2167 The minimum possible or allowed value of the image. If None, no lower2168 clipping will be performed. Default is None.2169 max_value : float, optional2170 The maximum possible or allowed value of the image. If None, no upper2171 clipping will be performed. Default is None.2172 Examples2173 --------2174 >>> from nethin.augmentation import ImageTransform2175 >>> from nethin.utils import simple_bezier2176 >>> import numpy as np2177 >>>2178 >>> np.random.seed(42)2179 """2180 def __init__(self,2181 transform,2182 min_value=None,2183 max_value=None,2184 data_format=None,2185 random_state=None):2186 super(ImageTransform, self).__init__(data_format=data_format,2187 random_state=random_state)2188 if not callable(transform):2189 raise ValueError('``transform`` must be callable.')2190 self.transform = transform2191 if min_value is None:2192 self.min_value = min_value2193 else:2194 self.min_value = float(min_value)2195 if max_value is None:2196 self.max_value = max_value2197 else:2198 self.max_value = float(max_value)2199 if (self.min_value is not None) and (self.max_value is not None):2200 assert(self.min_value < self.max_value)2201 def __call__(self, inputs):2202 if isinstance(self.transform, Transform):2203 self.transform.prepare()2204 outputs = self.transform(inputs)2205 if (self.min_value is not None) or (self.max_value is not None):2206 outputs = np.clip(outputs, self.min_value, self.max_value)2207 return outputs2208# Deprecated names!2209ImageFlip = Flip2210ImageResize = Resize2211ImageCrop = Crop2212class Pipeline(object):2213 """Applies a data augmentation/preprocessing pipeline.2214 Parameters2215 ----------2216 pipeline : list of BaseAugmentation or Callable, optional2217 The pipeline. A list of data augmentation functions, that will be2218 applied (chained) one at the time starting with the first element of2219 ``pipeline`` and ending with the last element of ``pipeline``. Default2220 is an empty list.2221 Returns2222 -------2223 outputs : object2224 The augmented data. Often of the same type as the input.2225 """2226 def __init__(self, pipeline=[]):2227 try:2228 _pipeline = []2229 for p in pipeline:2230 if isinstance(p, BaseAugmentation):2231 _pipeline.append(p)2232 elif callable(p):2233 _pipeline.append(p)2234 else:2235 raise RuntimeError()2236 self.pipeline = _pipeline2237 except (TypeError, RuntimeError):2238 raise ValueError('"pipeline" must be a list of '2239 '"BaseAugmentation" or "Callable".')2240 def __call__(self, inputs):2241 outputs = inputs2242 for p in self.pipeline:2243 outputs = p(outputs)2244 return outputs2245 def add(self, p):2246 """Add an data autmentation/preprocessing step to the pipeline.2247 """2248 self.pipeline.append(p)2249class Transform(object):2250 def __init__(self):2251 pass2252 def prepare(self):2253 pass2254 def __call__(self, x):2255 return x2256if __name__ == "__main__":2257 import doctest...

Full Screen

Full Screen

test_cases.py

Source:test_cases.py Github

copy

Full Screen

1# vim:fileencoding=utf-8:sw=4:et -*- coding: utf-8 -*-2def dummy():3 u'''4 >>> import langtable5 >>> from langtable import list_locales6 >>> from langtable import list_scripts7 >>> from langtable import list_keyboards8 >>> from langtable import list_inputmethods9 >>> from langtable import list_consolefonts10 >>> from langtable import _test_language_territory11 >>> from langtable import language_name12 >>> from langtable import territory_name13 >>> from langtable import _test_cldr_locale_pattern14 >>> from langtable import supports_ascii15 >>> from langtable import languageId16 ######################################################################17 # Start of tests to reproduce the results from mangleLocale(inLocale) in anaconda, see:18 # https://git.fedorahosted.org/cgit/anaconda.git/tree/pyanaconda/localization.py#n12119 >>> list_locales(show_weights=False, languageId="af") # doctest: +NORMALIZE_WHITESPACE20 ['af_ZA.UTF-8']21 >>> list_locales(show_weights=False, languageId="am") # doctest: +NORMALIZE_WHITESPACE22 ['am_ET.UTF-8']23 # this puts ar_EG first instead of ar_SA from mangleLocale24 # (because EG is the Arabic country with the most inhabitants).25 # But this should not matter, all our Arabic translations26 # are in /usr/share/locale/ar/LC_MESSAGES/ at the moment, i.e. we do27 # not have different Arabic translations for different territories anyway,28 # than it does not matter that much which Arabic locale is choosen.29 # So I do not need to tweak the weights here, I think.30 >>> list_locales(show_weights=False, languageId="ar") # doctest: +NORMALIZE_WHITESPACE31 ['ar_EG.UTF-8', 'ar_SD.UTF-8', 'ar_DZ.UTF-8', 'ar_MA.UTF-8', 'ar_IQ.UTF-8', 'ar_SA.UTF-8', 'ar_YE.UTF-8', 'ar_SY.UTF-8', 'ar_TN.UTF-8', 'ar_LY.UTF-8', 'ar_JO.UTF-8', 'ar_AE.UTF-8', 'ar_LB.UTF-8', 'ar_KW.UTF-8', 'ar_OM.UTF-8', 'ar_QA.UTF-8', 'ar_BH.UTF-8', 'ar_IN.UTF-8']32 >>> list_locales(show_weights=False, languageId="as") # doctest: +NORMALIZE_WHITESPACE33 ['as_IN.UTF-8']34 >>> list_locales(show_weights=False, languageId="ast") # doctest: +NORMALIZE_WHITESPACE35 ['ast_ES.UTF-8']36 >>> list_locales(show_weights=False, languageId="be") # doctest: +NORMALIZE_WHITESPACE37 ['be_BY.UTF-8', 'be_BY.UTF-8@latin']38 >>> list_locales(show_weights=False, languageId="bg") # doctest: +NORMALIZE_WHITESPACE39 ['bg_BG.UTF-8']40 >>> list_locales(show_weights=False, languageId="bn") # doctest: +NORMALIZE_WHITESPACE41 ['bn_BD.UTF-8', 'bn_IN.UTF-8']42 >>> list_locales(show_weights=False, languageId="bs") # doctest: +NORMALIZE_WHITESPACE43 ['bs_BA.UTF-8']44 >>> list_locales(show_weights=False, languageId="ca") # doctest: +NORMALIZE_WHITESPACE45 ['ca_ES.UTF-8', 'ca_FR.UTF-8', 'ca_AD.UTF-8', 'ca_IT.UTF-8']46 >>> list_locales(show_weights=False, languageId="cs") # doctest: +NORMALIZE_WHITESPACE47 ['cs_CZ.UTF-8']48 >>> list_locales(show_weights=False, languageId="cy") # doctest: +NORMALIZE_WHITESPACE49 ['cy_GB.UTF-8']50 >>> list_locales(show_weights=False, languageId="da") # doctest: +NORMALIZE_WHITESPACE51 ['da_DK.UTF-8']52 >>> list_locales(show_weights=False, languageId="de") # doctest: +NORMALIZE_WHITESPACE53 ['de_DE.UTF-8', 'de_AT.UTF-8', 'de_CH.UTF-8', 'de_BE.UTF-8', 'de_LU.UTF-8']54 >>> list_locales(show_weights=False, languageId="el") # doctest: +NORMALIZE_WHITESPACE55 ['el_GR.UTF-8', 'el_CY.UTF-8']56 >>> list_locales(show_weights=False, languageId="en") # doctest: +NORMALIZE_WHITESPACE57 ['en_US.UTF-8', 'en_GB.UTF-8', 'en_IN.UTF-8', 'en_AU.UTF-8', 'en_CA.UTF-8', 'en_DK.UTF-8', 'en_IE.UTF-8', 'en_NZ.UTF-8', 'en_NG.UTF-8', 'en_HK.UTF-8', 'en_PH.UTF-8', 'en_SG.UTF-8', 'en_ZA.UTF-8', 'en_ZM.UTF-8', 'en_ZW.UTF-8', 'en_BW.UTF-8', 'en_AG.UTF-8']58 # I put es_ES first here which is kind of arbitrary, Spain isn’t the59 # country with the biggest number of Spanish speaking people, but that60 # is what Anaconda’s mangleMap did so far and it is not clear which61 # country to put first in that list anyway.62 >>> list_locales(show_weights=False, languageId="es") # doctest: +NORMALIZE_WHITESPACE63 ['es_ES.UTF-8', 'es_VE.UTF-8', 'es_UY.UTF-8', 'es_US.UTF-8', 'es_SV.UTF-8', 'es_PY.UTF-8', 'es_PR.UTF-8', 'es_PE.UTF-8', 'es_PA.UTF-8', 'es_NI.UTF-8', 'es_MX.UTF-8', 'es_HN.UTF-8', 'es_GT.UTF-8', 'es_EC.UTF-8', 'es_DO.UTF-8', 'es_CU.UTF-8', 'es_CR.UTF-8', 'es_CO.UTF-8', 'es_CL.UTF-8', 'es_BO.UTF-8', 'es_AR.UTF-8']64 >>> list_locales(show_weights=False, languageId="et") # doctest: +NORMALIZE_WHITESPACE65 ['et_EE.UTF-8']66 >>> list_locales(show_weights=False, languageId="eu") # doctest: +NORMALIZE_WHITESPACE67 ['eu_ES.UTF-8']68 >>> list_locales(show_weights=False, languageId="fa") # doctest: +NORMALIZE_WHITESPACE69 ['fa_IR.UTF-8']70 >>> list_locales(show_weights=False, languageId="fi") # doctest: +NORMALIZE_WHITESPACE71 ['fi_FI.UTF-8']72 >>> list_locales(show_weights=False, languageId="fr") # doctest: +NORMALIZE_WHITESPACE73 ['fr_FR.UTF-8', 'fr_CA.UTF-8', 'fr_BE.UTF-8', 'fr_CH.UTF-8', 'fr_LU.UTF-8']74 >>> list_locales(show_weights=False, languageId="gl") # doctest: +NORMALIZE_WHITESPACE75 ['gl_ES.UTF-8']76 >>> list_locales(show_weights=False, languageId="gu") # doctest: +NORMALIZE_WHITESPACE77 ['gu_IN.UTF-8']78 >>> list_locales(show_weights=False, languageId="he") # doctest: +NORMALIZE_WHITESPACE79 ['he_IL.UTF-8']80 >>> list_locales(show_weights=False, languageId="hi") # doctest: +NORMALIZE_WHITESPACE81 ['hi_IN.UTF-8']82 >>> list_locales(show_weights=False, languageId="hr") # doctest: +NORMALIZE_WHITESPACE83 ['hr_HR.UTF-8']84 >>> list_locales(show_weights=False, languageId="hu") # doctest: +NORMALIZE_WHITESPACE85 ['hu_HU.UTF-8']86 >>> list_locales(show_weights=False, languageId="hy") # doctest: +NORMALIZE_WHITESPACE87 ['hy_AM.UTF-8']88 >>> list_locales(show_weights=False, languageId="id") # doctest: +NORMALIZE_WHITESPACE89 ['id_ID.UTF-8']90 # we have no ilo_PH.UTF-8 locale in glibc!91 >>> list_locales(show_weights=False, languageId="ilo") # doctest: +NORMALIZE_WHITESPACE92 []93 >>> list_locales(show_weights=False, languageId="is") # doctest: +NORMALIZE_WHITESPACE94 ['is_IS.UTF-8']95 >>> list_locales(show_weights=False, languageId="it") # doctest: +NORMALIZE_WHITESPACE96 ['it_IT.UTF-8', 'it_CH.UTF-8']97 >>> list_locales(show_weights=False, languageId="ja") # doctest: +NORMALIZE_WHITESPACE98 ['ja_JP.UTF-8']99 >>> list_locales(show_weights=False, languageId="ka") # doctest: +NORMALIZE_WHITESPACE100 ['ka_GE.UTF-8']101 >>> list_locales(show_weights=False, languageId="kk") # doctest: +NORMALIZE_WHITESPACE102 ['kk_KZ.UTF-8']103 >>> list_locales(show_weights=False, languageId="kn") # doctest: +NORMALIZE_WHITESPACE104 ['kn_IN.UTF-8']105 >>> list_locales(show_weights=False, languageId="ko") # doctest: +NORMALIZE_WHITESPACE106 ['ko_KR.UTF-8']107 >>> list_locales(show_weights=False, languageId="lt") # doctest: +NORMALIZE_WHITESPACE108 ['lt_LT.UTF-8']109 >>> list_locales(show_weights=False, languageId="lv") # doctest: +NORMALIZE_WHITESPACE110 ['lv_LV.UTF-8']111 >>> list_locales(show_weights=False, languageId="mai") # doctest: +NORMALIZE_WHITESPACE112 ['mai_IN.UTF-8']113 >>> list_locales(show_weights=False, languageId="mk") # doctest: +NORMALIZE_WHITESPACE114 ['mk_MK.UTF-8']115 >>> list_locales(show_weights=False, languageId="ml") # doctest: +NORMALIZE_WHITESPACE116 ['ml_IN.UTF-8']117 >>> list_locales(show_weights=False, languageId="mr") # doctest: +NORMALIZE_WHITESPACE118 ['mr_IN.UTF-8']119 >>> list_locales(show_weights=False, languageId="ms") # doctest: +NORMALIZE_WHITESPACE120 ['ms_MY.UTF-8']121 >>> list_locales(show_weights=False, languageId="nb") # doctest: +NORMALIZE_WHITESPACE122 ['nb_NO.UTF-8']123 # this puts nds_NL first instead of nds_DE from mangleLocale124 # (because there seem to be more speakers of nds in NL than in DE).125 # It should not matter at though at the moment, all our nds translations126 # are in /usr/share/locale/nds/LC_MESSAGES/ at the moment,127 # the right translations will be chosen no matter whether nds_DE.UTF-8128 # or nds_NL.UTF-8 is set as the locale.129 >>> list_locales(show_weights=False, languageId="nds") # doctest: +NORMALIZE_WHITESPACE130 ['nds_NL.UTF-8', 'nds_DE.UTF-8']131 >>> list_locales(show_weights=False, languageId="ne") # doctest: +NORMALIZE_WHITESPACE132 ['ne_NP.UTF-8']133 >>> list_locales(show_weights=False, languageId="nl") # doctest: +NORMALIZE_WHITESPACE134 ['nl_NL.UTF-8', 'nl_BE.UTF-8', 'nl_AW.UTF-8']135 >>> list_locales(show_weights=False, languageId="nn") # doctest: +NORMALIZE_WHITESPACE136 ['nn_NO.UTF-8']137 >>> list_locales(show_weights=False, languageId="nso") # doctest: +NORMALIZE_WHITESPACE138 ['nso_ZA.UTF-8']139 >>> list_locales(show_weights=False, languageId="or") # doctest: +NORMALIZE_WHITESPACE140 ['or_IN.UTF-8']141 # This puts pa_IN first instead of pa_PK to make it do the142 # same as mangleLocale did. There seem to be more speakers of pa in PK143 # than in IN, nevertheless pa_IN is more important for us because144 # we have *only* Punjabi translations for India (all our Punjabi145 # translations use Gurmukhi script (used by the pa_IN.UTF-8 glibc locale).146 # None of our translations use the Perso-Arabic Shahmukhī alphabet147 # used by the pa_PK.UTF-8 glibc locale.148 # All of our Punjabi translations are currently in /usr/share/locale/pa,149 # as they use the Gurmukhi script and seem to be specific to India,150 # they should probably move to /usr/share/locale/pa_IN in future.151 #152 # Giving pa_IN.UTF-8 higher weight should fix153 # https://bugzilla.redhat.com/show_bug.cgi?id=986155154 # Bug 986155 - Punjabi (India) missing in language installation list155 >>> list_locales(show_weights=False, languageId="pa") # doctest: +NORMALIZE_WHITESPACE156 ['pa_IN.UTF-8', 'pa_PK.UTF-8']157 >>> list_locales(show_weights=False, languageId="pl") # doctest: +NORMALIZE_WHITESPACE158 ['pl_PL.UTF-8']159 # different from mangleLocale which gives pt_PT160 # (because Brazil is much bigger than Portugal).161 # Anaconda has translations for both Brasilian and Portuguese Portuguese:162 # $ ls /usr/share/locale/pt*/LC_MESSAGES/*anaco*163 # /usr/share/locale/pt/LC_MESSAGES/anaconda.mo164 # /usr/share/locale/pt_BR/LC_MESSAGES/anaconda.mo165 # So Anaconda needs to be specific here, just selecting languageId="pt"166 # cannot be enough.167 >>> list_locales(show_weights=False, languageId="pt") # doctest: +NORMALIZE_WHITESPACE168 ['pt_BR.UTF-8', 'pt_PT.UTF-8']169 >>> list_locales(show_weights=False, languageId="ro") # doctest: +NORMALIZE_WHITESPACE170 ['ro_RO.UTF-8']171 >>> list_locales(show_weights=False, languageId="ru") # doctest: +NORMALIZE_WHITESPACE172 ['ru_RU.UTF-8', 'ru_UA.UTF-8']173 >>> list_locales(show_weights=False, languageId="si") # doctest: +NORMALIZE_WHITESPACE174 ['si_LK.UTF-8']175 >>> list_locales(show_weights=False, languageId="sk") # doctest: +NORMALIZE_WHITESPACE176 ['sk_SK.UTF-8']177 >>> list_locales(show_weights=False, languageId="sl") # doctest: +NORMALIZE_WHITESPACE178 ['sl_SI.UTF-8']179 >>> list_locales(show_weights=False, languageId="sq") # doctest: +NORMALIZE_WHITESPACE180 ['sq_AL.UTF-8']181 >>> list_locales(show_weights=False, languageId="sr") # doctest: +NORMALIZE_WHITESPACE182 ['sr_RS.UTF-8', 'sr_RS.UTF-8@latin', 'sr_ME.UTF-8']183 >>> list_locales(show_weights=False, languageId="sr", scriptId="Cyrl") # doctest: +NORMALIZE_WHITESPACE184 ['sr_RS.UTF-8', 'sr_ME.UTF-8']185 >>> list_locales(show_weights=False, languageId="sr", scriptId="cyrillic") # doctest: +NORMALIZE_WHITESPACE186 ['sr_RS.UTF-8', 'sr_ME.UTF-8']187 >>> list_locales(show_weights=False, languageId="sr", scriptId="Latn") # doctest: +NORMALIZE_WHITESPACE188 ['sr_RS.UTF-8@latin']189 >>> list_locales(show_weights=False, languageId="sr", scriptId="latin") # doctest: +NORMALIZE_WHITESPACE190 ['sr_RS.UTF-8@latin']191 # the script can also be specified in the languageId.192 # If the script is specified in the languageId already, it takes193 # precedence over a script specified in scriptId:194 >>> list_locales(show_weights=False, languageId="sr_Latn") # doctest: +NORMALIZE_WHITESPACE195 ['sr_RS.UTF-8@latin']196 >>> list_locales(show_weights=False, languageId="sr_Latn", scriptId="Latn") # doctest: +NORMALIZE_WHITESPACE197 ['sr_RS.UTF-8@latin']198 >>> list_locales(show_weights=False, languageId="sr_Latn", scriptId="Cyrl") # doctest: +NORMALIZE_WHITESPACE199 ['sr_RS.UTF-8@latin']200 >>> list_locales(show_weights=False, languageId="sr_Cyrl") # doctest: +NORMALIZE_WHITESPACE201 ['sr_RS.UTF-8', 'sr_ME.UTF-8']202 >>> list_locales(show_weights=False, languageId="sr_cyrillic") # doctest: +NORMALIZE_WHITESPACE203 ['sr_RS.UTF-8', 'sr_ME.UTF-8']204 >>> list_locales(show_weights=False, languageId="sr_Cyrl", scriptId="Latn") # doctest: +NORMALIZE_WHITESPACE205 ['sr_RS.UTF-8', 'sr_ME.UTF-8']206 >>> list_locales(show_weights=False, languageId="sr_cyrillic", scriptId="latin") # doctest: +NORMALIZE_WHITESPACE207 ['sr_RS.UTF-8', 'sr_ME.UTF-8']208 >>> list_locales(show_weights=False, languageId="sr_latin", scriptId="cyrillic") # doctest: +NORMALIZE_WHITESPACE209 ['sr_RS.UTF-8@latin']210 >>> list_locales(show_weights=False, languageId="sr_Cyrl", scriptId="Cyrl") # doctest: +NORMALIZE_WHITESPACE211 ['sr_RS.UTF-8', 'sr_ME.UTF-8']212 >>> list_locales(show_weights=False, languageId="sv") # doctest: +NORMALIZE_WHITESPACE213 ['sv_SE.UTF-8', 'sv_FI.UTF-8']214 >>> list_locales(show_weights=False, languageId="ta") # doctest: +NORMALIZE_WHITESPACE215 ['ta_IN.UTF-8', 'ta_LK.UTF-8']216 >>> list_locales(show_weights=False, languageId="te") # doctest: +NORMALIZE_WHITESPACE217 ['te_IN.UTF-8']218 >>> list_locales(show_weights=False, languageId="tg") # doctest: +NORMALIZE_WHITESPACE219 ['tg_TJ.UTF-8']220 >>> list_locales(show_weights=False, languageId="th") # doctest: +NORMALIZE_WHITESPACE221 ['th_TH.UTF-8']222 >>> list_locales(show_weights=False, languageId="tr") # doctest: +NORMALIZE_WHITESPACE223 ['tr_TR.UTF-8', 'tr_CY.UTF-8']224 >>> list_locales(show_weights=False, languageId="uk") # doctest: +NORMALIZE_WHITESPACE225 ['uk_UA.UTF-8']226 >>> list_locales(show_weights=False, languageId="ur") # doctest: +NORMALIZE_WHITESPACE227 ['ur_PK.UTF-8', 'ur_IN.UTF-8']228 >>> list_locales(show_weights=False, languageId="vi") # doctest: +NORMALIZE_WHITESPACE229 ['vi_VN.UTF-8']230 >>> list_locales(show_weights=False, languageId="zu") # doctest: +NORMALIZE_WHITESPACE231 ['zu_ZA.UTF-8']232 # End of tests to reproduce the results from mangleLocale(inLocale) in anaconda233 ######################################################################234 >>> list_locales(languageId="de", territoryId="BE") # doctest: +NORMALIZE_WHITESPACE235 ['de_BE.UTF-8']236 # territory given in languageId overrides territory given in territoryId:237 >>> list_locales(languageId="sr_RS", territoryId="BE") # doctest: +NORMALIZE_WHITESPACE238 ['sr_RS.UTF-8', 'sr_RS.UTF-8@latin']239 # script given in languageId overrides script given in scriptId:240 >>> list_locales(languageId="sr_Cyrl_RS", scriptId="Latn") # doctest: +NORMALIZE_WHITESPACE241 ['sr_RS.UTF-8']242 # script given in languageId overrides script given in scriptId:243 >>> list_locales(languageId="sr_Latn_RS", scriptId="Cyrl") # doctest: +NORMALIZE_WHITESPACE244 ['sr_RS.UTF-8@latin']245 # script and territory given in languageId override script and territory in extra parameters:246 >>> list_locales(languageId="sr_Cyrl_RS", scriptId="Latn", territoryId="DE") # doctest: +NORMALIZE_WHITESPACE247 ['sr_RS.UTF-8']248 # if languageId contains an invalid locale id, it is completely ignored:249 >>> list_locales(languageId="sr_CYrl_RS", scriptId="Latn", territoryId="DE") # doctest: +NORMALIZE_WHITESPACE250 ['de_DE.UTF-8', 'nds_DE.UTF-8', 'hsb_DE.UTF-8', 'fy_DE.UTF-8']251 # Japanese uses a mixture of hiragana, katakana, and kanji:252 >>> list_scripts(languageId='ja') # doctest: +NORMALIZE_WHITESPACE253 ['Hani', 'Hira', 'Kana']254 >>> list_scripts(languageId='ko') # doctest: +NORMALIZE_WHITESPACE255 ['Hang', 'Hani']256 >>> list_scripts(languageId='vi') # doctest: +NORMALIZE_WHITESPACE257 ['Latn', 'Hani']258 >>> list_scripts(languageId='sr') # doctest: +NORMALIZE_WHITESPACE259 ['Cyrl', 'Latn']260 >>> list_scripts(languageId='ks') # doctest: +NORMALIZE_WHITESPACE261 ['Arab', 'Deva']262 >>> list_scripts(languageId='ks', territoryId='IN') # doctest: +NORMALIZE_WHITESPACE263 ['Deva', 'Arab']264 >>> list_scripts(languageId='ks', territoryId='PK') # doctest: +NORMALIZE_WHITESPACE265 ['Arab']266 >>> list_scripts(languageId='ks_PK') # doctest: +NORMALIZE_WHITESPACE267 ['Arab']268 >>> list_scripts(languageId='ks_IN') # doctest: +NORMALIZE_WHITESPACE269 ['Deva', 'Arab']270 >>> list_scripts(languageId='ks_Deva_IN') # doctest: +NORMALIZE_WHITESPACE271 ['Deva']272 >>> list_scripts(languageId='ks_devanagari_IN') # doctest: +NORMALIZE_WHITESPACE273 ['Deva']274 >>> list_scripts(languageId='ks_IN@devanagari') # doctest: +NORMALIZE_WHITESPACE275 ['Deva']276 >>> list_scripts(languageId='ks_Arab_IN@devanagari') # doctest: +NORMALIZE_WHITESPACE277 ['Arab']278 >>> list_scripts(languageId='ks_IN.UTF-8') # doctest: +NORMALIZE_WHITESPACE279 ['Deva', 'Arab']280 >>> list_scripts(languageId='ks_IN.UTF-8@devanagari') # doctest: +NORMALIZE_WHITESPACE281 ['Deva']282 >>> list_scripts(languageId='ks_Arab_IN.UTF-8@devanagari') # doctest: +NORMALIZE_WHITESPACE283 ['Arab']284 >>> list_scripts(languageId='ks_Arab_IN.UTF-8@devanagari', scriptId='Latn') # doctest: +NORMALIZE_WHITESPACE285 ['Arab']286 >>> list_scripts(languageId='de') # doctest: +NORMALIZE_WHITESPACE287 ['Latn']288 >>> list_scripts(languageId='de', scriptId='Cyrl') # doctest: +NORMALIZE_WHITESPACE289 ['Cyrl']290 >>> list_scripts(languageId='de_Cyrl', scriptId='Latn') # doctest: +NORMALIZE_WHITESPACE291 ['Cyrl']292 >>> list_scripts(scriptId='Zzzz') # doctest: +NORMALIZE_WHITESPACE293 ['Zzzz']294 >>> list_keyboards(languageId="de", territoryId="BE") # doctest: +NORMALIZE_WHITESPACE295 ['be(oss)']296 # script and territory given in languageId override script and territory in extra parameters:297 >>> list_keyboards(languageId="sr_Latn", scriptId="Cyrl", territoryId="BE") # doctest: +NORMALIZE_WHITESPACE298 ['rs(latin)', 'be(oss)']299 # script and territory given in languageId override script and territory in extra parameters:300 >>> list_keyboards(languageId="sr_Latn_RS", scriptId="Cyrl", territoryId="BE") # doctest: +NORMALIZE_WHITESPACE301 ['rs(latin)']302 # script and territory given in languageId override script and territory in extra parameters:303 >>> list_keyboards(languageId="sr_Cyrl", scriptId="Latn", territoryId="BE") # doctest: +NORMALIZE_WHITESPACE304 ['rs', 'be(oss)']305 # script and territory given in languageId override script and territory in extra parameters:306 >>> list_keyboards(languageId="sr_Cyrl_RS", scriptId="Latn", territoryId="BE") # doctest: +NORMALIZE_WHITESPACE307 ['rs']308 >>> list_inputmethods(languageId="ja") # doctest: +NORMALIZE_WHITESPACE309 ['ibus/kkc', 'ibus/anthy']310 >>> list_inputmethods(languageId="ja", territoryId="JP") # doctest: +NORMALIZE_WHITESPACE311 ['ibus/kkc', 'ibus/anthy']312 >>> list_inputmethods(languageId="ja", territoryId="DE") # doctest: +NORMALIZE_WHITESPACE313 ['ibus/kkc', 'ibus/anthy']314 >>> list_inputmethods(languageId="de", territoryId="JP") # doctest: +NORMALIZE_WHITESPACE315 ['ibus/kkc', 'ibus/anthy']316 >>> list_inputmethods(languageId="ko") # doctest: +NORMALIZE_WHITESPACE317 ['ibus/hangul']318 >>> list_inputmethods(languageId="zh") # doctest: +NORMALIZE_WHITESPACE319 ['ibus/libpinyin', 'ibus/chewing', 'ibus/cangjie']320 >>> list_inputmethods(languageId="zh", territoryId="CN") # doctest: +NORMALIZE_WHITESPACE321 ['ibus/libpinyin']322 >>> list_inputmethods(languageId="zh_CN") # doctest: +NORMALIZE_WHITESPACE323 ['ibus/libpinyin']324 >>> list_inputmethods(languageId="zh", territoryId="HK") # doctest: +NORMALIZE_WHITESPACE325 ['ibus/cangjie']326 >>> list_inputmethods(languageId="zh", territoryId="MO") # doctest: +NORMALIZE_WHITESPACE327 ['ibus/cangjie']328 >>> list_inputmethods(languageId="zh", territoryId="TW") # doctest: +NORMALIZE_WHITESPACE329 ['ibus/chewing']330 >>> list_inputmethods(languageId="zh", territoryId="SG") # doctest: +NORMALIZE_WHITESPACE331 ['ibus/libpinyin']332 >>> list_inputmethods(languageId="as", territoryId="IN") # doctest: +NORMALIZE_WHITESPACE333 ['ibus/m17n:as:phonetic']334 >>> list_inputmethods(languageId="as", territoryId="BD") # doctest: +NORMALIZE_WHITESPACE335 ['ibus/m17n:as:phonetic']336 >>> list_inputmethods(languageId="bn") # doctest: +NORMALIZE_WHITESPACE337 ['ibus/m17n:bn:inscript']338 >>> list_inputmethods(languageId="gu") # doctest: +NORMALIZE_WHITESPACE339 ['ibus/m17n:gu:inscript']340 >>> list_inputmethods(languageId="hi") # doctest: +NORMALIZE_WHITESPACE341 ['ibus/m17n:hi:inscript']342 >>> list_inputmethods(languageId="kn") # doctest: +NORMALIZE_WHITESPACE343 ['ibus/m17n:kn:kgp']344 >>> list_inputmethods(languageId="mai") # doctest: +NORMALIZE_WHITESPACE345 ['ibus/m17n:mai:inscript']346 >>> list_inputmethods(languageId="ml") # doctest: +NORMALIZE_WHITESPACE347 ['ibus/m17n:ml:inscript']348 >>> list_inputmethods(languageId="mr") # doctest: +NORMALIZE_WHITESPACE349 ['ibus/m17n:mr:inscript']350 >>> list_inputmethods(languageId="or") # doctest: +NORMALIZE_WHITESPACE351 ['ibus/m17n:or:inscript']352 >>> list_inputmethods(languageId="pa") # doctest: +NORMALIZE_WHITESPACE353 ['ibus/m17n:pa:inscript']354 >>> list_inputmethods(languageId="ta") # doctest: +NORMALIZE_WHITESPACE355 ['ibus/m17n:ta:tamil99']356 >>> list_inputmethods(languageId="te") # doctest: +NORMALIZE_WHITESPACE357 ['ibus/m17n:te:inscript']358 >>> list_inputmethods(languageId="ur") # doctest: +NORMALIZE_WHITESPACE359 ['ibus/m17n:ur:phonetic']360 >>> list_inputmethods(languageId="sd") # doctest: +NORMALIZE_WHITESPACE361 ['ibus/m17n:sd:inscript']362 >>> list_inputmethods(languageId="sd", scriptId="Deva") # doctest: +NORMALIZE_WHITESPACE363 ['ibus/m17n:sd:inscript']364 >>> list_inputmethods(languageId="sd", scriptId="Arab") # doctest: +NORMALIZE_WHITESPACE365 []366 >>> list_inputmethods(languageId="sd", scriptId="Deva", territoryId="IN") # doctest: +NORMALIZE_WHITESPACE367 ['ibus/m17n:sd:inscript']368 >>> list_inputmethods(languageId="sd", scriptId="Arab", territoryId="PK") # doctest: +NORMALIZE_WHITESPACE369 []370 >>> list_inputmethods(languageId="sd", scriptId="Deva", territoryId="PK") # doctest: +NORMALIZE_WHITESPACE371 ['ibus/m17n:sd:inscript']372 >>> list_inputmethods(languageId="sd", scriptId="Arab", territoryId="IN") # doctest: +NORMALIZE_WHITESPACE373 []374 >>> list_inputmethods(languageId="sd", territoryId="PK") # doctest: +NORMALIZE_WHITESPACE375 ['ibus/m17n:sd:inscript']376 >>> list_inputmethods(languageId="sd", territoryId="IN") # doctest: +NORMALIZE_WHITESPACE377 ['ibus/m17n:sd:inscript']378 >>> list_consolefonts(languageId="de", territoryId="DE") # doctest: +NORMALIZE_WHITESPACE379 ['latarcyrheb-sun16']380 >>> list_consolefonts(languageId="el") # doctest: +NORMALIZE_WHITESPACE381 ['iso07u-16', 'LatGrkCyr-8x16']382 >>> list_consolefonts(territoryId="GR") # doctest: +NORMALIZE_WHITESPACE383 ['iso07u-16', 'LatGrkCyr-8x16']384 >>> list_consolefonts(languageId="el", territoryId="GR") # doctest: +NORMALIZE_WHITESPACE385 ['iso07u-16']386 >>> list_consolefonts(languageId="el", territoryId="DE") # doctest: +NORMALIZE_WHITESPACE387 ['iso07u-16', 'LatGrkCyr-8x16', 'latarcyrheb-sun16']388 # script and territory given in languageId override script and territory in extra parameters:389 >>> list_consolefonts(languageId="el_GR", territoryId="DE") # doctest: +NORMALIZE_WHITESPACE390 ['iso07u-16']391 >>> list_consolefonts(languageId="de", territoryId="GR") # doctest: +NORMALIZE_WHITESPACE392 ['latarcyrheb-sun16', 'iso07u-16', 'LatGrkCyr-8x16']393 >>> _test_language_territory(show_weights=False, languageId=None, territoryId=None) # doctest: +NORMALIZE_WHITESPACE394 None: []395 None: []396 +: []397 None: []398 None: []399 +: []400 >>> _test_language_territory(show_weights=False, languageId="be", territoryId="BY") # doctest: +NORMALIZE_WHITESPACE401 be: ['be_BY.UTF-8', 'be_BY.UTF-8@latin']402 BY: ['be_BY.UTF-8', 'be_BY.UTF-8@latin']403 +: ['be_BY.UTF-8']404 be: ['by']405 BY: ['by']406 +: ['by']407 >>> _test_language_territory(show_weights=False, languageId="de", territoryId="CH") # doctest: +NORMALIZE_WHITESPACE408 de: ['de_DE.UTF-8', 'de_AT.UTF-8', 'de_CH.UTF-8', 'de_BE.UTF-8', 'de_LU.UTF-8']409 CH: ['de_CH.UTF-8', 'fr_CH.UTF-8', 'it_CH.UTF-8', 'wae_CH.UTF-8']410 +: ['de_CH.UTF-8']411 de: ['de(nodeadkeys)', 'de(deadacute)', 'at(nodeadkeys)', 'ch', 'be(oss)']412 CH: ['ch', 'ch(fr)', 'it']413 +: ['ch']414 >>> _test_language_territory(show_weights=False, languageId="fr", territoryId="CH") # doctest: +NORMALIZE_WHITESPACE415 fr: ['fr_FR.UTF-8', 'fr_CA.UTF-8', 'fr_BE.UTF-8', 'fr_CH.UTF-8', 'fr_LU.UTF-8']416 CH: ['de_CH.UTF-8', 'fr_CH.UTF-8', 'it_CH.UTF-8', 'wae_CH.UTF-8']417 +: ['fr_CH.UTF-8']418 fr: ['fr(oss)', 'ca', 'ch(fr)']419 CH: ['ch', 'ch(fr)', 'it']420 +: ['ch(fr)']421 >>> _test_language_territory(show_weights=False, languageId="fr", territoryId="FR") # doctest: +NORMALIZE_WHITESPACE422 fr: ['fr_FR.UTF-8', 'fr_CA.UTF-8', 'fr_BE.UTF-8', 'fr_CH.UTF-8', 'fr_LU.UTF-8']423 FR: ['fr_FR.UTF-8', 'br_FR.UTF-8', 'oc_FR.UTF-8', 'ca_FR.UTF-8']424 +: ['fr_FR.UTF-8']425 fr: ['fr(oss)', 'ca', 'ch(fr)']426 FR: ['fr(oss)']427 +: ['fr(oss)']428 >>> _test_language_territory(show_weights=False, languageId="de", territoryId="FR") # doctest: +NORMALIZE_WHITESPACE429 de: ['de_DE.UTF-8', 'de_AT.UTF-8', 'de_CH.UTF-8', 'de_BE.UTF-8', 'de_LU.UTF-8']430 FR: ['fr_FR.UTF-8', 'br_FR.UTF-8', 'oc_FR.UTF-8', 'ca_FR.UTF-8']431 +: ['de_DE.UTF-8', 'de_AT.UTF-8', 'de_CH.UTF-8', 'de_BE.UTF-8', 'fr_FR.UTF-8', 'de_LU.UTF-8', 'br_FR.UTF-8', 'oc_FR.UTF-8', 'ca_FR.UTF-8']432 de: ['de(nodeadkeys)', 'de(deadacute)', 'at(nodeadkeys)', 'ch', 'be(oss)']433 FR: ['fr(oss)']434 +: ['fr(oss)', 'de(nodeadkeys)', 'de(deadacute)', 'at(nodeadkeys)', 'ch', 'be(oss)']435 >>> _test_language_territory(show_weights=False, languageId="de", territoryId="BE") # doctest: +NORMALIZE_WHITESPACE436 de: ['de_DE.UTF-8', 'de_AT.UTF-8', 'de_CH.UTF-8', 'de_BE.UTF-8', 'de_LU.UTF-8']437 BE: ['nl_BE.UTF-8', 'fr_BE.UTF-8', 'de_BE.UTF-8', 'wa_BE.UTF-8', 'li_BE.UTF-8']438 +: ['de_BE.UTF-8']439 de: ['de(nodeadkeys)', 'de(deadacute)', 'at(nodeadkeys)', 'ch', 'be(oss)']440 BE: ['be(oss)']441 +: ['be(oss)']442 >>> _test_language_territory(show_weights=False, languageId="de", territoryId="AT") # doctest: +NORMALIZE_WHITESPACE443 de: ['de_DE.UTF-8', 'de_AT.UTF-8', 'de_CH.UTF-8', 'de_BE.UTF-8', 'de_LU.UTF-8']444 AT: ['de_AT.UTF-8']445 +: ['de_AT.UTF-8']446 de: ['de(nodeadkeys)', 'de(deadacute)', 'at(nodeadkeys)', 'ch', 'be(oss)']447 AT: ['at(nodeadkeys)']448 +: ['at(nodeadkeys)']449 >>> _test_language_territory(show_weights=False, languageId="de", territoryId="JP") # doctest: +NORMALIZE_WHITESPACE450 de: ['de_DE.UTF-8', 'de_AT.UTF-8', 'de_CH.UTF-8', 'de_BE.UTF-8', 'de_LU.UTF-8']451 JP: ['ja_JP.UTF-8']452 +: ['de_DE.UTF-8', 'de_AT.UTF-8', 'de_CH.UTF-8', 'ja_JP.UTF-8', 'de_BE.UTF-8', 'de_LU.UTF-8']453 de: ['de(nodeadkeys)', 'de(deadacute)', 'at(nodeadkeys)', 'ch', 'be(oss)']454 JP: ['jp']455 +: ['jp', 'de(nodeadkeys)', 'de(deadacute)', 'at(nodeadkeys)', 'ch', 'be(oss)']456 >>> _test_language_territory(show_weights=False, languageId="ja", territoryId="DE") # doctest: +NORMALIZE_WHITESPACE457 ja: ['ja_JP.UTF-8']458 DE: ['de_DE.UTF-8', 'nds_DE.UTF-8', 'hsb_DE.UTF-8', 'fy_DE.UTF-8']459 +: ['ja_JP.UTF-8', 'de_DE.UTF-8', 'nds_DE.UTF-8', 'hsb_DE.UTF-8', 'fy_DE.UTF-8']460 ja: ['jp']461 DE: ['de(nodeadkeys)', 'de(deadacute)']462 +: ['jp', 'de(nodeadkeys)', 'de(deadacute)']463 >>> _test_language_territory(show_weights=False, languageId="de", territoryId="ZA") # doctest: +NORMALIZE_WHITESPACE464 de: ['de_DE.UTF-8', 'de_AT.UTF-8', 'de_CH.UTF-8', 'de_BE.UTF-8', 'de_LU.UTF-8']465 ZA: ['zu_ZA.UTF-8', 'xh_ZA.UTF-8', 'af_ZA.UTF-8', 'en_ZA.UTF-8', 'nso_ZA.UTF-8', 'tn_ZA.UTF-8', 'st_ZA.UTF-8', 'ts_ZA.UTF-8', 'ss_ZA.UTF-8', 've_ZA.UTF-8', 'nr_ZA.UTF-8']466 +: ['de_DE.UTF-8', 'de_AT.UTF-8', 'de_CH.UTF-8', 'de_BE.UTF-8', 'de_LU.UTF-8', 'zu_ZA.UTF-8', 'xh_ZA.UTF-8', 'af_ZA.UTF-8', 'en_ZA.UTF-8', 'nso_ZA.UTF-8', 'tn_ZA.UTF-8', 'st_ZA.UTF-8', 'ts_ZA.UTF-8', 'ss_ZA.UTF-8', 've_ZA.UTF-8', 'nr_ZA.UTF-8']467 de: ['de(nodeadkeys)', 'de(deadacute)', 'at(nodeadkeys)', 'ch', 'be(oss)']468 ZA: ['us']469 +: ['us', 'de(nodeadkeys)', 'de(deadacute)', 'at(nodeadkeys)', 'ch', 'be(oss)']470 >>> _test_language_territory(show_weights=False, languageId="ar", territoryId="EG") # doctest: +NORMALIZE_WHITESPACE471 ar: ['ar_EG.UTF-8', 'ar_SD.UTF-8', 'ar_DZ.UTF-8', 'ar_MA.UTF-8', 'ar_IQ.UTF-8', 'ar_SA.UTF-8', 'ar_YE.UTF-8', 'ar_SY.UTF-8', 'ar_TN.UTF-8', 'ar_LY.UTF-8', 'ar_JO.UTF-8', 'ar_AE.UTF-8', 'ar_LB.UTF-8', 'ar_KW.UTF-8', 'ar_OM.UTF-8', 'ar_QA.UTF-8', 'ar_BH.UTF-8', 'ar_IN.UTF-8']472 EG: ['ar_EG.UTF-8']473 +: ['ar_EG.UTF-8']474 ar: ['ara', 'ara(azerty)', 'iq', 'ma', 'sy']475 EG: ['ara']476 +: ['ara']477 >>> _test_language_territory(show_weights=False, languageId="ar", territoryId="IQ") # doctest: +NORMALIZE_WHITESPACE478 ar: ['ar_EG.UTF-8', 'ar_SD.UTF-8', 'ar_DZ.UTF-8', 'ar_MA.UTF-8', 'ar_IQ.UTF-8', 'ar_SA.UTF-8', 'ar_YE.UTF-8', 'ar_SY.UTF-8', 'ar_TN.UTF-8', 'ar_LY.UTF-8', 'ar_JO.UTF-8', 'ar_AE.UTF-8', 'ar_LB.UTF-8', 'ar_KW.UTF-8', 'ar_OM.UTF-8', 'ar_QA.UTF-8', 'ar_BH.UTF-8', 'ar_IN.UTF-8']479 IQ: ['ar_IQ.UTF-8']480 +: ['ar_IQ.UTF-8']481 ar: ['ara', 'ara(azerty)', 'iq', 'ma', 'sy']482 IQ: ['iq']483 +: ['iq']484 >>> _test_language_territory(show_weights=False, languageId="ar", territoryId="MA") # doctest: +NORMALIZE_WHITESPACE485 ar: ['ar_EG.UTF-8', 'ar_SD.UTF-8', 'ar_DZ.UTF-8', 'ar_MA.UTF-8', 'ar_IQ.UTF-8', 'ar_SA.UTF-8', 'ar_YE.UTF-8', 'ar_SY.UTF-8', 'ar_TN.UTF-8', 'ar_LY.UTF-8', 'ar_JO.UTF-8', 'ar_AE.UTF-8', 'ar_LB.UTF-8', 'ar_KW.UTF-8', 'ar_OM.UTF-8', 'ar_QA.UTF-8', 'ar_BH.UTF-8', 'ar_IN.UTF-8']486 MA: ['ar_MA.UTF-8']487 +: ['ar_MA.UTF-8']488 ar: ['ara', 'ara(azerty)', 'iq', 'ma', 'sy']489 MA: ['ma']490 +: ['ma']491 >>> _test_language_territory(show_weights=False, languageId="ar", territoryId="SY") # doctest: +NORMALIZE_WHITESPACE492 ar: ['ar_EG.UTF-8', 'ar_SD.UTF-8', 'ar_DZ.UTF-8', 'ar_MA.UTF-8', 'ar_IQ.UTF-8', 'ar_SA.UTF-8', 'ar_YE.UTF-8', 'ar_SY.UTF-8', 'ar_TN.UTF-8', 'ar_LY.UTF-8', 'ar_JO.UTF-8', 'ar_AE.UTF-8', 'ar_LB.UTF-8', 'ar_KW.UTF-8', 'ar_OM.UTF-8', 'ar_QA.UTF-8', 'ar_BH.UTF-8', 'ar_IN.UTF-8']493 SY: ['ar_SY.UTF-8']494 +: ['ar_SY.UTF-8']495 ar: ['ara', 'ara(azerty)', 'iq', 'ma', 'sy']496 SY: ['sy']497 +: ['sy']498 >>> _test_language_territory(show_weights=False, languageId="ar", territoryId="IN") # doctest: +NORMALIZE_WHITESPACE499 ar: ['ar_EG.UTF-8', 'ar_SD.UTF-8', 'ar_DZ.UTF-8', 'ar_MA.UTF-8', 'ar_IQ.UTF-8', 'ar_SA.UTF-8', 'ar_YE.UTF-8', 'ar_SY.UTF-8', 'ar_TN.UTF-8', 'ar_LY.UTF-8', 'ar_JO.UTF-8', 'ar_AE.UTF-8', 'ar_LB.UTF-8', 'ar_KW.UTF-8', 'ar_OM.UTF-8', 'ar_QA.UTF-8', 'ar_BH.UTF-8', 'ar_IN.UTF-8']500 IN: ['hi_IN.UTF-8', 'en_IN.UTF-8', 'bn_IN.UTF-8', 'te_IN.UTF-8', 'mr_IN.UTF-8', 'ta_IN.UTF-8', 'ur_IN.UTF-8', 'gu_IN.UTF-8', 'kn_IN.UTF-8', 'ml_IN.UTF-8', 'or_IN.UTF-8', 'pa_IN.UTF-8', 'as_IN.UTF-8', 'mai_IN.UTF-8', 'sat_IN.UTF-8', 'ks_IN.UTF-8', 'ks_IN.UTF-8@devanagari', 'kok_IN.UTF-8', 'sd_IN.UTF-8', 'sd_IN.UTF-8@devanagari', 'doi_IN.UTF-8', 'mni_IN.UTF-8', 'brx_IN.UTF-8', 'bho_IN.UTF-8', 'bo_IN.UTF-8', 'hne_IN.UTF-8', 'mag_IN.UTF-8', 'ar_IN.UTF-8']501 +: ['ar_IN.UTF-8']502 ar: ['ara', 'ara(azerty)', 'iq', 'ma', 'sy']503 IN: ['in(eng)']504 +: ['in(eng)', 'ara', 'ara(azerty)', 'iq', 'ma', 'sy']505 >>> _test_language_territory(show_weights=False, languageId="ar", territoryId="DE") # doctest: +NORMALIZE_WHITESPACE506 ar: ['ar_EG.UTF-8', 'ar_SD.UTF-8', 'ar_DZ.UTF-8', 'ar_MA.UTF-8', 'ar_IQ.UTF-8', 'ar_SA.UTF-8', 'ar_YE.UTF-8', 'ar_SY.UTF-8', 'ar_TN.UTF-8', 'ar_LY.UTF-8', 'ar_JO.UTF-8', 'ar_AE.UTF-8', 'ar_LB.UTF-8', 'ar_KW.UTF-8', 'ar_OM.UTF-8', 'ar_QA.UTF-8', 'ar_BH.UTF-8', 'ar_IN.UTF-8']507 DE: ['de_DE.UTF-8', 'nds_DE.UTF-8', 'hsb_DE.UTF-8', 'fy_DE.UTF-8']508 +: ['ar_EG.UTF-8', 'ar_SD.UTF-8', 'ar_DZ.UTF-8', 'ar_MA.UTF-8', 'ar_IQ.UTF-8', 'ar_SA.UTF-8', 'ar_YE.UTF-8', 'ar_SY.UTF-8', 'ar_TN.UTF-8', 'ar_LY.UTF-8', 'ar_JO.UTF-8', 'ar_AE.UTF-8', 'ar_LB.UTF-8', 'ar_KW.UTF-8', 'ar_OM.UTF-8', 'ar_QA.UTF-8', 'de_DE.UTF-8', 'ar_BH.UTF-8', 'ar_IN.UTF-8', 'nds_DE.UTF-8', 'hsb_DE.UTF-8', 'fy_DE.UTF-8']509 ar: ['ara', 'ara(azerty)', 'iq', 'ma', 'sy']510 DE: ['de(nodeadkeys)', 'de(deadacute)']511 +: ['de(nodeadkeys)', 'ara', 'de(deadacute)', 'ara(azerty)', 'iq', 'ma', 'sy']512 >>> _test_language_territory(show_weights=False, languageId="as", territoryId="IN") # doctest: +NORMALIZE_WHITESPACE513 as: ['as_IN.UTF-8']514 IN: ['hi_IN.UTF-8', 'en_IN.UTF-8', 'bn_IN.UTF-8', 'te_IN.UTF-8', 'mr_IN.UTF-8', 'ta_IN.UTF-8', 'ur_IN.UTF-8', 'gu_IN.UTF-8', 'kn_IN.UTF-8', 'ml_IN.UTF-8', 'or_IN.UTF-8', 'pa_IN.UTF-8', 'as_IN.UTF-8', 'mai_IN.UTF-8', 'sat_IN.UTF-8', 'ks_IN.UTF-8', 'ks_IN.UTF-8@devanagari', 'kok_IN.UTF-8', 'sd_IN.UTF-8', 'sd_IN.UTF-8@devanagari', 'doi_IN.UTF-8', 'mni_IN.UTF-8', 'brx_IN.UTF-8', 'bho_IN.UTF-8', 'bo_IN.UTF-8', 'hne_IN.UTF-8', 'mag_IN.UTF-8', 'ar_IN.UTF-8']515 +: ['as_IN.UTF-8']516 as: ['in(eng)']517 IN: ['in(eng)']518 +: ['in(eng)']519 >>> _test_language_territory(show_weights=False, languageId="bn", territoryId="BD") # doctest: +NORMALIZE_WHITESPACE520 bn: ['bn_BD.UTF-8', 'bn_IN.UTF-8']521 BD: ['bn_BD.UTF-8']522 +: ['bn_BD.UTF-8']523 bn: ['in(eng)']524 BD: ['us']525 +: ['us', 'in(eng)']526 >>> _test_language_territory(show_weights=False, languageId="bn", territoryId="IN") # doctest: +NORMALIZE_WHITESPACE527 bn: ['bn_BD.UTF-8', 'bn_IN.UTF-8']528 IN: ['hi_IN.UTF-8', 'en_IN.UTF-8', 'bn_IN.UTF-8', 'te_IN.UTF-8', 'mr_IN.UTF-8', 'ta_IN.UTF-8', 'ur_IN.UTF-8', 'gu_IN.UTF-8', 'kn_IN.UTF-8', 'ml_IN.UTF-8', 'or_IN.UTF-8', 'pa_IN.UTF-8', 'as_IN.UTF-8', 'mai_IN.UTF-8', 'sat_IN.UTF-8', 'ks_IN.UTF-8', 'ks_IN.UTF-8@devanagari', 'kok_IN.UTF-8', 'sd_IN.UTF-8', 'sd_IN.UTF-8@devanagari', 'doi_IN.UTF-8', 'mni_IN.UTF-8', 'brx_IN.UTF-8', 'bho_IN.UTF-8', 'bo_IN.UTF-8', 'hne_IN.UTF-8', 'mag_IN.UTF-8', 'ar_IN.UTF-8']529 +: ['bn_IN.UTF-8']530 bn: ['in(eng)']531 IN: ['in(eng)']532 +: ['in(eng)']533 >>> _test_language_territory(show_weights=False, languageId="zh", territoryId="CN") # doctest: +NORMALIZE_WHITESPACE534 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']535 CN: ['zh_CN.UTF-8']536 +: ['zh_CN.UTF-8']537 zh: ['cn']538 CN: ['cn']539 +: ['cn']540 >>> _test_language_territory(show_weights=False, languageId="zh", territoryId="TW") # doctest: +NORMALIZE_WHITESPACE541 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']542 TW: ['zh_TW.UTF-8']543 +: ['zh_TW.UTF-8']544 zh: ['cn']545 TW: ['cn']546 +: ['cn']547 >>> _test_language_territory(show_weights=False, languageId="zh", territoryId="HK") # doctest: +NORMALIZE_WHITESPACE548 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']549 HK: ['zh_HK.UTF-8']550 +: ['zh_HK.UTF-8']551 zh: ['cn']552 HK: ['cn']553 +: ['cn']554 >>> _test_language_territory(show_weights=False, languageId="zh", territoryId="MO") # doctest: +NORMALIZE_WHITESPACE555 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']556 MO: ['zh_HK.UTF-8']557 +: ['zh_HK.UTF-8']558 zh: ['cn']559 MO: ['cn']560 +: ['cn']561 >>> _test_language_territory(show_weights=False, languageId="zh", territoryId="SG") # doctest: +NORMALIZE_WHITESPACE562 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']563 SG: ['zh_SG.UTF-8', 'en_SG.UTF-8']564 +: ['zh_SG.UTF-8']565 zh: ['cn']566 SG: ['us', 'cn']567 +: ['cn']568 >>> _test_language_territory(show_weights=False, languageId="en", territoryId="SG") # doctest: +NORMALIZE_WHITESPACE569 en: ['en_US.UTF-8', 'en_GB.UTF-8', 'en_IN.UTF-8', 'en_AU.UTF-8', 'en_CA.UTF-8', 'en_DK.UTF-8', 'en_IE.UTF-8', 'en_NZ.UTF-8', 'en_NG.UTF-8', 'en_HK.UTF-8', 'en_PH.UTF-8', 'en_SG.UTF-8', 'en_ZA.UTF-8', 'en_ZM.UTF-8', 'en_ZW.UTF-8', 'en_BW.UTF-8', 'en_AG.UTF-8']570 SG: ['zh_SG.UTF-8', 'en_SG.UTF-8']571 +: ['en_SG.UTF-8']572 en: ['us', 'gb']573 SG: ['us', 'cn']574 +: ['us']575 >>> _test_language_territory(show_weights=False, languageId="zh", scriptId = "Hant", territoryId=None) # doctest: +NORMALIZE_WHITESPACE576 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']577 None: []578 +: ['zh_TW.UTF-8', 'zh_HK.UTF-8']579 zh: ['cn']580 None: []581 +: ['cn']582 >>> _test_language_territory(show_weights=False, languageId="zh", scriptId = "Hans", territoryId=None) # doctest: +NORMALIZE_WHITESPACE583 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']584 None: []585 +: ['zh_CN.UTF-8', 'zh_SG.UTF-8']586 zh: ['cn']587 None: []588 +: ['cn']589 >>> _test_language_territory(show_weights=False, languageId="zh", scriptId = "Hans", territoryId="SG") # doctest: +NORMALIZE_WHITESPACE590 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']591 SG: ['zh_SG.UTF-8', 'en_SG.UTF-8']592 +: ['zh_SG.UTF-8']593 zh: ['cn']594 SG: ['us', 'cn']595 +: ['cn']596 >>> _test_language_territory(show_weights=False, languageId="zh", scriptId = "Hans", territoryId="TW") # doctest: +NORMALIZE_WHITESPACE597 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']598 TW: ['zh_TW.UTF-8']599 +: ['zh_CN.UTF-8', 'zh_SG.UTF-8', 'zh_TW.UTF-8']600 zh: ['cn']601 TW: ['cn']602 +: ['cn']603 >>> _test_language_territory(show_weights=False, languageId="zh", scriptId = "Hant", territoryId="HK") # doctest: +NORMALIZE_WHITESPACE604 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']605 HK: ['zh_HK.UTF-8']606 +: ['zh_HK.UTF-8']607 zh: ['cn']608 HK: ['cn']609 +: ['cn']610 >>> _test_language_territory(show_weights=False, languageId="zh", scriptId = "Hant", territoryId="MO") # doctest: +NORMALIZE_WHITESPACE611 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']612 MO: ['zh_HK.UTF-8']613 +: ['zh_HK.UTF-8']614 zh: ['cn']615 MO: ['cn']616 +: ['cn']617 >>> _test_language_territory(show_weights=False, languageId="zh", scriptId = "Hant", territoryId="CN") # doctest: +NORMALIZE_WHITESPACE618 zh: ['zh_CN.UTF-8', 'zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_SG.UTF-8']619 CN: ['zh_CN.UTF-8']620 +: ['zh_TW.UTF-8', 'zh_HK.UTF-8', 'zh_CN.UTF-8']621 zh: ['cn']622 CN: ['cn']623 +: ['cn']624 >>> _test_language_territory(show_weights=False, languageId="ia", territoryId=None) # doctest: +NORMALIZE_WHITESPACE625 ia: ['ia_FR.UTF-8']626 None: []627 +: ['ia_FR.UTF-8']628 ia: ['us(euro)']629 None: []630 +: ['us(euro)']631 >>> _test_language_territory(show_weights=False, languageId="ia", territoryId="DE") # doctest: +NORMALIZE_WHITESPACE632 ia: ['ia_FR.UTF-8']633 DE: ['de_DE.UTF-8', 'nds_DE.UTF-8', 'hsb_DE.UTF-8', 'fy_DE.UTF-8']634 +: ['ia_FR.UTF-8', 'de_DE.UTF-8', 'nds_DE.UTF-8', 'hsb_DE.UTF-8', 'fy_DE.UTF-8']635 ia: ['us(euro)']636 DE: ['de(nodeadkeys)', 'de(deadacute)']637 +: ['us(euro)', 'de(nodeadkeys)', 'de(deadacute)']638 >>> _test_language_territory(show_weights=False, languageId="tt", territoryId="RU") # doctest: +NORMALIZE_WHITESPACE639 tt: ['tt_RU.UTF-8', 'tt_RU.UTF-8@iqtelif']640 RU: ['ru_RU.UTF-8', 'cv_RU.UTF-8', 'mhr_RU.UTF-8', 'os_RU.UTF-8', 'tt_RU.UTF-8', 'tt_RU.UTF-8@iqtelif']641 +: ['tt_RU.UTF-8']642 tt: ['ru(tt)', 'us(altgr-intl)']643 RU: ['ru', 'ru(tt)', 'us(altgr-intl)']644 +: ['ru(tt)']645 >>> _test_language_territory(show_weights=False, languageId="tt", scriptId="Latn", territoryId="RU") # doctest: +NORMALIZE_WHITESPACE646 tt: ['tt_RU.UTF-8', 'tt_RU.UTF-8@iqtelif']647 RU: ['ru_RU.UTF-8', 'cv_RU.UTF-8', 'mhr_RU.UTF-8', 'os_RU.UTF-8', 'tt_RU.UTF-8', 'tt_RU.UTF-8@iqtelif']648 +: ['tt_RU.UTF-8@iqtelif']649 tt: ['ru(tt)', 'us(altgr-intl)']650 RU: ['ru', 'ru(tt)', 'us(altgr-intl)']651 +: ['us(altgr-intl)']652 # according to https://wiki.gnome.org/GnomeGoals/KeyboardData,653 # “us(euro)” keyboard should be used in NL:654 >>> _test_language_territory(show_weights=False, languageId="nl") # doctest: +NORMALIZE_WHITESPACE655 nl: ['nl_NL.UTF-8', 'nl_BE.UTF-8', 'nl_AW.UTF-8']656 None: []657 +: ['nl_NL.UTF-8', 'nl_BE.UTF-8', 'nl_AW.UTF-8']658 nl: ['us(euro)', 'us(altgr-intl)', 'be(oss)']659 None: []660 +: ['us(euro)', 'us(altgr-intl)', 'be(oss)']661 >>> _test_language_territory(show_weights=False, languageId="nl", territoryId="NL") # doctest: +NORMALIZE_WHITESPACE662 nl: ['nl_NL.UTF-8', 'nl_BE.UTF-8', 'nl_AW.UTF-8']663 NL: ['nl_NL.UTF-8', 'fy_NL.UTF-8', 'nds_NL.UTF-8', 'li_NL.UTF-8']664 +: ['nl_NL.UTF-8']665 nl: ['us(euro)', 'us(altgr-intl)', 'be(oss)']666 NL: ['us(euro)', 'us(altgr-intl)']667 +: ['us(euro)', 'us(altgr-intl)']668 # but “be(oss)” keyboard should be used for nl in BE669 # (see: https://bugzilla.redhat.com/show_bug.cgi?id=885345):670 >>> _test_language_territory(show_weights=False, languageId="nl", territoryId="BE") # doctest: +NORMALIZE_WHITESPACE671 nl: ['nl_NL.UTF-8', 'nl_BE.UTF-8', 'nl_AW.UTF-8']672 BE: ['nl_BE.UTF-8', 'fr_BE.UTF-8', 'de_BE.UTF-8', 'wa_BE.UTF-8', 'li_BE.UTF-8']673 +: ['nl_BE.UTF-8']674 nl: ['us(euro)', 'us(altgr-intl)', 'be(oss)']675 BE: ['be(oss)']676 +: ['be(oss)']677 >>> print(language_name(languageId="de")) # doctest: +NORMALIZE_WHITESPACE678 Deutsch679 >>> print(language_name(languageId="de", territoryId="DE")) # doctest: +NORMALIZE_WHITESPACE680 Deutsch (Deutschland)681 >>> print(language_name(languageId="de", territoryId="CH")) # doctest: +NORMALIZE_WHITESPACE682 Deutsch (Schweiz)683 >>> print(language_name(languageId="de", territoryId="AT")) # doctest: +NORMALIZE_WHITESPACE684 Deutsch (Österreich)685 >>> print(language_name(languageId="de", territoryId="BE")) # doctest: +NORMALIZE_WHITESPACE686 Deutsch (Belgien)687 >>> print(language_name(languageId="de", territoryId="JP")) # doctest: +NORMALIZE_WHITESPACE688 Deutsch (Japan)689 >>> print(language_name(languageId="de", territoryId="BY")) # doctest: +NORMALIZE_WHITESPACE690 Deutsch (Belarus)691 >>> print(language_name(languageId="de", territoryId="BY", languageIdQuery="de", territoryIdQuery="CH")) # doctest: +NORMALIZE_WHITESPACE692 Deutsch (Weissrussland)693 >>> print(language_name(languageId="de", scriptId="Latn", territoryId="DE")) # doctest: +NORMALIZE_WHITESPACE694 Deutsch (Deutschland)695 >>> print(language_name(languageId="pt")) # doctest: +NORMALIZE_WHITESPACE696 português697 >>> print(language_name(languageId="pt", territoryId="PT")) # doctest: +NORMALIZE_WHITESPACE698 português (Portugal)699 >>> print(language_name(languageId="pt", territoryId="BR")) # doctest: +NORMALIZE_WHITESPACE700 português (Brasil)701 >>> print(language_name(languageId="pt", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE702 Portugiesisch703 >>> print(language_name(languageId="pt", territoryId="PT", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE704 Portugiesisch (Portugal)705 >>> print(language_name(languageId="pt", territoryId="BR", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE706 Portugiesisch (Brasilien)707 >>> print(language_name(languageId="mai", territoryId="IN", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE708 Maithili (India)709 >>> print(language_name(languageId="mai", territoryId="NP", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE710 Maithili (Nepal)711 >>> print(language_name(languageId="mai", territoryId="IN", languageIdQuery="mai")) # doctest: +NORMALIZE_WHITESPACE712 मैथिली (भारत)713 >>> print(language_name(languageId="mai", territoryId="NP", languageIdQuery="mai")) # doctest: +NORMALIZE_WHITESPACE714 मैथिली (नेपाल)715 >>> print(language_name(languageId="zh")) # doctest: +NORMALIZE_WHITESPACE716 中文717 >>> print(language_name(languageId="zh", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE718 Chinesisch719 >>> print(language_name(languageId="zh", scriptId="Hant", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE720 Chinesisch (traditionell)721 >>> print(language_name(languageId="zh", scriptId="Hans", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE722 Chinesisch (vereinfacht)723 >>> print(language_name(languageId="zh", territoryId="HK", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE724 Traditionelles Chinesisch (Sonderverwaltungszone Hongkong)725 >>> print(language_name(languageId="zh", territoryId="MO", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE726 Traditionelles Chinesisch (Sonderverwaltungszone Macao)727 >>> print(language_name(languageId="zh", territoryId="MO", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE728 Traditional Chinese (Macau SAR China)729 >>> print(language_name(languageId="zh", territoryId="SG", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE730 Vereinfachtes Chinesisch (Singapur)731 >>> print(language_name(languageId="zh", territoryId="TW", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE732 Traditionelles Chinesisch (Taiwan)733 >>> print(language_name(languageId="zh", territoryId="CN")) # doctest: +NORMALIZE_WHITESPACE734 简体中文 (中国)735 >>> print(language_name(languageId="zh", territoryId="SG")) # doctest: +NORMALIZE_WHITESPACE736 简体中文 (新加坡)737 >>> print(language_name(languageId="zh", territoryId="TW")) # doctest: +NORMALIZE_WHITESPACE738 繁體中文 (台灣)739 >>> print(language_name(languageId="zh", territoryId="TW", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE740 Traditional Chinese (Republic of China)741 >>> print(language_name(languageId="zh", territoryId="TW", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE742 Traditionelles Chinesisch (Taiwan)743 >>> print(language_name(languageId="zh", territoryId="TW", languageIdQuery="de", territoryIdQuery="DE")) # doctest: +NORMALIZE_WHITESPACE744 Traditionelles Chinesisch (Taiwan)745 >>> print(language_name(languageId="zh", territoryId="TW", languageIdQuery="es")) # doctest: +NORMALIZE_WHITESPACE746 chino tradicional (Taiwán)747 >>> print(language_name(languageId="zh", territoryId="TW", languageIdQuery="es", territoryIdQuery="ES")) # doctest: +NORMALIZE_WHITESPACE748 chino tradicional (Taiwán)749 >>> print(language_name(languageId="zh", territoryId="TW", languageIdQuery="zh")) # doctest: +NORMALIZE_WHITESPACE750 繁体中文 (台湾)751 >>> print(language_name(languageId="zh", territoryId="TW", languageIdQuery="zh", territoryIdQuery="TW")) # doctest: +NORMALIZE_WHITESPACE752 繁體中文 (台灣)753 >>> print(language_name(languageId="zh", territoryId="TW", languageIdQuery="zh", territoryIdQuery="CN")) # doctest: +NORMALIZE_WHITESPACE754 繁体中文 (中华民国)755 >>> print(language_name(languageId="zh", territoryId="HK")) # doctest: +NORMALIZE_WHITESPACE756 繁體中文 (中華人民共和國香港特別行政區)757 >>> print(language_name(languageId="zh", territoryId="MO")) # doctest: +NORMALIZE_WHITESPACE758 繁體中文 (中華人民共和國澳門特別行政區)759 >>> print(language_name(languageId="zh", scriptId="Hans", territoryId="CN")) # doctest: +NORMALIZE_WHITESPACE760 简体中文 (中国)761 >>> print(language_name(languageId="zh", scriptId="Hans", territoryId="SG")) # doctest: +NORMALIZE_WHITESPACE762 简体中文 (新加坡)763 >>> print(language_name(languageId="zh", scriptId="Hant", territoryId="TW")) # doctest: +NORMALIZE_WHITESPACE764 繁體中文 (台灣)765 >>> print(language_name(languageId="zh", scriptId="Hant", territoryId="HK")) # doctest: +NORMALIZE_WHITESPACE766 繁體中文 (中華人民共和國香港特別行政區)767 >>> print(language_name(languageId="zh", scriptId="Hant", territoryId="MO")) # doctest: +NORMALIZE_WHITESPACE768 繁體中文 (中華人民共和國澳門特別行政區)769 >>> print(language_name(languageId="sr")) # doctest: +NORMALIZE_WHITESPACE770 српски771 >>> print(language_name(languageId="sr", territoryId="RS")) # doctest: +NORMALIZE_WHITESPACE772 српски (Србија)773 >>> print(language_name(languageId="sr", territoryId="ME")) # doctest: +NORMALIZE_WHITESPACE774 српски (Црна Гора)775 >>> print(language_name(languageId="sr", scriptId="Cyrl")) # doctest: +NORMALIZE_WHITESPACE776 српски (Ћирилица)777 >>> print(language_name(languageId="sr", scriptId="Latn")) # doctest: +NORMALIZE_WHITESPACE778 Srpski (Latinica)779 >>> print(language_name(languageId="sr", scriptId="Cyrl", territoryId="RS")) # doctest: +NORMALIZE_WHITESPACE780 српски (Ћирилица) (Србија)781 >>> print(language_name(languageId="sr", scriptId="Latn", territoryId="RS")) # doctest: +NORMALIZE_WHITESPACE782 Srpski (Latinica) (Srbija)783 >>> print(language_name(languageId="sr", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE784 Serbian785 >>> print(language_name(languageId="sr", territoryId="RS", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE786 Serbian (Serbia)787 >>> print(language_name(languageId="sr", territoryId="ME", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE788 Serbian (Montenegro)789 >>> print(language_name(languageId="sr", scriptId="Cyrl", territoryId="RS", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE790 Serbian (Cyrillic) (Serbia)791 >>> print(language_name(languageId="sr", scriptId="Latn", territoryId="RS", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE792 Serbian (Latin) (Serbia)793 # script and territory given in languageId override script and territory in extra parameters:794 >>> print(language_name(languageId="sr_Latn_RS", scriptId="Cyrl", territoryId="DE", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE795 Serbian (Latin) (Serbia)796 >>> print(language_name(languageId="be")) # doctest: +NORMALIZE_WHITESPACE797 беларуская798 >>> print(language_name(languageId="be", territoryId="BY")) # doctest: +NORMALIZE_WHITESPACE799 беларуская (Беларусь)800 >>> print(language_name(languageId="be", scriptId="Cyrl")) # doctest: +NORMALIZE_WHITESPACE801 беларуская802 >>> print(language_name(languageId="be", scriptId="Latn")) # doctest: +NORMALIZE_WHITESPACE803 biełaruskaja804 >>> print(language_name(languageId="be", scriptId="latin", languageIdQuery="be", scriptIdQuery="latin")) # doctest: +NORMALIZE_WHITESPACE805 biełaruskaja806 >>> print(language_name(languageId="be", scriptId="Cyrl", territoryId="BY")) # doctest: +NORMALIZE_WHITESPACE807 беларуская (Беларусь)808 >>> print(language_name(languageId="be", scriptId="Latn", territoryId="BY")) # doctest: +NORMALIZE_WHITESPACE809 biełaruskaja (Bielaruś)810 >>> print(language_name(languageId="be", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE811 Belarusian812 >>> print(language_name(languageId="be", territoryId="BY", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE813 Belarusian (Belarus)814 >>> print(language_name(languageId="be", scriptId="Cyrl", territoryId="BY", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE815 Belarusian (Belarus)816 >>> print(language_name(languageId="be", scriptId="Latn", territoryId="BY", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE817 Belarusian (Belarus)818 # script and territory given in languageId override script and territory in extra parameters:819 >>> print(language_name(languageId="be_Latn_BY", scriptId="Cyrl", territoryId="DE", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE820 Belarusian (Belarus)821 >>> print(language_name(languageId="nds", territoryId="DE")) # doctest: +NORMALIZE_WHITESPACE822 Plattdüütsch (Düütschland)823 >>> print(language_name(languageId="nds", territoryId="NL")) # doctest: +NORMALIZE_WHITESPACE824 Plattdüütsch (Nedderlannen)825 >>> print(language_name(languageId="pa")) # doctest: +NORMALIZE_WHITESPACE826 ਪੰਜਾਬੀ827 >>> print(language_name(languageId="pa", territoryId="PK")) # doctest: +NORMALIZE_WHITESPACE828 پنجاب (پکستان)829 >>> print(language_name(languageId="pa", scriptId="Arab", territoryId="PK")) # doctest: +NORMALIZE_WHITESPACE830 پنجاب (پکستان)831 >>> print(language_name(languageId="pa", territoryId="IN")) # doctest: +NORMALIZE_WHITESPACE832 ਪੰਜਾਬੀ (ਭਾਰਤ)833 >>> print(language_name(languageId="pa", scriptId="Guru", territoryId="IN")) # doctest: +NORMALIZE_WHITESPACE834 ਪੰਜਾਬੀ (ਭਾਰਤ)835 >>> print(language_name(languageId="pa", scriptId="Arab")) # doctest: +NORMALIZE_WHITESPACE836 پنجاب837 >>> print(language_name(languageId="pa", scriptId="Guru")) # doctest: +NORMALIZE_WHITESPACE838 ਪੰਜਾਬੀ839 >>> print(language_name(languageId="tl")) # doctest: +NORMALIZE_WHITESPACE840 Tagalog841 >>> print(territory_name(territoryId="AE", languageIdQuery="ar")) # doctest: +NORMALIZE_WHITESPACE842 الإمارات العربية المتحدة843 >>> print(territory_name(territoryId="AE", languageIdQuery="de")) # doctest: +NORMALIZE_WHITESPACE844 Vereinigte Arabische Emirate845 >>> print(territory_name(territoryId="AE", languageIdQuery="en")) # doctest: +NORMALIZE_WHITESPACE846 United Arab Emirates847 >>> print(territory_name(territoryId="AE", languageIdQuery=None)) # doctest: +NORMALIZE_WHITESPACE848 >>> print(territory_name(territoryId="TW", languageIdQuery="zh")) # doctest: +NORMALIZE_WHITESPACE849 台湾850 >>> print(territory_name(territoryId="TW", languageIdQuery="zh", scriptIdQuery="Hant")) # doctest: +NORMALIZE_WHITESPACE851 台灣852 >>> print(territory_name(territoryId="TW", languageIdQuery="zh", scriptIdQuery="Hant", territoryIdQuery="TW")) # doctest: +NORMALIZE_WHITESPACE853 台灣854 >>> print(territory_name(territoryId="TW", languageIdQuery="zh", territoryIdQuery="TW")) # doctest: +NORMALIZE_WHITESPACE855 台灣856 >>> print(territory_name(territoryId="HK", languageIdQuery="zh", territoryIdQuery="HK")) # doctest: +NORMALIZE_WHITESPACE857 中華人民共和國香港特別行政區858 >>> print(territory_name(territoryId="TW", languageIdQuery="zh", scriptIdQuery="Hans")) # doctest: +NORMALIZE_WHITESPACE859 台湾860 >>> print(territory_name(territoryId="TW", languageIdQuery="zh", scriptIdQuery="Hans", territoryIdQuery="CN")) # doctest: +NORMALIZE_WHITESPACE861 中华民国862 >>> print(territory_name(territoryId="TW", languageIdQuery="zh", territoryIdQuery="CN")) # doctest: +NORMALIZE_WHITESPACE863 中华民国864 >>> print(territory_name(territoryId="TW", languageIdQuery="zh", scriptIdQuery="Cyrl", territoryIdQuery="CN")) # doctest: +NORMALIZE_WHITESPACE865 中华民国866 >>> print(territory_name(territoryId="TW", languageIdQuery="zh", scriptIdQuery="Hans", territoryIdQuery="DE")) # doctest: +NORMALIZE_WHITESPACE867 台湾868 >>> print(territory_name(territoryId="TW", languageIdQuery="de", scriptIdQuery="Latn", territoryIdQuery="DE")) # doctest: +NORMALIZE_WHITESPACE869 Taiwan870 >>> print(territory_name(territoryId="CH", languageIdQuery="de", scriptIdQuery="Latn", territoryIdQuery="DE")) # doctest: +NORMALIZE_WHITESPACE871 Schweiz872 >>> print(territory_name(territoryId="BY", languageIdQuery="de", scriptIdQuery="Latn", territoryIdQuery="CH")) # doctest: +NORMALIZE_WHITESPACE873 Weissrussland874 # script given in languageIdQuery overrides script given in scriptIdQuery:875 >>> print(territory_name(territoryId="RS", languageIdQuery="sr_Cyrl_RS", scriptIdQuery="Latn", territoryIdQuery="CH")) # doctest: +NORMALIZE_WHITESPACE876 Србија877 ######################################################################878 # testing locale pattern regexp:879 # valid patterns:880 >>> _test_cldr_locale_pattern(localeId="srx_XK") # doctest: +NORMALIZE_WHITESPACE881 [('language', 'srx'), ('script', None), ('territory', 'XK')]882 >>> _test_cldr_locale_pattern(localeId="sr_XK") # doctest: +NORMALIZE_WHITESPACE883 [('language', 'sr'), ('script', None), ('territory', 'XK')]884 >>> _test_cldr_locale_pattern(localeId="sr@foo") # doctest: +NORMALIZE_WHITESPACE885 [('language', 'sr'), ('script', None), ('territory', None)]886 >>> _test_cldr_locale_pattern(localeId="sr_Cyrl_RS") # doctest: +NORMALIZE_WHITESPACE887 [('language', 'sr'), ('script', 'Cyrl'), ('territory', 'RS')]888 >>> _test_cldr_locale_pattern(localeId="sr_Cyrl_RS@foo") # doctest: +NORMALIZE_WHITESPACE889 [('language', 'sr'), ('script', 'Cyrl'), ('territory', 'RS')]890 >>> _test_cldr_locale_pattern(localeId="srx_Artc_XK") # doctest: +NORMALIZE_WHITESPACE891 [('language', 'srx'), ('script', 'Artc'), ('territory', 'XK')]892 #----------------------------------------------------------------------893 # invalid patterns:894 >>> _test_cldr_locale_pattern(localeId="srxf_Artc_XK") # doctest: +NORMALIZE_WHITESPACE895 []896 >>> _test_cldr_locale_pattern(localeId="srx_ARtc_XK") # doctest: +NORMALIZE_WHITESPACE897 []898 >>> _test_cldr_locale_pattern(localeId="srx_Artc_XXK") # doctest: +NORMALIZE_WHITESPACE899 []900 >>> _test_cldr_locale_pattern(localeId="srx_XXK") # doctest: +NORMALIZE_WHITESPACE901 []902 >>> _test_cldr_locale_pattern(localeId="srx_Artc_Kx") # doctest: +NORMALIZE_WHITESPACE903 []904 >>> supports_ascii("jp") # doctest: +NORMALIZE_WHITESPACE905 True906 >>> supports_ascii("ru") # doctest: +NORMALIZE_WHITESPACE907 False908 >>> supports_ascii("cz") # doctest: +NORMALIZE_WHITESPACE909 True910 >>> supports_ascii("sk") # doctest: +NORMALIZE_WHITESPACE911 True912 >>> supports_ascii("ara") # doctest: +NORMALIZE_WHITESPACE913 False914 >>> supports_ascii("not_existing_in_database") # doctest: +NORMALIZE_WHITESPACE915 True916 >>> languageId("Sindhi") # doctest: +NORMALIZE_WHITESPACE917 'sd'918 >>> languageId("Српски") # doctest: +NORMALIZE_WHITESPACE919 'sr'920 >>> languageId("Serbian") # doctest: +NORMALIZE_WHITESPACE921 'sr'922 >>> languageId("Serbian (Cyrillic)") # doctest: +NORMALIZE_WHITESPACE923 'sr_Cyrl'924 >>> languageId("Serbian (Latin)") # doctest: +NORMALIZE_WHITESPACE925 'sr_Latn'926 >>> languageId("Српски (Ћирилица)") # doctest: +NORMALIZE_WHITESPACE927 'sr_Cyrl'928 >>> languageId("Српски (Србија)") # doctest: +NORMALIZE_WHITESPACE929 'sr_RS'930 >>> languageId("Portuguese") # doctest: +NORMALIZE_WHITESPACE931 'pt'932 >>> languageId("Portuguese (Brazil)") # doctest: +NORMALIZE_WHITESPACE933 'pt_BR'934 >>> languageId("Portuguese (Portugal)") # doctest: +NORMALIZE_WHITESPACE935 'pt_PT'936 >>> languageId("Portugiesisch (Brasilien)") # doctest: +NORMALIZE_WHITESPACE937 'pt_BR'938 >>> languageId("Shuswap language") # doctest: +NORMALIZE_WHITESPACE939 'shs'940 >>> languageId("Shuswap Language") # doctest: +NORMALIZE_WHITESPACE941 'shs'942 >>> languageId("shuswap language") # doctest: +NORMALIZE_WHITESPACE943 'shs'944 >>> languageId("sHuSwAp laNguAge") # doctest: +NORMALIZE_WHITESPACE945 'shs'946 >>> languageId("Czech (Czech Republic)") # doctest: +NORMALIZE_WHITESPACE947 'cs_CZ'948 >>> languageId("English (United Kingdom)") # doctest: +NORMALIZE_WHITESPACE949 'en_GB'950 >>> languageId("Low German (Germany)") # doctest: +NORMALIZE_WHITESPACE951 'nds_DE'952 >>> languageId("Tagalog") # doctest: +NORMALIZE_WHITESPACE953 'tl'954 >>> languageId("Filipino") # doctest: +NORMALIZE_WHITESPACE955 'fil'956 >>> print(langtable.timezone_name(timezoneId='US/Mountain', languageIdQuery='ja')) # doctest: +NORMALIZE_WHITESPACE957 アメリカ合衆国/山地時間958 >>> print(langtable.timezone_name(timezoneId='US/Pacific', languageIdQuery='ja')) # doctest: +NORMALIZE_WHITESPACE959 アメリカ合衆国/太平洋時間960 >>> print(langtable.timezone_name(timezoneId='America/North_Dakota/Center', languageIdQuery='es')) # doctest: +NORMALIZE_WHITESPACE961 América/Dakota del Norte/Centro962 >>> print(langtable.timezone_name(timezoneId='Europe/Berlin', languageIdQuery='zh')) # doctest: +NORMALIZE_WHITESPACE963 欧洲/柏林964 >>> print(langtable.timezone_name(timezoneId='Europe/Berlin', languageIdQuery='zh_Hant')) # doctest: +NORMALIZE_WHITESPACE965 歐洲/柏林966 >>> print(langtable.timezone_name(timezoneId='Europe/Berlin', languageIdQuery='zh_CN')) # doctest: +NORMALIZE_WHITESPACE967 欧洲/柏林968 >>> print(langtable.timezone_name(timezoneId='Europe/Berlin', languageIdQuery='zh_TW')) # doctest: +NORMALIZE_WHITESPACE969 歐洲/柏林970 >>> print(langtable.timezone_name(timezoneId='GMT+1', languageIdQuery='cs')) # doctest: +NORMALIZE_WHITESPACE971 GMT+1972 >>> print(langtable.timezone_name(timezoneId='foo/bar', languageIdQuery='cs')) # doctest: +NORMALIZE_WHITESPACE973 foo/bar974 >>> print(langtable.timezone_name(timezoneId='Europe/foo/bar', languageIdQuery='cs')) # doctest: +NORMALIZE_WHITESPACE975 Evropa/foo/bar976 >>> print(langtable.timezone_name(timezoneId='America/Vancouver', languageIdQuery='xxx')) # doctest: +NORMALIZE_WHITESPACE977 America/Vancouver978 >>> print(langtable.timezone_name(timezoneId='Pacific/Pago_Pago', languageIdQuery='xxx')) # doctest: +NORMALIZE_WHITESPACE979 Pacific/Pago_Pago980 >>> print(langtable.timezone_name(timezoneId='America/Vancouver', languageIdQuery='ast')) # doctest: +NORMALIZE_WHITESPACE981 América/Vancouver982 >>> print(langtable.timezone_name(timezoneId='Pacific/Pago_Pago', languageIdQuery='ast')) # doctest: +NORMALIZE_WHITESPACE983 Océanu Pacíficu/Pago Pago984 '''985if __name__ == "__main__":986 import doctest...

Full Screen

Full Screen

jpacks.py

Source:jpacks.py Github

copy

Full Screen

1import setpath2from lib import jopts3from lib.jsonpath import jsonpath as libjsonpath4import json5import operator6import itertools7import re8import functions9import math10import unicodedata11try:12 from collections import OrderedDict13except ImportError:14 # Python 2.615 from lib.collections26 import OrderedDict16characters_to_clean=re.compile(ur"""[^\w!-~]""", re.UNICODE)17def utf8clean(arg):18 def cleanchar(c):19 c=c.group()[0]20 if c != '\n' and unicodedata.category(c)[0] == 'C':21 return u''22 else:23 return c24 o=''25 if type(arg) in (str,unicode):26 o+=characters_to_clean.sub(cleanchar, arg)27 else:28 o+=unicode(arg, errors='replace')29 return o30def jngrams(*args):31 """32 .. function:: jngrams(n,text) -> jpack33 Converts multiple input arguments into a jpack of ngrams.34 Examples:35 >>> sql("select jngrams(1,'This is a test phrase')")36 jngrams(1,'This is a test phrase')37 -------------------------------------------38 [["This"],["is"],["a"],["test"],["phrase"]]39 >>> sql("select jngrams(2,'This is a test phrase')")40 jngrams(2,'This is a test phrase')41 ---------------------------------------------------------42 [["This","is"],["is","a"],["a","test"],["test","phrase"]]43 """44 if type(args[0]) == int:45 n = args[0]46 text = args[1]47 else:48 n = 149 text = args[0]50 g = text.split(' ')51 listofngrams = []52 for i in xrange(len(g)-n+1):53 listofngrams.append(g[i:i+n])54 return jopts.toj(listofngrams)55jngrams.registered=True56def jfrequentwords(*args):57 """58 .. function:: jfrequentwords(args...) -> jpack59 Returns the frequent words of a text in a jpack60 """61 wordslist = args[0].split(' ')62 setwords = set(wordslist)63 c = dict.fromkeys(setwords, 0)64 for w in wordslist:65 c[w]+=166 lenwords = len(setwords)67 extremevals = int(math.ceil(lenwords * 3 * 1.0/100))68 frequences = sorted(c.values())[extremevals:(lenwords-extremevals)]69 avgfrequency = math.ceil(sum(frequences)*1.0/len(frequences))70 return jopts.toj([k for k,v in c.iteritems() if v >= avgfrequency])71jfrequentwords.registered=True72def jsonstrict(*args):73 """74 .. function:: jsonstrict(args...) -> json string75 Sometimes we wish to process json lists from another application. Jsonstrict function76 tries to always create json compatible lists. So it always returns json lists.77 Examples:78 >>> sql("select jsonstrict('a')")79 jsonstrict('a')80 ---------------81 ["a"]82 >>> sql("select jsonstrict('a','b',3)")83 jsonstrict('a','b',3)84 ---------------------85 ["a","b",3]86 >>> sql("select jsonstrict('a', jpack('b',3))")87 jsonstrict('a', jpack('b',3))88 -----------------------------89 ["a",["b",3]]90 """91 return json.dumps(jopts.elemfromj(*args), separators=(',',':'), ensure_ascii=False)92jsonstrict.registered=True93def jzip(*args):94 """95 .. function:: jzip(args...) -> json string96 It combines the corresponding elements of input jpacks.97 Examples:98 >>> sql('''select jzip('["a", "b"]', '[1,2]','[4,5]')''')99 jzip('["a", "b"]', '[1,2]','[4,5]')100 -----------------------------------101 [["a",1,4],["b",2,5]]102 """103 return json.dumps([list(x) for x in zip(*jopts.elemfromj(*args))], separators=(',',':'), ensure_ascii=False)104jzip.registered=True105def jzipdict(*args):106 """107 .. function:: jzipdict(args...) -> json string108 It combines the correspinding elements of input jpacks into a jdict.109 Examples:110 >>> sql('''select jzipdict('["a", "b"]', '[1,2]','[4,5]')''')111 jzipdict('["a", "b"]', '[1,2]','[4,5]')112 ---------------------------------------113 {"a":[1,4],"b":[2,5]}114 """115 return json.dumps(dict(tuple([x[0], x[1:]]) for x in zip(*jopts.elemfromj(*args))), separators=(',',':'), ensure_ascii=False)116jzipdict.registered=True117def jlen(*args):118 """119 .. function:: jlen(args...) -> int120 Returns the total length in elements of the input jpacks.121 Examples:122 >>> sql("select jlen('abc')")123 jlen('abc')124 -----------125 1126 >>> sql("select jlen('a','b',3)")127 jlen('a','b',3)128 ---------------129 3130 >>> sql("select jlen('a', jpack('b',3))")131 jlen('a', jpack('b',3))132 -----------------------133 3134 >>> sql("select jlen('[1,2,3]')")135 jlen('[1,2,3]')136 ---------------137 3138 """139 return sum([len(x) if type(x) in (dict,list) else 1 for x in (jopts.elemfromj(*args))])140jlen.registered=True141def jfilterempty(*args):142 """143 .. function:: jfilterempty(jpacks.) -> jpack144 Removes from input jpacks all empty elements.145 Examples:146 >>> sql("select jfilterempty('a', '', '[]')")147 jfilterempty('a', '', '[]')148 ---------------------------149 a150 >>> sql("select jfilterempty('a','[null]',3)")151 jfilterempty('a','[null]',3)152 ----------------------------153 ["a",3]154 >>> sql("select jfilterempty('[3]', jpack('b', ''))")155 jfilterempty('[3]', jpack('b', ''))156 -----------------------------------157 [3,"b"]158 """159 return jopts.toj([x for x in jopts.fromj(*args) if x!='' and x!=[] and x!=None])160jfilterempty.registered=True161def jlengthiest(*args):162 """163 .. function:: jlengthiest(jpacks.) -> jpack164 Returns the string with the greatest length contained in the jpacks.165 Examples:166 >>> sql("select jlengthiest('a', '', '[]')")167 jlengthiest('a', '', '[]')168 --------------------------169 a170 >>> sql("select jlengthiest('a','longer',3)")171 jlengthiest('a','longer',3)172 ---------------------------173 longer174 >>> sql("select jlengthiest('[3]', jpack('b', ''))")175 jlengthiest('[3]', jpack('b', ''))176 ----------------------------------177 3178 """179 maxlen=-1180 res=None181 182 for i in (x for x in jopts.fromj(*args)):183 if i == None: 184 l=-1185 else:186 l = len(unicode(i))187 if l > maxlen:188 maxlen = l189 res = i190 return res191jlengthiest.registered=True192def jchars(*args):193 """194 .. function:: jletters(text) -> character jpack195 Splits an input text into its composing characters.196 Examples:197 >>> sql("select jchars('this is a text')")198 jchars('this is a text')199 ---------------------------------------------------------200 ["t","h","i","s"," ","i","s"," ","a"," ","t","e","x","t"]201 >>> sql("select jchars('another', 'text')")202 jchars('another', 'text')203 ---------------------------------------------204 ["a","n","o","t","h","e","r","t","e","x","t"]205 """206 output = []207 for i in args:208 output+=list(i)209 return json.dumps(output, separators=(',',':'), ensure_ascii=False)210jchars.registered=True211def j2s(*args):212 """213 .. function:: j2s(jpack) -> space separated string214 Converts multiple input jpacks to a space separated string. Newlines are converted to spaces.215 Examples:216 >>> sql("select j2s('[1,2,3]')") # doctest: +NORMALIZE_WHITESPACE217 j2s('[1,2,3]')218 --------------219 1 2 3220 >>> sql("select j2s('[1,2,3]','a')") # doctest: +NORMALIZE_WHITESPACE221 j2s('[1,2,3]','a')222 ------------------223 1 2 3 a224 >>> sql("select j2s('a', 'b')") # doctest: +NORMALIZE_WHITESPACE225 j2s('a', 'b')226 -------------227 a b228 """229 return ' '.join([ unicode(x).replace('\n',' ') for x in jopts.fromj(*args) ])230j2s.registered=True231def j2t(*args):232 """233 .. function:: j2t(jpack) -> tabpack234 Converts multiple input jpacks to a tab separated pack (tab separated values). If tab or newline characters are found in235 the source jpack they are converted to spaces.236 Examples:237 >>> sql("select j2t('[1,2,3]')") # doctest: +NORMALIZE_WHITESPACE238 j2t('[1,2,3]')239 --------------240 1 2 3241 >>> sql("select j2t('[1,2,3]','a')") # doctest: +NORMALIZE_WHITESPACE242 j2t('[1,2,3]','a')243 ------------------244 1 2 3 a245 >>> sql("select j2t('a', 'b')") # doctest: +NORMALIZE_WHITESPACE246 j2t('a', 'b')247 -------------248 a b249 """250 return '\t'.join([ unicode(x).replace('\t', ' ').replace('\n',' ') for x in jopts.fromj(*args) ])251j2t.registered=True252def t2j(*args):253 """254 .. function:: t2j(tabpack) -> jpack255 Converts a tab separated pack to a jpack.256 Examples:257 >>> sql("select t2j(j2t('[1,2,3]'))") # doctest: +NORMALIZE_WHITESPACE258 t2j(j2t('[1,2,3]'))259 -------------------260 ["1","2","3"]261 >>> sql("select t2j('asdfasdf')") # doctest: +NORMALIZE_WHITESPACE262 t2j('asdfasdf')263 ---------------264 ["asdfasdf"]265 """266 267 fj=[]268 for t in args:269 fj+=t.split('\t')270 return json.dumps(fj, separators=(',',':'), ensure_ascii=False)271t2j.registered=True272def s2j(*args):273 """274 .. function:: s2j(tabpack) -> jpack275 Converts a space separated pack to a jpack.276 Examples:277 >>> sql("select s2j('1 2 3 ')") # doctest: +NORMALIZE_WHITESPACE278 s2j('1 2 3 ')279 --------------280 ["1","2","3"]281 """282 fj=[]283 for t in args:284 fj+=[x for x in t.split(' ') if x!='']285 return jopts.toj(fj)286s2j.registered=True287def jmerge(*args):288 """289 .. function:: jmerge(jpacks) -> jpack290 Merges multiple jpacks into one jpack.291 Examples:292 >>> sql("select jmerge('[1,2,3]', '[1,2,3]', 'a', 3 )") # doctest: +NORMALIZE_WHITESPACE293 jmerge('[1,2,3]', '[1,2,3]', 'a', 3 )294 -------------------------------------295 [1,2,3,1,2,3,"a",3]296 """297 return jopts.toj( jopts.fromj(*args) )298jmerge.registered=True299def jset(*args):300 """301 .. function:: jset(jpacks) -> jpack302 Returns a set representation of a jpack, unifying duplicate items.303 Examples:304 >>> sql("select jset('[1,2,3]', '[1,2,3]', 'b', 'a', 3 )") # doctest: +NORMALIZE_WHITESPACE305 jset('[1,2,3]', '[1,2,3]', 'b', 'a', 3 )306 ----------------------------------------307 [1,2,3,"a","b"]308 """309 return jopts.toj(sorted(set(jopts.fromj(*args))))310jset.registered = True311def jexcept(*args):312 """313 .. function:: jexcept(jpackA, jpackB) -> jpack314 Returns the items of jpackA except the items that appear on jpackB.315 Examples:316 >>> sql("select jexcept('[1,2,3]', '[1,2,3]')") # doctest: +NORMALIZE_WHITESPACE317 jexcept('[1,2,3]', '[1,2,3]')318 -----------------------------319 []320 >>> sql("select jexcept('[1,2,3]', '[1,3]')") # doctest: +NORMALIZE_WHITESPACE321 jexcept('[1,2,3]', '[1,3]')322 ---------------------------323 2324 """325 if len(args) < 2:326 raise functions.OperatorError("jexcept","operator needs at least two inputs")327 b = set(jopts.fromj(args[1]))328 return jopts.toj([x for x in jopts.fromj(args[0]) if x not in b])329jexcept.registered = True330def jintersection(*args):331 """332 .. function:: jintersection(jpackA, jpackB) -> jpack333 Returns the items of jpackA except the items that appear on jpackB.334 Examples:335 >>> sql("select jintersection('[1,2,3]', '[1,2,3]')") # doctest: +NORMALIZE_WHITESPACE336 jintersection('[1,2,3]', '[1,2,3]')337 -----------------------------------338 [1,2,3]339 >>> sql("select jintersection('[1,2,3]', '[1,3]', 1)") # doctest: +NORMALIZE_WHITESPACE340 jintersection('[1,2,3]', '[1,3]', 1)341 ------------------------------------342 1343 """344 if len(args) < 2:345 raise functions.OperatorError("jintersection","operator needs at least two inputs")346 return jopts.toj(sorted(set.intersection(*[set(jopts.fromj(x)) for x in args])))347jintersection.registered = True348def jsort(*args):349 """350 .. function:: jsort(jpacks) -> jpack351 Sorts the input jpacks.352 Examples:353 >>> sql("select jsort('[1,2,3]', '[1,2,3]', 'b', 'a', 3 )") # doctest: +NORMALIZE_WHITESPACE354 jsort('[1,2,3]', '[1,2,3]', 'b', 'a', 3 )355 -----------------------------------------356 [1,1,2,2,3,3,3,"a","b"]357 """358 return jopts.toj(sorted( jopts.fromj(*args) ))359jsort.registered=True360def jsplitv(*args):361 """362 .. function:: jsplitv(jpacks) -> [C1]363 Splits vertically a jpack.364 Examples:365 >>> sql("select jsplitv(jmerge('[1,2,3]', '[1,2,3]', 'b', 'a', 3 ))") # doctest: +NORMALIZE_WHITESPACE366 C1367 --368 1369 2370 3371 1372 2373 3374 b375 a376 3377 """378 yield ('C1', )379 for j1 in jopts.fromj(*args):380 yield [jopts.toj(j1)]381jsplitv.registered=True382def jsplit(*args):383 """384 .. function:: jsplit(jpacks) -> [C1, C2, ...]385 Splits horizontally a jpack.386 Examples:387 >>> sql("select jsplit('[1,2,3]', '[3,4,5]')") # doctest: +NORMALIZE_WHITESPACE388 C1 | C2 | C3 | C4 | C5 | C6389 ---------------------------390 1 | 2 | 3 | 3 | 4 | 5391 """392 fj=[jopts.toj(x) for x in jopts.fromj(*args)]393 if fj==[]:394 yield ('C1',)395 396 yield tuple( ['C'+str(x) for x in xrange(1,len(fj)+1)] )397 yield fj398jsplit.registered=True399def jflatten(*args):400 """401 .. function:: jflattten(jpacks) -> jpack402 Flattens all nested sub-jpacks.403 Examples:404 >>> sql(''' select jflatten('1', '[2]') ''') # doctest: +NORMALIZE_WHITESPACE405 jflatten('1', '[2]')406 --------------------407 ["1",2]408 >>> sql(''' select jflatten('[["word1", 1], ["word2", 1], [["word3", 2], ["word4", 2]], 3]') ''') # doctest: +NORMALIZE_WHITESPACE409 jflatten('[["word1", 1], ["word2", 1], [["word3", 2], ["word4", 2]], 3]')410 -------------------------------------------------------------------------411 ["word1",1,"word2",1,"word3",2,"word4",2,3]412 """413 return jopts.toj( jopts.flatten( jopts.elemfromj(*args) ))414jflatten.registered=True415def jmergeregexp(*args):416 """417 .. function:: jmergeregexp(jpacks) -> jpack418 Creates a regular expression that matches all of the jpack's contents. If the input419 jpack contains keyword pairs, then jmergeregexp returns a regular expression420 with named groups.421 Examples:422 >>> sql(''' select jmergeregexp('["abc", "def"]') ''') # doctest: +NORMALIZE_WHITESPACE423 jmergeregexp('["abc", "def"]')424 ------------------------------425 (?:abc)|(?:def)426 >>> sql(''' select jmergeregexp('[["pos", "p1"], ["neg", "n1"], ["pos", "p2"]]') ''') # doctest: +NORMALIZE_WHITESPACE427 jmergeregexp('[["pos", "p1"], ["neg", "n1"], ["pos", "p2"]]')428 -------------------------------------------------------------429 (?P<neg>n1)|(?P<pos>p1|p2)430 >>> sql(''' select jmergeregexp('[]') ''') # doctest: +NORMALIZE_WHITESPACE431 jmergeregexp('[]')432 ------------------433 _^434 >>> sql(''' select jmergeregexp('["ab",""]') ''') # doctest: +NORMALIZE_WHITESPACE435 jmergeregexp('["ab",""]')436 -------------------------437 (?:ab)438 """439 inp = jopts.fromj(*args)440 if len(inp)>0 and type(inp[0]) == list:441 out={}442 for x,y in inp:443 if x not in out:444 out[x] = [y]445 else:446 out[x].append(y)447 res = '|'.join('(?P<'+ x + '>' + '|'.join(y)+')' for x, y in out.iteritems() if y!='')448 if res == '':449 res = '_^'450 return res451 res = '|'.join('(?:'+x+')' for x in inp if x!='')452 if res == '':453 res = '_^'454 return res455jmergeregexp.registered=True456def jmergeregexpnamed(*args):457 """458 .. function:: jmergeregexpnamed(jpacks) -> jpack459 Creates a regular expression that matches all of the jpack's contents with named groups. If the number of460 named groups in a regular expression is greater than 99, then the output will be a jpack of regular expressions.461 Examples:462 >>> sql(''' select jmergeregexpnamed('["abc", "def"]') ''') # doctest: +NORMALIZE_WHITESPACE463 jmergeregexpnamed('["abc", "def"]')464 -----------------------------------465 (abc)|(def)466 """467 inp = jopts.fromj(*args)468 inp.sort()469 out = []470 for g in xrange(0, len(inp), 99):471 out.append('|'.join('('+x+')' for x in inp[g:g+99]))472 return jopts.toj(out)473jmergeregexpnamed.registered=True474def jdict(*args):475 """476 .. function:: jdict(key, value, key1, value1) -> jdict477 Returns a jdict of the keys and value pairs.478 Examples:479 >>> sql(''' select jdict('key1', 'val1', 'key2', 'val2') ''') # doctest: +NORMALIZE_WHITESPACE480 jdict('key1', 'val1', 'key2', 'val2')481 -------------------------------------482 {"key1":"val1","key2":"val2"}483 >>> sql(''' select jdict('key', '{"k1":1,"k2":2}') ''') # doctest: +NORMALIZE_WHITESPACE484 jdict('key', '{"k1":1,"k2":2}')485 -------------------------------486 {"key":{"k1":1,"k2":2}}487 >>> sql(''' select jdict('key', '["val1", "val2"]') ''') # doctest: +NORMALIZE_WHITESPACE488 jdict('key', '["val1", "val2"]')489 --------------------------------490 {"key":["val1","val2"]}491 >>> sql(''' select jdict('1') ''') # doctest: +NORMALIZE_WHITESPACE492 Traceback (most recent call last):493 ...494 OperatorError: Madis SQLError:495 Operator JDICT: At least two arguments required496 """497 if len(args)==1:498 raise functions.OperatorError('jdict',"At least two arguments required")499 result = OrderedDict()500 501 for i in xrange(0, len(args), 2):502 result[args[i]] = jopts.fromjsingle(args[i+1])503 return jopts.toj( result )504jdict.registered=True505def jdictkeys(*args):506 """507 .. function:: jdictkeys(jdict) -> jpack508 Returns a jpack of the keys of input jdict509 Examples:510 >>> sql(''' select jdictkeys('{"k1":1,"k2":2}', '{"k1":1,"k3":2}') ''') # doctest: +NORMALIZE_WHITESPACE511 jdictkeys('{"k1":1,"k2":2}', '{"k1":1,"k3":2}')512 -----------------------------------------------513 ["k1","k2","k3"]514 >>> sql(''' select jdictkeys('{"k1":1,"k2":2}') ''') # doctest: +NORMALIZE_WHITESPACE515 jdictkeys('{"k1":1,"k2":2}')516 ----------------------------517 ["k1","k2"]518 >>> sql(''' select jdictkeys('test') ''') # doctest: +NORMALIZE_WHITESPACE519 jdictkeys('test')520 -----------------521 []522 >>> sql(''' select jdictkeys(1) ''') # doctest: +NORMALIZE_WHITESPACE523 jdictkeys(1)524 ------------525 []526 """527 528 if len(args)==1:529 keys=[]530 i=args[0]531 try:532 if i[0]=='{' and i[-1]=='}':533 keys=[x for x in json.loads(i, object_pairs_hook=OrderedDict).iterkeys()]534 except TypeError,e:535 pass536 else:537 keys=OrderedDict()538 for i in args:539 try:540 if i[0]=='{' and i[-1]=='}':541 keys.update([(x,None) for x in json.loads(i, object_pairs_hook=OrderedDict).iterkeys()])542 except TypeError,e:543 pass544 keys=list(keys)545 return jopts.toj( keys )546jdictkeys.registered=True547def jdictvals(*args):548 """549 .. function:: jdictvals(jdict, [key1, key2,..]) -> jpack550 If only the first argument (jdict) is provided, it returns a jpack of the values of input jdict (sorted by the jdict keys).551 If key values are also provided, it returns only the keys that have been provided.552 Examples:553 >>> sql(''' select jdictvals('{"k1":1,"k2":2}') ''') # doctest: +NORMALIZE_WHITESPACE554 jdictvals('{"k1":1,"k2":2}')555 ----------------------------556 [1,2]557 >>> sql(''' select jdictvals('{"k1":1,"k2":2, "k3":3}', 'k3', 'k1', 'k4') ''') # doctest: +NORMALIZE_WHITESPACE558 jdictvals('{"k1":1,"k2":2, "k3":3}', 'k3', 'k1', 'k4')559 ------------------------------------------------------560 [3,1,null]561 >>> sql(''' select jdictvals('{"k1":1}') ''') # doctest: +NORMALIZE_WHITESPACE562 jdictvals('{"k1":1}')563 ---------------------564 1565 >>> sql(''' select jdictvals('{"k1":1}') ''') # doctest: +NORMALIZE_WHITESPACE566 jdictvals('{"k1":1}')567 ---------------------568 1569 >>> sql(''' select jdictvals(1) ''') # doctest: +NORMALIZE_WHITESPACE570 jdictvals(1)571 ------------572 1573 """574 if type(args[0]) in (int,float) or args[0][0]!='{' or args[0][-1]!='}':575 return args[0]576 d=json.loads(args[0])577 if len(args)==1:578 d=d.items()579 d.sort(key=operator.itemgetter(1,0))580 vals=[x[1] for x in d]581 else:582 vals=[]583 for i in args[1:]:584 try:585 vals.append(d[i])586 except KeyboardInterrupt:587 raise588 except:589 vals.append(None)590 591 return jopts.toj( vals )592jdictvals.registered=True593def jdictsplit(*args):594 """595 .. function:: jdictsplit(jdict, [key1, key2,..]) -> columns596 If only the first argument (jdict) is provided, it returns a row containing the values of input jdict (sorted by the jdict keys).597 If key values are also provided, it returns only the columns of which the keys have been provided.598 Examples:599 >>> sql(''' select jdictsplit('{"k1":1,"k2":2}') ''') # doctest: +NORMALIZE_WHITESPACE600 k1 | k2601 -------602 1 | 2603 >>> sql(''' select jdictsplit('{"k1":1,"k2":2, "k3":3}', 'k3', 'k1', 'k4') ''') # doctest: +NORMALIZE_WHITESPACE604 k3 | k1 | k4605 --------------606 3 | 1 | None607 """608 d=json.loads(args[0])609 if len(args)==1:610 d=sorted(d.items())611 yield tuple([x[0] for x in d])612 yield [jopts.toj(x[1]) for x in d]613 else:614 vals=[]615 yield tuple(args[1:])616 for i in args[1:]:617 try:618 vals.append(jopts.toj(d[i]))619 except KeyboardInterrupt:620 raise 621 except:622 vals.append(None)623 yield vals624jdictsplit.registered=True625def jdictsplitv(*args):626 """627 .. function:: jdictsplitv(jdict, [key1, key2,..]) -> columns628 If only the first argument (jdict) is provided, it returns rows containing the values of input jdict.629 If key values are also provided, it returns only the columns of which the keys have been provided.630 Examples:631 >>> sql(''' select jdictsplitv('{"k1":1,"k2":2}') ''') # doctest: +NORMALIZE_WHITESPACE632 key | val633 ---------634 k1 | 1635 k2 | 2636 >>> sql(''' select jdictsplitv('{"k1":1,"k2":2, "k3":3}', 'k3', 'k1', 'k4') ''') # doctest: +NORMALIZE_WHITESPACE637 key | val638 ---------639 k3 | 3640 k1 | 1641 """642 yield ('key', 'val')643 if len(args) == 1:644 dlist = json.loads(args[0], object_pairs_hook=OrderedDict)645 for k, v in dlist.iteritems():646 yield [k, jopts.toj(v)]647 else:648 dlist = json.loads(args[0])649 for k in args[1:]:650 try:651 yield k, jopts.toj(dlist[k])652 except KeyError:653 pass654jdictsplitv.registered = True655def jdictgroupkey(*args):656 """657 .. function:: jdictgroupkey(list_of_jdicts, groupkey1, groupkey2, ...)658 It groups an array of jdicts into a hierarchical structure. The grouping is done659 first by groupkey1 then by groupkey2 and so on.660 If no groupkeys are provided, then the first key of array's first jdict is used as a groupkey.661 Examples:662 >>> sql('''select jdictgroupkey('[{"gkey":"v1", "b":1},{"gkey":"v1","b":2},{"gkey":"v2","b":1, "c":2}]') as j''')663 j664 ---------------------------------------------665 {"v1":[{"b":1},{"b":2}],"v2":[{"b":1,"c":2}]}666 >>> sql('''select jdictgroupkey('[{"gkey":"v1", "b":1},{"gkey":"v1","b":2},{"gkey":"v2","b":1, "c":2}]', "gkey") as j''')667 j668 ---------------------------------------------669 {"v1":[{"b":1},{"b":2}],"v2":[{"b":1,"c":2}]}670 >>> sql('''select jdictgroupkey('[{"gkey":"v1", "gkey2":"f1", "b":1},{"gkey":"v1", "gkey2":"f2", "b":2},{"gkey":"v1", "gkey2":"f2", "b":1, "c":2}]', "gkey", "gkey2") as j''')671 j672 --------------------------------------------------------------673 {"v1":{"gkey2":{"f1":[{"b":1}],"f2":[{"b":2},{"b":1,"c":2}]}}}674 """675 def recgroupkey(jdict, gkeys):676 outdict=OrderedDict()677 for d in jdict:678 if d[gkeys[0]] not in outdict:679 outdict[d[gkeys[0]]] = [d]680 else:681 outdict[d[gkeys[0]]].append(d)682 del(d[gkeys[0]])683 if len(gkeys)>1:684 outdict = OrderedDict([(x, recgroupkey(y, gkeys[1:])) for x,y in outdict.iteritems()])685 return {gkeys[0]:outdict}686 outdict=OrderedDict()687 dlist=json.loads(args[0], object_pairs_hook=OrderedDict)688 if len(args) == 1:689 groupkeys = [iter(dlist[0]).next()]690 else:691 groupkeys = args[1:]692 outdict = recgroupkey(dlist, groupkeys)693 return jopts.toj(outdict[groupkeys[0]])694jdictgroupkey.registered=True695def jsplice(*args):696 """697 .. function:: jsplice(jpack, range1_start, range1_end, ...) -> jpack698 Splices input jpack. If only a single range argument is provided, it returns input jpack's element in provided position. If defined position699 index is positive, then it starts counting from the beginning of input jpack. If defined position is negative it starts counting from the700 end of input jpack.701 If more than one range arguments are provided, then the arguments are assumed to be provided in pairs (start, end) that define ranges inside702 the input jpack that should be joined together in output jpack.703 Examples:704 >>> sql(''' select jsplice('[1,2,3,4,5]',0) ''') # doctest: +NORMALIZE_WHITESPACE705 jsplice('[1,2,3,4,5]',0)706 ------------------------707 1708 >>> sql(''' select jsplice('[1,2,3,4,5]',-1) ''') # doctest: +NORMALIZE_WHITESPACE709 jsplice('[1,2,3,4,5]',-1)710 -------------------------711 5712 >>> sql(''' select jsplice('[1,2,3,4,5]',10) ''') # doctest: +NORMALIZE_WHITESPACE713 jsplice('[1,2,3,4,5]',10)714 -------------------------715 None716 >>> sql(''' select jsplice('[1,2,3,4,5]', 0, 3, 0, 2) ''') # doctest: +NORMALIZE_WHITESPACE717 jsplice('[1,2,3,4,5]', 0, 3, 0, 2)718 ----------------------------------719 [1,2,3,1,2]720 >>> sql(''' select jsplice('[1,2,3,4,5]', 2, -1) ''') # doctest: +NORMALIZE_WHITESPACE721 jsplice('[1,2,3,4,5]', 2, -1)722 -----------------------------723 [3,4]724 """725 largs=len(args)726 if largs==1:727 return args[0]728 fj=jopts.fromj(args[0])729 if largs==2:730 try:731 return jopts.toj(fj[args[1]])732 except KeyboardInterrupt:733 raise734 except:735 return None736 outj=[]737 for i in xrange(1,largs,2):738 try:739 outj+=fj[args[i]:args[i+1]]740 except KeyboardInterrupt:741 raise 742 except:743 pass744 return jopts.toj(outj)745 746jsplice.registered=True747def jcombinations(*args):748 """749 .. function:: jcombinations(jpack, r) -> multiset750 Returns all length r combinations of jpack.751 Examples:752 >>> sql('''select jcombinations('["t1","t2","t3"]',2)''')753 C1 | C2754 -------755 t1 | t2756 t1 | t3757 t2 | t3758 >>> sql('''select jcombinations('["t1","t2",["t3","t4"]]',2)''')759 C1 | C2760 ----------------761 t1 | t2762 t1 | ["t3","t4"]763 t2 | ["t3","t4"]764 >>> sql('''select jcombinations(null,2)''')765 >>> sql('''select jcombinations('["t1","t2","t3","t4"]')''')766 C1767 --768 t1769 t2770 t3771 t4772 """773 r=1774 if len(args)==2:775 r=args[1]776 yield tuple(('C'+str(x) for x in xrange(1,r+1)))777 for p in itertools.combinations(jopts.fromj(args[0]), r):778 yield [jopts.toj(x) for x in p]779jcombinations.registered=True780def jpermutations(*args):781 """782 .. function:: jpermutations(jpack, r) -> multiset783 Returns all length r permutations of jpack.784 Examples:785 >>> sql('''select jpermutations('["t1","t2","t3"]',2)''')786 C1 | C2787 -------788 t1 | t2789 t1 | t3790 t2 | t1791 t2 | t3792 t3 | t1793 t3 | t2794 >>> sql('''select jpermutations('["t1","t2",["t3","t4"]]',2)''')795 C1 | C2796 -------------------------797 t1 | t2798 t1 | ["t3","t4"]799 t2 | t1800 t2 | ["t3","t4"]801 ["t3","t4"] | t1802 ["t3","t4"] | t2803 >>> sql('''select jpermutations(null,2)''')804 >>> sql('''select jpermutations('["t1","t2","t3","t4"]')''')805 C1806 --807 t1808 t2809 t3810 t4811 """812 r=1813 if len(args)==2:814 r=args[1]815 yield tuple(('C'+str(x) for x in xrange(1,r+1)))816 for p in itertools.permutations(jopts.fromj(args[0]), r):817 yield [jopts.toj(x) for x in p]818jpermutations.registered=True819def jsonpath(*args):820 """821 .. function:: jsonpath(JSON, jsonpathexpr1, jsonpathexpr2) -> multiset822 Uses jsonpath expressions to pick values from inside a JSON input. If the outputs of all JSONpath expressions823 have the same number of elements in them, it splits the output into multiple rows.824 .. note::825 For more on JSONpath see: http://goessner.net/articles/JsonPath/826 Examples:827 >>> sql('''select jsonpath('{"d1":[{"name":"n1", "value":"v1"}, {"name":"n2", "value":"v2"}]}', '$.d1') ''')828 C1829 -------------------------------------------------------830 [{"name":"n1","value":"v1"},{"name":"n2","value":"v2"}]831 >>> sql('''select jsonpath('{"d1":[{"name":"n1", "value":"v1"}, {"name":"n2", "value":"v2"}]}', '$.d1[*].name') ''')832 C1833 --834 n1835 n2836 >>> sql('''select jsonpath('{"d1":[{"name":"n1", "value":"v1"}, {"name":"n2", "value":"v2"}]}', '$.d1[*].name', '$.d1[*].value') ''')837 C1 | C2838 -------839 n1 | v1840 n2 | v2841 >>> sql('''select jsonpath('{"d1":[{"name":"n1", "value":"v1"}, {"name":"n2", "value":"v2"}]}', '$.d1[*].name', '$.d1[*].nonexisting') ''')842 C1 | C2843 ---------844 n1 | None845 n2 | None846 >>> sql('''select jsonpath('{"d1":[{"name":"n1", "value":"v1"}, {"name":"n2"}]}', '$.d1[*].name', '$.d1[*].value') ''')847 C1 | C2848 ----------------849 ["n1","n2"] | v1850 851 >>> sql('''select jsonpath('{"d1":[{"name":"n1", "value":"v1"}, {"name":"n2", "value":"v2"}]}', '$.nonexisting') ''')852 """853 try:854 j = json.loads(args[0])855 except:856 try:857 j = json.loads(utf8clean(args[0]))858 except:859 import sys860 sys.stderr.write(args[0])861 error = 'Error in input line: '+ args[0]862 raise863 yield tuple( ('C'+str(x)for x in xrange( 1,len(args) ) ) )864 output=[libjsonpath(j, jp, use_eval=False) for jp in args[1:]]865 l=0866 lchanges=0867 for i in output:868 try:869 if len(i)!=l:870 l=len(i)871 lchanges+=1872 except TypeError:873 pass874 if l==0:875 return876 try:877 if lchanges>1:878 yield [jopts.toj(x) if type(x)!=bool else None for x in output]879 else:880 for i in xrange(l):881 yield [jopts.toj(x[i]) if type(x)!=bool else None for x in output]882 except:883 import sys884 sys.stderr.write(args[0])885 error = 'Error in input line: '+ args[0]886 raise Exception(error)887jsonpath.registered=True888if not ('.' in __name__):889 """890 This is needed to be able to test the function, put it at the end of every891 new function you create892 """893 import sys894 import setpath895 from functions import *896 testfunction()897 if __name__ == "__main__":898 reload(sys)899 sys.setdefaultencoding('utf-8')900 import doctest...

Full Screen

Full Screen

Automation Testing Tutorials

Learn to execute automation testing from scratch with LambdaTest Learning Hub. Right from setting up the prerequisites to run your first automation test, to following best practices and diving deeper into advanced test scenarios. LambdaTest Learning Hubs compile a list of step-by-step guides to help you be proficient with different test automation frameworks i.e. Selenium, Cypress, TestNG etc.

LambdaTest Learning Hubs:

YouTube

You could also refer to video tutorials over LambdaTest YouTube channel to get step by step demonstration from industry experts.

Run Robotframework automation tests on LambdaTest cloud grid

Perform automation testing on 3000+ real desktop and mobile devices online.

Try LambdaTest Now !!

Get 100 minutes of automation test minutes FREE!!

Next-Gen App & Browser Testing Cloud

Was this article helpful?

Helpful

NotHelpful