fix(parser): remove MAX_LINES limit that truncates long chapters

The MAX_LINES=1000 limit was causing text to disappear after ~25 pages
in long chapters. For example, a 93KB chapter generating 242 pages
(~9,680 lines) was being truncated at ~1000 lines.

Replaced the hard limit with a safety check that prevents infinite loops
by forcing advancement when nextBreakIndex doesn't progress, while still
allowing all lines to be processed.
This commit is contained in:
Eunchurn Park 2025-12-27 00:17:41 +09:00
parent aff4dc6628
commit eb75b4a82e
No known key found for this signature in database
GPG Key ID: 29D94D9C697E3F92

View File

@ -111,16 +111,17 @@ std::vector<size_t> ParsedText::computeLineBreaks(const int pageWidth, const int
// Stores the index of the word that starts the next line (last_word_index + 1) // Stores the index of the word that starts the next line (last_word_index + 1)
std::vector<size_t> lineBreakIndices; std::vector<size_t> lineBreakIndices;
size_t currentWordIndex = 0; size_t currentWordIndex = 0;
constexpr size_t MAX_LINES = 1000;
while (currentWordIndex < totalWordCount) { while (currentWordIndex < totalWordCount) {
if (lineBreakIndices.size() >= MAX_LINES) { size_t nextBreakIndex = ans[currentWordIndex] + 1;
break;
// Safety check: prevent infinite loop if nextBreakIndex doesn't advance
if (nextBreakIndex <= currentWordIndex) {
// Force advance by at least one word to avoid infinite loop
nextBreakIndex = currentWordIndex + 1;
} }
size_t nextBreakIndex = ans[currentWordIndex] + 1;
lineBreakIndices.push_back(nextBreakIndex); lineBreakIndices.push_back(nextBreakIndex);
currentWordIndex = nextBreakIndex; currentWordIndex = nextBreakIndex;
} }