-
Notifications
You must be signed in to change notification settings - Fork 78
Exposes API for processing pretraining data #672
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
19 commits
Select commit
Hold shift + click to select a range
fb2a1aa
adds API for converting documents into pretraining data
RobotSail 6fbdcfd
removes extraneous validation + account for eos already being present
RobotSail fd6f5f3
exposes the pretraining column name in training config
RobotSail 09a0b90
explicitly do not request a dict
RobotSail 967a3c4
adds the ability for training to consume pretraining forrmat data
RobotSail dd8e363
linting + minor fixes
RobotSail 333a8ef
add docs
RobotSail 1cd4aa4
updates mock tokenizer
RobotSail fba8ce6
tests
RobotSail f57251b
linting
RobotSail a354284
double-check failure
RobotSail 699ca77
fix tests
RobotSail 9c3905b
revert debu change
RobotSail fa0b7d1
fix incorrect accese
RobotSail 5f8f2a5
adds pydantic to mypy for tox
RobotSail d7be2a3
more linting
RobotSail 0275455
revert changes
RobotSail 77d39f6
fix tests
RobotSail 554df1b
remove unused fields
RobotSail File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Some comments aren't visible on the classic Files Changed page.
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
transformers 4.55.0 PreTrainedTokenizer apply_chat_template default return_dict behavior
<web_search_result>
Short answer: In transformers 4.55.0, PreTrainedTokenizer.apply_chat_template defaults return_dict=False. When you set return_dict=True the method returns the usual tokenizer output dict (input_ids, attention_mask, pixel_values, etc. where applicable); with return_dict=False it returns the token id list (or token array) ready for the model. [1][2][3]
Sources
</web_search_result>
Script executed:
Repository: instructlab/training
<shell_result>
Repository: instructlab/training
Script execution failed with unrecoverable error: [CAPABILITY:RUN_SHELL_SCRIPT] Failed to clone repository instructlab/training: Clone operation failed
</shell_result>
transformers apply_chat_template return_dict default history earlier versions before 4.55.0
<web_search_result>
Short answer: Historically apply_chat_template defaulted to tokenize=True and return_dict=False (so it usually returned token ids or a string). Over a series of changes (notably PR/commit referenced as #36149) callers from Processor/ProcessorMixin started invoking tokenizer.apply_chat_template with tokenize=False and return_dict=False, which effectively forced non-tokenized string output and made options that require tokenization/return_dict (e.g. return_assistant_tokens_mask=True) unusable. See the changelog/issues and docs for details. [1][2][3]
Sources
</web_search_result>
Clarify the rationale:
return_dict=Falseis already the default, not a future-proofing measure.The inline comment suggests that "newer versions of transformers have
return_dict=Trueby default," butapply_chat_templatedefaults toreturn_dict=Falsein transformers 4.55.0 and historically has done so. The explicit parameter here is redundant but harmless.If this is intended as defensive coding against hypothetical future changes, consider clarifying or removing the comment to avoid misleading future readers. If the intent is to document current behavior, the comment should state that the parameter preserves the existing default rather than implying a version-specific workaround.
🤖 Prompt for AI Agents