GPT-2
Tabnine learns common code idioms and patterns by training powerful ML models on code. Our most powerful models use over 380M parameters and are an extension of
GPT-2 specialized for code (combining syntactic and semantic information).
Tabnine trains on highly credible Open Source code (with permissive licenses) and predicts the most likely code based on context using a combination of language models,
such as GPT-2 and semantic models.
Tabnine models are periodically updated to capture patterns from the latest repositories in GitHub and other credible sources.
Supported Languages
Tabnine supports all major programming languages:
Javascript, Python, Typescript, PHP,
Java, C++, C, C#, Objective-C,
Go, Rust, Perl, Ruby,
Swift, Haskell, Scala,
F#, Kotlin, Julia, Lua, SQL, Bash.
Most common configuration languages:
json, yaml, toml.
Web technologies: HTML, CSS, Markdown
Tabnine also supports completions in English text and comments.
Private Code Model
On top of the public GPT-2 model, Tabnine also offers a private local model trained on your code.
This local model is not shared in any way with Tabnine and remains private to your machine.
The private mode adapts to your code instantly as you're using it in your projects. The more you use Tabnine, the better the suggestions will match your project, style, and coding preferences.