metadata
license: apache-2.0
language:
- en
library_name: transformers
pipeline_tag: text-generation
inference: true
widget:
- text: |-
public class HelloWorld {
public static void main(String[] args) {
example_title: Hello world
group: Java
JavaCoder
Table of Contents
Model Summary
The JavaCoder models are !B parameter models trained on 80+ programming languages from The Stack (v1.2), with opt-out requests excluded. The model uses Multi Query Attention, a context window of 8192 tokens, and was trained using the Fill-in-the-Middle objective on 1 trillion tokens.
- Repository:
- Project Website:
- Paper:
- Point of Contact:
- Languages: 80+ Programming languages