#
language-models
Here are 62 public repositories matching this topic...
Timoeller
commented
Oct 6, 2020
when printing these objects we do not see informative content.
Visualize and explore NLP language models. Ecco creates interactive visualizations directly in Jupyter notebooks explaining the behavior of Transformer-based language models (like GPT2).
-
Updated
Aug 2, 2021 - Jupyter Notebook
PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
deep-learning
vietnamese
transformers
python3
named-entity-recognition
language-models
ner
natural-language-inference
bert
pos-tagging
vietnamese-nlp
nli
fairseq
vncorenlp
roberta
bert-embeddings
rdrsegmenter
part-of-speech-tagging
transformers-library
phobert
-
Updated
Dec 7, 2020
An easy to use Natural Language Processing library and framework for predicting, training, fine-tuning, and serving up state-of-the-art NLP models.
nlp
docker
machine-learning
natural-language-processing
deep-learning
gpu
transformers
pytorch
api-rest
easy
gpt
language-models
deep-learning-tutorial
bert
fine-tuning
ulmfit
xlnet
-
Updated
Aug 13, 2021 - Jupyter Notebook
ACL'2021: LM-BFF: Better Few-shot Fine-tuning of Language Models
-
Updated
Aug 1, 2021 - Python
This repository contains landmark research papers in Natural Language Processing that came out in this century.
nlp
language
machine-learning
natural-language-processing
artificial-intelligence
vectors
papers
language-models
bert
-
Updated
Feb 11, 2021
A package built on top of Hugging Face's transformers library that makes it easy to utilize state-of-the-art NLP models
python
nlp
machine-learning
natural-language-processing
ai
deep-learning
text-classification
transformers
artificial-intelligence
question-answering
language-models
bert
roberta
-
Updated
Aug 14, 2021 - Python
nlp
sentiment-analysis
text-classification
nlu
transformer
named-entity-recognition
persian-language
language-models
ner
bert
downstream-tasks
parsbert
persian-bert
persianber
-
Updated
May 28, 2021 - Jupyter Notebook
Pre-trained models and language resources for Natural Language Processing in Polish
machine-learning
natural-language-processing
polish
language-models
word-embedding
polish-language
lexicons
-
Updated
Jun 21, 2021
This is a list of open-source projects at Microsoft Research NLP Group
-
Updated
Sep 29, 2020
Language models are open knowledge graphs ( non official implementation )
-
Updated
Nov 14, 2020 - Python
will-thompson-k
commented
Aug 13, 2021
Similar to other notes, need "tl;dr" notes. Please use TEMPLATE.md format and follow instructions on README.md
Open
Add tl;dr for gpt-3
Generate realistic Instagram captions using transformers 🤗
-
Updated
Aug 26, 2020 - Python
A collection of resources on using BERT (https://arxiv.org/abs/1810.04805 ) and related Language Models in production environments.
python
language
awesome
tutorial
deployment
tensorflow
paper
model
natural-language
production
transformers
resources
smaller
implementation
language-models
knowledge-distillation
bert
elmo
bert-pytorch
distilbert
-
Updated
Apr 8, 2021
Must-read papers on Natural Language Processing (NLP)
-
Updated
Jul 5, 2021
[ICLR 2021] "InfoBERT: Improving Robustness of Language Models from An Information Theoretic Perspective" by Boxin Wang, Shuohang Wang, Yu Cheng, Zhe Gan, Ruoxi Jia, Bo Li, Jingjing Liu
information-theory
language-models
bert
adversarial-attacks
roberta
adversarial-defense
adversarial-robustness
-
Updated
Jan 14, 2021 - Python
UBC ARBERT and MARBERT Deep Bidirectional Transformers for Arabic
benchmark
awesome
social-media
deep-learning
classification
ubc
language-models
arabic
bert
model-evaluation
arabic-nlp
arabic-language
arabic-dialects
bert-model
huggingface-transformers
arbert
marbert
arabic-models
ubc-dlnlp
ubcnlp
-
Updated
May 18, 2021
Keras implementations of three language models: character-level RNN, word-level RNN and Sentence VAE (Bowman, Vilnis et al 2016).
-
Updated
May 22, 2021 - Python
Smart Language Model
-
Updated
Sep 20, 2020 - C++
Transformer based Turkish language models
-
Updated
Jan 9, 2021 - Python
Neural Network Language Model that generates text based off Lord of the Rings. Built with Pytorch.
-
Updated
Feb 6, 2021 - Python
Python source code for EMNLP 2020 paper "Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT".
transfer-learning
language-models
cross-lingual
low-resource-languages
residual-adapters
pretraining
unsupervised-machine-translation
-
Updated
Dec 8, 2020 - Python
The course notes about Stanford CS224n Natural Language Processing with Deep Learning Winter 2019 (using PyTorch)
machine-translation
question-answering
language-models
dependency-parsing
cs224n
cs224n-assignment-solutions
cs224nwinter2019
-
Updated
Jan 14, 2020 - JavaScript
Deep-learning Transfer Learning models of NTUA-SLP team submitted at the IEST of WASSA 2018 at EMNLP 2018.
python
deep-neural-networks
twitter
deep-learning
sentiment-analysis
pytorch
lstm
transfer-learning
language-models
emotion-analysis
-
Updated
May 20, 2020 - Python
Python implementation of an N-gram language model with Laplace smoothing and sentence generation.
python
nlp
ngram
ngrams
language-models
language-model
ngram-language-model
laplace-smoothing
perplexity
smoothing-methods
-
Updated
Feb 9, 2018 - Python
Course on Language Technologies and NLP
-
Updated
May 15, 2017 - TeX
The data and code for NumerSense (EMNLP2020)
-
Updated
Feb 16, 2021 - Python
Code for equipping pretrained language models (BART, GPT-2, XLNet) with commonsense knowledge for generating implicit knowledge statements between two sentences, by (i) finetuning the models on corpora enriched with implicit information; and by (ii) constraining models with key concepts and commonsense knowledge paths connecting them.
-
Updated
Jul 27, 2021 - Python
-
Updated
Apr 28, 2021 - Python
Improve this page
Add a description, image, and links to the language-models topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the language-models topic, visit your repo's landing page and select "manage topics."
https://github.com/huggingface/transformers/blob/546dc24e0883e5e9f5eb06ec8060e3e6ccc5f6d7/src/transformers/models/gpt2/modeling_gpt2.py#L698
Assertions can't be relied upon for control flow because they can be disabled, as per the following: