The Wayback Machine - https://web.archive.org/web/20220504230815/https://github.com/topics/pytorch
Skip to content
#

PyTorch

pytorch logo

PyTorch is an open source machine learning library based on the Torch library, used for applications such as computer vision and natural language processing, primarily developed by Facebook's AI Research lab.

Here are 21,533 public repositories matching this topic...

transformers
pytorch-lightning
tsuga
tsuga commented Apr 15, 2022

🐛 Bug

tuner.scale_batch_size finds the suitable batch size and update the batch size of the model AND datamodule.
For the model, tuner.scale_batch_size updates the batch size in the model regardless of model.batch_size and model.hparams.batch_size.

However, for the datamodule, tuner.scale_batch_size updates datamodule.batch_size only, and keep datamodule.hparams.batch_size

bug good first issue trainer: tune lightningdatamodule
AnirudhDagar
AnirudhDagar commented Jan 24, 2022

Although the results look nice and ideal in all TensorFlow plots and are consistent across all frameworks, there is a small difference (more of a consistency issue). The result training loss/accuracy plots look like they are sampling on a lesser number of points. It looks more straight and smooth and less wiggly as compared to PyTorch or MXNet.

It can be clearly seen in chapter 6([CNN Lenet](ht

tensorflow-adapt-track good first issue
datasets
dlwh
dlwh commented Mar 16, 2022

Describe the bug

Streaming Datasets can't be pickled, so any interaction between them and multiprocessing results in a crash.

Steps to reproduce the bug

import transformers
from transformers import Trainer, AutoModelForCausalLM, TrainingArguments
import datasets

ds = datasets.load_dataset('oscar', "unshuffled_deduplicated_en", split='train', streaming=True).with_format("
bug good first issue
chan4cc
chan4cc commented Apr 26, 2021

New Operator

Describe the operator

Why is this operator necessary? What does it accomplish?

This is a frequently used operator in tensorflow/keras

Can this operator be constructed using existing onnx operators?

If so, why not add it as a function?

I don't know.

Is this operator used by any model currently? Which one?

Are you willing to contribute it?

operator good first issue enhancement
nni
pkubik
pkubik commented Mar 14, 2022

Describe the issue:
During computing Channel Dependencies reshape_break_channel_dependency does following code to ensure that the number of input channels equals the number of output channels:

in_shape = op_node.auxiliary['in_shape']
out_shape = op_node.auxiliary['out_shape']
in_channel = in_shape[1]
out_channel = out_shape[1]
return in_channel != out_channel

This is correct

bug help wanted good first issue model compression
danieldeutsch
danieldeutsch commented Jun 2, 2021

Is your feature request related to a problem? Please describe.
I typically used compressed datasets (e.g. gzipped) to save disk space. This works fine with AllenNLP during training because I can write my dataset reader to load the compressed data. However, the predict command opens the file and reads lines for the Predictor. This fails when it tries to load data from my compressed files.

Good First Issue Contributions welcome Feature request

Created by Facebook's AI Research lab (FAIR)

Released September 2016

Latest release about 2 months ago

Repository
pytorch/pytorch
Website
pytorch.org
Wikipedia
Wikipedia

Related Topics

python pytorch-tutorial