The Wayback Machine - https://web.archive.org/web/20220215171056/https://github.com/pytorch/pytorch/pull/60750
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade benchmark to v1.5.5 #60750

Closed
wants to merge 1 commit into from
Closed

Upgrade benchmark to v1.5.5 #60750

wants to merge 1 commit into from

Conversation

@dreiss
Copy link
Contributor

@dreiss dreiss commented Jun 25, 2021

Summary: This fixes the build for gcc 11.

Test Plan: CI

Summary: This fixes the build for gcc 11.

Test Plan: CI
@facebook-github-bot
Copy link
Contributor

@facebook-github-bot facebook-github-bot commented Jun 25, 2021

💊 CI failures summary and remediations

As of commit 59e6e96 (more details on the Dr. CI page and at hud.pytorch.org/pr/60750):


  • 1/1 failures introduced in this PR

🕵️ 1 new failure recognized by patterns

The following CI failures do not appear to be due to upstream breakages:

See CircleCI build pytorch_linux_bionic_py3_8_gcc9_coverage_test2 (1/1)

Step: "Run tests" (full log | diagnosis details | 🔁 rerun)

Jun 25 17:08:59 AssertionError: False is not tr...lowed difference with rtol=0 and atol=0 is only 0!
Jun 25 17:08:59 ----------------------------------------------------------------------
Jun 25 17:08:59 Traceback (most recent call last):
Jun 25 17:08:59   File "/opt/conda/lib/python3.8/site-packages/torch/testing/_internal/common_distributed.py", line 399, in wrapper
Jun 25 17:08:59     self._join_processes(fn)
Jun 25 17:08:59   File "/opt/conda/lib/python3.8/site-packages/torch/testing/_internal/common_distributed.py", line 606, in _join_processes
Jun 25 17:08:59     self._check_return_codes(elapsed_time)
Jun 25 17:08:59   File "/opt/conda/lib/python3.8/site-packages/torch/testing/_internal/common_distributed.py", line 655, in _check_return_codes
Jun 25 17:08:59     self.assertEqual(
Jun 25 17:08:59   File "/opt/conda/lib/python3.8/site-packages/torch/testing/_internal/common_utils.py", line 1498, in assertEqual
Jun 25 17:08:59     super().assertTrue(result, msg=self._get_assert_msg(msg, debug_msg=debug_msg))
Jun 25 17:08:59 AssertionError: False is not true : Scalars failed to compare as equal! Comparing -6 and 0 gives a difference of 6, but the allowed difference with rtol=0 and atol=0 is only 0!
Jun 25 17:08:59 Expect process 2 exit code to match Process 0 exit code of 0, but got -6
Jun 25 17:08:59 
Jun 25 17:09:00 ----------------------------------------------------------------------
Jun 25 17:09:00 Ran 367 tests in 1294.240s
Jun 25 17:09:00 
Jun 25 17:09:00 FAILED (failures=1, skipped=7)
Jun 25 17:09:00 
Jun 25 17:09:00 Generating XML reports...
Jun 25 17:09:00 Generated XML report: test-reports/python-unittest/distributed.rpc.test_tensorpipe_agent/TEST-TensorPipeDdpComparisonTestWithSpawn-20210625164725.xml
Jun 25 17:09:00 Generated XML report: test-reports/python-unittest/distributed.rpc.test_tensorpipe_agent/TEST-TensorPipeDdpUnderDistAutogradTestWithSpawn-20210625164725.xml

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@facebook-github-bot
Copy link
Contributor

@facebook-github-bot facebook-github-bot commented Jun 25, 2021

@dreiss has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@facebook-github-bot facebook-github-bot commented Jun 28, 2021

@dreiss merged this pull request in 971cdaf.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

2 participants