
Formed in 2009, the Archive Team (not to be confused with the archive.org Archive-It Team) is a rogue archivist collective dedicated to saving copies of rapidly dying or deleted websites for the sake of history and digital heritage. The group is 100% composed of volunteers and interested parties, and has expanded into a large amount of related projects for saving online and digital history.
History is littered with hundreds of conflicts over the future of a community, group, location or business that were "resolved" when one of the parties stepped ahead and destroyed what was there. With the original point of contention destroyed, the debates would fall to the wayside. Archive Team believes that by duplicated condemned data, the conversation and debate can continue, as well as the richness and insight gained by keeping the materials. Our projects have ranged in size from a single volunteer downloading the data to a small-but-critical site, to over 100 volunteers stepping forward to acquire terabytes of user-created data to save for future generations.
The main site for Archive Team is at archiveteam.org and contains up to the date information on various projects, manifestos, plans and walkthroughs.
This collection contains the output of many Archive Team projects, both ongoing and completed. Thanks to the generous providing of disk space by the Internet Archive, multi-terabyte datasets can be made available, as well as in use by the Wayback Machine, providing a path back to lost websites and work.
Our collection has grown to the point of having sub-collections for the type of data we acquire. If you are seeking to browse the contents of these collections, the Wayback Machine is the best first stop. Otherwise, you are free to dig into the stacks to see what you may find.
The Archive Team Panic Downloads are full pulldowns of currently extant websites, meant to serve as emergency backups for needed sites that are in danger of closing, or which will be missed dearly if suddenly lost due to hard drive crashes or server failures.
I ran the build.sh script while installing the library on my laptop(CentOS Linux release 7.5.1804 (Core)), but it gets stuck on Test #22 due to some reason.
Test project /root/cpp-netlib/build
Start 1: uri_parse_test
1/26 Test #1: uri_parse_test ...................................... Passed 0.01 sec
Start 2: uri_parse_scheme_test
2/26 Test #2: uri_parse_scheme_test ............................... Passed 0.00 sec
Start 3: uri_parse_path_test
3/26 Test #3: uri_parse_path_test ................................. Passed 0.00 sec
Start 4: uri_test
4/26 Test #4: uri_test ............................................ Passed 0.01 sec
Start 5: uri_encoding_test
5/26 Test #5: uri_encoding_test ................................... Passed 0.00 sec
Start 6: uri_normalization_test
6/26 Test #6: uri_normalization_test .............................. Passed 0.01 sec
Start 7: uri_comparison_test
7/26 Test #7: uri_comparison_test ................................. Passed 0.00 sec
Start 8: uri_reference_test
8/26 Test #8: uri_reference_test .................................. Passed 0.00 sec
Start 9: uri_resolve_test
9/26 Test #9: uri_resolve_test .................................... Passed 0.01 sec
Start 10: uri_builder_test
10/26 Test #10: uri_builder_test .................................... Passed 0.01 sec
Start 11: uri_stream_test
11/26 Test #11: uri_stream_test ..................................... Passed 0.00 sec
Start 12: optional_test
12/26 Test #12: optional_test ....................................... Passed 0.00 sec
Start 13: cpp-netlib-utils_thread_pool
13/26 Test #13: cpp-netlib-utils_thread_pool ........................ Passed 0.48 sec
Start 14: cpp-netlib-http-response_incremental_parser_test
14/26 Test #14: cpp-netlib-http-response_incremental_parser_test .... Passed 0.00 sec
Start 15: cpp-netlib-http-request_incremental_parser_test
15/26 Test #15: cpp-netlib-http-request_incremental_parser_test ..... Passed 0.00 sec
Start 16: cpp-netlib-http-request_linearize_test
16/26 Test #16: cpp-netlib-http-request_linearize_test .............. Passed 0.00 sec
Start 17: cpp-netlib-http-client_constructor_test
17/26 Test #17: cpp-netlib-http-client_constructor_test ............. Passed 0.01 sec
Start 18: cpp-netlib-http-client_get_test
18/26 Test #18: cpp-netlib-http-client_get_test ..................... Passed 2.94 sec
Start 19: cpp-netlib-http-client_get_different_port_test
19/26 Test #19: cpp-netlib-http-client_get_different_port_test ...... Passed 2.91 sec
Start 20: cpp-netlib-http-client_get_ready_test
20/26 Test #20: cpp-netlib-http-client_get_ready_test ............... Passed 1.56 sec
Start 21: cpp-netlib-http-client_get_streaming_test
21/26 Test #21: cpp-netlib-http-client_get_streaming_test ........... Passed 0.01 sec
Start 22: cpp-netlib-http-server_constructor_test
^Cmake: *** [test] Interrupt
As you can see, I had to manually interrupt the process. Why is this happening? Are these tests necessary to be passed before I start using the library?
The text was updated successfully, but these errors were encountered: