
Formed in 2009, the Archive Team (not to be confused with the archive.org Archive-It Team) is a rogue archivist collective dedicated to saving copies of rapidly dying or deleted websites for the sake of history and digital heritage. The group is 100% composed of volunteers and interested parties, and has expanded into a large amount of related projects for saving online and digital history.
History is littered with hundreds of conflicts over the future of a community, group, location or business that were "resolved" when one of the parties stepped ahead and destroyed what was there. With the original point of contention destroyed, the debates would fall to the wayside. Archive Team believes that by duplicated condemned data, the conversation and debate can continue, as well as the richness and insight gained by keeping the materials. Our projects have ranged in size from a single volunteer downloading the data to a small-but-critical site, to over 100 volunteers stepping forward to acquire terabytes of user-created data to save for future generations.
The main site for Archive Team is at archiveteam.org and contains up to the date information on various projects, manifestos, plans and walkthroughs.
This collection contains the output of many Archive Team projects, both ongoing and completed. Thanks to the generous providing of disk space by the Internet Archive, multi-terabyte datasets can be made available, as well as in use by the Wayback Machine, providing a path back to lost websites and work.
Our collection has grown to the point of having sub-collections for the type of data we acquire. If you are seeking to browse the contents of these collections, the Wayback Machine is the best first stop. Otherwise, you are free to dig into the stacks to see what you may find.
The Archive Team Panic Downloads are full pulldowns of currently extant websites, meant to serve as emergency backups for needed sites that are in danger of closing, or which will be missed dearly if suddenly lost due to hard drive crashes or server failures.
Summary
This proposal aims to provide end-to-end strong typing for all interactions between SignalR clients and SignalR hubs.
Motivation and goals
In scope
ChannelReader<>
andIAsyncEnumerable<>
as the underlying streamOut of scope
Risks / unknowns
HubMethodName
) can break strongly-typed clients depending on the particular implementation.Examples
Client to Server Calls
Let's say there is an interface
IMyHub
as defined below which is implemented by an ASP.NET Core application asMyHub : SignalR.Hub, IMyHub
.A developer currently needs to consume it as below.
Instead, developer could be making strongly-typed calls.
We can either have such a proxy be acquired from a hub connection or a builder.
Acquisiton from a hub connection is simpler while builder provides more room for extension in future.
Server to Client Call Handlers
One can similarly define an interface
IMyClient
as below which can then be used inHub<IMyClient>
on server-end and implemented by any consumer to provide callbacks.A developer currently needs to provide such callbacks as below.
Instead, developer could be registering callback in a strongly-typed manner.
Multiple callback providers can be registered against the same hub connection so different callbacks can be provided by different classes. However this does mean overlap is possible. We'd want to decide how to handle this and whether to impose restrictions.
Detailed design
To be decided.
Some alternatives for client-to-server call implementation are:
Source-generated proxies utilizes C# 9 / .NET 5 source generator feature.
Pros:
Cons:
Dynamic proxies utilizes
Reflection.Emit
to dynamically generate proxy.Pros:
Cons:
Expressive proxies utilizes expressions to
Pros:
Cons:
await hubConnection.AsExpressive<IMyHub>().InvokeAsync(hub => hub.Do())
)Server-to-client calls can be registered with just reflection (no
Reflection.Emit
) which is simple enough and would work on practically any platform. Other alternatives are possible such as source generation as well.The text was updated successfully, but these errors were encountered: