Focused crawls are collections of frequently-updated webcrawl data from narrow (as opposed to broad or wide) web crawls, often focused on a single domain or subdomain.
CleverCSV is a Python package for handling messy CSV files. It provides a drop-in replacement for the builtin CSV module with improved dialect detection, and comes with a handy command line application for working with CSV files.
C# class giving a simplest with performance serializer functionalities to manipulate CSV strings; several test cases were executed in order to treat the particularities of a sequence of CSV characters; in this publication, is considered stable, it is implemented in productive environments with large scale of data generated or consumed in CSV
LaukvikCSV is a powerful API for reading, writing and querying tabular data stored in the CSV format. In contrast to other API it lets you specify data types for each column using meta data. It automatically detects delimiters so you don't have to worry about delimiters being comma, tab, pipe, semicolon etc. Run powerful queries to filter your data easily with a fluid query language thats type safe. Export your tabular data to CSV, JSON, XML and HTML.
Got a long list of email recipients? Let's get that sorted. Just feed your connection details, and your subject list in a CSV format, the content and the daemon triggers the emails automatically.
An extra to easily import users into the MODX user database and batch assign MODX user groups and roles. The importer accepts data files in CSV format.
In a recent use I wanted to know three statistics:
I was colouring nodes and, in particular, wanted to know if any nodes remained uncoloured - at a certain level in the hierarchy.